A (Brief) History of Credit Reporting

By Sarah Seville, Spencer Watson, and Valerie Ploumpis
Scholars and historians who study the evolution of financial services frequently cite two wildly different dates for the beginning of the credit reporting system in the United States: either it began in the 1820s, or it began in the 1960s. Both are right, and both are wrong in different ways.
So, let’s take a quick look at what happened.
Debt and credit have existed for millennia, but credit reporting as we know it is much younger. Before the 1800s, businesses and business owners in the U.S. would secure loans from creditors by asking well-esteemed friends or neighbors to vouch for their character. Creditors would also investigate the person and collect rumors and hearsay about the credit applicant.
New bankruptcy laws enacted during the 1800s also posed a problem for lenders: if businesses could default on their debts and obtain relief from their debt, creditors would lose out on money. Lenders wanted a better way to predict which borrowers would successfully repay their loans. The old system – getting a loan based on a reliability recommendation from a friend or neighbor – didn’t accomplish that.
The next few decades featured several experiments to establish a standard system for evaluating a credit applicant’s character and assets. In 1841, the Mercantile Agency, founded by Lewis Tappan, maintained a ledger of evaluations of borrowers in New York City. These early credit reports were frequently based on local rumors collected about a person’s domestic life, their habits, vices, and sometimes even speculation about their sexuality. They also frequently included remarks that were wildly racist.
Prior to the turn of the century, lending from banks was almost exclusively for businesses though. Credit from banks and commercial lenders was not frequently available for regular people yet. Many individuals avoided taking on personal debt, which was at this point provided by friends, relatives, or else by loan sharks. Banks thought lending to individual consumers was too risky.
Credit was not widely available for individual consumers in the U.S. until the 1900s. The 20th century saw an explosion in consumer credit, including the adoption of credit by retailers and department stores, General Motor’s invention of the finance company to provide car loans in 1920, the U.S. government’s invention of the 30-year mortgage in 1933, and the invention of the credit card. These developments shifted Americans’ attitudes toward personal indebtedness: consumers began to see credit as a means to acquire goods and services, and lenders increasingly began to see individuals as potential customers.[1]
The first consumer credit bureaus in the U.S. were cooperatives established by local retailers to pool the credit histories of their customers and collect debts.[2] Local finance companies and chambers of commerce organized similar pools of credit histories. As with previous credit reports, bureaus employed investigators to collect personal information about borrowers and included value judgments about individuals’ personal habits and lifestyles. There were fewer than 100 consumer credit bureaus in 1916, but by 1955 there were 1,600 credit bureaus in the U.S.
Credit managers formed a national organization in 1912 and started to create more standardized practices to collect, share, and codify consumer information. Credit bureaus were keeping an incredible amount of personal information about consumers’ lives – but up to this point, data was collected by local agencies and kept in physical ledgers. However, the massive expansion of consumer lending meant that creditors were marketing more often to customers on a regional or national scale. No single locally focused credit bureau could satisfy all of a lender’s requests for information about loan applicants.[3]
Stepping into this gap, the Retail Credit Company in Atlanta (which later became Equifax) started to computerize credit records in the 1960s, as did other newer companies like the Credit Data Corporation (which would eventually be acquired and become Experian).[4] The computerization of credit reports made them cheaper and easier to transmit to lenders, and computerized credit bureaus began to dominate the credit reporting marketplace.
In another development, increasingly, credit bureaus like the Credit Data Corporation began to rely upon financial accounting data provided by their subscribers to create their reports. Those figures were cheaper to acquire and more objective and accurate than the reports written by investigators. The computerization and consolidation of the credit reporting industry would lead to the emergence of the “Big Three” credit bureaus for lending that we know today: Equifax, Experian, and Transunion. Still, thousands of smaller consumer reporting agencies continue to exist that create tenant reports, employment background checks, and other specialty consumer reports (often using data sourced from the Big Three).
By the 1960s, consumers and policymakers had become increasingly concerned about the volume of personal, private, and subjective information contained in credit reports and the difficulty of correcting inaccuracies in credit files.[5] Congress held a series of hearings starting in 1968 about the myriad problems in the credit reporting industry surrounding privacy, errors, and accuracy, which ultimately resulted in the passage of the Fair Credit Reporting Act of 1970 (FCRA). The Act placed limits on, who could access a consumer report, required credit bureaus and other “consumer reporting agencies” to have reasonable procedures for maximum possible accuracy, set time limits for negative information, and placed specific restrictions on “investigative” consumer reports that contained personal interviews. It also guaranteed consumers’ rights to access credit reports and to contest inaccurate information.
Credit scores as we know them today are an invention from the late 1980s-early 1990s. A statistics and data firm called Fair, Isaac and Company (now FICO) developed an algorithm for creating a numerical score for each consumer, using data contained in the credit reports from the Big Three credit bureaus. Ostensibly, the scores enable lenders to assess a credit applicant’s risk of default so lenders can make sure borrowers will pay back their loans.
Now, let’s back up for a second. When scholars and historians say that credit reports started including “consumers” in the 1900s – that’s not entirely accurate. Women and people of color were frequently shut out of the credit market until the women’s rights and civil rights movements of the 60s and 70s.
Initially, a steady minimum income was the basis of to qualify for credit. But even when they had adequate income – which they sometimes did not due to wage gaps and labor discrimination – women and people of color would not be approved for credit. When the first credit card was introduced in 1958, women weren’t eligible at all due in part to Married Women Property Acts – though they could sometimes be a secondary signer on a man’s card. Most credit and lending services – including loan applications, private banking, and individual credit reports after marriage – weren’t available to women until the passage of the Equal Credit Opportunity Act of 1974.
Even after ECOA, the playing field still isn’t totally level for women, people of color, or other minorities. Persisting gaps in income and wealth as well as inequitable access to credit for women, people of color, LGBTQI+ people, and other minorities mean that frequently those groups have lower credit scores and struggle more frequently to obtain the best credit offers.
It is striking how rarely this system has been updated in the past 200 years spanning the Industrial Revolution, two World Wars, and the invention of the internet. So it may be unsurprising that the credit reporting industry, including the Big Three, continues to rely on legacy computer systems that are difficult to update and manage and also have been vulnerable to hackers and led to data breaches. This is to say: it’s time for an update to the credit reporting system.
[1] Louis Hyman, Debtor Nation 1-95 (Princeton University Press 2011)
[2] Thomas A Durkin et al., CONSUMER CREDIT AND THE AMERICAN ECONOMY 247 (Oxford University Press 2014)
[3] Durkin, supra note 2 at 248-9.
[4] Hyman, supra note 1 at 206-213.
[5] Durkin, supra note 2 at 250-1.