CECNA Learning Center

Why Was Your Loan Denied? The Lender Must Tell You

It’s not unusual for a consumer’s loan application to be turned down for no apparent reason. Lenders typically claim that they consider a wide range of information in reaching their decisions.

In fact, an increasing number of those decisions are made by algorithms – so-called artificial intelligence – with little if any human involvement. They’re often based not on the information provided by the applicant but on the huge consumer databases that are also used to target advertising, spam and junk mail.

Consumers are none the wiser and are unable to find out why their application was denied or their credit was downgraded. But today, the Consumer Financial Protection Bureau (CFPB) said that’s not good enough.

The agency said that  federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms.

The CFPB published a Consumer Financial Protection Circular to remind the public of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

 

Data harvesting

Data harvesting on Americans has become voluminous, giving firms the ability to know highly detailed information about their customers before they ever interact with them, CFPB said in a news release.

Financial companies have long used advanced computational methods as part of their credit decision-making processes, and they have been able to provide the rationales for their credit decisions. However, some creditors now may be making credit decisions based solely on the outputs from complex algorithms, sometimes called “black-box” models.

The reasoning behind some of these black-box decisions may be unknown, even to the lender, which makes complying with ECOA’s requirements impossible.

The CFPB today said that laws requiring consumers to be told of the reasons for adverse decisions must be enforced, regardless of the technology used by creditors.

ECOA protects individuals and businesses against discrimination when seeking, applying for, and using credit, the circular noted. Creditors cannot lawfully use technologies in their decision-making processes if using them means that they are unable to provide these required explanations.

Read today’s Consumer Financial Protection Circular, Adverse action notification requirements in connection with credit decisions based on complex algorithms.

Consumers can submit fair lending complaints, or complaints about financial products or services, by visiting the CFPB’s website or by calling (855) 411-CFPB (2372).

 

Like this article?

Share on Facebook
Share on Twitter
Share on Linkdin
Share on Pinterest

Leave a comment