Article

CFPB Tells Firms "Black Box" Credit Models Used by Banks, Other Lenders Must Not Discriminate

Thomson Reuters Regulatory Intelligence
by Richard Satran
June 15, 2022

The U.S. Consumer Financial Protection Bureau has told firms it regulates that federal anti-discrimination law extends liability for banks and other lenders in their use of algorithmic models used in credit decisions. The consumer finance agency’s decision to enforce against bias in what it calls “black box” credit decisions was disclosed in a “Consumer Financial Protection Circular” that asserted authority to regulate lenders’ widely used fintech operations.

The agency underscored its intent to begin enforcing the ruling immediately; it appealed to technology workers to file whistleblower claims when they have knowledge of programming features that promote disparate lending practices. While the CFPB itself lacks a program offering rewards for tips, it could rely on other law enforcement partners that offer whistleblower incentives.

The CFPB, which has used the CFPB “circular” policymaking route to implement administrative priorities, instead of much slower actions required in rule making, has aimed at oversight of financial firms in a series of recent initiatives. In the new circular issued May 26, it cited the provisions of the FCOA, which requires lenders to explain to consumers “adverse actions” in credit denial decisions, as the legal authority supporting its regulatory action.

CFPB Could Face Legal Challenges, Implementation Obstacles

The CFPB circular lacked specifics on what actions constitute violations and on how the agency will enforce the rule, said Michael Gordon, a partner with the law firm Ballard Spar LLP and former senior CFPB official involved in the startup of the agency.

“But it does send a strong signal that the CFPB will closely scrutinize the use of algorithms in credit decision-making – and will have little tolerance when firms fall short of their obligation to inform consumers about adverse decisions,” Gordon said.

The CFPB’s algorithmic-bias initiative follows Biden administration orders for regulators to push broadly on both discriminatory lending violations and modernizing oversight of financial sector digital practices. CFPB Director Rohit Chopra has moved more quickly than other U.S. agencies in mapping out fintech oversight in a series of studies and initiatives on lending bias. The agency has pushed its oversight beyond financial services firms to include large tech firms which have recently launched consumer credit offerings.

The CFPB at the same time has stepped up enforcement of existing discriminatory lending laws in a series of recent actions against financial firms that have marked a major departure from the Trump-era period of relatively few enforcements, which often were resolved with warning letters in place of penalties.

CFPB Innovation Unit Revamped

The circular carries a weight similar to the Securities and Exchange Commission risk alerts that outline priority areas for scrutiny based on examination results. It puts the industry on notice that it considers discriminatory lending in fintech will be viewed as no different than brick-and-mortar fair lending law violations.

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said Chopra. He cited the specific provisions covered by Equal Credit Opportunity Act requirements that firms issue notices to explain adverse credit decisions it makes.

The CFPB in another similar policy interpretation said recently that it will monitor for discriminatory practices in how firms manage credit cards and loans after an application is approved. In the prior decision the CFPB warned lending institutions that "anti-discrimination protections do not vanish once a customer obtains a loan.”

The agency also signaled recently, in what was seen as its most significant recent policy move, that it would examine lenders and service providers for discriminatory practices by applying previously unused authority to combat Unfair, Deceptive and Abusive Acts and Practices (UDAAP), encompassed in the Dodd-Frank Act of 2010, which created the bureau.

The authority, which it has not used on the past, will use UDAAP provisions to enforce lending practices in actions that allow significant fines. The UDAAP gives the CFPB broad authority to pursue fraud in patterns that show consumer abuse without requiring evidence of intent to mislead.

More Forceful CFTC Shifts Focus

The CFTC last month signaled its more forceful regulation of fintech when it revamped the Office of Innovation, which under the prior administration had been a vehicle to encourage fintech development with "no action" letters and regulatory "sandboxes" for developers to launch new applications. The relaunched Office of Competition and Innovation aims to expose anti-competitive practices of large banks and tech firms that hamper smaller innovators.

The new competition unit will “analyze obstacles to open markets, better understand how big players are squeezing out smaller players, host incubation events, and, in general, make it easier for people to switch financial providers.” The agency made clear in its new black box initiative that creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new.

The agency last October when it ordered major U.S. tech firms Apple, Facebook, Google, PayPal, and Square to provide information on their online payments and use of customer data, in an initiative that also has focused on Chinese tech giants outside its regulatory oversight, including Alipay and WeChat Pay.

The new move into regulating algorithmic processes could prompt legal challenges by firms over the CFPB’s authority in emerging areas, legal experts said. The CFPB’s legal status was upheld in a recent split ruling by the Supreme Court challenge that nontheless limited the unchecked authority of the agency’s director. The U.S. high court has in general taken a more skeptical view of regulatory authority in recent decades.

CFPB Director Targets Digital Practice Abuses

Within weeks of taking over as CFPB director last October, Chopra's agency launched the initiative to review the practices of the large tech firms’ as consumer credit entities, with such programs as Buy Now, Pay Later used by millions of consumers for online purchases. Since then it has launched reviews of real estate valuation data metrics and other uses of technology in consumer finance.

“This CFPB circular is part of a larger effort by the agency to prioritize fair lending compliance and enhance scrutiny of technology innovations in consumer finance markets,” Gordon said. The CFPB said its advisory on black box practices that its oversight would "extend beyond adverse action notices and ECOA," citing its recent inquiry of automated valuation models within the home appraisal process" as an example of the types of behavior it could include in its exanded oversight.

CFPB said it plans to invoke the adverse action provision under ECOA, which “gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

The CFPB has said that it plans to take strong actions to offset data abuses because “data harvesting on Americans has become voluminous and ubiquitous, giving firms the ability to know highly detailed information about their customers before they ever interact with them.”

The abuses, it said, use “detailed datasets to power their algorithmic decision-making, which is sometimes marketed as ‘artificial intelligence.’ The information gleaned from data analytics has a broad range of commercial uses by financial firms, including for targeted advertising and in credit decision-making.”

Subscribe to Ballard Spahr Mailing Lists

Get the latest significant legal alerts, news, webinars, and insights that affect your industry. 
Subscribe