ARTICLE
12 October 2023

CFPB Guidance Puts Creditors On Notice About AI-Involved Adverse Actions

JD
Jones Day

Contributor

Jones Day is a global law firm with more than 2,500 lawyers across five continents. The Firm is distinguished by a singular tradition of client service; the mutual commitment to, and the seamless collaboration of, a true partnership; formidable legal talent across multiple disciplines and jurisdictions; and shared professional values that focus on client needs.
Recent Consumer Financial Protection Bureau ("CFPB") guidance reiterates that creditors must provide consumers with accurate and individualized explanations for adverse actions...
United States Consumer Protection
To print this article, all you need is to be registered or login on Mondaq.com.

Recent Consumer Financial Protection Bureau ("CFPB") guidance reiterates that creditors must provide consumers with accurate and individualized explanations for adverse actions—a task made more difficult by the complexity of artificial intelligence ("AI") systems.

The CFPB recently addressed the crossroads of AI and regulatory compliance. The CFPB issued guidance for creditors that utilize AI and related complex modelling in their consumer credit decision-making processes. The CFPB's action reiterates that creditors must comply with the Equal Credit Opportunity Act and Regulation B by providing accurate and specific explanations when they take adverse actions against consumers, regardless of AI use that may by nature be difficult to explain.

Critically, the guidance restricts the use of generic "checklist" adverse action forms when communicating the basis for adverse actions to consumers. Instead, it pushes lenders to connect the adverse action with the actual circumstances surrounding an individual consumer. Such explanations must go beyond the sample "checklist" bases for denial provided by the CFPB, if such additional bases are necessary to accurately capture why a creditor took an adverse action. The CFPB states that such precautions are necessary to protect consumer access to nondiscriminatory credit decisions in light of a surge in AI use for consumer credit decisions—and its concomitant algorithms, machine learning, and voluminous data processing.

One CFPB concern is that AI algorithms may consider "data that are harvested from consumer surveillance or data not typically found in a consumer's credit file or credit application." Further, such processes can be difficult to explain given the complexity of the underlying models and the vastness of data inputs. The guidance therefore stresses the importance of disclosing specific negative behaviors or reasons behind credit decisions, going beyond broad categories, and cautions against exclusive use of checklist-based practices that don't sound in individuality.

Though the CFPB notes in a related circular that companies cannot use the checklist as the "sole" basis for adverse actions—indicating that companies can still utilize such checklists to some extent—lenders will be well-served to ensure that the systems they utilize for consumer credit decision-making yield explanations that adequately tie back to individualized data for individual consumers.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

We operate a free-to-view policy, asking only that you register in order to read all of our content. Please login or register to view the rest of this article.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More