Skip to main content
Join Now
placeholder image for article on the Fair Credit Reporting Act (FCRA)

< Back to All

AI in Employment Decisions: Compliance with the Fair Credit Reporting Act (FCRA)

November 16, 2024

If you are using AI information in making employment decisions your activity could fall under the Fair Credit Reporting Act (FCRA) and therefore require compliance with the FCRA.

The Consumer Finance Protection Bureau (CFPB) issued a circular that posed the question, “Can an employer make employment decisions utilizing background dossiers, algorithmic scores, and other third-party consumer reports about workers without adhering to the Fair Credit Reporting Act (FCRA)?”  And the short answer is no. The circular goes on to say employers cannot “make employment decisions utilizing background dossiers, algorithmic scores, and other third-party consumer reports about workers without adhering to the Fair Credit Reporting Act (FCRA).”

The CFPB released the circular because it is “tak[ing] action to curb unchecked worker surveillance.”

The activities covered in the circular include:

  • Those that record current workers’ activities
  • Personal habits and attributes
  • Biometric information

Some types of activities which employers are now tracking with AI technology include driving habits, the time it takes for an employee to complete tasks, the number of messages sent by employees, web browsing, and keystroke activity. The companies who provide the software for monitoring these activities could meet the definition of a “consumer reporting agency” and if they do meet the definition, the employer using the software must abide by the FCRA requirements.

The circular lists “two key questions” for “enforcers” to consider when they are making a determination about a third-party falling under the FCRA:

  1. Does the employer’s use of data qualify as a use for “employment purposes” under the FCRA?
  2. Is the report obtained from a “consumer reporting agency,” meaning that the report-maker “assembled” or “evaluated” consumer information to produce the report?

This is something for employers to consider when using third-party providers for AI information and their analysis and recommendations based on that information.

When the third-party providers do fall under the FCRA regulations, employers have many obligations under the FCRA including, but not limited to, the following:

  • Providing the subject with an FCRA compliant disclosure as a stand-a-lone document with no “extraneous” information added.
  • Getting written authorization to obtain the information from the applicant/employee.
  • Providing any state/county specific notices.
  • Following the 2 step pre-adverse/adverse action process including giving the subject a copy of the Summary of Rights, the report, and letting them know which part(s) of the report might be used in making an adverse decision.
  • Some locations require specific analysis and documentation to be provided to the subject.
  • Giving the subject a minimum of 5 business days to start the dispute process.

If you are currently using AI or considering using AI in employment decisions, make sure you look at the possibility of the supplier of said information being classified as a Consumer Reporting Agency and follow the required regulations when and where applicable.

Sources: CFPB Says FCRA May Govern Background Report Providers and Users, Including Tech Companies | Cooley LLP – JDSupra; Employers’ Use of AI Tracking or Monitoring Reports May Trigger Obligations Under FCRA – Lexology; https://www.consumerfinance.gov/compliance/circulars/consumer-financial-protection-circular-2024-06-background-dossiers-and-algorithmic-scores-for-hiring-promotion-and-other-employment-decisions/

 

By Susan Chance, courtesy of SBAM-approved partner, ASE.

Click here for more News & Resources.

Share On: