While recent public attention has largely focused on generative artificial intelligence (AI), the use of AI for recruitment and promotion screening in the employment context is already widespread. It can help HR-professionals make sense of data as the job posting and application process is increasingly conducted online. According to a survey conducted by the Society for Human Resource Management (SHRM),[1] nearly one in four organizations use automation and/or AI to support HR-related activities, such as recruitment, hiring, and promotion decisions, and that number is posed to grow. The same study found that one in four organizations plan to start using, or to increase their use of, automation or AI in recruitment or hiring over the next five years, and one in five organizations plan to start using, or to increase their use of, automation or AI in performance management over the next five years.
AI tools, when used for HR purposes, have the capacity to impact workers at every stage of employment, from applicant screening, performance counseling, promotion, and discipline and discharge decisions. Varying approaches to regulating these tools have emerged. We discussed the proposed No Robot Bosses Act, which would prohibit the use of “Automated Decision Systems” in some cases, and give employees certain opt-out rights. Similarly, we reported on the EEOC’s recent guidance concerning the use of automated systems, with a focus on how use of these tools might violate Title VII.
New York City’s approach to regulating these tools is designed to make their use more transparent. Local Law 144 of 2021 (“Local Law 144”), effective January 1, 2023, remains one of the few laws in recent years that has passed through the legislative process to regulate AI in the employment context.[2] We have previously written about Local Law 144, from its passage, to its implementing regulation (the “Final Rule”), to the New York City Department of Consumer and Worker Protection’s (“DCWP”) release of FAQs for the law (the “FAQs”). Now that the law has been subject to enforcement for a few months—since July 5, 2023—it is a good time to take stock of its requirements.
In general, Local Law 144 requires covered employers and employment agencies that use “automated employment decision tools” (AEDT) to meet two primary requirements: (1) ensure that the AEDT is subject to a bias audit and publish its results; and (2) provide notice to applicants and employees before using an AEDT. The Final Rule and FAQs are instructive in helping employers and employment agencies navigate compliance with these requirements.
Geographic Scope of the Law
Local Law 144’s bias audit and notice requirements apply to use of AEDT “in the city.” The DCWP’s FAQs help provide additional clarity regarding this geographic limitation. First, the FAQs explain that compliance with the bias audit requirements applies to: (1) employers using AEDT for jobs based in New York City (including remote jobs associated with a New York City address); and (2) employment agencies either located in New York City or hiring for jobs in New York City. With respect to the notice requirement, the FAQs clarify that only applicants or employees who are New York City residents must receive the required notice.
What is an Automated Employment Decision Tool?
An AEDT is a computational process, derived from “machine learning, statistical modeling, data analytics, or artificial intelligence” that issues a “simplified output” (i.e., scores, classifications, or recommendations) to “substantially assist or replace discretionary decision making” for “employment decisions” (i.e., screening candidates for employment or employees for promotion in the city). We have written about the definition of AEDT and the Final Rule’s clarification of some of the terms in its definition. For example, the Final Rule clarifies that AEDT are limited to those (i) that rely “solely” on a simplified output; (ii) use a simplified output as a factor weighted more than any other criteria in a set; or (iii) use a simplified output to “overrule” conclusions derived from other factors including human decision-making. Some applications could include resume screening software, or pre-employment assessments programs. Employee productivity and monitoring tools could also conceivably be covered to the extent they generate a score or classification used to screen employees for promotion.
Auditing AEDT for Potential Bias
Within one year of using an AEDT, employers and employment agencies must ensure that an independent third party conducts a bias audit of the AEDT. The Final Rule clarifies the definition of “independent auditor” by requiring that the auditor: not be employed by the business using the AEDT or the vendor that developed or distributes the AEDT; not have any involvement in use, development, or distribution of the AEDT; and not have a direct financial interest or a material indirect financial interest in the employer or employment agency that will use the AEDT or the vendor that developed or distributed the AEDT. Notably, however, the FAQs state that a vendor can have an independent auditor do a bias audit of its own tool. This may ease the burden on employers and employment agencies, although the FAQs make clear it clear that the responsibility to ensure that a bias audit is completed still falls on the employer or employment agency, not the vendor.
Addressing public concern that Local Law 144 was largely silent on what constitutes a “bias audit,” the DCWP provided detail in its Final Rule as to what must be included in a compliant bias audit. Specifically, a bias audit must include the scoring and/or selection rate for each race/ethnicity and sex category reported on an EEO component 1 report. It must also include the “impact ratio,” which means the rate of selection/scoring of each category divided by the highest scored/selected category. The impact ratio will show if any race/ethnicity and sex categories of applicants or employees are selected (or scored) at a higher rate than other categories.
A “summary” of the bias audit results must be published on the employer’s or employment agency’s website. The summary must include the date of the most recent bias audit of the AEDT, the distribution date of the AEDT, the source and explanation of the data used to conduct the bias audit, the number of individuals the AEDT assessed that fall within an “unknown” category (i.e., those whose race/ethnicity and sex are not known), the number of applicants or candidates, the selection or scoring rates, as applicable, and the impact ratios for all categories.
In the FAQs, the DCWP has confirmed that Local Law 144 “does not require any specific actions based on the results of the bias audit”; however, the bias audit results may publicly reveal data that can fuel potential discrimination claims. As we recently reported, the U.S. Equal Employment Opportunity Commission (“EEOC”) settled with a company that it alleged had programmed screen software to reject applicants over the age 50 in violation of the Age Discrimination in Employment Act.
Notice Requirements
Local Law 144 requires notice ten business days before using the AEDT. Addressing the reality that many job applications are online, the Final Rule states that notice can be provided by placing it in the job posting or on the employment section of an employer’s or employment agency’s website at least ten days before using the AEDT, although U.S. mail or e-mail will also suffice. The FAQs also clarify that the notice need not be “position-specific.” Notably, for candidates for promotion, the FAQs further clarify that notice can be made in a written policy or procedure. This might include, for example, the company employee handbook or privacy policy.
The notice itself must inform the employee or applicant that AEDT is being used and the “job qualifications and characteristics” that the AEDT will analyze. It must also include instructions for how to request an alternative selection process or reasonable accommodation. While the Final Rule makes clear that employers or employment agencies are not required under Local Law 144 to provide an alternative selection process on such request, other laws, such as the Americans with Disabilities Act, may impose that requirement, as we have written about here.
Finally, Local Law 144 requires that an employer or employment agency provide information concerning its AEDT-related data handling practices, specifically, it must provide: the type of data collected for the AEDT, the source of such data, and the employer’s or employment agency’s data retention policy. This information must be provided either on the employer or employment agency’s website or “within 30 days of written request,” and, according to the Final Rule, instructions for how to make such a written request should be made available to the applicant or employee. An employer or employment agency, however, need not to provide this information if it provides an explanation to the candidate or employee that doing so would violate local, state, or federal law, or interfere with a law enforcement investigation.
What Penalties May Employers Face for Noncompliance?
Organizations that violate Local Law 144 will be liable for $500 per violation on the first day. Each subsequent violation after the first day carries a civil penalty of between $500 and $1,500. Each day that the AEDT is used without complying with the bias audit requirements is a separate violation, as is each failure to provide the proper notice to an applicant or employee is a separate violation.
The Bigger Picture
Regulation of AI in the employment context has been developing at the state and federal level at a rapid pace. As mention above, federal efforts include the proposed No Robot Bosses Act, EEOC guidance, and guidance from other regulators including the FTC, CFPB, and DOJ. Additionally, state regulators, such as the California Civil Rights Council and the California Privacy Protection Agency, are working on regulations addressing the use of automated decision making tools. New York Senator Hoylman-Sigal has also recently proposed an expansive law governing the use of automated employment decision tools and electronic monitoring practices. Many of these proposals and have overlapping requirements that are similar to Local Law 144. Thus, understanding New York City’s law will help employers and employment agencies stay ahead of these ongoing regulatory efforts.
What Should Employers Be Doing Now?
Employers or employment agencies with offices in New York City, and those who are placing employees in New York City, should conduct an assessment of the tools used for hiring and promotion and the vendors providing those tools. Depending on the results of that assessment, employers and employment agencies may need to (1) engage an independent auditor to conduct a bias audit, (2) update language in applicable job postings, and (3) revise internal policies, including their employee handbook and privacy policy. Our dedicated AI Team at EBG can assist throughout this this process.
*Zoe Leid, a 2023 Summer Associate (not admitted to the practice of law) in the firm’s New York office, contributed to the preparation of this post.
[1] SHRM, Automation & AI in HR, available at
https://advocacy.shrm.org/SHRM-2022-Automation-AI-Research.pdf
[2] Illinois has also enacted a law concerning the use of facial recognition and video interviewing technology used during applicant interview processes.
Blog Editors
Authors
- Of Counsel
- Senior Counsel