New York City employers who use Artificial Intelligence (AI) tools in hiring will soon be subject to new regulations requiring them to notify candidates when using the technology.
Courtesy Getty Images
New York City employers who use Artificial Intelligence (AI) tools in hiring will soon be subject to new regulations. Starting April 15, Local Law 144 will go into effect, and employers will be required to notify candidates when they are using AI in hiring.
Local Law 144 is the first piece of U.S. legislation regarding the use of AI in hiring. The NYC Department of Consumer and Worker Protection originally set an enforcement date of Jan. 1, but due to what it called a “substantial volume of thoughtful comments,” the date was pushed back, and the agency stated it will finalize the rule in the coming months.
As it’s currently written, the law would require employers to:
- Conduct a bias audit on an automated employment decision tool prior to its use
- Notify candidates and employees that the tool is in use
- Outline to candidates the job qualifications and characteristics that the AI will use
If employers do not comply or violate any of the provisions, they will be subject to a civil penalty.
NYC isn’t the only government body to address this issue. In January, The U.S. Equal Employment Opportunity Commission (EEOC) posted a draft of a Strategic Enforcement Plan focused on reducing bias in AI hiring technologies.
And in October, the White House released a white paper titled “Blueprint for an AI Bill of Rights: Making Automated Systems Work for the American People.”
New Tools, New Rules
One of the chief motivators behind these types of laws is the possibility of discrimination and biases that may come as a result of these AI tools.
In May, the EEOC published guidance on how AI tools can be discriminatory. The agency wrote that because the AI is automated to look for specific keywords, qualifications and requirements, the software will likely exclude qualified candidates that don’t fit that exact mold.
One example given by the agency was a chatbot that may reject candidates with significant gaps in their resumes. The agency argued the bot would then screen this person out of the qualified candidates, even if that gap was caused by a disability or maternity leave.
At a Jan. 31 hearing titled “Navigating Employment Discrimination in AI and Automated Systems: A New Civil Rights Frontier,” EEOC chair Charlotte Burrows said 83% of employers, including 99% of Fortune 500 companies, now use some type of automated tool as part of their hiring process.
Work with the System, Not Against It
Of these, a smaller but still significant number of companies report using AI software in addition to automated tools in hiring.
In a poll BioSpace ran three weeks following Burrows’ statement, 22% of respondents said their organization uses AI tech in their hiring process. Many of the respondents are part of the life science industry.
Prestige Scientific, a life sciences recruiting and executive search firm, is one of these. Stephen Provost, managing director and co-founder, told BioSpace that the addition of AI in screening candidates at Prestige was implemented within the past six months.
“We can see the benefits of AI for the future,” Provost said. “The tool that we use updates information in real time, so it will learn from the different criteria we give it and bring back more accurate results when we conduct a search.”
He emphasized that life sciences candidates, specifically, can work with the system to decrease their chances of being excluded from the AI’s search. He said due to a large number of acronyms and jargon commonly used in the industry, candidates should include those specific words in their resumes.
“Most companies have some type of software filter…that looks for certain keywords,” he said. “In this industry, I suggest using both an acronym and writing out what it stands for because you don’t know how the person setting up the AI will query the database.”
Anne Hunter is the founder of Hunter Marketing AI, a consultancy focused on helping businesses integrate AI tools. She recommended fighting fire with fire—using AI software that automatically edits a candidate’s resume or cover letter to match the keywords in a job description.
Hunter highlighted how AI can help prevent discrimination.
“Optimistically, AI screening will help eliminate bias because it looks for a skills match between a candidate and a role instead of judging based on personal characteristics,” she said. “This is a step up from the old biases, such as shared college hobbies or perceived demographic abilities, that a hiring manager might be influenced by upon first glance at a resume.”
Indeed, unintentional biases from an automated recruitment tool are easier to work around than intentional biases from a real person. Still, Provost cautioned against blindly filling one’s resume with certain keywords and phrases.
“I would not suggest candidates add anything to a resume if they have no experience in that area in order to get an interview,” Provost said. “For many HR and talent acquisition professionals, it makes that candidate lose credibility instantly…it’s the number one way to stand out in a bad way.”
As these tools continue to evolve, job seekers must evolve with them if they want to keep up with the competition. Likewise, with new legal actions like New York City’s Local Law 144, employers will soon be forced to do the same.