What You Should Know About Upcoming AI and Automated Employment Tool Rules
With a growing remote workforce, many companies now hire employees virtually, without meeting them in person. The HireVue 2021 Global Trends Report found that 54% of hiring leaders said virtual interviews resulted in a speedier recruitment process, and 41% said it helped them identify the best candidates.
The proliferation of more advanced HR tools has also helped make remote hiring a very attractive approach for employers and recruiters. AI-based tools are said to enable better sourcing and screening of candidates, as well as greater engagement and easier communications.
“Artificial intelligence (AI) and analytics have gained popularity among hiring teams as an effective way to accelerate the hiring of qualified candidates at scale,” said Eric Sydell, EVP of innovation at HR software firm Modern Hire.
A 2022 Zippia study found 94% of hiring professionals who use an Applicant Tracking System (ATS) say the software has improved their hiring process, and 68% agree that the best way to continue improving recruiting performance is to invest in new technology.
The process is called automated hiring, and while it’s gaining traction at a rapid pace, there are also serious considerations to keep in mind before jumping in.
Regulations Are Coming Here
With so much buzz surrounding automated hiring, it didn’t take very long for legislators to examine the risks of using AI in the recruitment process. New York City, for instance, has passed a law that requires that AI and algorithm-based recruitment and HR technologies be audited for bias before being used.
A bias audit requires the hiring company to ensure that the tool provides an impartial evaluation by an independent auditor. At a minimum, they will need to test whether the tool can provide a balanced assessment of all candidates.
But there are other aspects to the law, said Angela Preston, associate general counsel, corporate ethics and compliance at New York-based background check provider Sterling. Regulations being passed in the New York area also prohibit the use of such tools unless candidates are notified, she said.
There’s still a lot of questions surrounding the new rule and how employers can ensure their compliance. An article published in Harvard Business Review discusses the privacy risks associated with using AI in recruitment.
“While an employer may not violate any laws in merely discerning an applicant’s personal information, the company may become vulnerable to legal exposure if it makes adverse employment decisions by relying on any protected categories such as one’s place of birth, race, or native language — or based on private information that it does not have the right to consider, such as possible physical illness or mental ailment,” the authors wrote.
UK job search firm Total Jobs analyzed 77,000 job postings on the company’s website and found nearly 500,000 instances of gender bias. According to their findings, the average job posting had six male-coded or female-coded words.
While many such biases can be believed to be unintentional, it’s no surprise that states are looking into regulations that can help minimize the risks associated with the automated hiring process.
So, What Should Companies Do?
The situation is still evolving, and the most important thing leaders, recruiters and HR teams can do is to stay informed and get expert counsel if needed. Changes are evidently coming, and compliance is likely to be scrutinized for some time. More particularly, Preston said organizations that use AI and automated tools to help fill gaps in their workforce should keep an eye on the following items:
Note How Automated Employment Tools Are Defined
Laws are prescriptive, and definitions matter. Preston said it’s critical for leaders to understand how terms like “automated employment tools” are defined to get a clearer sense of how it applies to their use cases.
Definitions also differ from state to state, and businesses that operate in more than one state may need to speak to legal advisors to make it through the maze of regulations. Some companies may prefer to adhere to the strictest law or the narrowest definition to help protect the company.
Keep up With Federal Guidance
Similarly, employers need to ensure they are familiar with federal guidance, Preston said. There is more to this than state-specific rules and regulations. Operating in a state that does not yet have such laws may still require employers to keep an eye on how the situation is evolving at the federal level.
Identify and Stress-Test Automated Hiring Systems
Preston recommends companies that use AI in their HR processes complete an audit of the hiring systems to identify possible automated screening tools using the definitions in the NYC law.
It is also wise to review the tool and seek out any potential bias that may have been introduced inadvertently. Preston advised: Consider whether you should conduct bias testing and whether it is legally required, and document any testing that is done.” Recording the bias tests helps protect the business should there be a need to prove that efforts were made to eliminate bias and comply with guidelines.
The same tests should also be conducted with an eye on privacy.
Consult With Legal Aid
Businesses that use or are looking to implement AI and automation tools in their HR processes should consult with legal counsel if at any point unsure about their level of compliance with the new rules.
Even if operations are not in states where such rules have been enacted or discussed, it’s clear that they are coming, and it’s best to be prepared.
Ensure Candidates Know About AI Use
If this isn’t yet part of the process, Sydell said employers should ensure all candidates know early in the application process that their data may be put through AI or automation techniques. An even better option would be to also offer candidates the ability to refuse to have their data processed by AI, without risk of discrimination for having opted out.
A data audit should also be conducted to find out what information is required for the position and only collect the information essential for processing the application, Sydell said.
Employers are turning to technology to assist in their hiring practices; it would be a shame if they used technological advancements to gather sensitive information for purposes other than those stipulated to candidates and employees.
This article was originally published in Reworked.
This content is offered for informational purposes only. First Advantage is not a law firm, and this content does not, and is not intended to, constitute legal advice. Information in this may not constitute the most up-to-date legal or other information.
Readers of this content should contact their attorney or lawyer to obtain advice concerning any particular legal matter. No reader, or user of this content, should act or refrain from acting on the basis of information in this content without first seeking legal advice from counsel or lawyers in the relevant jurisdiction. Only your individual attorney or legal advisor can provide assurances that the information contained herein – and your interpretation of it – is applicable or appropriate to your particular situation. Use of, and access to, this content does not create an attorney-client relationship between the reader, or user of this presentation and First Advantage.
Let's Connect
Have questions, need more info, or want to chat background screening solutions? We’re here for you. Click the option that best describes you.
Job candidate? Click here
I am a POTENTIAL CLIENT
and want to chat with Sterling.