Automated tech, including artificial intelligence for job recruiting, interviewing and hiring, may discriminate against qualified job-seekers.
Top officials at the Equal Employment Opportunity Commission and the US Department of Labor’s Office of Federal Contract Compliance Programs are warning about the potential for automated technology in the workplace to accelerate discrimination.
They hosted a roundtable of experts on September 13 as their agencies consider the proliferation of automated tech including artificial intelligence for job recruiting, interviewing and hiring.
Jenny Yang, Director of the OFCCP, which enforces equal employment opportunity laws among federal contractors, said, “we have important work to do ahead.”
The two agencies are focused on promoting equity in tech-based hiring systems as part of a joint venture to expand access to jobs for underrepresented communities, called the Hiring Initiative to Reimagine Equity.
There’s also the Artificial Intelligence and Algorithmic Fairness Initiative, an EEOC effort meant to ensure that tech in hiring and employment decisions doesn’t flout civil rights laws.
This federal focus comes as automated technology and artificial intelligence are increasingly being used by employers.
Nearly all Fortune 500 companies use online, algorithmic recruitment sourcing and hiring tools, said Eric Reicin, president and CEO of the Better Business Bureau National Programs, a non-profit that oversees self-regulation programs for businesses.
These tools include video screening tools that analyze things like facial movements to make assessments about candidates; automated sourcing and recruitment platforms that use public data to make predictions about competencies and chatbots that screen potential applicants, said Wilneida Negrón, Director of Policy and Research at worker organizing platform Coworker.org.
This technology isn’t necessarily discriminatory, but can be.
Charlotte Burrows, EEOC chair, said, “to be clear, I am not suggesting that automated hiring systems cannot be used consistently with [diversity, equity, inclusion and accessibility] initiatives. But it’s important to understand the ways in which the assumptions included in the design of some programs that automate employment decisions can affect the diversity of candidates selected.”
Digital advertising platforms can also allow employers to show job ads only to certain workers based on race, gender and age, said Peter Romer-Friedman, a principal at Gupta Wessler PLLC, who has worked on a lawsuit centering on Facebook’s practices in this area.
He suggested that new EEOC regulations or guidance on how digital intermediaries involved in advertising or sourcing jobs are covered under the law would be helpful, although Reicin cautioned that these are complicated and evolving technologies, so a one-size approach may or may not work.
The EEOC said that it would issue technical assistance on the use of AI in employment decisions when it launched its AI initiative last year. It issued guidance in May with the Justice Department focused on the impact on people with disabilities specifically.
An EEOC spokesperson said the agency is continuing “to gather information and listen to stakeholders to inform how we can use our tools like guidance and technical assistance to increase compliance with federal EEO laws and diversity, equity, accessibility, and inclusion in both the private and federal sector.”
Yang called out businesses that work with the federal government in particular, saying that “federal contractors have an especially important role to play in taking proactive efforts to identify potential barriers that may exclude qualified talent and contractors must take action-oriented steps to address any problem areas.”
Source: GCN