abusesaffiliationarrow-downarrow-leftarrow-rightarrow-upattack-typeburgerchevron-downchevron-leftchevron-rightchevron-upClock iconclosedeletedevelopment-povertydiscriminationdollardownloademailenvironmentexternal-linkfacebookfiltergenderglobegroupshealthC4067174-3DD9-4B9E-AD64-284FDAAE6338@1xinformation-outlineinformationinstagraminvestment-trade-globalisationissueslabourlanguagesShapeCombined Shapeline, chart, up, arrow, graphLinkedInlocationmap-pinminusnewsorganisationotheroverviewpluspreviewArtboard 185profilerefreshIconnewssearchsecurityPathStock downStock steadyStock uptagticktooltiptwitteruniversalityweb
Article

5 Jul 2023

Author:
Kyle Wiggers, TechCrunch

USA: Anti-bias law for hiring algorithms goes into effect in New York City

"NYC’s anti-bias law for hiring algorithms goes into effect", 5 July 2023

After months of delays, New York City today began enforcing a law that requires employers using algorithms to recruit, hire or promote employees to submit those algorithms for an independent audit — and make the results public. The first of its kind in the country, the legislation — New York City Local Law 144 — also mandates that companies using these types of algorithms make disclosures to employees or job candidates.

At a minimum, the reports companies must make public have to list the algorithms they’re using as well an an “average score” candidates of different races, ethnicities and genders are likely to receive from the said algorithms — in the form of a score, classification or recommendation. It must also list the algorithms’ “impact ratios,” which the law defines as the average algorithm-given score of all people in a specific category (e.g. Black male candidates) divided by the average score of people in the highest-scoring category.

Companies found not to be in compliance will face penalties of $375 for a first violation, $1,350 for a second violation and $1,500 for a third and any subsequent violations. Each day a company uses an algorithm in noncompliance with the law, it’ll constitute a separate violation — as will failure to provide sufficient disclosure.

Importantly, the scope of Local Law 144, which was approved by the City Council and will be enforced by the NYC Department of Consumer and Worker Protection, extends beyond NYC-based workers. As long as a person’s performing or applying for a job in the city, they’re eligible for protections under the new law.

Many see it as overdue. Khyati Sundaram, the CEO of Applied, a recruitment tech vendor, pointed out that recruitment AI in particular has the potential to amplify existing biases — worsening both employment and pay gaps in the process.

But the risks aren’t slowing adoption. Nearly one in four organizations already leverage AI to support their hiring processes, according to a February 2022 survey from the Society for Human Resource Management. The percentage is even higher — 42% — among employers with 5,000 or more employees.

So what forms of algorithms are employers using, exactly? It varies. Some of the more common are text analyzers that sort résumés and cover letters based on keywords. But there are also chatbots that conduct online interviews to screen out applicants with certain traits, and interviewing software designed to predict a candidate’s problem solving skills, aptitudes and “cultural fit” from their speech patterns and facial expressions.

The range of hiring and recruitment algorithms is so vast, in fact, that some organizations don’t believe Local Law 144 goes far enough.

The NYCLU, the New York branch of the American Civil Liberties Union, asserts that the law falls “far short” of providing protections for candidates and workers. Daniel Schwarz, senior privacy and technology strategist at the NYCLU, notes in a policy memo that Local Law 144 could, as written, be understood to only cover a subset of hiring algorithms — for example excluding tools that transcribe text from video and audio interviews. (Given that speech recognition tools have a well-known bias problem, that’s obviously problematic.)

“The … proposed rules [must be strengthened to] ensure broad coverage of [hiring algorithms], expand the bias audit requirements and provide transparency and meaningful notice to affected people in order to ensure that [algorithms] don’t operate to digitally circumvent New York City’s laws against discrimination,” Schwarz wrote. “Candidates and workers should not need to worry about being screened by a discriminatory algorithm.”

Parallel to this, the industry is embarking on preliminary efforts to self-regulate.

While imperfect in certain areas, according to critics, Local Law 144 does require that audits be conducted by independent entities that haven’t been involved in using, developing or distributing the algorithm they’re testing and that don’t have a relationship with the company submitting the algorithm for testing.