Domain Registration

Many Facial-Recognition Systems Are Biased, Says U.S. Study

  • December 19, 2019
  • Business

The majority of commercial facial-recognition systems exhibit bias, according to a study from a federal agency released on Thursday, underscoring questions about a technology increasingly used by police departments and federal agencies to identify suspected criminals.

The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, the National Institute of Standards and Technology reported on Thursday. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

The technology also had more difficulty identifying women than men and elderly people more than middle-aged people.

“One false match can lead to missed flights, lengthy interrogations, watchlist placements, tense police encounters, false arrests or worse,” Jay Stanley, a policy analyst at the American Civil Liberties Union, said in a statement. “Government agencies including the F.B.I., Customs and Border Protection and local law enforcement must immediately halt the deployment of this dystopian technology.”

Article source: https://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html?emc=rss&partner=rss

Related News

Search

Find best hotel offers