Steven Lilley via flickr
Last Monday, a bill was introduced in the US Congress to tighten control over police actions. This is the Justice in Policing Act of 2020, which was already being developed by 166 members of the House of Representatives and 35 senators. The document "should be the first such measure to control the police, change the culture of law enforcement and build trust between law enforcement agencies and our society." The bill was a direct result of growing dissatisfaction with law enforcement and mass protests that began after the death of an unarmed black American, George Floyd, as a result of his detention by police.
Dissatisfaction with the actions of the police intensified the discussion about the massive use of face recognition technologies by law enforcement agencies, which was repeatedly criticized by human rights defenders. This criticism has already led to a three-year moratorium on the use of such technologies by the police and other public services in California last September.
San Francisco was the first to ban face recognition systems. Later, several other American cities, including Auckland (California) and Somerville (Massachusetts), followed suit. In September, law enforcement agencies across California was banned from using technology. One recent argument was an experiment conducted by the American Civil Liberties Union (ACLU). 25,000 photographs of people arrested, as well as photographs of members of the California legislature, were uploaded in Amazon's Rekognition program. The software found 26 matches.
There are even more cases of racial discrimination when using such software. The American National Institute of Technology Standards found that the program is mistaken in about every ten thousandth case with images of white women. With images of black women a mistake occurred in every thousandth case, that is, 10 times more often. Last year, scientists at the Massachusetts Institute of Technology came to similar conclusions.
The Justice in Policing Act of 2020 was introduced on June 8. On June 9, IBM released a statement stating that it was phasing out work on general-purpose face recognition technologies and related software. In addition, the company said it sent a letter to Congress on June 8 urging lawmakers to "intensify the fight against police abuse and give citizens more rights to protect their constitutional rights."
Two days later, on June 11, Amazon announced a one-year moratorium on police use of Amazon face recognition technology. At the same time, the company noted that it will continue to provide access to the Amazon Rekognition system to those involved in the search for missing or stolen children, and organizations fighting against trafficking in persons. The next day, Microsoft President Brad Smith, in an interview with The Washington Post, said his company would not sell face recognition software to police departments until there was a law that guarantees and protects the rights of people in this area. Microsoft’s statement came after US human rights organizations, the Electronic Frontier Foundation (EFF) and the American Civil Liberties Union (ACLU), called on the company to follow the lead of IBM and Amazon.
In any case, experts believe that the events of the past week will not only lead to a change in the legislative framework regarding recognition of faces by the authorities, but will also affect other players in this market, at least with increased public attention. Some companies have already rushed to indicate their position. Representatives of NEC told Recode that they “oppose any use of face recognition technologies or any other technologies if this violates the first amendment to the US constitution”.
source: www.washingtonpost.com
Dissatisfaction with the actions of the police intensified the discussion about the massive use of face recognition technologies by law enforcement agencies, which was repeatedly criticized by human rights defenders. This criticism has already led to a three-year moratorium on the use of such technologies by the police and other public services in California last September.
San Francisco was the first to ban face recognition systems. Later, several other American cities, including Auckland (California) and Somerville (Massachusetts), followed suit. In September, law enforcement agencies across California was banned from using technology. One recent argument was an experiment conducted by the American Civil Liberties Union (ACLU). 25,000 photographs of people arrested, as well as photographs of members of the California legislature, were uploaded in Amazon's Rekognition program. The software found 26 matches.
There are even more cases of racial discrimination when using such software. The American National Institute of Technology Standards found that the program is mistaken in about every ten thousandth case with images of white women. With images of black women a mistake occurred in every thousandth case, that is, 10 times more often. Last year, scientists at the Massachusetts Institute of Technology came to similar conclusions.
The Justice in Policing Act of 2020 was introduced on June 8. On June 9, IBM released a statement stating that it was phasing out work on general-purpose face recognition technologies and related software. In addition, the company said it sent a letter to Congress on June 8 urging lawmakers to "intensify the fight against police abuse and give citizens more rights to protect their constitutional rights."
Two days later, on June 11, Amazon announced a one-year moratorium on police use of Amazon face recognition technology. At the same time, the company noted that it will continue to provide access to the Amazon Rekognition system to those involved in the search for missing or stolen children, and organizations fighting against trafficking in persons. The next day, Microsoft President Brad Smith, in an interview with The Washington Post, said his company would not sell face recognition software to police departments until there was a law that guarantees and protects the rights of people in this area. Microsoft’s statement came after US human rights organizations, the Electronic Frontier Foundation (EFF) and the American Civil Liberties Union (ACLU), called on the company to follow the lead of IBM and Amazon.
In any case, experts believe that the events of the past week will not only lead to a change in the legislative framework regarding recognition of faces by the authorities, but will also affect other players in this market, at least with increased public attention. Some companies have already rushed to indicate their position. Representatives of NEC told Recode that they “oppose any use of face recognition technologies or any other technologies if this violates the first amendment to the US constitution”.
source: www.washingtonpost.com