Confusion in facial recognition between politicians and criminals
According to an experiment conducted by the ACLU, Amazon’s facial recognition system reportedly falsely accused 28 members of the US Congress as potential criminals.
For a few months, Amazon collaborated with the authorities of the city of Orlando so that they could test the Rekognition facial recognition system. The American Union for Civil Liberties (ACLU) strongly opposed the project when it discovered that the system was deployed on three surveillance cameras of the city. If the project was simply to test the technology, several associations protested, arguing that such a system could impede individual freedoms. In addition to this last point, the ACLU also criticizes Amazon for associating itself with the government around such a project. Several employees joined the association’s opinion by writing a letterfor group CEO Jeff Bezos. In this one, the signatories denounce the loan to the authorities of a software able to identify a hundred individuals on a single image.
Moreover, it seems that Amazon’s surveillance system is not yet ready for use, since a test run by the ACLU shows the weaknesses of some of its capabilities. To carry out the experiment, the American association compiled 25,000 photographs of criminals in order to compare them to that of the 535 members of the Congress and to analyze the possible correspondences. A test that has not missed since 28 politicians were considered by Recognition as criminals.
Minorities (still) targeted by Amazon’s surveillance system
More alarmingly, facial recognition technology has largely focused, wrongly, on African-American or Latin-American populations. While these represent only 20% of the 535 congress members, 39% of those identified as criminals are African American or Latin American. Following these results, ACLU lawyer Jacob Snow was alarmed: ” Facial recognition will be used to fuel discriminatory surveillance and law enforcement that will target communities of color, immigrants and activists (…) Once engaged, the damage can not be canceled .
For its part, Amazon argued that the software was incorrectly tuned and that the accuracy should have been 95% and not 80%, as in the ACLU test. In fact, a spokesperson for the group said, ” If 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals or other cases of social media use, it would not be appropriate to identify individuals with a reasonable level of certainty “.
As a reminder, several entities are worried about collaborations between private companies and governments, so Google has ended its collaboration with the Pentagon following the controversy.