Amazon’s New Law Enforcement Facial Tech Stirs Up Protests

With the news that Amazon currently is selling its new facial recognition software, Rekognition, to law enforcement agencies, a number of rights and privacy groups including the American Civil Liberties Union (ACLU) have sounded the alarm.

Amazon already has sold Rekognition software to the Orlando Police Department and the Washington County Sheriff’s Office in Oregon, and many other police agencies have expressed interest. The new software can track and analyze hundreds of people in a photo using a database containing millions of faces, according to Amazon’s sales department. 

Although facial recognition software is nothing new to law enforcement, the ACLU in a letter to Amazon CEO Jeff Bezos expressed concern that Rekognition would be used against “persons of interest” like undocumented immigrants.

“Amazon Rekognition is primed for abuse in the hands of government,” the letter said. “This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect that Amazon has built.”

Although facial recognition technology has been a very effective law enforcement tool thus far, it also has had its pitfalls. Recall the trouble that Google Photos had because it wasn’t built to function on all skin tones, and the embarrassment of tagging pictures of black people as gorillas. And then there was the issue with Hewlett-Packard’s face-tracking webcams that failed to recognize black people at all.

When Apple released its iPhone-10 with a facial recognition lock on operating the device, the immediate concern was whether consumers could trust Apple not to send their faces to some other databases.

Amazon screenshot

Privacy organizations are most concerned that Rekognition is designed to operate at a distance, without the knowledge or consent of the person being identified. Individuals are not able to prevent themselves from being identified by cameras that could be located anywhere—including unmanned drones and even in the eyewear of strangers using devices like Google Glass.

There have been notable foul-ups in the facial recognition story on these personal levels. Take the case of John Gass, who on April 5, 2011, received a letter from the Massachusetts Registry of Motor Vehicles (RMV) ordering him to stop driving immediately. After investigation, Gass learned that his image had been flagged by a facial recognition algorithm that determined he looked sufficiently like another Massachusetts driver who had a criminal record. When the Massachusetts RMV was notified of its mistake, it claimed that it was the burden of the accused to clear their names because the advantages in protecting the public outweigh the inconveniences to the wrongly targeted few.

Based on this and other examples, facial recognition has some house-cleaning to do, technologically and legally.