According to the documents, Amazon itself has also offered to connect law enforcement with other parties, including a body camera manufacturer, interested in using Rekognition.
The ACLU blog responds:
“If police body cameras, for example, were outfitted with facial recognition, devices intended for officer transparency and accountability would further transform into surveillance machines aimed at the public. With this technology, police would be able to determine who attends protests. ICE could seek to continuously monitor immigrants as they embark on new lives. Cities might routinely track their own residents, whether they have reason to suspect criminal activity or not. As with other surveillance technologies, these systems are certain to be disproportionately aimed at minority communities.“
The ACLU said it reached out to Washington County and the city of Orlando for any records showing that either organization had given citizens an opportunity to discuss the possible impact of Rekognition before it was implemented. Both agencies were also asked to provide clarification on the rules of how Rekognition would be used and how privacy rights would be protected. Neither agency responded to the ACLU's request.
In a written statement released to NPR, Amazon responded to concerns by pointing out the applications of Rekognition beyond law enforcement: “[O]ur quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology. Imagine if customers couldn't buy a computer because it was possible to use that computer for illegal purposes?..."[W]e require our customers to comply with the law and be responsible when using Amazon Rekognition.”
A video released by the ACLU outlines its position on law enforcement's use of Amazon Rekognition.
The Orlando Police Department told NPR that its pilot program of Rekognition is “limited to only eight city-owned cameras" and a "handful" of volunteer officers. It also said it was not "utilizing any images of members of the public for testing.”
The Latest in a Trend
While Amazon may be catching the most heat because of its size and influence, it is far from the only company releasing facial recognition technology for private and public use. Facial recognition has seen big deployments in China with companies like Yitu Technology applying it to everything from retail experiences to ATM security. In a famous use case, Chinese authorities were able to apprehend dozens of wanted criminals at a festival in Qingdao, China last year with the use of facial recognition cameras.
Boston-based AI startup Neurala recently announced that it was working with Motorola Solutions to integrate image recognition AI into cameras to help police search for people of interest, such as criminal suspects and missing children. The company hopes that by placing recognition capabilities at the edge—in the cameras themselves—it will significantly reduce law enforcement response times and assist in finding suspects and victims in crowded and chaotic environments.
In a statement released to Design News, Max Versace, CEO of Neurala, said that what Amazon is doing with Rekognition is symptomatic of a trend as AI becomes more sophisticated and looks for more complex use cases. “Much of other players' emphasis is placed today on enabling inference—or classification—on the device. This is, in a way, old news,” Versace said. “Inference has been possible and viable on devices for a while, whereas the frontier in AI today is learning, not only inferencing, on the device.”
In regard to user privacy, Versace said the best protection for privacy is to allow AI on edge devices like cameras rather than letting all of the image processing be done on potentially vulnerable cloud-based systems. “Only on-device learning bears the promise of protecting user data—shipping your data and identity on the cloud is contrary to privacy,” he told Design News, “The future of an AI that does the job while respecting privacy—in particular when we are talking about a police force—requires learning on the device.”
To date, no laws have been put in place in the U.S. governing the use of real-time facial recognition by law enforcement. And we may not see a court judge the constitutionality of facial recognition AI until a widespread or high-profile use case emerges.
“People should be free to walk down the street without being watched by the government,” Cagle and Oker wrote in their blog. “By automating mass surveillance, facial recognition systems like Rekognition threaten this freedom, posing a particular threat to communities already unjustly targeted in the current political climate. Once powerful surveillance systems like these are built and deployed, the harm will be extremely difficult to undo.”
How do you feel about facial recognition being deployed by law enforcement? Share your thoughts in the comments.
Chris Wiltz is a Senior Editor at Design News covering emerging technologies including AI, VR/AR, and robotics.
North America's Premier Battery Conference.
Join our in-depth conference program with over 100 technical discussions covering topics from new battery technologies and chemistries to BMS and thermal management.
The Battery Show. Sept. 11-13, 2018, in Novi, MI. Register for the event, hosted by Design News’ parent company UBM.
[Main image source: Pixabay]