The city of Orlando, Fla., says it has ended a pilot program in which its police force used Amazon's real-time facial recognition — a system called "Rekognition" that had triggered complaints from rights and privacy groups when its use was revealed earlier this year.
Orlando's deal to open part of its camera systems to Amazon was reported by NPR's Martin Kaste in May, after the ACLU noticed that an Amazon Rekognition executive mentioned the city as a customer.
On Monday, the ACLU of Florida wrote a letter to Mayor Buddy Dyer and the Orlando City Council, demanding that the city "immediately" shut down "any face surveillance deployment or use by city agencies and departments."
On the same day, Orlando city and police officials issued a joint statement saying that the test of how its officers might use the Rekognition technology ended last week.
The city added, "Staff continues to discuss and evaluate whether to recommend continuation of the pilot at a further date," adding that "the contract with Amazon remains expired."
Orlando's is believed to be the first police force in the U.S. to try out a real-time facial recognition system; other agencies have used the software mainly to sift through crime scene images and compare the faces in them to mug shot photos.
Orlando police say the test was limited to only a fraction of the city's cameras, and that during the pilot, the department tested the system by tracking its own officers.
The Rekognition deal with Orlando caused a stir, and it prompted Amazon to issue a clarification about the level of engagement, after one of its executives described Orlando's pilot program in a speech delivered in South Korea in early May.
Specifically, the company said that Ranju Das, who leads the Rekognition unit, had overstated the current use and capability of the system in Orlando when he said:
"City of Orlando is a launch partner of ours. It's a smart city; they have cameras all over the city. The authorized cameras are then streaming the data ... we are a subscriber to the stream, we analyze the video in real time, search against the collection of faces that they have."
Police could use the system to track "persons of interest," Das said, citing the case of people attending high-profile public events.
As criticism poured in over the idea that Orlando police could possibly use real-time facial analysis in public spaces without notice or debate, Amazon said, "It's not correct that they've installed cameras all over the city or are using in production." The company also apologized for any confusion or misunderstandings about the use of the system.
Here's how the Orlando Police Department describes the pilot program, in an email to NPR on Tuesday:
"There are eight video streams (from existing City-owned cameras) Amazon would have access to through the pilot program and it also includes photos of the faces of seven OPD officers who volunteered to have their images used in the pilot."
In its letter attacking the Rekognition program, the ACLU wrote, "These systems enable the mass location tracking of residents without criminal suspicion. Amazon's product is primed for such abuse."
As member station WMFE reports, "The ACLU asked the city council to pass a resolution or ordinance putting an end to the program."
The letter came on the heels of a similar complaint last week, when 10 Orlando-based community groups joined the Arab American Institute to ask Police Chief John Mina to shut down the Amazon program.
In that letter, the groups said, "The context of increased ICE raids, FBI targeting of Black Lives Matter activists, the securitizing of communities through Countering Violent Extremism (CVE) initiatives, racial disparities in the use of police force, and the President's Muslim Ban has led to increased levels of distrust both within our community and across the nation."
Five Rekognition cameras were at police headquarters, and three others were in Orlando's downtown, according to the letter, whose signatories ranged from immigration advocates to the Orange County Classroom Teachers Association and the Sikh American Legal Defense and Education Fund.
Using the facial-recognition system, the groups wrote, would likely increase suspicion and reduce freedoms.
As NPR reported in May, "There are no laws explicitly barring law enforcement from using real-time facial recognition, and the constitutionality has not been tested by higher courts."
300x250 Ad
300x250 Ad