You are currently viewing Community members react to approval of Aurora police's AI facial recognition tool

Community members react to approval of Aurora police's AI facial recognition tool



AURORA, Colo. (KDVR) — Police officers in Aurora are set to have a new tool at their disposal soon: City councilmembers passed an ordinance allowing the city’s police department to use artificial intelligence for facial recognition. Some community members are concerned.

Some council members said the new technology will help police identify suspects.
Some community members worry innocent people could end up in the crosshairs.

Aurora City Council members approved a measure allowing city police to use AI facial recognition software to produce tips and leads in a vote of 5-2. Police said they plan to use the technology to find suspects and follow up on tips, not for arrests or immigration enforcement. Some community members are concerned.

“To package this new AI system as if it’s going to protect those that it is supposed to serve, we’re not buying that. We are not buying that. This is going to be an accelerator to the discriminatory practices that we are already seeing. This is not all of a sudden going to be used to ensure that people that are not involved or guilty of a crime are protected; this actually is doing the opposite,” said MiDian Holmes, CEO of Epitome of Black Excellence and Partnership.

Police in Aurora are still under a consent decree and some community members think back to police involved incidents like the death of Elijah McClain or a 2020 incident of a family being pulled over and placed in handcuffs after police mistakenly identified their car in connection with a crime. Supporters of the change believe the new tech will lessen those incidents.

“We really need to rebuild this trust. I am hoping that this is a tool that will do that so that we aren’t pulling the wrong people out of vehicles and we aren’t incorrectly identifying people. That is what I truly believe this is going to be about. If I feel at any point that is not the case and this is some sort of an overreach of people’s rights, I will certainly be just as vocal,” said At-Large City Councilwoman Danielle Jurinsky.

ACLU of Colorado sent FOX31 a statement about the matter:

“APD has a history of racially biased policing and excessive use of force, and the use of facial recognition would not positively impact those actions. In fact, it could worsen the racial inequities seen within police interactions in Aurora. 

“Facial recognition systems consistently misidentify women, children, older people, and especially people of color, at higher rates. Given APD’s history of disproportionate stops and arrests of Black residents, introducing a system with known bias threatens to deepen existing inequities.

“When law enforcement can match faces in public spaces or from video feeds, this creates a surveillance regime where people may self-censor, avoid demonstrations, or avoid public life. In Aurora — a city with active civic engagement, diverse languages, and cultural communities — this risk is real. Aurora is still repairing trust between communities and police and should resist adding tools that increase surveillance capacity rather than reduce invasive practices. People in Aurora deserve policing that is effective, equitable, and trusted by all residents. Introducing facial recognition will not help APD rebuild those critical pillars of policing in Aurora’s communities.” –Anaya Robinson, ACLU of Colorado

The vendor for the software, Clearview AI, just settled a $51.75 million lawsuit for violating Illinois’ privacy laws. Now that the measure has been passed, police can start implementing it in 30 days.

Leave a Reply