In this promotional video from Truleo, the company shows how its AI would have analyzed Miami-Dade police officers during the recent controversial traffic stop of NFL player Tyreek Hill. (Warning: video contains profanity)
"Who will watch the watchmen?" In the age of police body cameras, the answer may be "artificial intelligence."
After a decade of explosive growth, body cameras are now standard-issue for most American police as they interact with the public. The vast majority of those millions of hours of video are never watched — it's just not humanly possible.
For academics who study the everyday actions of police, the videos are an ocean of untapped data. Some are now using "large language model" AI's — think ChatGPT — to digest that information and produce new insights.
"For us, it's a game changer," says Jennifer Eberhardt, a psychology professor at Stanford whose work on race and crime won her a MacArthur "genius grant."
She leads a team of researchers who used AI to help review and analyze videos of nearly 600 traffic stops by Oakland police.
"We could look at the first 27 seconds of the stop, the first roughly 45 words that the officer spoke, and we could use this model to predict whether that driver was going to be handcuffed, searched or arrested by the end of the stop," she says.
The research found the encounters were more likely to escalate when officers started the stop by giving orders, rather than reasons for the interaction.
While academics are using AI from anonymized videos to understand larger processes, some police departments have started using it to help supervise individual officers — and even rate their performance.
"It's an early warning system to address not just bad behavior, [but] good behavior," says Nishant Joshi, chief of police in Alameda, California. When he took over as chief three years ago, he brought in a pilot version of Truleo, a system that analyzes automated transcriptions of body camera videos to assesses how officers perform.
On his computer, he looks up Truleo's assessment of a recent traffic stop. The AI has found the Alameda police officer performed with a "high degree of professionalism."
"'The reason for the stop was quickly explained to the occupants,'" Joshi reads from the AI's summary. "So it realized that the officer said something like, 'The reason I'm pulling you over is you were speeding, you gotta be careful, it's dangerous.'"
The system will also flag swearing or abusive language — by the officer or other people at the scene. Joshi says he especially appreciates that it can pick out instances in which members of the public swear at the officer and the officer remains professional, a situation that might otherwise go unappreciated by supervisors.
"I send out praise a lot," Joshi says. "And Truleo has a feature in there that you can snip out a certain portion of an interaction and send the officer a compliment on how they perform." Joshi believes the system makes it easier for supervisors to catch the bad habits of officers and reinforce the good, and is well worth the $36,000 annual price tag.
Larger departments pay more for the service. Truleo says 30 police agencies now use the product, including the New York Police Department.
AI's ability to mold officers' behavior has been tested by a new independent study led by Ian Adams of the University of South Carolina. It looked at Truleo's effects on officers and control groups at two agencies, the police department in Aurora, Colo. and the sheriff's office in Richland County, SC.
"The difference that we see in these findings is, you know, in [Aurora] it's driving the rate of low professionalism down, which is good," Adams says. "And [in Richland County], it's raising the rate of highly professional encounters, which is also good."
A pre-publication executive summary of the research says Truleo "nearly doubled the incidence of 'highly professional behavior'" among Richland deputies whose videos were monitored by Truleo, and who were able to interact with the system as it evaluated them.
Adams says he was also surprised by officers' apparent openness to being judged by AI. While some had misgivings, others liked the machine's impartiality.
But not every rollout of Truleo has gone smoothly.
"AI looking over us -- when does it stop?" says Mike Solan, president of the Seattle Police Officers Guild.
When media reports revealed early last year that the Seattle Police Department was trying Truleo, it took the union by surprise.
"They went behind our backs and rolled this thing out," Solan says. "They were indeed spying on us. And when we caught them, they panicked."
In a statement to NPR, the Seattle Police Department wrote, "SPD cancelled its contract with Truleo in early 2023, no longer has any relationship with the company, and is not paying for the product."
Solan says he's not opposed to Truleo per se, but he says how it's used is something that should be negotiated in the union contract. In November, the Seattle City Council set aside $250,000 for police to continue to use Truleo, but the department has yet to renew its relationship with the company.
Truleo's co-founder, Anthony Tassone, says Truleo works best in departments where officers are aware of the AI and can watch how it assesses them.
"Officers devour this information. They're in the car, they rewatch their footage. They're like athletes watching last night's game," Tassone says.
But that raises a question: will officers simply cater to the demands of the software? Adams says Truleo is clearly following a formula -- for instance, officers who use more than 25 words to explain something get points for "professionalism." He says some officers told the researchers they were purposely playing the game according to Truleo’s rules.
"For the research team, we're sort of left with a 'So what?'" he says. He calls it a philosophical question.
"Does it matter that there's not a true change of heart, versus an officer figuring out, 'Oh, this is just what I'm supposed to do?'" he asks. "It's something to consider in this brave new world, I guess."
300x250 Ad
300x250 Ad