Police departments all over the world are increasingly using body-worn cameras to extend public trust and accountability. But this created huge amounts of knowledge, about 95% none of which is ever reviewed and even seen.
Enter company like Axon, Polis solutions And Truleo. These firms market artificial intelligence (AI) tools to investigate the info generated by body-worn cameras and other police technologies.
Some police departments within the United States previously began attempts of those tools before abandoning them for privacy reasons.
Truleo told The Conversation that police in Australia are actually using its technology, but didn’t name a selected department. However, when The Conversation asked Australian police departments whether or not they were using Truleo's software or considering using it, all however the Queensland Police Service said they weren’t.
In a press release, a Queensland Police Service spokesperson said it was currently conducting an AI test using “various technologies” as a part of its work to combat domestic and family violence. The spokesperson added: “Once the trial is complete, an in depth assessment can be carried out before the QPS considers future options for deploying the technology.”
But AI won’t solve the challenges facing police – a minimum of not alone.
The unfulfilled promise of body-worn cameras
The increasing use of body-worn cameras by law enforcement lately follows a series of high-profile cases involving police use of force. In Australia, for instance, it’s a police officer is currently in court for the involuntary manslaughter of a 95-year-old great-grandmother with a Taser.
There is a debate about whether body-worn cameras actually make the behavior of cops more transparent and accountable.
Some experts have said their effectiveness is uncertain. Others have said they’re a failed attempt at reform.
These feelings were confirmed by a Large study published earlier this yr.
The study examined the usage of body-worn cameras in response to domestic and family violence in Australia. It acknowledged their potential usefulness, but in addition showed what the info from these technologies may very well be not be used to support victims and survivors. This stems from more fundamental problems with the best way police treat victims and survivors.
The diverse uses of AI in police work
Police have long used AI as a part of their work.
For example, in 2000, the New South Wales Police launched a program that used data analytics to predict which individuals were liable to committing against the law to be able to improve police surveillance.
A report from the Law Enforcement Conduct Commission later revealed that this system disproportionately targeted Indigenous youth, who were subsequently subjected to increased surveillance and increased arrests for minor crimes. This led to NSW Police terminating this system in 2023.
The Queensland Police Service has also proposed a program that uses AI technologies to predict the danger of domestic and family violence.
But experts have pointed to possible unintended consequences, including the criminalization of victim-survivors.
Companies like Truleo, which provides police with AI tools to investigate body-worn camera footage, say these tools improve police “professionalism.”. However, it just isn’t clear whether what’s measured and assessed as “professionalism” correlates with officers’ core duties and responsibilities.
In fact, the Seattle Police Department within the US terminated its contract with Truleo, despite admitting that it was a “promising” process.
This happened after the invention a case of unprofessional behavior during which the police union cited the usage of camera footage as a violation of the officer's privacy.
The need for structural reform
AI tools could help police manage and analyze data from body-worn cameras. Their value relies on several conditions.
First, police must thoroughly evaluate all AI tools to make sure they’re fit for purpose within the local context. Many of those technologies are developed overseas and trained on data that has linguistic characteristics akin to accents, inflections and insults that will not be common in Australia.
Second, police – and the businesses that provide AI data evaluation tools – also must be transparent about how they use body-worn camera footage. In particular, they have to communicate where, how and in line with what precautions data can be processed and stored.
Finally – and most significantly – the usage of AI technologies by the police mustn’t replace organizational and structural reforms.
Police must examine the impact of behaviors and processes which have led to unequal policing practices. AI technologies will not be an answer to those underlying dynamics.
Without an understanding of the systemic structures that result in inequities within the criminal justice system, police can be unprepared for the impact of integrating AI technologies into their work. Otherwise, these technologies usually tend to exacerbate existing injustices and inequalities.
In short, questions on AI mustn’t just be about technology, but concerning the legitimacy of the police.