Key Takeaways:
At the Oklahoma City police headquarters, Captain Jason Bussert recently demonstrated a new AI-powered software named Draft One, designed to create police reports from body camera audio.
The demonstration on May 31, 2024, highlighted how the technology developed by Axon—a leading supplier of police body cameras and Tasers—can produce a crime report in just eight seconds by analyzing the audio and radio chatter captured by body cameras worn by officers.
Arguably you can’t confront your accuser in this case… evidence with no origin is worthless
— Agent NaN (@thaxter) August 27, 2024
Sgt. Matt Gilmore of the Oklahoma City Police shared his experience using the AI tool, noting that it created a report that was more accurate and flowed better than what he could have written manually.
He was particularly impressed by the AI’s ability to capture details he had missed, such as another officer mentioning the color of a suspect’s car. For many officers, the software presents an opportunity to save time on the paperwork that traditionally follows every incident.
The Oklahoma City Police Department is among several in the United States testing Draft One to generate initial drafts of incident reports. Officers who have tried the technology speak favorably of its ability to streamline their workload, allowing them more time to focus on active policing rather than sitting behind a desk.
Although he was interrogating his own collected data, using AI.
— nevso (@nevso) August 27, 2024
Rick Smith, the founder and CEO of Axon, remarked that Draft One has received a very positive reaction from officers, as it alleviates the repetitive and time-consuming task of writing reports. According to Smith, many officers did not sign up for police work to spend hours on data entry.
However, the introduction of AI tools like Draft One is not without its concerns. Prosecutors, legal scholars, and police watchdogs have expressed reservations about the potential impact of AI-generated reports on the criminal justice process.
A core worry is how these reports could affect court proceedings. Prosecutors, for example, are uneasy about officers testifying in court about reports they did not write themselves.
To address these concerns, Axon has implemented a step where officers must confirm that the report was generated using AI, adding a layer of accountability.
In Oklahoma City, police officials have decided to limit the use of Draft One to minor incidents that do not involve arrests, felonies, or violent crimes. This cautious approach follows advice from local prosecutors who suggested avoiding AI-generated reports for cases with potentially severe legal consequences.
In contrast, other police departments have adopted a more open policy regarding the use of Draft One. In Lafayette, Indiana, for instance, Police Chief Scott Galloway has allowed all officers to use the software for any report, noting that it has been “incredibly popular” since its introduction earlier this year.
Similarly, in Fort Collins, Colorado, officers are free to use Draft One for any type of report, though they have identified limitations when using it in high-noise areas like the city’s downtown bar district.
It’s simply a transcription of voice to text.
An AI that’s been around for quite a while.
As a first draft I’m not sure it’s any worse than a biased police officer’s memory of who said/did what when.
When they say witnesses can be unreliable, it’s the same for cops!— Lisa Zed Respect Unceded Gumbaynggirr Land (@LisaZed2) August 27, 2024
Beyond legal implications, the use of AI in policing has raised deeper societal concerns. Activists such as Aurelius Francisco, a community advocate in Oklahoma City, argue that these technologies could further entrench racial biases within the criminal justice system.
Francisco, who co-founded the Foundation for Liberating Minds, points out that automating the process of writing reports could make it easier for police to conduct surveillance and engage in discriminatory practices, disproportionately affecting Black and brown communities.
AI has already made its way into various aspects of policing, including using algorithms to read license plates, recognize suspects’ faces, and even predict where crimes might occur.
However, the use of AI to generate police reports is a relatively new development. As such, there are few, if any, established guidelines to regulate its use.
I suspect there’ll be a few cases tossed.
— Greg Michaels (@KootenayGreg) August 26, 2024
Draft One, for example, relies on the same generative AI technology that powers ChatGPT, developed by OpenAI.
According to Noah Spitzer-Williams, who manages Axon’s AI products, the company has configured its technology version to minimize creativity and adhere strictly to facts.
This adjustment is aimed at preventing the “hallucinations” or fabricated details that sometimes occur with generative AI.
As AI tools like Draft One become more integrated into everyday policing, they are already beginning to influence how officers document incidents. Some officers are becoming more verbal in narrating events while wearing body cameras, knowing that the AI will use this audio to generate reports.
If it’s accurate, and it may very well be, it may be acceptable.
— Prairie Paul (@PaulDoroshenko) August 26, 2024
This adaptation suggests a shift in how officers approach incident documentation, focusing more on ensuring that all necessary details are captured accurately.
Despite the apparent benefits, there are concerns about how much reliance on AI could change the nature of police work. Andrew Ferguson, a legal scholar from American University, is writing what is expected to be the first law review article on this emerging technology.
He emphasizes the need for caution, noting that while human-generated police reports have their flaws, it remains unclear which is more reliable: a report written by a person or one generated by AI.
Will memory experts be able to challenge this in court, as surely the physical act of writing police reports should correlate to a better memory of events?
— Bryan McCann (@McCanopener1) August 26, 2024
Ferguson worries that the ease of using AI might lead officers to be less meticulous in their documentation, potentially compromising the quality and accuracy of police reports.
As discussions about the role of AI in law enforcement continue, the legal, ethical, and practical implications remain debatable. Without clear regulations and more robust public discourse, the introduction of AI-generated reports could affect the future of policing and the broader criminal justice system.
they are probably more accurate than what they usually make up.
— Mark Bischof (@mobischof) August 26, 2024
The integration of such technology must be carefully managed to ensure it serves to enhance justice rather than compromise it.
For more news and trends, visit AI News on our website.