US Police Rely on AI Chatbots for Crime Reports Despite Controversial Bias Issues!

  • Editor
  • August 29, 2024
    Updated
us-police-rely-on-ai-chatbots-for-crime-reports-despite-controversial-bias-issues

Key Takeaways:

  • Efficiency vs. Accountability: AI tools like Draft One significantly reduce the time police officers spend on paperwork, but concerns remain about accountability and the quality of AI-generated reports.
  • Potential Bias in AI Tools: The integration of AI into policing raises issues of racial bias, particularly regarding its use by departments already criticized for disproportionate treatment of minority communities.
  • Lack of Regulatory Frameworks: There is a lack of clear regulations and guidelines for the use of AI-generated police reports, making it a controversial addition to law enforcement practices.
  • Impact on Legal Proceedings: The reliability of AI-generated reports in court is a critical concern, especially given the potential for AI to generate inaccurate or biased information.

At the Oklahoma City police headquarters, Captain Jason Bussert recently demonstrated a new AI-powered software named Draft One, designed to create police reports from body camera audio.

The demonstration on May 31, 2024, highlighted how the technology developed by Axon—a leading supplier of police body cameras and Tasers—can produce a crime report in just eight seconds by analyzing the audio and radio chatter captured by body cameras worn by officers.


Sgt. Matt Gilmore of the Oklahoma City Police shared his experience using the AI tool, noting that it created a report that was more accurate and flowed better than what he could have written manually.

He was particularly impressed by the AI’s ability to capture details he had missed, such as another officer mentioning the color of a suspect’s car. For many officers, the software presents an opportunity to save time on the paperwork that traditionally follows every incident.

“It was a better report than I could have ever written, and it was 100% accurate. It flowed better,” Gilmore said.

The Oklahoma City Police Department is among several in the United States testing Draft One to generate initial drafts of incident reports. Officers who have tried the technology speak favorably of its ability to streamline their workload, allowing them more time to focus on active policing rather than sitting behind a desk.


Rick Smith, the founder and CEO of Axon, remarked that Draft One has received a very positive reaction from officers, as it alleviates the repetitive and time-consuming task of writing reports. According to Smith, many officers did not sign up for police work to spend hours on data entry.

“They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate,” he said.

However, the introduction of AI tools like Draft One is not without its concerns. Prosecutors, legal scholars, and police watchdogs have expressed reservations about the potential impact of AI-generated reports on the criminal justice process.

A core worry is how these reports could affect court proceedings. Prosecutors, for example, are uneasy about officers testifying in court about reports they did not write themselves.

There is a risk that an officer might disclaim full responsibility for the contents of a report by stating, “The AI wrote that, I didn’t.” Smith also echoed this concern, noting that “They never want to get an officer on the stand who says, well, ‘The AI wrote that, I didn’t.’”

To address these concerns, Axon has implemented a step where officers must confirm that the report was generated using AI, adding a layer of accountability.

In Oklahoma City, police officials have decided to limit the use of Draft One to minor incidents that do not involve arrests, felonies, or violent crimes. This cautious approach follows advice from local prosecutors who suggested avoiding AI-generated reports for cases with potentially severe legal consequences.

“So no arrests, no felonies, no violent crimes,” said Captain Jason Bussert, who manages information technology for the 1,170-officer department.

In contrast, other police departments have adopted a more open policy regarding the use of Draft One. In Lafayette, Indiana, for instance, Police Chief Scott Galloway has allowed all officers to use the software for any report, noting that it has been “incredibly popular” since its introduction earlier this year.

Similarly, in Fort Collins, Colorado, officers are free to use Draft One for any type of report, though they have identified limitations when using it in high-noise areas like the city’s downtown bar district.


Beyond legal implications, the use of AI in policing has raised deeper societal concerns. Activists such as Aurelius Francisco, a community advocate in Oklahoma City, argue that these technologies could further entrench racial biases within the criminal justice system.

Francisco, who co-founded the Foundation for Liberating Minds, points out that automating the process of writing reports could make it easier for police to conduct surveillance and engage in discriminatory practices, disproportionately affecting Black and brown communities.

“The fact that the technology is being used by the same company that provides Tasers to the department is alarming enough,” said francisco. He emphasized that automating those reports will “ease the police’s ability to harass, surveil and inflict violence on community members. While making the cop’s job easier, it makes Black and brown people’s lives harder.”

AI has already made its way into various aspects of policing, including using algorithms to read license plates, recognize suspects’ faces, and even predict where crimes might occur.

However, the use of AI to generate police reports is a relatively new development. As such, there are few, if any, established guidelines to regulate its use.


Draft One, for example, relies on the same generative AI technology that powers ChatGPT, developed by OpenAI.

According to Noah Spitzer-Williams, who manages Axon’s AI products, the company has configured its technology version to minimize creativity and adhere strictly to facts.

“We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have,” he explained.

This adjustment is aimed at preventing the “hallucinations” or fabricated details that sometimes occur with generative AI.

As AI tools like Draft One become more integrated into everyday policing, they are already beginning to influence how officers document incidents. Some officers are becoming more verbal in narrating events while wearing body cameras, knowing that the AI will use this audio to generate reports.


This adaptation suggests a shift in how officers approach incident documentation, focusing more on ensuring that all necessary details are captured accurately.

“As the technology catches on, officers will become more and more verbal in describing what’s in front of them,” said Bussert.

Despite the apparent benefits, there are concerns about how much reliance on AI could change the nature of police work. Andrew Ferguson, a legal scholar from American University, is writing what is expected to be the first law review article on this emerging technology.

He emphasizes the need for caution, noting that while human-generated police reports have their flaws, it remains unclear which is more reliable: a report written by a person or one generated by AI.


Ferguson worries that the ease of using AI might lead officers to be less meticulous in their documentation, potentially compromising the quality and accuracy of police reports.

“I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing,” he said.

As discussions about the role of AI in law enforcement continue, the legal, ethical, and practical implications remain debatable. Without clear regulations and more robust public discourse, the introduction of AI-generated reports could affect the future of policing and the broader criminal justice system.


The integration of such technology must be carefully managed to ensure it serves to enhance justice rather than compromise it.

For more news and trends, visit AI News on our website.

Was this article helpful?
YesNo
Generic placeholder image

Dave Andre

Editor

Digital marketing enthusiast by day, nature wanderer by dusk. Dave Andre blends two decades of AI and SaaS expertise into impactful strategies for SMEs. His weekends? Lost in books on tech trends and rejuvenating on scenic trails.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *