Trending Topics

AI police report writing ahead: Proceed with caution

Faster police reports, less paperwork — but at what cost? The risks and rewards of AI in law enforcement

Artificial Intelligence AI, road sign with digital words with storm clouds. Concept for danger and warning of technology

Bill Oxford/Getty Images

This article is based on research conducted as a part of the CA POST Command College. It is a futures study of a particular emerging issue of relevance to law enforcement. Its purpose is not to predict the future; rather, to project a variety of possible scenarios useful for planning and action in anticipation of the emerging landscape facing policing organizations.

The article was created using the futures forecasting process of Command College and its outcomes. Managing the future means influencing it — creating, constraining and adapting to emerging trends and events in a way that optimizes the opportunities and minimizes the threats of relevance to the profession.

By Lieutenant Ryan Davis

“Hey, Joe, thanks for being honest with us for once.” Detective Laci Oskins steps out of the interrogation room, leaving her partner to finish up with the suspect. She walks over to her desk, unclips her body-worn camera from her belt and docks it at her computer.

The night started many hours ago with a late-night phone call about a street robbery in which the victim ended up in the hospital, pretty badly hurt. She was tough and did not want to give up her purse but paid the price. Laci and her partner met with the victim at the hospital and reviewed some of the evidence. Luckily Laci’s partner recognized the suspect from the surveillance video they obtained. A quick facial recognition search of the surveillance video confirmed where the suspect had last been contacted. Laci was able to locate and arrest him, recovering the victim’s purse nearby.

Now it’s time to write the report. The suspect will be due in court in less than 48 hours. Usually, Laci would’ve started this process with a cup of coffee, knowing it would take her several hours to complete. But now Laci’s body camera lets out a long “beeeeeep,” signaling it’s finished downloading.

Laci quickly types in her password and opens up the records management software. Opening the report prompt, Laci smiles to herself — she can’t believe how much easier this has gotten. The artificial intelligence-driven software has accurately formatted a clear summary of the incident with everything Laci did that night. The report is clearly broken down with headings for each step in the investigation.

The AI software has flagged several areas of the report for Laci to review and clarify. Laci quickly adjusts sentences as she scans and verifies the information. Satisfied with the thoroughness and accuracy of the report, she enters her password to verify the report has been reviewed and is complete before forwarding it to her sergeant. She steps out of the station into the cool night air and thinks back to when she used to write her reports by hand.

This story reflects what will be reality for law enforcement in the near future. Police reports are an important element of the work law enforcement does. They are relied upon by individuals, businesses, insurance companies, district attorneys and others as an accurate record of an event or investigation. [1] The public must be able to trust what they chronicle, whether to document a crime for an insurance claim or to prosecute the perpetrator. As we allow AI to begin to automate this process, we must ensure we retain human oversight, work in conjunction with the overall criminal justice system and retain transparency with the public to produce accurate reports that are free from bias.

Benefits include time savings

The time it takes to accurately prepare a police report can vary widely across the United States. According to a survey responded to by more than 11,000 police professionals across the country, officers can spend 3–4 hours a day on paperwork. [2,3] AI, however, promises to free officers from that burden and allow departments to reallocate that time to more important policing functions.

Early AI-driven report writing platforms are already in use. Axon’s Draft One and other similar products are built upon the same technology as ChatGPT and other large language models (LLMs). [4] LLMs are developed using a wide range of internet text; however, unlike humans, they cannot think or comprehend content. [5] LLMs use a type of machine learning called deep learning. Deep learning models can essentially train themselves to recognize distinctions without human intervention, although some human fine-tuning is typically necessary. [6]

Current systems have been criticized for “hallucinations,” a catch-all term for the common errors like fabricated facts or otherwise incorrect information often found in synthetic text. [7] Axon has said it has more control over Draft One to ensure it focuses more on the facts without the need to create more information or data. Rick Smith, the CEO of Axon, has touted the company’s ability to decrease the technology’s “creativity” to ensure it focuses on the right content and limits these “hallucinations.”[7] Products like Draft One currently only process audio to develop police reports. Axon has experimented with computer vision; however, that capability needs further development. [8]

Agencies are currently using AI report-writing technology in many ways. Lafayette, Indiana, and Fort Collins, Colorado, are using report-writing tools on all levels of investigations. In Oklahoma City, police officials showed the software to local prosecutors, who advised caution before using it on high-stakes criminal cases. This resulted in Oklahoma City not using it for arrests, felonies or violent crimes. [9] Still, many more agencies are testing products to see how they can improve report writing efficiency to save officers’ time. [10]

The biggest question with the use of AI in report writing is how the courts and public will react to this change. Savvy defense attorneys will call into question the legitimacy of reports and raise doubt around inconsistencies that may arise from the use of AI. Activists and watchdog groups will question the accuracy of the reports and security of personal information.

Be transparent to gain trust

Public perception of artificial intelligence will be a key factor in how this technology is used in law enforcement. People see artificial intelligence being utilized throughout many areas of their lives but are not confident the technology will be used appropriately. In a 2023 Pew research poll, 52% of respondents said they felt more concerned than excited about artificial intelligence uses in their daily lives. [11]

Artificial intelligence has been criticized for being biased based on the data fed into it. It has also been shown to act irrationally or hallucinate in different scenarios. This makes it important for law enforcement to stress the human oversight component as artificial intelligence is added to the various systems within policing. We can trust that AI will get the report near completion, but a human will always be needed to verify that it’s complete and accurate before submitting it.

Police reports often contain sensitive personal information. Any products we use must meet the standards for protecting that information. Departments should also be transparent with community stakeholders about their uses of artificial intelligence. This will help to build trust with the use of the technology.

The court system

Officer verification and additional review steps will be critical to ensure reports are complete and accurate to the memory of the officers authoring them. Axon’s Draft One has different settings to ensure this happens.

There is a setting that allows administrators to intentionally insert obvious errors into the automatically generated reports, which users must then manually correct before saving the report, essentially forcing them to carefully review the draft for accuracy before submitting it.

In other sections, the system will create prompts for additional information that must be addressed before continuing.

A third option is to require a percentage of words to be changed or added to the report before it allows the officer to save the final report. All these settings function to ensure the officer is reading through the generated report and not submitting it without proper review and verification. AI will get the report most of the way done, but each officer would still be required to make the report personal to themselves and the incident. Still, as with human-generated reports, errors and inconsistencies will find their way in.

Dr. Matthew Guariglia, a policy analyst for the Electronic Frontier Foundation, said he was concerned these reports were “going to destroy the ability to cross-examine officers. Because if an officer is caught in a lie on the stand, they can always just say, ‘Well, the AI wrote that.’” [12] That concern is already being expressed by prosecutors. Recently, the King County, Washington prosecuting attorney’s office circulated a memo to Seattle-area law enforcement agencies saying it would not accept any police reports produced using artificial intelligence. The district attorney cited the potential for AI hallucinations and other unintended errors, providing an example of a previous case where AI indicated an officer was at the scene who was never there. [13]

Another issue that arose recently was the ability of the agency and others to know whether AI was used in the creation of a police report. Through public records requests, Politico found most police departments have no way of differentiating AI-generated police reports from those written by human officers. [10] The inability to discern whether a report was written by an officer or AI raises concerns among experts who are already skeptical of the technology’s accuracy. To address the issue, in Mt. Vernon, Illinois, the police chief required his officers to disclose whether Axon’s Draft One was used when their reports were produced. [10] Other police agencies will have to individually or collectively resolve this aspect of AI reports before they can be accepted by courts and prosecutors.

Agencies considering piloting these new systems should take a cautious approach. Discussions with district attorneys’ offices and stakeholders in the community should be held prior to implementation. Utilizing the systems on minor reports and incidents that do not have the potential to jeopardize court cases is a prudent option. To ensure the costs of this transition are worth it, agencies testing these systems should also assess if they are as effective at improving the time it takes to draft reports as advertised.

Lastly, officers should disclose that AI was used in any report authored so others will know and can consider the veracity of the report on that basis.

Conclusion

Officers must be able to testify to the accuracy of reports generated on incidents they investigate. Human oversight will be critical in developing trust in these systems. However, as more human oversight is required of the officer, efficiency and time savings may diminish.

Minimal legislation exists to control how artificial intelligence is currently used. We will likely see more regulation in the future, and there is also the potential of court rulings to call into question reports or other documents created using artificial intelligence. It’s advisable to be upfront with any use of artificial intelligence in the completion of reports so it does not jeopardize the court process. Law enforcement leaders looking to adopt this technology should be proactive in working with their DAs’ offices and any vendors to ensure these issues and others are addressed as they pilot programs in their jurisdictions.

References

1. Graham G. The importance of police reports. Lexipol. June 2021.

2. Geremia M. IACP 2019 preview: The future of police reporting. Police1. October 2019.

3. Nuance Dragon Law Enforcement. 2019 role of technology in law enforcement paperwork annual report. October 2019.

4. Axon. Rewrite report writing.

5. Potts J. The impact of large language models on police report writing and beyond. Police1. February 2024.

6. Cloudflare. What is a large language model (LLM). October 2024.

7. Dupree MH. Cops say hallucinating AIs are ready to write police reports that could send people to prison. Futurism. August 2024.

8. Toure M. Police are using AI to write crime reports. What could go wrong. ZDNet. September 2024.

9. Murphy, Sean, and Matt O’Brien. Police officers are starting to use AI chatbots to write crime reports. Will they hold up in court. Associated Press. August 2024.

10. Ng A. Did an AI write up your arrest. Hard to know. Politico. September 2024.

11. Faverio M, Tyson A. What the data says about Americans’ views of artificial intelligence. Pew Research Center. November 2023.

12. Lewis S. How artificial intelligence is changing the reports US police write. The Guardian. October 2024.

13. Bishop T. Internal memo: Don’t use AI for police reports, prosecutor tells Seattle-area law enforcement. GeekWire. September 2024.

About the author

Ryan Davis has been a dedicated member of the Covina (California) Police Department since 2004 and currently serves as a patrol lieutenant. He holds a bachelor’s degree in sociology from Chapman University and a master’s degree in law enforcement and public safety leadership from the University of San Diego. Throughout his career, Davis has excelled in various roles, including field training officer, detective and SWAT team member. He is currently enrolled in California POST Command College Class 73. In his spare time, Davis enjoys golfing, traveling and spending quality time with his family.

|NEXT: Generative AI in law enforcement: Questions police chiefs need to answer