Trending Topics

Automated report writing: Benefits and risks for police

Adding AI to the process can improve efficiency, but beware of these issues

Businessman use artificial intelligence AI technology for enhanced work efficiency data analysis and efficient tools, Unlocking work potential with AI solutions chatbot help solve work problems.

GamePH/Getty Images

Automated report writing is the use of artificial intelligence (AI) to generate police reports based on transcriptions, data and templates. It can use bodycam footage to create reports by transcribing audio, analyzing video for relevant details, extracting key facts, matching them with a report template and generating a draft. A human must then review and edit the draft for accuracy and completeness. Some systems add random obscure information officers must catch to ensure the report has been reviewed. It is intended to save time, reduce errors and improve consistency in police documentation.

Many police departments are adopting automated report writing, citing benefits that include reducing the workload and stress associated with manually writing lengthy and complex reports; enhancing the accuracy and quality of reports by minimizing human errors, biases and inconsistencies; and increasing the efficiency and timeliness of report delivery and review through standardized formats. Additionally, it facilitates the analysis and sharing of data across units and agencies, improves the transparency and accountability of police actions by providing clear and detailed records, and allows officers to engage more with the community and allocate additional time for service calls.

Critics note, however, that automated report writing also poses some challenges and risks for officers, such as the possibility of errors, falsifications and compromised security.
The critical issue is how police departments can leverage technology to minimize report writing time, fulfill community obligations with increased officer presence, and still generate ethical, accountable and fair police reports while protecting officers’ careers. This article will discuss the pros and cons of automated report writing and how police departments can navigate it.

Recent ACLU report on use of AI to draft police reports

In 2024 the ACLU produced a comprehensive report providing its perspective on the use of AI to generate police reports. This report highlighted significant concerns, including the potential for biases, inaccuracies and privacy issues inherent in AI systems. It urged police departments to reconsider the adoption of these technologies, emphasizing the importance of accountability and risks of perpetuating systemic biases.

Areas of concern included:

  • Fundamentalness to the justice system: Police reports are essential documents that influence investigations, prosecutions and public records. They often serve as the only official account of incidents, providing a critical foundation for the justice process.
  • Potential biases and inaccuracies: The ACLU argues against the use of AI in drafting police reports due to the risk of biases and inaccuracies. AI systems can fabricate facts or reflect societal prejudices, leading to unreliable and potentially harmful outcomes.
  • Lack of traceability: AI-generated reports may lack accountability, making it challenging to trace errors or biases back to their source. This can complicate efforts to hold officers accountable for their actions, undermining trust in the justice system.
  • Privacy concerns: The integration of AI in law enforcement raises substantial privacy issues, particularly regarding data handling and surveillance. The use of body-camera transcripts for AI-generated reports can impact the accuracy of evidence and integrity of officers’ memories.
  • Perpetuation of racial biases: There are concerns AI could perpetuate existing racial biases in policing, leading to unjust outcomes. AI systems can absorb and reflect the racism, sexism and other biases present in the data they are trained on.
  • Unreliable and quirky technology: AI technology is often unreliable and prone to making up facts. It can also absorb and reflect societal biases, resulting in errors or biased interpretations of events. These issues highlight the need for cautious implementation and thorough oversight.
  • Impact on evidence accuracy: Using body-camera transcripts for AI-generated reports can affect the accuracy of evidence and integrity of officers’ memories. Human memory is malleable, and subsequent information about an event can alter a person’s recollection of what took place.
  • Need for public understanding: Ensuring transparency and accountability in AI-generated reports is crucial. The public must understand how these reports are created, and defendants in criminal cases must be able to interrogate the evidence against them to ensure a fair trial.
  • Accountability in discretionary actions: AI-generated reports might remove accountability around the use of discretionary power by police officers. Written justifications for actions like stops, frisks and searches are reviewed by supervisors to identify potential misuse of authority. AI could obscure this process, making it harder to detect and address abuses of power.

Additional risks of automated report writing

While the ACLU report discusses Brady vs. Maryland, the 1963 precedent that requires exculpatory evidence to be disclosed to defendants, it does not address the potential repercussions for officers who submit reports containing false information, which could result in their being deemed untrustworthy and ending their careers. For example, reliance on AI systems can inadvertently lead to clerical errors for which officers are ultimately held accountable. The introduction of potential vulnerabilities, such as an AI “skeleton key” that bypasses safety measures, could undermine the security and integrity of police reports by allowing unauthorized data manipulation. Additionally, ethical and legal concerns arise regarding data ownership, access and the use of these systems. Officers might save time by typing and verifying their reports directly, rather than using an automated system that requires review and correction.

In Oklahoma City, Oklahoma, a police sergeant shared their experience with AI-generated reports, noting the AI documented details they did not initially observe, such as the color of a suspect’s car. [1] While this can enhance the accuracy of reports, it also raises concerns about the reliability of AI in capturing all relevant details accurately and the potential for officers to overlook critical information during review.

King County, Washington took a more cautious approach by issuing a moratorium on AI-generated reports due to concerns about errors and inaccuracies. [2] The King County prosecuting attorney’s office highlighted a case where an AI-assisted report incorrectly included a reference to an officer who was not present at the scene. This error led to an officer being placed on the Brady list. [3] Such incidents underscore the potential risks of relying on AI for critical documentation and the importance of thorough human review.

Transcriptions are not police reports

Transcriptions document what was said during incidents for general purposes, similar to bodycam footage. They are distinct from official police reports, which require analysis, precision and adherence to legal standards. Often, transcriptions do not follow the appropriate format, style and tone needed for police reports. They may include irrelevant or incomplete information and lack a proper structure, potentially causing confusion. Additionally, they may not address specific questions, issues and objectives relevant to police reports and often do not include necessary evidence, reasoning or recommendations. Therefore, they should not be considered police reports.

Transcriptions are useful for documenting routine incidents such as traffic violations and minor complaints, thereby conserving time and resources. They offer a sufficiently accurate record of the events and statements made, which aids in accountability. However, more complex situations, such as arrests and use-of-force incidents, necessitate detailed reports that include analysis and legal justifications — something transcriptions alone cannot provide. It is essential for officers to differentiate between transcriptions and reports and label them appropriately to prevent confusion. Agencies should clearly mark narratives as either LLM-generated transcriptions or officer-authored reports, ensuring they are updated or supplemented as necessary.

Agency obligations

Police agencies must ensure public and officer safety, uphold the law and maintain trust. AI technology for automated transcription can aid these goals but requires careful implementation.

Be open about AI use and its capabilities, limitations and potential biases. Inform stakeholders, including the public, about its impact on report quality and accuracy. Provide channels for feedback, complaints and corrections to ensure continuous improvement and accountability.

Use AI to reduce time on routine tasks, thereby freeing officers for more complex duties and community engagement. This technology can streamline processes such as transcribing audio recordings and filling out forms, significantly improving productivity. Consequently, officers can focus more on crime prevention and proactive policing efforts.

AI can alleviate the stress and fatigue associated with writing lengthy and detailed reports, enhancing officer well-being and reducing liability risks. By automating these routine tasks, officers can concentrate on essential, human-centric aspects of their duties, such as building relationships, offering support and demonstrating empathy to the community. This not only improves officers’ wellness but also fosters a stronger connection between law enforcement and the public.

Conclusion

Automated report writing is a promising technology that can enhance the efficiency and quality of police documentation. However, it also poses some risks and challenges that need to be addressed and mitigated. Transcriptions should not be labeled as police reports, as they are different types of documents that serve different functions and audiences. Officers should be aware of the benefits and limitations of automated report writing and use.

References

1. Kelly J. Are Oklahoma City police officers really using AI to write reports? What we know. The Oklahoman. September 2024.

2. Harris J. AI-assisted police reports not welcome in King County due to potential errors. KOMO News. September 2024.

3. Bishop T. Internal memo: Don’t use AI for police reports, prosecutor tells Seattle-area law enforcement. GeekWire. September 2024.

|NEXT: Generative AI in law enforcement: Questions police chiefs need to answer

Philip Lukens served as the Chief of Police in Alliance, Nebraska from December 2020 until his resignation in September 2023. He began his law enforcement career in Colorado in 1995. He is known for his innovative approach to policing. As a leading expert in AI, he has been instrumental in pioneering the use of artificial intelligence in tandem with community policing, significantly enhancing police operations and optimizing patrol methods.

His focus on data-driven strategies and community safety has led to significant reductions in crime rates and use of force. Under Lukens’ leadership, his agency received the Victims Services Award in 2022 from the International Association of Chiefs of Police. He is a member of the IACP-PPSEAI Committee - Human Trafficking Committee, PERF, NIJ LEADS and Future Policing Institute Fellow. He holds a Bachelor of Science in Criminology from Colorado Technical University. He has also earned multiple certifications, including Northwestern School of Police Staff and Command, PERF’s Senior Management Institute for Police, Supervisor Institute with FBI LEEDA, and IACP’s Leadership in Police Organizations.

Connect on LinkedIn.