Trending Topics

AI in policing: What tech mogul Larry Ellison gets right and wrong

Oracle’s chair and CTO Larry Ellison recently predicted that “every police officer is going to be supervised at all times,” raising questions about balancing enhanced accountability with privacy concerns

Oracle-Tennessee

Larry Ellison, Oracle’s chair and CTO, recently shared some thought-provoking insights on how the tech industry will impact policing.

Mark J. Terrill/AP

The recent Oracle financial analyst meeting on September 12, 2024, featured Larry Ellison, Oracle’s chair and CTO, sharing his brilliant and thought-provoking insights on the tech industry and Oracle’s strategy. While many of his remarks were accurate, some require further examination, especially regarding global policing due to regional governance differences in the U.S.

Agreeing with Ellison

Ellison emphasized the transformative potential of Oracle’s Cloud Infrastructure (OCI), highlighting its performance, security and cost efficiency. He noted that AI’s dependency on vast amounts of data is crucial, which I agree with, especially in policing, where more data enhances AI’s ability to identify patterns, learn and make accurate predictions.

I continue to assert that AI on the back end, freeing humans to interact with the public, is essential. Ellison highlighted Oracle’s system design, suggesting that minimizing human labor reduces errors and increases security, similar to secure online transactions versus physical credit card use. He also emphasized the vital role of the company’s IoT framework in modern policing, enabling real-time data collection, analysis and response for effective police operations.

Ellison highlighted the potential of Oracle’s IoT framework, which I see playing a role in resource management among many police automation functions, where smart inventory systems can track and maintain equipment, ensuring optimal conditions and availability.

Ellison’s argument for quicker and more affordable autonomous systems and data centers is crucial. This supports my belief in the importance of transparent data dashboards and autonomous backend systems in police agencies. It also liberates personnel from administrative tasks, allowing them to increase community engagement.

Debunking Ellison’s statements

However, not all of Ellison’s claims withstand scrutiny. Despite technological advancements, police must uphold constitutional rights, even if it may be deemed less efficient or safe. Ellison asserted that cameras and AI in schools would enhance safety, but the Fourth Amendment and school policies against recording children complicate matters. Some schools have cameras, but police cannot always access or use the footage, adding further complexity.

Ellison neglected to highlight the crucial role of having accurate data within the system, which is fundamental before considering how AI can be beneficial. The integrity and diversity of data are fundamental to the effectiveness and fairness of AI systems in policing. Proper data management ensures that AI algorithms do not perpetuate existing biases but rather promote accurate and equitable outcomes. For instance, in the case of predictive policing, where AI algorithms analyze historical crime data to forecast future criminal activities, if the data used in these systems is skewed or biased, it can result in disproportionate targeting of certain communities. For example, if historical data reflects over-policing in specific neighborhoods, the AI system may continue to predict higher crime rates in those areas, perpetuating a cycle of surveillance and enforcement, which is a result of a discussion from a project I have been working on with the National Academies.

Ellison talked about Oracle body cameras and mentioned that while officers can request the camera to be off during bathroom or lunch breaks, it always records, although footage would only be accessible with a court order. This implies officers have no reasonable expectation of privacy, which could be a concern if some unions or agencies are unaware of this and are currently deploying this platform. Additionally, many departments allow the public to request that officers turn off their body cameras for various reasons, therefore officers would be eroding trust with the public when cameras are not actually off.

AI can assist in video monitoring for various reasons, such as supporting street staff, transcribing footage and ensuring accountability, which will improve safety and efficiency. However, some studies indicate that body cameras don’t significantly affect the use of force, so one should be cautious in assuming that constant monitoring will suddenly make a difference.

Ellison suggested that AI will supervise every officer, reporting issues as they arise. This must be balanced with respecting individual rights and used for mentorship. Studies indicate that administrative oversight contributes to stress and staffing shortages, underscoring the need for balanced leadership.

Ellison stated that drones can respond faster than police cars, adding value by gathering information. This doesn’t replace officers who still have to respond and intervene. Additionally, cell phone cameras can provide instant live feeds during calls, further aiding in information gathering without Fourth Amendment risk issues.

Using autonomous drones to assist in tasks like spotting forest fires or detecting arson can be challenging, though sometimes feasible. AI-powered cameras already monitor for fires, and drones could be deployed autonomously for specific missions in the future. However, privacy concerns, as highlighted in Leaders of a Beautiful Struggle vs. Baltimore Police Department case, mean police departments must carefully balance AI deployment concerning a risk matrix. In this case, the Fourth Circuit upheld the denial of a preliminary injunction, ruling that Baltimore’s Aerial Investigation Research (AIR) program did not violate the Fourth Amendment, as it imposed minimal invasions of privacy and served a legitimate law enforcement purpose by monitoring public movements without identifying specific individuals.

I recently presented at the 2024 NIJ Research Conference on AI in Policing, emphasizing the balance between technological advancements and ethical considerations. I urged developers and law enforcement to adopt a human-centered approach, safeguarding constitutional rights and ensuring AI enhances rather than replaces human interaction in policing.

While AI offers substantial benefits like enhanced situational awareness, improved response times, and valuable data analysis, we must avoid over-reliance on technology. There is a fine line between promoting best behavior and creating a chilling effect where people alter their actions out of fear. Over-monitoring can infringe on personal freedoms and stifle creativity. It’s vital to ensure AI aids human judgment while respecting constitutional rights and upholding public trust.

Philip Lukens served as the Chief of Police in Alliance, Nebraska from December 2020 until his resignation in September 2023. He began his law enforcement career in Colorado in 1995. He is known for his innovative approach to policing. As a leading expert in AI, he has been instrumental in pioneering the use of artificial intelligence in tandem with community policing, significantly enhancing police operations and optimizing patrol methods.

His focus on data-driven strategies and community safety has led to significant reductions in crime rates and use of force. Under Lukens’ leadership, his agency received the Victims Services Award in 2022 from the International Association of Chiefs of Police. He is a member of the IACP-PPSEAI Committee - Human Trafficking Committee, PERF, NIJ LEADS and Future Policing Institute Fellow. He holds a Bachelor of Science in Criminology from Colorado Technical University. He has also earned multiple certifications, including Northwestern School of Police Staff and Command, PERF’s Senior Management Institute for Police, Supervisor Institute with FBI LEEDA, and IACP’s Leadership in Police Organizations.

Connect on LinkedIn.