Imagine being an officer whose partner doesn’t get sick or distracted. A partner who can help focus their officer’s eyes on the road, evaluate what is happening and then do what is needed. They remind their officer of the upcoming turn in the road, monitor traffic and driving conditions, and stay quiet when driving conditions are hazardous. They also nudge the officer to complete reports and finish a log entry.
They are a partner that digs for leads by sorting, referencing and then analyzing data from all manner of public, private, cloud and social media sources. Once analyzed, they serve up their findings to their partner officer as ready clues. They reduce the time spent on calls, make their officer more efficient and take up no space in the car. They whisper in the officer’s ear and even alert them when fatigue might be a factor. They are the quintessential partner who makes others that much better.
Sound too good to be true? Perhaps today, but in a few years, a Digital Voice Assistant (DVA) more advanced than today’s Siri or Alexa will perform these functions and more.
Digital voice assistants: From the kitchen to the patrol car
Today, there are an estimated 1.83 billion DVA users worldwide, [1] and “billions of sensor platforms in peoples’ pockets.” [2] Software giant Microsoft sees voice as being the next big arena in computer interaction, and indications are people like using voice to interact with a variety of tasks and devices. [3] So how does this future translate to law enforcement?
Pop-culture is replete with examples of conversations between “smart” computers and human characters. Television shows like “Star Trek,” “The Jetsons” and “Knight Rider” all provide examples of writers who dreamed of using speech interaction with “smart” computers. Digital Trends blogger Luke Dormehl wrote, “Since Siri debuted on the iPhone 4s back in 2011, voice assistants have gone from unworkable gimmick to the basis for smart speaker technology found in one in six American homes.” [4] Instead of asking Alexa to update the grocery list, or Siri to remind you when it is time to leave for your next appointment, the DVA would be directed toward the unique activities of the police officer patrolling the community.
In the police car of tomorrow, DVAs could be accessing law enforcement and social media data and updating notes on a call for service. They could also remind officers to complete tasks like mandated reporting, tagging video files from body-worn cameras and transcribing reports. Their intuitive speech communication gives them up to three times faster interaction with the user, [5,6] while polymorphic characteristics including personality and intonation improve likability and adoption by users. [7,8]
Using geography, time and circumstance, the DVA could be programmed to hunt through sources including law enforcement data, public camera feeds and social media postings looking for relevant results. Such access might provide the officer responding to a burglary report the opportunity to review results and have leads before arriving on scene.
Writing for the Berkeley Law Journal, professors Maurice Stucke and Ariel Ezrachi said, “the digital assistant sweeps in significant amounts of data that can ‘reveal much more in combination than any isolated record.’” [9] This indicates disparate data sources that might elude review could be combed through and linked together for criminal investigations.
How digital voice assistants work
Digital Voice Assistants, Virtual Assistants, or Digital Personal Assistants are some of the titles used to refer to DVAs.
DVAs work through a conversational speech interface and cloud-based artificial intelligence. The speech interface replaces the keyboard or touch screen usually associated with a computer, while the cloud-based AI interprets the speech and commands given by the user. [8,10]
DVAs constantly “listen” for their unique activation word and then perform the next command they hear. [11] For instance, “Ok Google” activates the Google Assistant, which can then process a command like, “Add bread to my grocery list.” The Google Assistant processes the verbal command and replies to the user something like, “Okay, I’ve added bread to your grocery list.” This positive response ensures the user understands the command was both understood and executed in the absence of a visual interface. [12]
DVAs interact with data sources including the Internet and private data repositories, while simultaneously connecting to sensors throughout their environment. [4,13] An example of these sensors in a home could include a “smart” thermostat, whereas a patrol car DVA might be connected to proximity sensors on the car, CAD software and the law radio.
Microsoft foresees digital assistants will not be limited to our smart speakers or phones, but “multi-device and multi-sensory.” [3] Microsoft researchers Christi Olson and Kelli Kemery write, “We will jump from talking to our phone to skyping on our car windshield to viewing our smart speaker screen without losing the logic of a session.” [3]
However, today’s DVAs are not up to law enforcement tasks, so where do they need to improve?
Artificial intelligence needs to be smarter for law enforcement use
A future with DVAs in your home to turn down the lights, manage grocery lists and tell you what’s playing on streaming media does not make the technology ready for prime time in law enforcement. “The age of touch as the primary user interface is giving way to a new age of voice and digital assistants,” wrote Olsen and Kemery. [3] But natural language conversations are still beyond the capability of current DVAs. [6] With some notable exceptions like Google Duplex, most DVAs are incapable of fooling us into believing they’re human. Thus, the future will require additional development for specific use cases like law enforcement. [4]
Because the driving force behind the DVAs is the artificial intelligence engine that supports conversations with users, development in this area will improve natural language conversations. This should also translate into fewer misunderstandings between the DVA and users, which sometimes results in inadvertent commands or mistaken recordings. [11,14] As artificial intelligence becomes “smarter,” features will develop to efficiently deliver information to the officer based on best-case scenarios, like “the best restaurant near me.” [9] Additionally, the security and confidentiality of transactions and distraction in the patrol car will improve.
Security needs improvement
Security of audible transactions requires improvement and two or more factors of authentication. [14] Single-factor authentication using just a voice “activation” word is insufficient in a law enforcement environment. [15] Other security options like voice print mapping have challenges because emotion, fatigue and other factors all contribute to varying voice patterns. [14] One of the two-factor authentication suggestions discovered in the research was a device worn on the wrist by the user that matched physical vibrations with voice commands. [14] In another example, Apple’s iPhone requires the user to unlock the telephone when using “Siri” to update a calendar event, thereby ensuring authentication. Once authenticated, though, how do the police keep their conversations with their DVA private?
Voice confidentiality is essential
Conversations with a DVA are clearly audible. This creates challenges when the wrong person overhears confidential data provided to an officer by the DVA. Users surveyed in one study were reluctant to use voice commands in public places, specifically because they could be overheard. [10]
Timing verbal responses to the officer for the appropriate moment, or some type of earpiece to prevent widely audible broadcast, might serve to limit unauthorized hearing of the conversation. Additionally, earpieces could double as effective audio pick up for the car environment. In that case, it would be essential to maintain ambient hearing abilities for the officer as operators of emergency vehicles are exempt from prohibition of earphones in both ears. Listening to the DVA, even privately, is one more way an officer could be distracted. As an added and competing source of information, designers must develop ways to limit the opportunity for the DVA to draw attention away from an officer’s surroundings.
Mobility of DVAs and reducing distraction in the car
Moving interaction from screens and keyboards to voice will be more distracting for officers in the car. Voice communications adds a cognitive load similar to cellular telephone conversations [16] and more than what is necessary to process visual instructions on a computer screen. [12] To reduce overburdening the cognitive load of the officers, the DVA will need to be aware of driving conditions, law radio traffic and MDC traffic, among others. [16]
While current smartphone providers are working to limit distracting drivers, it appears giving situational awareness to the DVA is the answer. In this way, certain layers of communication could be delayed during high-cognitive load situations. [16] This should minimize the impact DVAs will have in the police car and increase driving safety. [16] For example, the DVA might postpone messaging between the user/officer and other users on a mobile data platform while driving with emergency lights and siren. Even as distractions are minimized and confidentiality and security are achieved, transparency and legal issues must still be addressed if DVAs are used in the law enforcement.
Legal standards: Transparency, discovery and responsibility
Several legal conversations need continued development to facilitate the future use of DVAs.
The first revolves around the development of DVA intelligence. The most effective way for DVAs to develop in a new environment involves machine learning whereby DVAs use a large data set to program themselves. [17] Law enforcement must work with developers to build a clear standard for such learning.
With a clean, unbiased, and scrupulously reviewed data standard [18], developers will position DVAs for transparency that supports clear and unambiguous public review. [19] As Ali Hashmi and his co-researchers wrote, “…as we move towards a more AI-based ubiquitous computing environment, the right to explanation will become increasingly important in the legal … domains.” Furthermore, public trust will require a clear understanding of who is the responsible party in decision making. While there will likely be a tendency to lean toward convenience and just assume the DVA is correct in all decisions, [9,17] it will be important to decide who is making the final choice.
Much like in today’s world, the information provided from one law enforcement data repository to an officer in the field is considered unverified until the source agency verifies its validity. [20] When broad and unrefined responses are received through records search, the ultimate authority of what decision is made currently rests with the officer. We will need to consider the legal ramifications if DVAs research and provide officers a filtered response to review like the “best restaurant near me” situation. [9]
Where do we go from here?
Articles and research demonstrating how DVAs and artificial intelligence may infringe on our privacy, democracy and economy are common. In his article How AI could kill off democracy, Jamie Bartlett, director of the Centre for the Analysis of Social Media, wrote, “As machines get smarter, they will continually produce good, money-saving solutions compared to human decisions, which will further establish their importance in our lives... A government with a machine advising it to allocate police resources to save money and cut crime would be hard to resist.” [17] Such an opportunity for efficiencies and potential reductions in cost [19,21] must, however, be ushered forward despite the challenges DVAs create.
The major manufacturers of DVAs like Amazon, Apple, Google and Microsoft all continue to push into the space and may partner with law enforcement. However, the products they produce are often focused on E-commerce, rather than specific use cases like law enforcement.
Other companies have focused specific use cases on legal or medical needs like Nuance Communications. Nuance has a focused DVA for the medical field that cross-checks drug interactions when a new medicine is prescribed for a patient. [4] Both Google and Apple have advanced programming interfaces that allow programs to leverage the DVA function of the smartphone. Though one of these avenues a new DVA will emerge that possesses the focused functionality that law enforcement needs. As that emergence occurs, effort should be made to survey the legal landscape and guide rather than stifle the legislative progress.
We should also learn from the fear demonstrated in San Francisco in May 2019 when the city council enacted a law banning facial recognition, another artificial intelligence tool. [22] The City moved preemptively to eliminate the technology because of inaccuracies in the software’s abilities, without considering what other safeguards existed or what other tools might be impacted. Interestingly, San Francisco had to reverse course later that year when they discovered the new law prohibited the use of iPhones the City purchased for its employees. [22]
Because DVAs have the potential to positively interact with so much, police and sheriff’s departments should consider the following course of action:
- When future law enforcement software companies are evaluated, asking the company how they propose to manage court discovery, machine learning, and distracted driving is essential.
- Additionally, vehicle purchases should consider whether DVA interfaces and sensors are available or when they will be expected in the future.
- Simultaneously, we should work with legislators, manufacturers, and industry experts to develop reasonable standards for each of the issues addressed and others that will ultimately emerge.
We should be deliberate in asking for the various elements of DVA technology needed for success in the future. Simultaneously, we should be mindful of the objections to this technology, and the potential for vendors to ignore some of the more essential aspects of the technology when moving forward with implementation.
Conclusion
While many topics play a part in this law enforcement future, voice is fast, voice is intuitive, and DVAs appear likely to become increasingly effective in the coming years. [1] Like most public sector technologies, someone will inevitably insert public DVA technology into the law enforcement universe. Thus, at issue is not a question of whether DVAs will creep into law enforcement but more a question of when.
NEXT: The great reset: Policing in 2030
References
1. Mauze B, et al. Voice Assistants - The New Glass! www.Capgemini.com, 2018.
2. Livingston S, Risse M. The Future Impact of Artificial Intelligence on Humans and Human Rights. Carnegie Council for Ethics in International Affairs, 2019, 33:2, 141-159.
3. Olson C, KemeryK. Voice Report: From answers to action: customer adoption of voice technology and digital assistants. Microsoft.com.
4. Dormehl L. 5 Ways that future A.I. assistants will take voice tech to the next level.
5. Sherry R, et al. Comparing Speech and Keyboard Text Entry for Short Messages in Two Languages on Touchscreen Phones. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2017, 1:4, Article 159.
6. Requejo D. How Voice User Interface is taking over the world, and why you should care. Medium.com.
7. Wagner K, Nimmermann F, Schramm-Klein H. Is it Human? The Role of Anthropomorphism as a Driver for the Successful Acceptance of Digital Voice Assistants. Proceedings of the 52nd Hawaii International Conference on Systems Sciences, 2019, pp. 1386-1395.
8. Braun M, et al. At Your Service: Designing Voice Assistant Personalities to Improve Automotive User Interfaces: A Real-World Driving Study, 2019. CHI Conference on Human Factors in Computing Systems Proceedings.
9. Stucke ME, Ezrachi A. How Digital Assistants Can Harm Our Economy, Privacy, And Democracy. Berkeley Technology Law Journal, 2018, 32:3, 1239-1299.
10. Liao Y, et al. Understanding the Role of Privacy and Trust in Intelligent Personal Assistant Adoption. Information in Contemporary Society iConference, 2019, Vol 11420.
11. Smith CS. Alexa and Siri Can Hear this Hidden Command. You Can’t. New York Times.
12. Santos ME. Design guidelines for voice user interfaces: Revisiting 10 usability heuristics for speech interaction. Medium.com.
13. Ammari T, et al. Music, Search, and IoT: How People (Really) Use Voice Assistants. ACM Transactions on Computer-Human Interactions, 2019, 26:3, Article 17.
14. Lei X, et al. The Insecurity of Home Digital Voice Assistants - Amazon Alexa as a Case Study. Cornell University.
15. U.S Department of Justice, FBI, CJIS. Criminal Justice Information Systems Security Policy, version 5.8.
16. Gaspar JG, et al. Providing Views of the Driving Scene to Drivers’ Conversation Partners Mitigates Cell-Phone-Related Distractions. Association for Psychological Science, 2014, 25: 2136-2146.
17. Bartlett J. How AI Could Kill Off Democracy. newstatesman.com.
18. Hashmi Ali, et al. AI Ethics: The Next Big Thing In Government. WorldGovernmentSummit.Org.
19. Government Office for Science, UK. Artificial Intelligence: Opportunities and Implications for the Future of Decision Making
20. California Commission on Peace Officers Standards and Training. Basic Course Workbook Series Student Materials Learning Domain 36 Information Systems Version 3.6. Post.ca.gov.
21. Tinholt D, et al. Unleashing the potential of Artificial Intelligence in the Public Sector. Capgemini.com.
22. Sapra B. San Francisco is changing its facial recognition ban after it accidentally made the iPhones it gave to city employees illegal. www.businessinsider.com.