Booz Allen: AI Law Enforcement Insights

Another Set of Eyes

VELOCITY V3. 2025 | Carl Ghattas, Todd Kline, and Thong Nguyen

Intelligent tools to augment law enforcement personnel

How will law enforcement officers (LEO) benefit when certain aspects of Tony Stark’s “Iron Man” suit become viable in real-world scenarios? Rocket-propelled armor may be the stuff of science fiction, but body-worn sensors, ultra-low-power neural compute, and high-resolution augmented reality (AR) waveguide displays are rapidly becoming more powerful. Today, these technologies are available in form factors as compact as a pair of glasses. Tomorrow, they will be the foundation for a platform that enables superhuman-like capabilities in the field.

The convergence of these technologies with AI is spawning a new paradigm of “intelligent” field-use tools that can provide capabilities such as active language translation, context-aware wayfinding, or a heads-up view of criminal activity or tactical support within the vicinity. When integrated into a simple, wearable platform with 5G connectivity, these technologies have the potential to enable LEOs to see the unseen, communicate without needing to speak, and maintain a level of situational awareness (SA) beyond what’s possible today.

However, as with night-vision devices or body-worn cameras, which took years to adopt, significant advances in field-worn technologies don’t happen overnight. Before AI-powered wearables migrate from the laboratory to the field, agencies can take certain steps to prepare themselves and their personnel. It’s imperative to understand how these capabilities can align with mission needs, what needs to be solved before they are deployed for operations, and how AI can reduce complexity and enhance SA for LEOs in the field.

Drowning Under a Tsunami of Data

Today, law enforcement and homeland security personnel face a daily challenge: manage the streams of data that are essential to everyday operations. This data comes to them in different modes (images, social media posts, tips) and from various points of origin (body-worn cameras, law enforcement databases). It is often siloed within organizational boundaries and under different classifications. According to a story published by IT Brew, in June 2024 Justin Williams, the deputy assistant director of the FBI’s information management division, said that the bureau’s Criminal Justice Information Services Division can hold over 30 petabytes of data at any time.

Federal agencies and local law enforcement are turning to AI to translate this tsunami of data into insights. According to IT Brew, Cynthia Kaiser, the deputy assistant director of the FBI’s Cyber Division, said that the FBI has used natural language processing to analyze tip calls and identify missed information. In August 2024, the Palm Beach County Sheriff’s Office told WPEC CBS 12 that it was rolling out a tool from Axon Enterprise, Inc., that uses generative AI and audio feeds from body-worn cameras to produce drafts of incident reports. These use cases free up time while maintaining human accountability, which is essential to ensuring agencies abide by the laws that protect privacy and the rights of the citizenry and govern the admissibility of evidence in court. They also point to a future where AI can help LEOs better process complex scenarios and reduce cognitive load in the field.

Optimizing Intelligence from the Office to the Edge

While applications of AI in law enforcement today have primarily been directed at office-based tasks, the rise of edge-capable devices raises new possibilities. How could this emerging tech address the unmet needs of LEOs in the field who are often removed from information sources and decision-making hubs? How might wearable AI enhance SA, which is critical for the success of field operations? By closing the gap between LEOs and essential information, this technology could result in distributed decision making and better response effectiveness.

For LEOs in the field, SA is a necessary yet difficult state to achieve. It can easily be impaired in high-pressure circumstances when split-second decisions can be the difference between life and death. Under significant stress, LEOs must perceive environmental cues such as potential threats in unfamiliar settings, understand what those cues mean in context (e.g., delineating between a person exhibiting nervousness and a person exhibiting aggression), and be able to project future scenarios based on current observations.

Expanding individual SA into a common operating picture (COP) is an even more sophisticated challenge. It involves the real-time fusion of insights across both individuals and disparate data sources to provide a unified view while simultaneously orchestrating the distribution of information relevant to each LEO’s specific tasks. This complexity is multiplied in multiagency operations where responders may be guided by different mandates, objectives, and information sources that can make collaboration difficult. In the dynamic and rapidly changing situations that LEOs face in the field, mistakes in relaying critical information in a relevant, timely manner can have dire consequences.

Despite the importance of COP technology in aiding public safety and security response, it is not uncommon to find these systems lacking the functionality that emerging technologies provide. For example, law enforcement agencies may use sophisticated assets such as helicopters and drones to capture valuable visual data. The problem is that once this information is received and distilled by central command (C2), the resulting insights are frequently relayed to field personnel verbally, rather than digitally. Furthermore, the potential for latency and error increases as the information is further propagated across other organizations.

There are meaningful opportunities to build systems that reimagine how data flows from the field to the office, across to other agencies, and back out. These systems must assist in parceling out the “noise” and helping LEOs focus on the essential tasks at hand. The trick isn’t always how fast you can find the needle in the haystack, it’s how efficiently you can remove the enshrouding hay.

A Glimpse of the Possible

Picture a pair of glasses, nondescript in appearance but purpose-built for use by frontline LEOs. When activated and worn, the glasses biometrically authenticate the wearer, which provides secure access to advanced capabilities. Specific mission data, such as suspect profiles, maps, license plates, or building floor plans, are pushed to them, enabling offline operation in the event of gaps in wireless coverage.

With a crisp and wide visual, the LEO sees information relevant to their mission overlaid in their field of vision. The data ranges from route guidance and traffic flow to the locations of suspects and team members. LEOs are also alerted to threats within the vicinity. During an operation, the glasses visually identify "friend or foe" at range, thereby reducing the chance of friendly fire on fellow on- and off-duty LEOs. The glasses also estimate crowd size and density, identifying key characteristics such as movement patterns. Facial recognition is performed on-device against the preloaded suspect profiles, reducing inefficient round trips while protecting the privacy of law-abiding citizens. A simple verbal command prompts the glasses to capture and digitally catalog data along with its surrounding context in time and space—information that supports future forensic analysis.

With its onboard scene intelligence capabilities, a pair of AI-powered smart glasses continuously maintains a temporal-spatial simulation of the world around it. The glasses stream delta updates of its local “world understanding” to C2, which gets fused with geographic information system data and other sources, such as drone and closed-circuit television (CCTV) feeds, to create a live, centralized digital twin of the area of coverage. This provides a unified view along with historical analytics and predictive simulation to support data-driven decisions, and the resulting central state is regularly synchronized back out to LEOs to give each individual real-time, swarm-like insight. A quick attachment of an add-on sensor enables the glasses to detect beyond the visible spectrum, including the presence of chemical, biological, or nuclear material, which enables the LEO to see and instantaneously report their findings. All of this happens without a word being said.

The underlying data streams can be captured as historical record and will be invaluable not only for after-action reports but also for training new recruits and synthesizing future scenarios that have yet to happen.

In addition to the passive assistance, LEOs can actively issue queries by voice, asking questions and giving directions such as: What is the last observed location of the suspect? Are there civilians in the area? Get me an aerial view of where 911 calls related to this incident are coming from. What is the quickest and safest route out of the area? With an understanding of both the individual and group contexts, the glasses can respond with answers tailored for the situation at hand in milliseconds. Up-to-date policies and procedures can be requested, which can be challenging to recall during or after stressful events.

The construction and deployment of such a platform lies within the realm of possibility. Commercially available smart glasses such as the Ray-Ban Meta collection, Snap Spectacles, or DigiLens ArgoTM feature high-resolution electro-optical sensors and a Qualcomm Snapdragon® processor that can run AI models in parallel with a full-fledged physics-based simulation at 60 frames per second. They have the wireless connectivity for low latency reachback to secure infrastructure. They can already perform speech to text, text to speech, computer vision, stream voice, and video. The next generations of these glasses will offer even more. Making them mission-effective will boil down to stitching the underlying technologies together seamlessly to serve a purpose.

Within the context of law enforcement missions—which run the gamut of public safety, including facilitating the lawful movement of people and goods, disrupting illicit activity, or responsively investigating innumerable tips and leads—the primary challenge boils down to data fusion and distribution and the user experience. Any information relayed to a frontline LEO must do so in a way that highlights relevant context and removes noise without serving as a distraction or a cognitive burden.

To be effective in the field, this platform will have to be built under tight size, weight, and power limits. LEOs will not wear glasses that are fragile, require frequent battery changes, or are tethered to cords that inhibit movement. Many LEOs are already overloaded by gear: weapons, body cams, first aid gear, lights, and more. Updates to the devices, client apps, private cloud microservices, and corresponding infrastructure must be seamless and scalable as utilization increases, making “software-defined everything” (SDE) a necessity. If it’s not overwhelmingly and obviously effective, adoption will be difficult, if not impossible, to achieve.

Furthermore, in an emergent situation like executing a warrant or pursuing a suspect, the glasses must be able to anticipate what the LEO needs. The integration of multimodal AI will significantly increase the value proposition by directly supporting task execution. For law enforcement missions, glasses that leverage AI’s advanced analytical abilities to proactively identify what a specific situation demands— information about the surrounding area, requests for backup—rather than solely wait for verbal commands will enhance LEOs’ SA and accelerate decision making in critical situations.

AI-powered wearables allow data to flow seamlessly from law enforcement officers in the field to central command, resulting in greater situational awareness for everyone. Figure 1: AI-powered wearables allow data to flow seamlessly from law enforcement officers in the field to central command, resulting in greater situational awareness for everyone.

From the Lab to the Field

It’s easy for discussions about emerging technology and its potential mission relevance to spark excitement, but achieving genuine mission transformation requires a far more nuanced and methodical approach. The technology landscape is littered with failed concepts that began with flawed assumptions. It is essential to incubate and develop these products in close partnership with current mission experts to ensure that they meet the practical needs and real-world constraints of field personnel. Iterative user testing and feedback are crucial to build trust, identify and address potential issues, improve usability, and validate core features. The employment of any human augmentation technology in support of public safety must be able to adapt fluidly to mission needs and withstand the demands of field use.

Two upcoming events illustrate how AI-powered wearables could help law enforcement surmount their data challenges: the 2026 FIFA World CupTM and the 2028 Summer Olympic Games. The United States will host both events, and over the course of each, tens of millions of passionate international fans and tourists will converge physically and digitally in a short time span, bringing in billions of dollars in revenue, creating tens of billions of impressions, and attracting a massive surge of global attention.

These spectators and tourists will only add to the already increasing tsunami of data LEOs are confronted with. But instead of a broadly occurring uptick, the finite nature of these events will acutely focus this increase in data like water pressure through a fire hose. To effectively investigate any leads, identify and disrupt illicit efforts, and facilitate the safe and lawful movement of people, LEOs will need to possess as close to full SA as possible at all times. Could intelligent field-use tools be the bridge that links these no-fail missions with the data required to execute them successfully when and where needed?

Imagine the following: Thousands of spectators are flowing toward a stadium, and a child becomes separated from her parents. The parents approach a nearby LEO seeking help, but they do not speak English. The LEO’s “smart glasses” provide instant translation, and she radios in the details of the situation. Another LEO maneuvers a drone and locates an isolated child 60 yards from where the parents are. The child’s location is spatially projected to the ground LEO’s glasses, and the LEO then guides the parents through the crowd to where the child is stranded.

On the other side of the stadium, a solitary LEO reports a fight breaking out between inebriated spectators that is too large for one person to handle. Immediately, the officer-in-charge (OIC) in the established command center pulls up a digital twin of the stadium on a screen. Reflected on the display are the real-time locations of every field LEO, each of whom is wearing tech that allows for such precision. The OIC then vectors an appropriate number of personnel to assist in de-escalating the situation while shifting others to cover new areas. The locations and distances of those reporting to assist are now reflected on the reporting officer’s display, thus guiding his plan to engage.

In each of these scenarios, the underlying value of the wearable devices lies in their ability to provide LEOs working in complex environments with access to data in a manner and at a speed that proves decisive in dealing with emergent and regular response requirements. The technologies that power these devices are evolving more rapidly than ever. In the commercial market, many observers are openly wondering if AI-powered glasses will replace earbuds and even mobile phones in the future. The next step for this technology is to find ways to fine-tune their capabilities to help the personnel tasked with maintaining order and public safety to execute their missions.

Learning from Night-Vision Devices

Night-vision devices (NVD) are illustrative of a wearable technology that can create game-changing advances for law enforcement personnel.

Prior to the adoption of NVDs, law enforcement entities faced significant challenges when operating at night. Lack of visibility brought with it much higher risks. This heavy piece of equipment would not be worth taking into the field if not for its immediate, obvious, undeniable utility: its ability to give LEOs the ability “see in the dark,” which in turn offers an overwhelming advantage.

Put another way, NVDs enable “data dominance.” They let LEOs operating in dark conditions see what was previously unseeable. AI-powered wearables offer a similar value proposition by providing LEOs with access to intuitive data feeds that augment their abilities without increasing their cognitive load.

Key Takeaways

  • Advanced wearable technology, including AI-powered smart glasses, could provide law enforcement officers with superhuman-like capabilities such as real-time translation, threat detection, and enhanced situational awareness.
  • Law enforcement agencies are struggling with an overwhelming volume of data from various sources, and AI could help transform this "tsunami of data" into actionable intelligence in the field.
  • While the technology for these advanced wearables exists, successful implementation will require careful development with user feedback to ensure they enhance rather than burden officers' capabilities.

Meet the Authors

leads Booz Allen’s law enforcement and homeland security business.

Todd Kline

is a strategy and transformation leader within Booz Allen’s law enforcement and immigration market and an adjunct professor at Bethel University.

Thong Nguyen

is a human-centered technologist in Booz Allen’s BrightLabs incubator, focused on advancing emerging technology that will transform future missions.

References

 

VELOCITY MAGAZINE

Booz Allen's annual publication dissecting issues at the center of mission and innovation.

Subscribe

Want more insights from Velocity? Sign up to receive more stories about emerging technologies and the impacts they’re making on missions of national importance.



1 - 4 of 8