For law enforcement officers, the end of a long shift does not mean the work is over. After clearing the perimeter, arresting the suspect, and cataloging the evidence, the real grind begins: writing the report.  

Advances in artificial intelligence (AI) are giving officers that time back. In 2024, after a grueling search for an escaped suspect, Oklahoma Police Sgt. Matt Gilmore docked his body camera as usual. Within seconds, an AI system processed the entire operation—commands to his K-9, radio traffic, witness statements and all—and generated a draft report. “It was a better report than I could have ever written, and it was 100% accurate. It flowed better,” Gilbert said.

While the technology is clearly useful, it is not always so precise. There are examples of AI mistaking a bag of chips for a gun and even transforming an officer into a frog. Although AI can analyze body camera footage at a speed no human can match, the importance of human review remains central. As the technology continues to advance—both in ways we can imagine and in ways we cannot—striking a balance between public safety, efficiency, and civil liberties will require ongoing vigilance.

The Economics of Body Cameras

Every agency knows that the hidden cost of body-worn cameras is the data, not the hardware. A single encounter can generate terabytes of video data that must be reviewed, redacted, and stored. Most of that footage is never reviewed at all.

A recent audit conducted by the Spokane Police Department reveals the reason why. To comply with a change to the Washington Public Records Act, city leaders needed to determine—down to the cent—what to charge for body camera footage requests. Researchers found that “targeted video redaction” (i.e., blurring a specific person or object) requires 11 minutes of staff time per minute of raw footage. At a rate of $0.76 per minute of work (based on salary and benefits), the total cost to redact one minute of video was estimated at $8.36.

Multiply that by millions of minutes across thousands of law enforcement agencies, and the potential for savings is profound.  

AI and other technological advances are transforming unmanageable archives into active sources of insight. For understaffed agencies losing deputies to burnout, reducing paperwork is a force multiplier. AI does the busywork so humans can do the police work.

The Connected Officer

We are entering an era of always-on intelligence—a shared digital consciousness connecting every officer, every patrol car, and every data point into a unified, intelligent system.

Axon, the Arizona-based vendor that enjoys a near-monopoly on the market, recently released the “Axon Body 4,” a camera with internet connection. Going online fundamentally shifts body cameras’ mission from a reactive model of policing to a proactive one. Inside a real-time crime center, AI can break down information silos in search of actionable intel, merging an officer’s body camera feed with other data streams such as doorbell cameras, license plate readers, and even social media. For example, by pulling up an officer’s live view during a foot chase and overlaying it with nearby traffic camera footage, dispatchers can direct the officer to intercept the suspect or head to safety—all in real time.

While these systems focus on looking outward to identify external threats, a new class of AI tools is turning inward to analyze the officers themselves. Some companies offer “AI partner” services that provide contextual prompts to adhere to department procedure. In a threatening situation, an automated training reminder, such as “Slow down, let them speak,” can help officers de-escalate a dangerous situation. While research into AI systems that assess officer conduct has identified benefits to officer professionalism, police unions remain wary of a “digital supervisor” micromanaging officer decisions.

Facial Recognition

A fundamental shift occurs in the balance of power between government and citizens when unconstrained use of facial recognition eradicates the ability to remain anonymous in public. As body cameras shift from post-incident review to real-time awareness, policymakers must weigh the operational benefits against privacy rights.

When The Washington Post revealed that New Orleans police were secretly operating a live, real-time dragnet of the French Quarter, the untargeted nature of the surveillance caused an uproar. This warrantless, off-the-books experiment in AI surveillance bypassed legal guardrails without the knowledge or consent of elected officials. Despite being shut down, the operation factored into city elections last year, with Mayor Helena Moreno speaking out against live facial recognition in a televised debate.

Yet police around the country regularly use AI to identify American citizens. For example, over 3,000 law enforcement agencies are using Clearview AI to match still images taken from surveillance footage or social media against a database of 70 billion facial images.

Despite regulatory and technological safeguards, the pressure to close cases can give investigators tunnel vision, leading them to mistake AI-generated leads for absolute truths. While many agencies have policies against using facial recognition as the sole basis for an arrest, these guardrails are often ignored in practice. According to a Washington Post investigation, at least eight Americans have been subject to wrongful arrests after AI identified them as criminal suspects.

AI policies should make clear that technology is an investigative tool—not a shortcut around old-fashioned police work. As of early 2025, 15 states had laws regulating facial recognition in policing. So far, California and Utah are the only ones to regulate body-camera AI specifically. Both mandate written disclosures and formal policies defining which AI tools are permitted for which tasks.

The Future of Body-Camera AI

Late last year, Canada’s Edmonton Police Service became the first in the world to pilot body cameras with live facial recognition that alerts officers to individuals with outstanding warrants for serious crimes like murder, aggravated assault, and robbery.

The next generation of body-camera AI will move beyond biometrics toward behavioral prediction. Developers are currently testing systems that can detect signs of escalating conflict, concealed weapons, aggressive body language, emotional distress, and even dishonesty. Theoretically, an agentic AI could access every police dashboard and body camera in a given area to locate a vehicle linked to an Amber Alert.

The ultimate question is not whether AI will be integrated into law enforcement, but how it will be used. From Spokane to New Orleans, the shift is undeniable: Body cameras have transformed from standalone recording devices to individual nodes within a unified smart network. Yet as we grant these devices the power to see, hear, and understand our world, we must remember that power is not the same as justice. An algorithm may be able to process a crime scene in milliseconds, but it takes a human to understand it. The future of public safety depends on keeping that distinction clear.

The Criminal Justice and Civil Liberties program focuses on public policy reforms that prioritize public safety as well as due process, fiscal responsibility, and individual liberty.