The intersection of technology and policing is a fraught issue.
In the hands of civilians, it has clearly been a tool for bringing injustices to light and holding police accountable. George Floyd’s death likely would have gone unremarked if a bystander hadn’t been filming during those nine fatal minutes.
But the flip side of technology is the way that it can be used by police to target minorities or crack down on protest and dissent.
In the 25 years since the New York Police Department kicked off the modern era of technology-driven policing with its CompStat crime-tracking software , police reform advocates argue that police have used technology to target marginalized communities.
For example, the rise of smart phones, whose ubiquitous cameras have transformed police accountability efforts, also permits departments to track users through so-called geo-fencing technologies.
This summer, federal and state police departments used geo-fencing to collect protestors’ personal information .
We must recognize that while technology can be a boon to police accountability, it also plays a role in shaping the culture that uses it.
Perhaps nothing better illustrates the double-edged nature of policing technology than facial recognition software. Protesters were enraged when federal officials and police departments in D.C.  and other cities started applying facial recognition software  to social media to identify protestors.
But protestors struck back by also using facial recognition—this time to identify police officers  who hid their nameplates and badges while making arrests. This technological arms race between officers and the public creates mutual distrust, further widening the gap of understanding between law enforcement and those that they serve.
But it seems too simple to conclude that technology is good in the hands of civilians and dangerous in the hands of police. In fact, reformers discount the ways that technology might be able both to head off police misconduct before it occurs, and helps to increase police accountability when it does.
At the end of the day, technology is a tool which makes tasks, good or bad, easier.
Think back to the George Floyd’s tragic death. One of the awful ironies of that event was the fact that several of the officers involved were trainees. One of them, Alex Keung, was on just his third shift  after being hired.
Would those trainees have acted differently if there was no camera rolling? Could that horrifying incident have turned out differently if a trainer was watching a live video feed of the officers’ body cameras and was able to intervene to stop Derek Chauvin at some point in those nine minutes?
That technology exists; it’s simply a matter of money and will to implement it.
Likewise, in a spin on the CompStat approach, an increasing number of larger departments are using “early warning” systems to identify officers who attract a disproportionate number of citizen complaints or misconduct reports.
These data driven systems work to pinpoint the small number of problem officers who, as every police chief knows, drive a huge percentage of citizen complaints. For many of these officers, early intervention and mentorship can help to reduce these complaints and prevent gross misconduct later, but departments often don’t have the savvy or capacity to pick out troubling patterns about individual officers from the statistical noise.
That’s where early warning systems help to replace intuition and gut feelings about “problem cops.” They can work when they are paired with swift intervention: a study from the National Institute of Justice found that early warning systems “have a dramatic effect on reducing citizen complaints  and other indicators of problematic police performance among those officers subject to intervention.”
To be sure, technology is not a panacea to everything that ails policing in America. Even the much-hyped expansion of body-worn cameras has not yielded all the benefits hoped for: there isn’t a lot of evidence they have done much to reduce police misconduct.
Part of the issue is funding. Departments don’t invest in storing and analyzing footage, which can reduce cameras’ effectiveness. One dramatic example of the way capacity limitations stymie accountability is Zachary Wester: the Florida officer who arrested dozens, maybe hundreds of innocent drivers on drug charges.
Wester’s body camera clearly showed him planting drugs in a number of instances, but backlogs in producing and processing video caused many defendants to plead guilty to false charges before a local prosecutor became suspicious  and reviewed hours of footage that revealed the fraud.
Wester faces criminal prosecution and dozens of federal lawsuits, and hundreds of convictions have been vacated, but that didn’t prevent many people’s lives from being ruined by false convictions .
When we talk about “fixing” police culture, we need to start not by painting all officers as problematic with a wide brush, but by identifying the early red flags of police misconduct and acting before things spiral out of control.
If left unnoticed or simply ignored, then misconduct can create its own toxic culture. Technology can help to further ingrain that toxic culture or help to reverse it depending on how such tools are used.
This is why we must take a harder look at exactly how we are using technology, not just if it is being used.
Technology has to be paired with resources, training, and most importantly, a willingness to use it for good if it is to be a force for police accountability.
But that makes it no different from any other tool, which can be used for good or bad, or used effectively or not. Where there’s a will, technology can help to find the way.
- “police”: https://chicago.cbslocal.com/2020/10/01/charge-dropped-against-mia-wright-woman-dragged-out-of-car-by-hair-by-police-at-brickyard-mall/
- “brutality”: https://www.msn.com/en-us/news/us/2-officers-fired-after-video-shows-college-students-pulled-from-car-tased/ar-BB14QVFH
- “and”: https://www.nbcnews.com/news/us-news/man-75-shoved-ground-buffalo-police-during-protest-released-hospital-n1232630
- “misconduct”: https://edition.cnn.com/2020/09/12/us/clayton-county-deputy-leave-physical-force/index.html
- “CompStat crime-tracking software”: https://compstat.nypdonline.org/
- “collect protestors’ personal information”: https://www.nbcnews.com/news/education/unc-campus-police-used-geofencing-tech-monitor-antiracism-protestors-n1105746
- “D.C.”: https://www.msn.com/en-us/news/us/facial-recognition-used-to-identify-lafayette-square-protester-accused-of-assault/ar-BB1aCYfm?ocid=msedgntp
- “facial recognition software”: https://onezero.medium.com/facial-recognition-is-law-enforcements-newest-weapon-against-protestors-c7a9760e46eb
- “identify police officers”: https://www.nytimes.com/2020/10/21/technology/facial-recognition-police.html
- “third shift”: https://www.nytimes.com/2020/06/27/us/minneapolis-police-officer-kueng.html
- “dramatic effect on reducing citizen complaints”: https://www.ncjrs.gov/pdffiles1/nij/188565.pdf
- “a local prosecutor became suspicious”: https://www.tallahassee.com/story/news/2018/09/29/prosecutor-who-sparked-jackson-drug-planting-probe-resigns-whistleblower/1441015002/
- “being ruined by false convictions”: https://www.tallahassee.com/story/news/local/2019/07/13/drug-planting-probe-florida-zach-wester-arrest-victims-justice-drugs-meth-jackson-county-arrest/1703423001/