Computers are once again learning something that humans have long understood: Putting a name to a face can be an elusive endeavor. Indeed, as facial recognition software is added to everything from iPhones to airport security, the problems that flawed technology could cause in the hands of law enforcement has been the source of great concern. However, this software will improve, leading to another adage that we need to beware of: A little knowledge is a dangerous thing.

Right now, much of the criticism of facial recognition technology regards the inaccuracies of the algorithms currently available. For example, facial recognition has been shown to be much less accurate for women or people of color than white men. Likewise, when the ACLU conducted a test of Amazon’s facial recognition software “Rekognition,”the software incorrectly matched 28 members of Congress to people who had been arrested. The false matches were disproportionately people of color. The implications this could have for policing are manifest and manifold.

Undoubtedly, the makers of this technology and its potential users must contend with this issue right now. But in a world where technology has consistently advanced in accuracy, it is easy to imagine a future where the algorithm works almost perfectly. That future should concern us, particularly as it affects the roughly one-in-three Americans with some sort of a criminal record. When law enforcement can instantly identify a person and look up their criminal record, those individuals will inevitably be targeted, despite having served their time.

Some level of anonymity is critical to reentry. Facial recognition would give law enforcement instantaneous access to stigmatizing information, with significant consequences. With facial recognition, officers will be able to single out justice-involved individuals, making them more likely to draw attention and be stopped. Even a perfect facial recognition tool in imperfect hands can lead to unjust outcomes.

Individuals, including returning citizens, should have a right to biometric privacy. Privacy and anonymity are inherently valuable to the development of individuals in society–it gives us space to develop an identity separate from surveillance and to transform and reinvent ourselves. Georgetown University law professor Julie Cohen writes, “Privacy is shorthand for breathing room to engage in the process of … self-development.” And if that process of self-development is necessary for the average citizen, it is even more vital for individuals returning to society with a criminal record.

Privacy is uniquely harmed when biometric information becomes readily available. Transforming one’s facial geometry into machine-readable data points eliminates the ability to choose to disclose one’s identity. The face becomes another piece of information that others can control, and if disseminated in a data breach, cannot be changed as a credit card number or password can be. A series of experiments at Carnegie Mellon Universityfound that “the convergence of face recognition, online social networks, and data mining has made it possible to use publicly available data and inexpensive off-the-shelf technologies to produce sensitive inferences merely starting from an anonymous face.”

Individuals with criminal records are already restricted from participating fully in civic life in a number of ways–from residing in certain housing options to loans for post-secondary education, to voting. We know their employment and integration back into society improves public safety by reducing the chances they will reoffend again. However, by further marking these individuals through facial recognition, their ability to reintegrate and move beyond their past is further hindered.

Employers, despite the best of intentions, are unduly influenced once they are aware a person has a criminal record. The law, in response, has attempted to give returning individuals a second chance by shielding this information. Expungement laws are structured to give individuals a clean slate, so that they are not demonized for minor crimes or arrests that occurred sometimes decades in the past.

Employers, despite the best of intentions, are unduly influenced once they are aware a person has a criminal record. The law, in response, has attempted to give returning individuals a second chance by shielding this information. Expungement laws are structured to give individuals a clean slate, so that they are not demonized for minor crimes or arrests that occurred sometimes decades in the past.

A world where innocents are marked as guilty because of a flawed algorithm is problematic, but a world where a guilty person can never be anything but a criminal is equally so. The law must adjust and respond accordingly; legislative and judicial mechanisms must be employed. Unfortunately, the few states that have enacted biometric privacy laws have made exceptions for law enforcement. A handful of cities have more directly addressed the law enforcement surveillance problem, but these efforts remain few and far between.

Jurisprudence has also shifted in recent years to respond to changes in technology. Following a person around on public thoroughfares is legal–but the Supreme Court has concluded that the same information collected via a GPS was a warrantless search. Watching a person making cell phone calls is legal, but earlier this year, the Supreme Court ruled that the government must obtain a warrant in order to access historical cell-site records. Chief Justice Roberts, writing for the majority, argued that “this tool risks government encroachment of the sort the Framers, after consulting the lessons of history, drafted the Fourth Amendment to prevent.”

Facial recognition may not look like a search in the traditional sense of the notion, but the intrusion is nonetheless real. We must grapple with the collateral consequences of facial recognition software, particularly for the formerly incarcerated seeking a second chance.

Featured Publications