Automating Our Justice System to Ensure Justice for All
These places are obvious potential hot spots for the spread of COVID-19. So, it’s no surprise that many of these courthouses and legal aid offices have shutdown, all but the most necessary hearings and processes have been postponed, and visitation in jails and prisons have become next to impossible.
Yet, the need for these services hasn’t magically disappeared.
As we do our best to protect petitioners, court staff and the greater public, we are reminded of how crucial technological innovations have become to our daily lives — not only increasing convenience and speed but even allowing these processes to occur amid pandemic restrictions and closures.
While COVID-19 offers an opportunity to enhance automation and technology across the board, this moment also offers us a chance to pause and ensure that the technological strides we implement improve access to justice for all.
Technology can assist individuals in many areas of the law and will be much needed in the coming months. During the recession a decade ago, civil cases — consumer credit, evictions, foreclosures and domestic relations cases — leapt by 1.5 million cases.
Our current economic downturn is even more severe.
Locations like Michigan and New York have — or will soon lift — eviction moratoriums they implemented during the onset of COVID-19. And the additional $600 per week unemployment benefit ended July 31. Many of the affected individuals will need legal aid to address gaps in basic needs as the pandemic continues, and our local court systems will likely be flooded with cases.
Even simple fixes like moving forms online or offering legal services via apps can help improve both efficiency and access to services.
In Boston, technologist Quinten Steenhuis is building an app that would create mobile-friendly online court forms. The service would also make filling in forms, such as a request for a restraining order in a domestic violence situation, easier to complete for those without legal training.
Refining existing court procedures through technology is a step in the right direction, but there are other exciting new processes that take advantage of the unique opportunities technology provides.
Clean Slate laws are one such innovation. Those with criminal records face an incredible amount of stigma and often struggle to find a job. Clean Slate laws would automate record clearances for certain individuals who have been crime-free and satisfied a waiting period.
While expungements can help some, just 6.5 percent of individuals ever apply for an expungement because the process is cumbersome and often requires legal help.
Making matters more difficult, expungements in most jurisdictions have come to a standstill since many courts and law offices are closed except for the most urgent matters.
Some states, like New Jersey, have even encountered record-setting backlogs, with one gentleman waiting more than a year to have his record cleared. In contrast, states like Pennsylvania — which has Clean Slate legislation in place –– have automated the clearance of eligible records. As a result, Pennsylvania was able to continue to process expungements despite COVID-19.
At the same time, the current moment has us re-examining the way technology can contribute to racial injustice in our system. Some local and state governments, as well as private companies, have decried the use of facial recognition because of the biased manner in which it operates.
Predictive policing has also recently come under fire.
Technology can amplify existing implicit biases, while making them harder to detect. Machine learning is not racist by default, but becomes so because of the human-produced data it learns from.
The most well-known example is likely ProPublica’s study of an AI program called Correctional Offender Management Profiling for Alternative Solutions (COMPAS). ProPublica found that COMPAS’ recidivism algorithm was calculating that black defendants had a higher risk level than white defendants, regardless of criminal history or other factors.
Similarly, predictive policing uses algorithms to predict where future crime will occur. The algorithms interpret police reports and send officers to target crime hotspots and chronic offenders.
Critics report this reinforces racist patterns already present in law enforcement, with minority neighborhoods subject to over-policing. Additionally, it’s not clear that the technology works — some studies show it reduces crime, but others have found only a negligible effect.
What makes systems like COMPAS more dangerous is that it can be harder to detect bias perpetuated by technology — termed “tech-washing” by critics — where shielded decision-making and objectivity make it difficult to detect algorithmic bias.
It’s clear we must distinguish between technological automation that is helpful and that which is hurtful. Part of being successful is intention and planning: Has the technology been created with the right stakeholders at the table?
Programmers are disproportionately white and Asian men, and algorithms often reflect the biases of their creators, or simply don’t include the appropriate training set for equitable machine learning.
For example, one study showed that AI’s capacity for natural language processing decreased when exposed to African-American vernacular because of a lack of diversity in the training set.
Some of the best uses of technology acknowledge human bias or inequalities in the human condition and try to mitigate them. In San Francisco, the district attorney’s office uses an algorithm to obscure race information from case materials at the earliest stages, to bring equity to charging decisions.
Similarly, to dismantle a system of racist predictive policing, we could rely more on traffic cameras and “smart” streets that would enforce the law regardless of an individual’s race.
Clean Slate technology works similarly by automating expungement relief and providing more equitable second chances, regardless of privilege. Expungements, which can cost individuals in the thousands, thus become accessible to all.
By closing court doors, COVID-19 has pushed us to innovate. By opening our eyes to racial inequities, George Floyd’s death has caused us to interrogate those innovations. Together, these two crises can bring us closer to a just and fair use of technology.