Replacing Judges with Computers Is Risky
Harvard Law Review
by Quentin L. Kopp, San Mateo Superior Court
FEBRUARY 20, 2018
Adopting technology for the sake of having it is an unwise move. Last year, the California Judicial Council proposed that California’s criminal courts jump on a failing bandwagon to inhibit and effectively replace judicial discretion with computer-based algorithms. The present judicial system already assesses each defendant, their previous criminal history, and ties to the community. These facts are used by prosecutors, defense attorneys, and judges to render a just determination whether the defendant is a “flight risk and danger” to the victim or to the public.
The California Judicial Council says California should replace its long standing money-bail system with a “risk-based pretrial assessment” tool as seen in other states. They say California should follow the examples set by Washington DC, New Mexico, New Jersey, and other jurisdictions. However, these cities jumped on the computer-based bail-replacement train before technology could catch up with human judgment.
The problem with this recommendation is that these early adopters are discovering too late that, like most half-baked technology, the innovative promise of automated criminal justice is fraught with unintended consequences and errors. Courts in these states and cities have become revolving doors for individuals who offend again.
This can add significant threats to the public’s safety, causing harm and prompting hundreds of more crimes affecting and harming more victims and percipient witnesses. Some victims haven’t survived the experiment of no-bail “justice.”
Technology cannot replace the depth of judicial knowledge, experience, and expertise in law enforcement that prosecutors and defendants’ attorneys possess. Complete evaluation and determination of whether to hold or release an accused defendant on bail for any particular defendant accused of any specific crime requires every bit of these combined skills.
Remember: no two cases — no two defendants, victims or pattern of facts — are alike. Many different defendants may be charged with the same penal code violation, but each crime and circumstance is unique. Each individual and case is unique. Each requires human judgment and the vital and very natural emotion of empathy—two things artificial intelligence systems cannot provide. The California Judicial Council has recommended a laboratory approach destined to fail precisely because it cannot take human responses into account.
Would victims of brutal crimes want a coldly dispassionate Mr. Spock of Star Trek fame using blended Silicon-Vulcan logic to release their attackers through the court’s new technology-enabled revolving door? Would you, if you found yourself in this situation, want to plead your case to a human judge or to the cold justice of some tech expert’s program logic?
Close judgment calls, real criminals, real victims and witnesses placed in harm’s way, and equal treatment under the law — each poses a unique hurdle and a troubling stumbling block to implementing the California Judicial Council’s recommendation.
While modern artificial-intelligence and expert-system technologies are ahead of where they were 20 years, 10 years or even a year ago, significant uncertainty remains about what and even how a machine-language program has learned from the real world or how it might judge real-world criminal cases.
Humans often draw from past history to apply common sense to new, unique, unexpected, or unusual situations that lie far from the norm.
We should look carefully at those jurisdictions and locales where the “risk-based pretrial assessment” tool is being used. We should also evaluate with caution the reports of the problems it has created. It’s significant that many are now trying to repeal their move to no-bail technology systems.
If we exercise common sense and learn from those who mistakenly employed the “risk-based pretrial assessment” tool, any rational individual would likely conclude that California should not substitute the depth of thinking, reasoning, and decision-making inherent in our present bail and judicial system with a computerized algorithm or robotic “thinking.”
Judge Quentin L. Kopp, a former member of the San Francisco Board of Supervisors, served three terms as state senator for the San Francisco and San Mateo County. He was appointed by California Governor Pete Wilson to the San Mateo Superior Court in 1999, and is a Class of 1952 Harvard Law School graduate who tried criminal and civil cases for 45 years.