Computer secrets alters justice system. In Minority Report, Tom Cruise plays in a future where a special police unit uses 3 “precogs” or psychic predictors to be able to arrest murderers before they commit their future crimes. Only occasionally does the strongest of the precogs issue a “minority report” indicating an alternative outcome where no crime is to be committed and these “minority reports” are concealed and the arrests made anyway based upon a disputed majority predictions. Until Police Officer Tom Cruise is accused of a future murder and has a “minority report” placing the issue in doubt. This type of science fiction scenario has found its way into an American court where a computer secrets alters justice system.
SCIENCE FICTION VS SCIENCE REALITY
Recently U.S. Supreme Court Chief Justice John G. Roberts Jr. visited Rensselaer Polytechnic Institute, he was asked a startling question, one with overtones of science fiction. “Can you foresee a day,” asked Shirley Ann Jackson, president of the college in upstate New York, “when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”
Chief Justice Robert’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”
Chief Justice roberts may have been thinking about the case of a Wisconsin man, Eric L. Loomis, who was sentenced to six years in prison based in part on a private company’s proprietary software. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.
In March, in a signal that the justices were intrigued by Mr. Loomis’s case, they asked the federal government to file a friend-of-the-court brief offering its views on whether the court should hear his appeal.
The report in Mr. Loomis’s case was produced by a product called Compas, sold by Northpointe Inc. It included a series of bar charts that assessed the risk that Mr. Loomis would commit more crimes.
The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”
The issue of how computer secrets alters justice system most recently appeared in Wisconsin.The Wisconsin Supreme Court ruled against Mr. Loomis. The report added valuable information, it said, and Mr. Loomis would have gotten the same sentence based solely on the usual factors, including his crime — fleeing the police in a car — and his criminal history.
At the same time, the court seemed uneasy with using a secret algorithm to send a man to prison. Justice Ann Walsh Bradley, writing for the court,
discussed, for instance, a report from ProPublica about Compas that concluded that black defendants in Broward County, Fla., “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism.”
Justice Bradley noted that Northpointe had disputed the analysis. Still, she wrote, “this study and others raise concerns regarding how a Compas assessment’s risk factors correlate with race.”
In the end, though, Justice Bradley allowed sentencing judges to use Compas. They must take account of the algorithm’s limitations and the secrecy surrounding it, she wrote, but said the software could be helpful “in providing the sentencing court with as much information as possible in order to arrive at an individualized sentence.”
Justice Bradley made Compas’s role in sentencing sound like the consideration of race in a selective university’s holistic admissions program. It could be one factor among many, she wrote, but not the determinative one.
In urging the United States Supreme Court not to hear the case, Wisconsin’s attorney general, Brad D. Schimel, seemed to acknowledge that the questions in the case were substantial ones. But he said the justices should not move too fast.
“The use of risk assessments by sentencing courts is a novel issue, which needs time for further percolation,” Mr. Schimel wrote.
He added that Mr. Loomis “was free to question the assessment and explain its possible flaws.” But it is a little hard to see how he could do that without access to the algorithm itself.
The company that markets Compas says its formula is a trade secret.
“The key to our product is the algorithms, and they’re proprietary,” one of its executives said last year. “We’ve created them, and we don’t release them because it’s certainly a core piece of our business.”
Compas and other products with similar algorithms play a role in many states’ criminal justice systems. “These proprietary techniques are used to set bail, determine sentences, and even contribute to determinations about guilt or innocence,” a report from the Electronic Privacy Information Center found. “Yet the inner workings of these tools are largely hidden from public view.”
SECRETS IN COURT
JUSTICE ABHORS SECRET EVIDENCE DENIED OPEN EXAMINATION IN COURT. The encroachment of secret evidence from secret courts was initially related to “National Security” but as with many things, “function creep” expands its place within our courts. The Foreign Intelligence Surveillance Court (FISC) was established by Congress in the Foreign Intelligence Surveillance Act (FISA) of 1978. The role of the FISC is to provide judicial oversight of Intelligence Community activities in a classified setting. The FISC is composed of federal judges appointed by the Chief Justice of the U.S. Supreme Court, and its decisions can be reviewed by the Foreign Intelligence Surveillance Court of Review (FISCR) and the Supreme Court. After the FISA Amendments Act of 2008, the FISC has to rule on important and novel Fourth Amendment issues raised by the government’s proposed targeting and minimization procedures. Most of the FISC’s orders and filings are highly classified, but, after the USA Freedom Act, any significant legal interpretations by the court must be made public. https://epic.org/privacy/surveillance/fisa/fisc/
The expansion of secrets in our courts relates back to a precedent from 1977, where the Supreme Court ruled http://caselaw.findlaw.com/us-supreme-court/430/349.html that a Florida man could not be condemned to die based on a sentencing report that contained confidential passages he was not allowed to see. The Supreme Court’s decision was fractured, and the controlling opinion appeared to say that the principle applied only in capital cases.
Mr. Schimel echoed that point and added that Mr. Loomis knew everything the court knew. Judges do not have access to the algorithm, either, he wrote.There are good reasons to use data to ensure uniformity in sentencing. It is less clear that uniformity must come at the price of secrecy, particularly when the justification for secrecy is the protection of a private company’s profits. The government can surely develop its own algorithms and allow defense lawyers to evaluate them.
At Rensselaer last month, Chief Justice Roberts said that judges had work to do in an era of rapid change.
The impact of technology has been across the board and we haven’t yet really absorbed how it’s going to change the way we do things and how computer secrets alters justice system
“The very nature of a trial is [the] search for truth.” Nix v. Whiteside, 374 U.S. 157, 158 (1986).
“Cross-examination is the greatest legal engine ever invented for the discovery of truth.” John H. Wigmore, quoted in Lilly v. Virginia, 527 U.S. 116 (1999). Secrets are the enemy of the search for truth and without truth justice is not possible. We must not permit the expanded usage of computer secrets alters justice system beyond repair.