Tuesday, May 2, 2017

Live in Obedient Fear, Citizen

In today's United States, a judge could sentence you to jail based on a software generated risk report which the defendant has no right to review.

There is already anecdotal evidence that these programs will show higher risk for non-white defendants:
When Chief Justice John G. Roberts Jr. visited Rensselaer Polytechnic Institute last month, he was asked a startling question, one with overtones of science fiction.

“Can you foresee a day,” asked Shirley Ann Jackson, president of the college in upstate New York, “when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”

The chief justice’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”

He may have been thinking about the case of a Wisconsin man, Eric L. Loomis, who was sentenced to six years in prison based in part on a private company’s proprietary software. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.
Continue reading the main story

In March, in a signal that the justices were intrigued by Mr. Loomis’s case, they asked the federal government to file a friend-of-the-court brief offering its views on whether the court should hear his appeal.

The report in Mr. Loomis’s case was produced by a product called Compas, sold by Northpointe Inc. It included a series of bar charts that assessed the risk that Mr. Loomis would commit more crimes.

The Compas report, a prosecutor told the trial judge, showed “a high risk of violence, high risk of recidivism, high pretrial risk.” The judge agreed, telling Mr. Loomis that “you’re identified, through the Compas assessment, as an individual who is a high risk to the community.”

The Wisconsin Supreme Court ruled against Mr. Loomis. The report added valuable information, it said, and Mr. Loomis would have gotten the same sentence based solely on the usual factors, including his crime — fleeing the police in a car — and his criminal history.

At the same time, the court seemed uneasy with using a secret algorithm to send a man to prison. Justice Ann Walsh Bradley, writing for the court, discussed, for instance, a report from ProPublica about Compas that concluded that black defendants in Broward County, Fla., “were far more likely than white defendants to be incorrectly judged to be at a higher rate of recidivism.”

………

In 1977, the Supreme Court ruled that a Florida man could not be condemned to die based on a sentencing report that contained confidential passages he was not allowed to see. The Supreme Court’s decision was fractured, and the controlling opinion appeared to say that the principle applied only in capital cases.

Mr. Schimel echoed that point and added that Mr. Loomis knew everything the court knew. Judges do not have access to the algorithm, either, he wrote.

There are good reasons to use data to ensure uniformity in sentencing. It is less clear that uniformity must come at the price of secrecy, particularly when the justification for secrecy is the protection of a private company’s profits. The government can surely develop its own algorithms and allow defense lawyers to evaluate them.
This is why the privatization of an essential state function is a bad thing.

This is as about a perfect example of a Kafkaesque situation as is possible:  Condemned with a secret report using a secret method.

No comments:

Post a Comment