Monday, 13 March 2017

How to Upgrade Judges with Machine Learning

Judges have often been criticized for locking up nonviolent offenders while awaiting trial and missing clues on potential violent offenders that hurt more citizens when allowed back onto the street. Sometimes issues of race, economic status, and more come into play at sentencing or a parole hearing. This means that more offenders are held than need be which clogs the wheels of justice. The National Bureau of Economic Research in conjunction with scientists and economists have devised an algorithm to assist judges at doing their job. To accomplish this task, National Bureau of Economic Research studied over 100k judicial cases paying close attention to the offense they are suspected of, when and where they were arrested, and numbers and type of prior convictions. Age was used as a demographic marker but not race. When applied to 100k more cases, the results of the machine learning algorithm were better than the judges in determining flight risk and propensity to commit another crime. Then applying the algorithm to court cases in 40 urban cities yielded the same results. In New York City alone they determined that they could cut crime by offenders awaiting trial by 25% if leaving the same amount of offenders jailed. Or they could lower the prison population by 40% while leaving the repeat offender crime rate the same. The system could be a fail safe type system where it allows the judge to make all decisions and the software will flag cases where the machine learning believes the judge is wrong. Some judges miss behavioral markers that a person is a flight risk or repeat offender. What do you think? This seems like it could be a great tool to assist with the court system as 40% less prisoners to feed and guard sounds awesome. By the same token other machine learning software packages were deemed too harsh on certain races as they used different markers to determine potential for recidivism. Is this the beginning of a Minority Report system? Kleinberg suggests that algorithms could be deployed to help judges without major disruption to the way they currently work in the form of a warning system that flags decisions highly likely to be wrong. Analysis of judges' performance suggested they have a tendency to occasionally release people who are very likely to fail to show in court, or to commit crime while awaiting trial. An algorithm could catch many of those cases, says Kleinberg. Discussion

Read the full article here by [H]ardOCP News/Article Feed

No comments: