In a world with increasing reliance on technology, it is unsurprising that computer algorithms are now being used to predict crime. Many have seen the movie Minority Report, starring Tom Cruise, in which a futuristic society has abolished all murder due to its ability to harness individuals’ psychic powers to predict killings before they even occur. Now, with the progression of artificial intelligence technology it is no surprise that we are getting closer to becoming the world in Minority Report. However, recent studies have proven that the data yielded from this technology is extremely unreliable. In fact, a study published on the 17th of January dispelled this myth of reliability. The researchers in the study utilized algorithms that are typically used for predicting recidivism, or a criminal’s likelihood that he or she will offend again. Interestingly enough, these algorithms are actually used for parole and judicial proceedings to help those in the justice system to determine if an offender is likely to commit a crime again. The results of the study were troubling.
The researchers found that when randomly selecting groups of lay people, they could predict if a criminal will recidivate about two-thirds of the time, which is essentially identical to the technology that courts use to determine that same final results.
This has the potential for huge impacts on offenders, as this technology is yielding the same accuracy as untrained human beings, and judges are lending great credence to this technology. Ms. Julia Dressel, a researcher who conducted this study for her undergraduate thesis at Dartmouth College alongside Mr. Hany Farid, a computer science professor, confesses that “an algorithm’s accuracy can’t be taken for granted, and [the courts] need to test these tools to ensure that they are performing as we expect them to.” The two also revealed that similar levels of accuracy can be found using just two pieces of data: the defendant’s amount of past convictions and the age of the defendant at the time of sentencing. This is very interesting when juxtaposed against Compas, or Correctional Offender Management Profiling for Alternative Sanctions, which utilizes six more nuanced variables to determine recidivism rates. In fact, Eric L. Loomis, a Wisconsin resident, was told by a judge that he was a “high risk” to the community at large and sentenced him to six years in prison for eluding the police. The judge had relied on Compas for this decision. The Supreme Court of Wisconsin denied Loomis’s petition for a writ of certiorari in January 2017. Despite this denial, it is still interesting to note that Loomis based his appeal on equal protection claims, asserting that male and female defendants are treated differently using the Compas algorithm. The organization ProPublica, which is dedicated to investigative journalism and exposing systemic inequality, found that the Compas software treated black defendants inequitably, which can also raise some Equal Protection concerns. Equivant, the company that invented Compas, disagrees with Dressel’s and Farid’s findings, and argues that both ProPublica’s and their study was inaccurate due to small sample sized.