Aletras, Nikolaos, et al. “Predicting Judicial Decisions of the European Court of Human Rights: a Natural Language Processing Perspective.” PeerJ Computer Science, 2:e93, 24 Oct. 2016, doi:10.7717/peerj-cs.93.
This study assesses the accuracy of an Artificial Intelligence program in predicting judicial outcomes in the European Court of Human Rights. The study found that Artificial Intelligence is able to predict cases in the European Court of Human Rights 79% of the time. Due to lack of access to most relevant court documents, the study uses verdicts handed down by the courts as a proxy for other court documents. The study is able to use these “published judgements” because they include sections about the facts of the case, the circumstances of the case, and relevant law. The researchers crop out parts of the ruling that foreshadow or announce the trial's verdict
DELAWARE ACCESS TO JUSTICE COMMISSION’S COMMITTEE ON FAIRNESS IN THE CRIMINAL JUSTICE SYSTEM. Equal Justice Initiative, 2015, DELAWARE ACCESS TO JUSTICE COMMISSION’S COMMITTEE ON FAIRNESS IN THE CRIMINAL JUSTICE SYSTEM, www.courts.delaware.gov/supreme/docs/EJI_Delaware-Bail-Paper.pdf. Accessed Nov. 2017.
This report finds that the current bail system in Delaware is expensive and increases pre-trial incarceration. The report recommends decreasing the number of people who have to pay bail in order to be released and increasing the number of people released on their own recognizance. Additionally, while the report does not recommend the use of risk assessment tools specifically, it does advise the use of a “risk-based model” in deciding when to release or detain pre-trial defendants. This report was created by a non-governmental organization as its holistic assessment of Delaware's pre-trial detention system. The report was made at the request of Delaware’s Access to Justice Commission’s Committee on Fairness in the Criminal Justice System.
Fazel, S., et al. “Use of Risk Assessment Instruments to Predict Violence and Antisocial Behaviour in 73 Samples Involving 24 827 People: Systematic Review and Meta-Analysis.” Bmj, vol. 345, no. jul24 2, 2012, doi:10.1136/bmj.e4692.
This study analyzes the effectiveness of nine major risk assessment tools at predicting violence. This study is a meta-analysis, analyzing the results of past experiments instead of creating a single experiment itself. The study finds that all nine tools used have roughly the same ability to predict violence. The tools tend to do a relatively good job predicting violence but are lacking in reliability enough that the study’s authors do not believe that these risk assessment tools should be used on their own without outside input. Finally, the study noted that these tools seem to have been primarily designed to assess men. Women often have different statistical patterns for committing violence and these tools do not account for them.
Rachlinski, Jeffrey J., et al. “Does Unconscious Racial Bias Affect Trial Judges.” Notre Dame Law Review, vol. 84, no. 3, 2009, pp. 1195–1246., heinonline.org/HOL/Page?handle=hein.journals/tndl84&div=31&g_sent=1&casa_token=&collection=journals.
This study assesses whether judges harbor implicit racial bias, whether these biases, if existent, affect their rulings, and what reforms may be able to decrease these biases. The study finds that most judges are implicitly racist, but not overwhelmingly so. In fact, the study finds that when judges are made aware of these racial biases, they tend not to make biased rulings against black defendants. In those situations, an opposite trend actually occurs, where judges with greater biases tend to rule more favorably towards black defendants. These findings were discovered somewhat coincidentally. The experimenters did not tell the judges being observed that their study was about racial biases, yet many of the judges reported that they had suspected race was the study's focus, statistically influencing their decisions.
Yang, Min, et al. “The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools.” Psychological Bulletin, vol. 136, no. 5, 2010, pp. 740–767., doi:10.1037/a0020473.
This study analyzed the effectiveness of different risk assessment tools when factoring in certain demographics. Unsurprisingly, one result of the study is that tools that are created for a specific demographic work better for that group than do more general tools. More interestingly, many assessment tools seem to work best when the subjects of the tools are older and white. While the study is not able to make a conclusive statement on how gender affects the accuracy of the data, it appears that data assessment tools are less accurate when the subjects are women instead of men.
This study assesses the accuracy of an Artificial Intelligence program in predicting judicial outcomes in the European Court of Human Rights. The study found that Artificial Intelligence is able to predict cases in the European Court of Human Rights 79% of the time. Due to lack of access to most relevant court documents, the study uses verdicts handed down by the courts as a proxy for other court documents. The study is able to use these “published judgements” because they include sections about the facts of the case, the circumstances of the case, and relevant law. The researchers crop out parts of the ruling that foreshadow or announce the trial's verdict
DELAWARE ACCESS TO JUSTICE COMMISSION’S COMMITTEE ON FAIRNESS IN THE CRIMINAL JUSTICE SYSTEM. Equal Justice Initiative, 2015, DELAWARE ACCESS TO JUSTICE COMMISSION’S COMMITTEE ON FAIRNESS IN THE CRIMINAL JUSTICE SYSTEM, www.courts.delaware.gov/supreme/docs/EJI_Delaware-Bail-Paper.pdf. Accessed Nov. 2017.
This report finds that the current bail system in Delaware is expensive and increases pre-trial incarceration. The report recommends decreasing the number of people who have to pay bail in order to be released and increasing the number of people released on their own recognizance. Additionally, while the report does not recommend the use of risk assessment tools specifically, it does advise the use of a “risk-based model” in deciding when to release or detain pre-trial defendants. This report was created by a non-governmental organization as its holistic assessment of Delaware's pre-trial detention system. The report was made at the request of Delaware’s Access to Justice Commission’s Committee on Fairness in the Criminal Justice System.
Fazel, S., et al. “Use of Risk Assessment Instruments to Predict Violence and Antisocial Behaviour in 73 Samples Involving 24 827 People: Systematic Review and Meta-Analysis.” Bmj, vol. 345, no. jul24 2, 2012, doi:10.1136/bmj.e4692.
This study analyzes the effectiveness of nine major risk assessment tools at predicting violence. This study is a meta-analysis, analyzing the results of past experiments instead of creating a single experiment itself. The study finds that all nine tools used have roughly the same ability to predict violence. The tools tend to do a relatively good job predicting violence but are lacking in reliability enough that the study’s authors do not believe that these risk assessment tools should be used on their own without outside input. Finally, the study noted that these tools seem to have been primarily designed to assess men. Women often have different statistical patterns for committing violence and these tools do not account for them.
Rachlinski, Jeffrey J., et al. “Does Unconscious Racial Bias Affect Trial Judges.” Notre Dame Law Review, vol. 84, no. 3, 2009, pp. 1195–1246., heinonline.org/HOL/Page?handle=hein.journals/tndl84&div=31&g_sent=1&casa_token=&collection=journals.
This study assesses whether judges harbor implicit racial bias, whether these biases, if existent, affect their rulings, and what reforms may be able to decrease these biases. The study finds that most judges are implicitly racist, but not overwhelmingly so. In fact, the study finds that when judges are made aware of these racial biases, they tend not to make biased rulings against black defendants. In those situations, an opposite trend actually occurs, where judges with greater biases tend to rule more favorably towards black defendants. These findings were discovered somewhat coincidentally. The experimenters did not tell the judges being observed that their study was about racial biases, yet many of the judges reported that they had suspected race was the study's focus, statistically influencing their decisions.
Yang, Min, et al. “The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools.” Psychological Bulletin, vol. 136, no. 5, 2010, pp. 740–767., doi:10.1037/a0020473.
This study analyzed the effectiveness of different risk assessment tools when factoring in certain demographics. Unsurprisingly, one result of the study is that tools that are created for a specific demographic work better for that group than do more general tools. More interestingly, many assessment tools seem to work best when the subjects of the tools are older and white. While the study is not able to make a conclusive statement on how gender affects the accuracy of the data, it appears that data assessment tools are less accurate when the subjects are women instead of men.