By: Keisha Murray, Associate
Summary
- In 2016, the Wisconsin Supreme Court in State v. Loomis addressed whether using a COMPAS risk assessment at sentencing would violate a defendant’s right to due process because of the proprietary nature that prevents defendants from challenging the assessment’s scientific validity.
- In an episode of the legal drama For the People, a judge informs a public defender that she would utilize software to determine the appropriate sentence by allowing the algorithm to decide the defendant’s potential recidivism.
- The public defender has serious concerns about the due process implications of a judge using software to assist her in determining the appropriate sentence.
- AI is not a replacement for human intelligence. Lawyers have an ethical duty to abide by the Model Rules of Professional Conduct to remain competent and understand the benefits and risks associated with relevant technology.
Artificial intelligence (AI) has garnered significant attention in the legal field. AI enables machines or computer programs to perform tasks that require human intelligence. Despite its efficiencies, legal professionals must remember our ethical responsibility when using AI in legal proceedings.
The legal drama For the People (the episode) is set in the Southern District of New York and follows new attorneys as they handle the country’s most high-profile cases. Fictitious attorney Allison Adams is a federal public defender who represents an African American teenage boy facing criminal charges stemming from flashing a gun to scare off gang members. During the pre-sentencing hearing, the judge informed Allison that she would utilize software to determine the appropriate sentence by allowing the algorithm to decide the defendant’s potential recidivism. Concerned about the potential due process implications, Allison asked to address the court on the software before the sentencing hearing.
Real-Life Application of Using AI in Court Proceedings
In 2016, the Wisconsin Supreme Court in State v. Loomis addressed whether using a COMPAS risk assessment at sentencing would violate a defendant’s right to due process because of the proprietary nature that prevents defendants from challenging the assessment’s scientific validity. The court concluded that a risk assessment tool could be used at sentencing but circumscribed how it can be used, including limitations and cautions a judge must observe to avoid potential due process violations.
Assessment Tool’s Intended Purpose
In the episode, Allison had serious concerns about the judge using software to assist her in determining the appropriate sentence. Allison’s concerns were justified, as the court in Loomis acknowledged that COMPAS was not intended for sentencing purposes and stressed that it should only be used for correctional decisions, not to determine the severity of a sentence.
In a 2023 Ethics Opinion, the State Bar of Michigan made an inaugural attempt to address judicial ethics regarding using AI in legal proceedings. The Opinion states, “As the use of technology increases, so does the requirement to maintain competence in what is available, how it is used, and whether the use of the technology in question would affect a judicial decision.”
Accuracy of the Assessment Tool’s Proprietary Algorithm
In the episode, Allison reassured her client that the judge would consider other factors, such as familial background, and that the same gang broke into his family’s apartment twice. When Allison asked the judge to educate her on what factors went into the algorithm, the judge replied that there were many factors,but the exact formula was proprietary. Allison believed that the judge was essentially asking her client to have faith in the software to be fair.
Similarly, in Loomis, the developer classified COMPAS as a proprietary trade secret and would not reveal how the risk scores were calculated or how the factors were weighed. Loomis was unsure if his sentence was based on accurate information because he was prevented from verifying the information used in COMPAS.
Lawyers have an ethical duty to provide competent representation to their clients. As AI becomes more integrated into the practice of law, lawyers must understand the strengths and limitations of AI tools and recognize when human intervention is necessary to ensure the quality of legal services.
While it is unclear whether Allison’s client could verify the information the judge intended to use in the software, Allison recognized that human intervention was necessary. She requested to conduct her own research about the software before the sentencing hearing. To discredit the algorithm, Allison asked her mathematician brother, Eddie, to review the software code and explain how it works.
Assessment Tool’s Discriminatory Biases
In Loomis, there were concerns that risk assessment tools may disproportionately classify minority offenders as high risk based on factors that may be outside their control, such as familial background and education. Loomis asserted that the specific method by which COMPAS considers gender was unknown and that gender could be used as a criminogenic factor.
The Michigan Code of Judicial Conduct cautions judicial officers that if they use an AI solution that is considered partial or unfair, it may influence the judicial officer’s judgment. This could occur if the tool’s algorithm or training data creates bias.
In the episode, Allison was convinced that the sentencing algorithm was built to disadvantage her client and everyone who looks like him. Allison believed the algorithm was biased, whereas her brother Eddie believed it was not biased and a good thing. She explained that the algorithm cannot feel, understand nuance,or empathize. Eddie encouraged Allison to use the data instead of discrediting it because machines do not have egos; people do.
Assessment Tool’s Data Set Comparison
In Loomis, the court cautioned that risk assessment tools must constantly be re-normed for changing populations. Jurisdictions that utilize these tools must ensure they can maintain their continued accuracy.
In the episode during the sentencing hearing, Allison’s research revealed that the recidivism rate of defendants tried and sentenced in the judge’s court was the lowest in the district and nationally. Allison told the judge that she understood the appeal of using the sentencing software as it appears to be efficient and immune to emotion and lapses in logic. The judge acknowledged that she may have been premature in relying on the sentencing software but reminded Allison that this is the future.
According to West Virginia Advisory Opinion 2023-22, a judge should never use AI to conclude the outcome of a case and must distinguish between using an AI application to decide and using AI to inform a decision.
Ultimately, Allison’s client received a six-month jail sentence, while the government previously sought a two-year sentence.
We Must Exercise Caution and Diligence When Using AI in Legal Proceedings
Lawyers and legal professionals must exercise caution and diligence when utilizing AI in legal proceedings. AI is not a replacement for human intelligence. Lawyers have an ethical duty to abide by the Model Rules of Professional Conduct to remain competent and understand the benefits and risks associated with relevant technology.