AI in the courtroom: Judges enlist ChatGPT help, critics cite risks
Loading...
| London, Bogota, and Los Angeles
Indian High Court judge Anoop Chitkara has ruled over thousands of cases. But when he refused bail to a man accused of assault and murder, he turned to ChatGPT to help justify his reasoning.
He is among a growing number of justices using artificial intelligence (AI) chatbots to assist them in rulings, with supporters saying the tech can streamline court processes while critics warn it risks bias and injustice.
鈥淎I cannot replace a judge. ... However, it has immense potential as an aid in judicial processes,鈥 said Mr. Chitkara.
鈥淭he knowledge revolution has started, and these AI platforms have in certain situations demonstrated their capabilities to instantaneously transform queries into outstanding results.鈥
Chatbots like ChatGPT and Google鈥檚 Bard are software applications designed to mimic human conversation in response to users鈥 questions.
Mr. Chitkara said he did not rely on ChatGPT to help decide his ruling in the 2020 case at the Punjab and Haryana High Court.
However, he wondered if he was relying too heavily on his own 鈥渃onsistent view鈥 that allegations involving an unusually high level of cruelty should count against granting bail, and asked ChatGPT to summarize case law on the issue.
The justice ministry did not immediately respond to a request for comment.
The use of AI in the criminal justice system is growing quickly worldwide, from the popular DoNotPay chatbot lawyer mobile app to robot judges in Estonia adjudicating small claims and AI judges in Chinese courts.
In the Caribbean Colombian city of Cartagena, Judge Juan Manuel Padilla also turned to ChatGPT for help in a lawsuit in which an autistic boy鈥檚 parents were suing his health care provider for treatment costs and expenses.
鈥淸ChatGPT] is generating text that is very reliable, very concrete, and applicable to a case in a specific way,鈥 said Mr. Padilla.
He asked the chatbot several legal questions such as whether an autistic child is exempt from fees for therapy. He included the details in his ruling, which sided in favor of the child.
Concerns over false results
But chatbots鈥 reliability is questionable, said several legal and tech experts.
鈥淪ome judges are trying to find a way to make the job faster 鈥 but they don鈥檛 always know the limits or risks,鈥 said Juan David Gutierrez, professor of public policy and data at Universidad del Rosario in Bogota, Colombia.
鈥淐hatGPT can make up laws and rulings that don鈥檛 exist. In my view, it shouldn鈥檛 be used for anything important.鈥
There have been numerous examples of chatbots getting information wrong or making up plausible but incorrect answers 鈥 which have been dubbed 鈥渉allucinations鈥 鈥 such as inventing fictional articles and academic papers.
When ChatGPT was tested on its responses to 50 legal questions by Linklaters, a global law firm headquartered in London, legal experts found it proficient in some areas but severely lacking in others.
The AI confused sections of the Data Protection Act 2018, and failed to give complete answers on English contract law.
鈥淚f you didn鈥檛 already have a very good understanding of that area of law, it would be very hard for you to work that out,鈥 solicitor Peter Church, an expert in data privacy at Linklaters, told the Thomson Reuters Foundation.
Use of chatbot 鈥渁 disaster鈥
Better technology promises a way to alleviate the huge backlog that is clogging some legal systems.
India alone had more than 40 million cases pending in lower courts last year while Brazil had 26 million new lawsuits filed in 2020 alone 鈥 more than 6,000 per judge.
But AI risks oversimplifying complex problems and could raise unrealistic expectations of tech鈥檚 capabilities, Dona Mathew and Urvashi Aneja from the research collective Digital Futures Lab wrote in a recent report.
There are also concerns over privacy violations and the exploitation of judicial data for profit.
鈥淲ith biased and incomplete datasets, no legal remedies and accountability safeguards ... these changes can lead to systematic harms like threats to judicial independence and stagnation of legal principles,鈥 they wrote.
Raquel Guerrero, a lawyer for three journalists in Bolivia who were accused of posting photos of a victim of violence without their permission, expressed concerns when the court consulted ChatGPT during an online hearing in April.
Ms. Guerrero said the complainant gave permission for the photos to be shared online but later denied she had done so.
Constitutional judges asked ChatGPT about any 鈥渓egitimate public interest鈥 for journalists posting online photos of a 鈥渨oman showing parts of her body鈥 without her consent.
ChatGPT answered it was a 鈥渧iolation of the person鈥檚 privacy and dignity.鈥 The judges ordered the photos to be removed from social media.
The court record said ChatGPT does not replace decisions made by jurists, but that it can be used as additional support to be able to 鈥渃larify certain concepts.鈥
But Ms. Guerrero said the chatbot鈥檚 use in the hearing was 鈥渁rbitrary鈥 and a 鈥渄isaster.鈥
鈥淚t can鈥檛 be used as if it鈥檚 a calculator that takes away the obligation of judges to use reason and to apply justice and to apply it correctly,鈥 Ms. Guerrero said, adding she is considering filing a complaint against the judges for using the chatbot.
鈥淥bviously, ChatGPT doesn鈥檛 stop being a robot. If you ask it in the right way, it will answer what you want to hear.鈥
This story was reported by the Thomson Reuters Foundation.聽