ChatGPT 'Hallucinated' Court Cases , Cited By NY Attorneys , in Federal Court Filing.<br />ChatGPT 'Hallucinated' Court Cases , Cited By NY Attorneys , in Federal Court Filing.<br />The case in question was a personal injury suit filed against Avianca Airlines.<br />The plaintiff's attorney, Peter LoDuca, cited a number of court cases that attorneys for Avianca could not substantiate.<br />It was later revealed that LoDuca had used ChatGPT to research useful cases to cite.<br />Federal Judge P. Kevin Castel gave LoDuca an opportunity to admit that the cases were somehow fabricated.<br />LoDuca allegedly then used ChatGPT to generate the cases that it had initially cited, despite the fact that they were AI fabrications.<br />In a court filing that could lead to sanctions against offending attorneys, .<br />... Castel wrote that the initial federal court filing was “replete with citations to non-existent cases.”.<br />In addition, Castel referred to the attorneys' research as “bogus judicial decisions with bogus quotes and bogus internal citations.”.<br />The term "hallucinated" has already been coined to refer to instances in which ChatGPT creates information that is not real.<br />This case highlights a trend across fields that incorporate research in which AI is used without being cross-referenced or checked for errors.<br />This creates new challenges in complex fields such as law, media and academia, challenges which Castel alludes to in his filing.<br />The Court is presented with an unprecedented circumstance, Federal Judge P. Kevin Castel, via NBC News