How Many Lawyers Have Gotten in Trouble Because of AI Hallucinations? A Lot

Damien Charlotin, a senior research fellow from Paris, focuses his research on artificial intelligence in the law. As part of his work, which includes a lecturer on legal data analysis, he created a database that tracks, “legal decisions in cases where generative AI produced hallucinated content – typically fake citations, but also other types of AI-generated arguments.” Included in this database are the punishments legal counsel suffered as a result of their mistakes. These penalties include warnings, financial sanctions, and dismissal by clients. Aside from what Damien reports, there is at least one case in the US in which an attorney was suspended from the practice of law.

As of today, December 14, 2025, there are over 600 reported cases in his database. For more, see AI Hallucination Cases Database – Damien Charlotin

Damien also created an automated reference checker called PelAIken. I haven’t tried the reference checker yet, but it looks impressive.

Please don’t become a warning on this website. Make sure someone in your firm checks every single citation, including string cites, that you receive from AI research. Even if you use a legal database, the AI can hallucinate holdings and other information. So, while the case itself may not be made up in a legal database such as Lexis or Westlaw, the content might still be.