I have had my website since 2011. Recently I decided to go through it, delete posts that no longer serve a purpose, and update posts that are out of date. With hundreds of posts, deletion is easy. Updating takes time.
I have been using Claude to help. For posts to delete, Claude reviews and offers input. If I agree that a post should be deleted, I delete it, and the link is automatically redirected back to my blog. For posts to update, I run the original through Claude, we discuss what needs to change, Claude drafts the update, I paste it into WordPress, and I review and edit. Often substantially.
The Workflow Has Been Useful…But
Today I asked Claude to update a 2011 post on obtaining Facebook data in civil litigation. The underlying legal framework has not changed much, but it has become more nuanced. Claude gave me several cases to support the changes, with citations. As I worked through the draft, I did what I tell readers to do when I write about AI and legal research; I looked up every case.
- The first case existed, but Claude had described it in a way that overstated its reach. The case involved an investigative subpoena from a state attorney general under a consumer protection statute, not a civil discovery subpoena. The reasoning could be analogized to civil discovery, but the holding was narrower than the draft suggested. This is a common problem I see with Claude. It doesn’t get the holdings quite right.
- The second case existed too, but Claude had cited it to the wrong jurisdiction. The case is from the D.C. Court of Appeals. Claude told me it was from New Jersey. A quick Google Scholar search turned up the real case, but if I had pasted Claude’s citation into a brief or a CLE handout, I would have been wrong.
- A third case Claude flagged for me as one I should verify before relying on it. To Claude’s credit, it acknowledged its uncertainty up front.
The second kind of mistake, where the case exists but the details are off, is the one lawyers should worry about most. Everyone has heard about the attorneys sanctioned for filing briefs with completely fabricated cases. Those cases are dramatic and instructive, but the more common AI failure is subtler. The case exists. The name is right. The reporter and the page numbers may or may not be right. The holding may be close to what the AI says or may not be. The jurisdiction may be wrong. The procedural posture may be wrong. A lawyer who runs the case name through Google Scholar, confirms the case exists, and stops there will feel reassured. That lawyer can still file a brief with a misleading citation.
The fix is not to stop using AI. The fix is to verify every factual claim against a primary source. For case citations, that means confirming the case exists, confirming the citation is correct including the jurisdiction and the year, confirming the procedural posture, and confirming the holding actually supports the proposition the AI used it for. Also, it is important to check and make sure the case is still good law. If the AI gives you six cases, you do this six times. There is no shortcut.
Used this way, AI is a real time saver, because verifying an existing case is faster than researching to find one from scratch. Used without verification, AI is a malpractice claim waiting to happen.
I will keep using Claude to help with this blog cleanup project. I will also keep checking every citation.