Using CoCounsel to Confirm AI Hallucination

I was reviewing a motion filed by opposing counsel when I noticed there was an unreported case cited. This tipped off my AI alarm as I have heard this is a sign that AI has been used (and hasn't been double-checked). I uploaded the motion to CoCounsel and asked it to find all of the cases cited. It was able to identify cases that did not exist (AI hallucination) and confirm the citations on cases that did exist. I then manually searched for the suspected hallucinated case and confirmed that it did not, in fact, exist. Using CoCounsel allowed me to save time investigating and also have the confidence to confirm the presence of AI hallucination. And in the end, I successfully got the motion dismissed.

Parents
  • Thomson Reuters Thomson Reuters staff member

    This is a great example of how you not only trusted your own intuition but were also able to leverage CoCounsel to secure a win for yourself. Thank you for sharing! 

  • Thomson Reuters Thomson Reuters staff member
    in reply to Ila Avinash

    Have you tried the Litigation Document Analyzer to help you detect erroneous citations in opposing counselor filings? Many of the firms I work with really like that tool as well because it lays out all the cited authority for you clearly and completely, allowing for rapid identification of not only hallucinations, but also any misrepresentations being made using real authority, which is a very common sign that AI has been involved in the drafting process as well. Hope this helps! 

  • I have not, but I will give it a try next time. Great tip!

  • Reply
    • I have not, but I will give it a try next time. Great tip!

    Children
    No Data