Greg Lambert – An Honest conversation about RAG and Hallucination-Free Legal Research Tools

Greg Lambert writes May 28 2024

Adam Ziegler‘s post calling on Legal Research Vendors like Thomson Reuters and LexisNexis to shoulder the burden of proving claims like Lexis’ “The fastest legal generative AI with conversational search, drafting, summarization, document analysis, and hallucination-free linked legal citations” has valid points.

Since I was one of the vocal critics of the Stanford report, I would say that I agree with Adam that the big legal information vendors need to be much more transparent on what their Gen AI tools can do, can’t do, and still have issues with the AI hallucinations which are a “feature” of Gen AI, and not a “bug.”

My biggest concern with the Stanford Institute for Human-Centered Artificial Intelligence (HAI) report was that this was STANFORD UNIVERSITY, and these reports are given a significant weight to them by the very nature that they have a history and prestige given to them simply from having that name attached to a report. Using Practical Law as a replacement for Westlaw Precision AI was not insignificant. The fact that they finally mention that they didn’t have access to the resource on page 9 of the report after giving pages of results critical of the Practical Law tool, was my biggest concern.

I have never been shy about criticizing the big vendors for any of their tools, including their reliance on RAG to help improve their AI tools. While RAG is a good first step, it is not a total fix for hallucinations. An example of this was seen a few months ago, when a law professor from Nebraska who pointed out that Lexis+ AI provided results to one of his searches with citations to cases published in 2025 and 2026. The way that Lexis “technically” explained why these cases from the future weren’t really hallucinations was that the citations were not linked in the answer, thus it should be clear to the person conducting research that these cases did not exist and should be ignored. It was a pretty weak explanation, even if it was technically true.

Read his full piece at

https://www.linkedin.com/pulse/honest-conversation-rag-hallucination-free-legal-research-lambert-skmmc/?trackingId=T20Zc2OBuaflQxrfQzVb6Q%3D%3D