Lawsites: With Launch of ‘AllSearch,’ Casetext Unleashes Powerful Neural Net Search Technology on Litigation Documents

Some background on Case Texts new “neural” search


Last year, I wrote here about the beta release by Casetext of a powerful search tool, WeSearch, developed using an emerging branch of artificial intelligence known as neural networks, that is remarkably adept at finding conceptually related documents, even when they contain no matching keywords.

Now, Casetext is formally launching that search tool under a new name, AllSearch, and with a focus on helping litigators search large sets of legal documents, including for e-discovery or to search internal databases and repositories, such as brief banks, litigation records, deposition transcripts, and expert reports.

It is now also fully integrated with the Casetext legal research platform, so that a user can simultaneously search primary and secondary legal resources and their own document collections.

“I see this as the most important product launch in the history of the company by far,” Pablo Arredondo, Casetext’s cofounder and chief innovation officer, told me during a demonstration of AllSearch, “because I think it represents our ability to expand well beyond legal research and to bring all that Casetext is to all the other oceans of content out there that need it.”

Neural Network Framework

Before I say more about this new product, allow me to provide some background.

As I explained in that post last year, in 2020, Casetext launched Compose, a first-of-its-kind product that helps lawyers create the first draft of a litigation brief in a fraction of the time it would normally take.

A core component of Compose was Parallel Search, a powerful tool for finding conceptually related cases, even when they contain no matching keywords. As I wrote in another post, Parallel Search could be considered the secret sauce of Compose, using an advanced neural network-based technique to to follow you as you draft a brief and automatically provide you with conceptually relevant precedent.

What is remarkable about Parallel Search is its ability to go beyond the kinds of results you would expect from keyword searching, finding conceptually analogous caselaw even when the cases do not use the same language.

As Arredondo puts it, compared to Parallel Search, “what others called natural language search was just casual Fridays in the keyword prison.”

(I offer some examples of the power of Parallel Search in this prior post.)

It is based on transformer-based neural networks, the same general approach that underpinned Google’s open-source network framework developed by Google called Bidirectional Encoder Representations from Transformers, or simply BERT. Casetext tailored the approach to make it work on the nuance and scale that litigation requires.

Proof of Concept

When Casetext launched WeSearch, it took that power of Parallel Search and extended it to virtually any collection of documents on which you might want to unleash it.

Read more