In 2018—long before generative AI turned into the topic du jour in the legal firmament—a judge in the Ontario Superior Court issued what seemed like a prescient if unprecedented opinion.
Upon examining indemnity costs claimed by the prevailing defendant in an occupier’s liability case, Justice Whitten took issue with a $900 line item that was being sought by the defendant’s counsel as “legal research” fees. In his ruling, the judge questioned the reasonableness of the fee and declaimed that “If artificial intelligence sources were employed, no doubt counsel’s preparation time would have been significantly reduced.” It was arguably the first hearing of its kind wherein a lawyer was “slammed” by a judge for not using AI.
Fast forward to 2023, and advances in machine learning have led to a step function change in natural language processing (as evinced by the emergence of large language models (LLMs) such as GPT-4). A key factor behind these advances—the transformer neural network architecture—enables these AI models to make sense of and generate human language text* at a level of sophistication that’s beyond anything we expected from machines. From answering complex queries in detail to helping humans overcome the proverbial “blank page problem,” these large language models hold out the promise of freeing up all manner of knowledge work from routine and time-consuming tasks. Indeed, their impact on work that involves looking for information across large corpuses of data and generating carefully drafted content is yet to be fully understood.
Need to catch up on some of the basic concepts behind generative AI? Catch up on some definitions in this article.
Legal work, generally speaking, is one such example in which the application of LLMs, albeit in its infancy, is being touted but also approached with caution. And despite their significant issues around misinformation, hallucinatory tendencies (such as inventing legal citations out of thin air), algorithmic bias, privacy, and security that could pose untold risk in the context of legal work, LLMs are reportedly getting traction across some law firms where they are being used as content generation tools. Notwithstanding these misgivings, a recent survey by Thomson Reuters Institute revealed that 51 percent of law firm professionals welcome the use of LLMs like ChatGPT.
Current Challenges with Access to Justice
One of the less discussed dimensions of this impact is the role LLMs could play in widening access to justice. As a portion of gross domestic product, the US legal system is considered the most costly legal system in the world according to a study released by the US Chamber Institute for Legal Reform.**
In fact, a Washington Post article states that more than two-thirds of critical civil cases in the US involve a party that cannot afford a lawyer. A searing example of the prohibitive costs of legal representation is in New York City, where 91 percent of petitioners and 92 percent of respondents were unable to afford a lawyer in child support matters and 99 percent of tenants were unrepresented in eviction proceedings.
The problem isn’t just about affordability. It’s also about availability: according to the American Bar Association (ABA), the majority of the 1.3 million lawyers in the US (2020) are located in metropolitan America. As a consequence, smaller towns and rural counties are turning into “legal deserts” with an acute scarcity of attorneys. For instance, even though New York state has the highest number of lawyers in comparison to other states, Orleans, a rural county in upstate New York, has only 31 lawyers for its 40,000 residents—three-quarters of a lawyer for every 1,000 people.
Can Generative AI Help Close the Justice Gap?
Given their computational firepower, scalability, and ability to make exponential improvements as they are trained on more data, could LLMs be the force multiplier that helps scale the availability and delivery of legal services and ultimately makes them more affordable for the average American?
Here are a sampling of ways in which LLMs could help scale the production of legal tasks and democratize access to justice:
Document Preparation: LLMs can accelerate the time it takes to prepare legal documents such as contracts, wills, and court filings by generating templates or offering suggestions based on user input.
Drafting Motions: LLMs can be used to expedite the time it takes to create initial drafts, cite case law, advance arguments, as well as hypothesize what arguments could be made by the opposing party.
Turning Legal Arcana into Simple Language: LLMs can help break down complex legal jargon into simpler terms, making it easier for a layperson to understand laws and regulations.
Legal Research: LLMs can assist in legal research, finding relevant case law, statutes, and regulations, which can save time for lawyers.
Scaling Pro Bono Work: The ability to scale pro bono services and legal aid to a larger cohort of the population has obvious benefits.
In considering these use cases, however, it’s imperative to keep in mind that humans should always be in the loop to validate the LLM’s outputs. Secondly, it may be early to deploy LLMs in complex legal tasks as they are only in their infancy. Passing the litmus test for industry-wide adoption would require LLM evangelists to prove that these AI models can perform as accurately (if not better) than humans would in doing routine legal work.
It’s too early to predict the extent to which LLMs could broaden access to justice. To suggest that any of this will happen anytime soon would be foolishly Panglossian, despite the spate of breakthroughs in AI development over the last few months. What we do know, however, is that on a long enough time horizon, technology tends to have strong deflationary effects on the price of goods and services—making them more affordable to a wider group of people.
*While not germane to this post, it is nonetheless important to acknowledge that large language models have also been shown to work with programming languages.
**The study looked at liability cost as a portion of GDP in several developed countries around the world; liability costs were treated as a proxy for more frequent and costly claims.