If you missed Relativity Fest 2023, you missed a lot. A fan-favorite session each year is our “e-Discovery State of the Union,” in which a panel of industry voices helps distill the current goings-on of our space into fun, tangible insights and lessons learned.
This year’s panel featured a blend of familiar and newer faces:
- David Horrigan, Discovery Counsel & Legal Education Director, Relativity (moderator)
- Ryan O’Leary, Research Director, Data Privacy and Legal Technologies, IDC
- Isha Marathe, technology reporter, Legaltech News, ALM
- Stephanie Wilkins, Editor-in-Chief, Legaltech News, ALM
- Joe Patrice, Senior Editor, Above the Law
- Bob Ambrogi, Publisher, LawSites/LawNext
- Greg Buckles, Analyst-Consultant, eDiscovery Journal
- Joy Heath Rush, CEO, ILTA
As in past years, the group used buzzers to respond to David’s questions and liven up the discussion. But this time, they each got two buzzers—allowing for yes/no responses to group prompts before diving into the details.
Now that we’re wrapping up 2023, let’s take a look at what is top-of-mind for experts across the legal technology world today. (And yes, we’re going to talk a lot about AI—but there’s more nuance to it than just that, I promise!)
Collaborative Data and its Many Challenges
“One of the largest issues I see, and it comes up in most inquiries with corporate clients: the sudden boom of collaborative spaces when we went remote. Team growth, workspace growth, Zoom meetings—we suddenly jumped into collaborative content and context, and discovery is trailing, as usual,” Greg Buckles told Fest attendees, when asked about the biggest issues facing the e-discovery world this year. “A particularly big question coming up is: What’s a custodian anymore? A person who created something from a template? Is it the administrator, or even a site or channel? Is it the ‘Last modified by’ user? Who is the owner—or are they all the owners?”
It’s an example of how modern data is presenting modern challenges for even the most experienced and robust legal teams and technologies.
“Our legal tech platforms are generally single-owner, and we don’t have the ability to share ownership; with collaborative data, this is breaking visualizations, analytics, and other functions,” Greg continued. “We don’t have all the answers yet.”
On a related subject, an audience member asked a question about short message data and how it’s handled. The panel got into a lively conversation about whether standardizing on productions of short message data is realistic—do 24-hour blocks of conversations make sense universally? Must it be more nuanced than that? Will this always be something to hash out, case-by-case, in a meet and confer?
There are many opinions on the subject of how to handle collaborative data efficiently, and with the most insights gleaned; the debating and innovation we’ll all need to confront these challenges is sure to continue in 2024.
The Intersection of AI, Accessibility, and Authenticity
“Are we panicking about deepfakes?”
Moderator David Horrigan issued this question to the group, and the panels’ insights ranged from cool-headed to low-key panicked.
“I’m in the camp of yes, and I’m not an easily panicked person. This is an area where tech intersects personal lives in a way that’s alarming,” Joy Heath Rush told the crowd. “We all need to be more educated about what tech can and cannot do. When it affects our personal lives, it becomes much more real.”
In addition to matters of personal safety, the group was concerned about authentication issues: how can AI-generated content be proven as such if it’s presented as if a human created it?
“We’ve seen an authentication problem over the last couple of years with images that have fooled large portions of people,” Ryan O’Leary noted. “There needs to be some sort of seal built into metadata for authentication purpose to prevent that slippery slope of the robots taking over the world.”
Media coverage of deepfakes—their existence, their risks, their unsettling accuracies—have helped familiarize all of us with the concept. And it helps, certainly, to be aware of the fact that not everything we encounter will be what it claims to be.
“This has been the most interesting topic coming out of generative AI, especially as we’ve reported on how the technology affects courts,” Isha Marathe said. “It affects all characters—jury, judge, lawyers, and e-discovery professionals, as the first line of defense to evidence. They’re affecting the legal process, outpacing the education around them.”
Indeed, there was significant concern from the whole group that deepfakes could truly shake the foundations of our justice system as we conduct evidentiary proceedings today.
“I’m concerned about the erosion of the concept of truth in evidence in the legal system, and the effect of that on juries. There are many times when you present a document to a witness, they look at it, and, in the past, they’ve had to admit to recognizing it. But we live in a world where you can create a prompt that says ‘in the voice of Greg Buckles, write a document that…’ And if you create a lot of content, it might be difficult to see and remember what you’ve created,” Greg Buckles explained. “From a witness perspective, that messes with your mind. We are going to see AI content intermingled with witness content, and we’ll definitely have people saying, ‘I don’t know if I wrote that; maybe Copilot did?’ And it’ll be an ‘AI ate my homework’ world.”
And even once technology is able to reliably differentiate computer-generated content from man-made content, there may be lingering issues.
“I’m not so concerned that the tech won’t be able to authenticate; I think there will be solutions for that,” Bob Ambrogi chimed in to say. “But I’m concerned courts and litigants won’t have access to that tech, or know about it, or know how to use it. Courts will have problems to deal with there.”
Joe Patrice agreed: “Authentication is going to be a big deal, but I think we’ll be able to figure it out. We’ve figured out other ways of faking evidence before, and we’ll do it again. But access will be a problem. Mercifully, I hope there’s not tons of fake evidence in low-level criminal prosecutions, and prosecutors aren’t trafficking in that sort of thing—but you never know.”
Once again, the “approach AI with caution” refrain rang throughout the room.
Legal Teams Must Approach Cybersecurity from Many Angles
The group was also asked: what about cybersecurity? Is it still an open issue for e-discovery and legal teams?
The answer was a resounding “yes.”
“I think the biggest issue in our space right now is how we deal with cybersecurity issues. AI has become a shiny object we talk a lot more about, but cybersecurity is still probably the most important thing going on,” Joe Patrice said. “It’s an issue for law firms. The abundance of data they have is viewed by the hacker community as a soft underbelly of vulnerability, and that’s a problem.”
Stephanie Wilkins noted that “AI is raising all new cybersecurity concerns.” It’s an evolving area requiring constant attention: “On a practical level, cybersecurity is the biggest concern.”
“We’re concerned about AI and cybersecurity, and we haven’t even gotten basic security out of the way,” Ryan O’Leary agreed. “Hackers are getting better and better, faster than we can keep up.” Especially regarding social engineering attempts, he said, “your average law firm whose main job is not cybersecurity needs to now invest heavily in that effort to protect their clients.”
And let’s not forget that all that shiny new tech can be used by the bad guys, too.
“AI is going to be the most powerful tool in hackers’ arsenal,” Stephanie Wilkins warned.
Is Tech Leveling the Playing Field in Legal?
In the years since discovery went digital, there has been much discussion over whether technology has helped level the playing field between big firms and smaller ones. Does technology’s ability to parse through incredible amounts of data more quickly, and with less manpower, make discovery more doable for smaller teams—or is the cost of using that sophisticated software keeping them divided?
Bob Ambrogi felt very strongly that the former statement is closer to reality.
“I’ve been practicing for a very long time, and the fact is, over the last couple of decades, tech has largely leveled the playing field between solo and small firms and big ones,” he observed. “It has helped those firms play in the big leagues when they want to.”
Still, it takes time, thoughtful investment, and intentionality—from legal teams themselves and software providers, as well as industry think tanks and other supportive organizations—to enable wider access and useful applications for teams of all sizes.
And still, as ever, mindful, well-educated approaches to those applications are key.
“Some applications of tech are not going to lend themselves well to helping David defeat Goliath. The AI situation is a problem because there’s a feeling everyone has that it will be democratizing, and people are already doing all sorts of things with it. But the rub is that some of this technology is still basically ‘mansplaining as a service,’” Joe Patrice said. “The real test will be who builds the best guardrails around it. That’s where I think we’ll start seeing cheap versions that aren’t as safe as expensive ones. It’s problematic because, unlike other tech solutions, where you can see that it’s not working, they’ll all act like they’re working—and it’ll be harder to tell which responses are good and which are bad.”
Thankfully, Ryan O’Leary says he’s seen the kind of mindful development the industry needs from its software providers so far.
“It’s been refreshing that most of them—some excluded—have taken a slower approach to AI development. They realize it’s going to have a significant impact on our world in next 100 years, and they’re trying to implement it in a way that’s much better and more thoughtful than with the cloud revolution a few years ago,” he said. “We’ve taken lessons and are learning to be more thoughtful, to experiment and not just rush into this new thing. That’s refreshing, but at the same time, orgs of every size need to think about this because if you don’t, it’ll become shadow IT real quick.”
That’s Not All, Folks
Of course, these were just some of the topics covered during our panel. Others ranged from the viability and ethics of non-lawyer ownership of law firms, to insourcing versus outsourcing discovery work, to self-collection, and much more.
So, we ask you: What were your most important e-discovery learnings from 2023? What do you think all this means for what’s coming in 2024? Share your reflections and predictions on LinkedIn using #eDiscoveryStateOfTheUnion.
Sam Bock is a member of the marketing team at Relativity, and serves as editor of The Relativity Blog.