The Legal Industry Has A Long Way To Go Before GPT Matches The Talk
">
When the "Generative AI" hype storm hit a couple months back, the first thing a lot of lawyers thought was "how can we make this all about the unlicensed practice of law?" Too late... it's already passing the bar exam. But the second thing that more savvy lawyers thought was something along the lines of, "I guess this could write first drafts for me?"
Sure! If you want your draft to sass back at you about noted pro-choice jurist Clarence Thomas and the critical importance of 55 U.S.C. ? 1823 to your client's claim. If that sounds less than appealing, then maybe it's not a magic draft machine.
That doesn't mean GPT won't ever get there or that it doesn't offer value to the profession today. But the tasks it's going to be doing out of the gate aren't going to match all the speculation and there are a lot of more mundane moving parts that the industry will have to sort through first.
This year's Legalweek conference should have replaced the swag bag with an opportunity to get a full back tattoo of "GPT" in gothic script across an arrow-pierced heart. AI is certainly a topic worthy of conversation among the community of bleeding edge legal technologists, but let's not let "GPT-sus take the wheel" quite yet.
While meeting with Redgrave Data -- who might be best described as the A-Team of data problems to the extent that if you have a problem, no one else can help, and if you can find them, you should hire Redgrave Data -- Mark Noel drifted into a conversation about GPT putting together that first draft and put some reasonable air brakes on the idea. "[The GPT fervor] is not new... it reminds me of when machine translation came to the market." Machine translation was also billed as the first-pass solution that would speed up human translators. But that wasn't what happened. Instead, the laborious clean up ultimately cost as much time as it saved. According to Noel, that's what we're likely to see from GPT for at least the short-term.
At ABA TECHSHOW, Casetext's Co-Founder and Chief Innovation Officer Pablo Arredondo gave a full session primer on how these advanced AI systems work and explained that the crux of a system like GPT is that it will keep trying to invent an answer that the user likes, even if it requires hallucinating to prove it. Extracting value out of these systems requires hefty amounts of training and carefully designed safeguards.
And that's just to get a product that could work for lawyers. Building something that lawyers will actually use has to interface with the professional skeptic in mind. When Casetext unveiled CoCounsel, the most impressive aspect we identified wasn't its smarts as much as its restraint. Knowing what it didn't know and prompting attorneys to double-check everything it couldn't guarantee struck me as the key challenge for every legal application going down this road.
Dan Broderick of BlackBoiler described this same interface philosophy in the contract review space. With the company's new ContextAI feature, it hopes to "remove the black box and give an explainable reason for the edits." The markup functionality provides the reasoning behind every redline offering the attorney an opportunity to check up on how the AI took the in-house contract playbook and past examples and got to the answer it did. "This is necessary to build trust. Lawyers want to understand what's happening. They want to be in control."
Firms and legal departments maintain a lot of vetted historical knowledge in their systems, so it's unsurprising that AI is a high priority for document management. The last time I spoke with NetDocuments, they had just unveiled PatternBuilder to help automate attorney processes. As they see it, the automated future involves taking that knowledge and delivering it seamlessly into the attorney workflow. As artificial intelligence becomes more robust, attorneys aren't going to want to adjust processes to meet the AI, they're going to need AI to meet them.
But a lot of banked knowledge can also be a dangerous thing for an AI.
It's certainly better to draw intelligence from a firm's files than the interwebs, but Broderick offered the example of a buyers-side agreement. How does the AI understand not to pull answers from the thousands of sellers-side agreements in the repository?
That's going to take organization. While catching up with Rich Hale of ActiveNav about the concept of Data Mapping as a Service, we got into a discussion about the importance of cleaning up data to the looming AI world.
Generally, ActiveNav's customers are wondering, "How do I get good at holding less data, at minimizing data spillage across the organization?" to keep costs controlled and to minimize the risk of losing PII in the pile and ending up on the wrong end of a costly breach or regulatory shortcoming. In short, ensuring that there's "Zero Dark Data," as they put it.
But what happens when a hallucinating AI drafts something that gets mistakenly added to the firm's knowledge base? How do we know what data for the system to draw from and what data spawns from the other side's undesirable markup? Figuring out what's what within the system is going to be a big deal for keeping AI on the straight and narrow.
GPT can perform legal tasks now, but performing the tasks that people expect GPT to perform will take time. As impressive as it is at clowning the bar exam, or effectively summarizing and comparing long documents, or matching language to deal playbooks -- all important tasks -- getting to a tool that can provide an actionable first draft will take time, effort, and organization. Until then, the most responsible approach requires building an interface that quickly and accurately aids the lawyer in understanding how the AI reaches its conclusions.
Still, this is a significant moment. Brian Meegan of ProSearch compares it -- and the simultaneous explosion of communication methods and emojis and the slow death of forcing every square peg communication into the round hole of email -- to the early days of eDiscovery or even the introduction of the computer. It's going to take some distance to understand the paradigm shift, but it's unfolding.
It's not what GPT -- and its competitors -- will be able to do today, but it will do in 10 years that we'll consider entirely routine.
Earlier: New GPT-4 Passes All Sections Of The Uniform Bar Exam. Maybe This Will Finally Kill The Bar Exam.
Legal AI Knows What It Doesn't Know Which Makes It Most Intelligent Artificial Intelligence Of All
Joe Patrice is a senior editor at Above the Law and co-host of Thinking Like A Lawyer. Feel free to email any tips, questions, or comments. Follow him on Twitter if you're interested in law, politics, and a healthy dose of college sports news. Joe also serves as a Managing Director at RPN Executive Search.