When they say that “they have an army of lawyers” or that Disney has more lawyers than animators and things like that, do they tho? Is an army of lawyers really effective? Do companies actually have an “army” of lawyers to redact and sign documents?
When they say that “they have an army of lawyers” or that Disney has more lawyers than animators and things like that, do they tho? Is an army of lawyers really effective? Do companies actually have an “army” of lawyers to redact and sign documents?
Basically it means that they can handle lots of cases at the same time while still giving each one as much attention as it needs. Winning or losing a difficult case can often be decided by how much time and expertise you can put into it. When you have a lot to lose, would you rather have a team of lawyers, each specializing in a different aspect that’s relevant to the case or a single lawyer who is overworked because he‘ll have to prepare a different case after lunch?
Edit: typo
The scope and visibility of the case is important, as well. Complex cases require lots of lawyers with different specialties to look at it from different angles.
Similar in engineering, you want more engineers working on a really big and complex project than just one person. I worked with a firm back in the day that designed a stadium - they had a whole floor of their HQ devoted to engineers who only worked on that project.
I would imagine it’s only matter of time before AI can do the majority of the work for law firms. I’ll have to ask my IP lawyer friend about this.
Yes lots of contract and doc review billable hours are going to go away. It’s going to be devastating.
Doc review is pretty awful though so maybe it’s for the better.
Lawyers who has tried to use AI so far had lost their cases miserably.
Because we don’t actually have AI. We have people following paint by numbers, not artists.
True AI, and not the sparking programming we have, will be more effective than any lawyer.
Who downvoted you? I’ve been arguing the same thing since AI has become the buzzword of the decade. No one seems to understand what Artificial Intelligence actually is and how these current systems are anything but. They aren’t even really a step in that direction because the underlying software and hardware isn’t anywhere near ready to emulate a human or even lower animal brain.
Oh, you mean that thing that hasn’t been proven possible yet?
Two years later he and his brother achieved the first successful test of powered flight. Their flight would last 12 seconds and cover 120ft with a top speed of 6.8mph.
The SR-71 Blackbird, flown 61 years after the first powered flight, had a top speed of 2190mph and had a range of 2,500 miles.
True AI will happen unless temporary stars are all the rage.
Yeah the wright brothers were great, but it pains me to say as a Daytonian engineer, but they were also completely full of themselves. There was good reason to believe heavier than air flight was not only possible but soon at the time. Lighter than air flight was not only already happening but had been used in conflicts, there was a hot air balloonist involved in the Paris commune.
But my doubts are of the possibility, immediacy, and practicality of an artificial device having human or greater cognition power in ways able to mimic organic brains. These questions aren’t me just being some doubter (though that is valid given the sheer resources being thrown at them and the way that we’re being asked to leave problems to them rather than seeking more immediate alternatives), but based on discussions with artificial intelligence specialists who don’t have a financial stake in the technology
That’s because we only hear about AI being used by lawyers when they use it wrong and it hallucinates a case that doesn’t exist, and then they don’t actually verify the case themselves.
I’m sure lawyers are already using it successfully, we just don’t hear about successful cases.
And right now they’re using general purpose LLM models, I’m sure we’ll get models actually focused on legal knowledge in the future that will do much better than the current ones.
First off, it’s not AI, it’s llm, basically a better way to collate and search data. It’s a tool that they should be using for research but they better not be using chatgpt or any of the other publicly available ones. I would hope that by now someone has launched or is working on one that was trained with data from law books, existing case law, etc and then you could also feed it any discovery documents that come in and it can help highlight what is important.
[citation needed]
Though I’m sure your LLM could hallucinate some for you!
I love that term “hallucinate”.
That’s a big of a euphemism as the word “faith”, and like the term “faith”, it’s used to mask glaring operational deficiencies. It reminds me of the time when I test drove a used car and there was a clear steering issue, which the car salesman called a “shimmy”.
Do I need to define collate? Maybe it wasn’t the best choice of verbiage but the point still stands. The quality of the output is always relative to the input. That’s why a growing number of companies are training their own llms with data from their own databases instead of trying to rely on external datasets.
For the record, I’m not talking about ones that you can ask a question and get an answer. I was talking about law firms using a local or privately hosted llm to scan through discovery documents and finding keywords or related keywords that may be relevant to the case they are working. Especially now that a lot of discovery is digital.
I can’t give more detail than the following because it may not be public yet but I am aware of one company working on their own llm to let clients more easily find info that has been published on their platform and would take longer to skim through than to just as a search engine.
All AI does is determine the probability of the next word that’s about to be said.
There definitely will come a time when an AI can craft legal thought, but it is a long, long time off.
Source: I’m a legal tech who’s actually helping my firm test legal gen AI platforms, all of which produce information that can’t be relied upon without human validation.
I currently use Copilot to help me solve problems with Microsoft Power Platform. “AI”, the generic misleading word we use to describe advanced search, can scour the web and solve a problem for me in seconds. It’s only a matter of time before a more advanced algorithm “learns” all the case law to ever exist. At some point, I imagine, you can type in all the details about your case and the machine with find all the applicable court cases in a matter of minutes. It’s still up to the lawyers to apply and utilize this but the research, the stuff that takes an army of lawyers, will be done in a fraction of a fraction of the time.
I said it’s a matter of time, not that it’s happening now or will happen tomorrow. But I do believe it will happen in our lifetime, very likely within the next ten years.
Yep. I use ChatGPT myself to help with research on coding and other issues.
I don’t expect AI is going to replace humans anytime soon, but the use of it is going to be an essential skill, and people and companies who don’t learn how will definitely go extinct.