this post was submitted on 11 Feb 2024
16 points (80.8% liked)

Futurology

1757 readers
292 users here now

founded 1 year ago
MODERATORS
top 25 comments
sorted by: hot top controversial new old
[–] [email protected] 10 points 8 months ago

Some countries (such as the US) are already oversupplied with law school graduates. The implication of this research is that they will soon be even more oversupplied. Law degrees are expensive to obtain. Apart from tuition costs, you need to devote years to study when you are not earning anything.

One of the obvious questions posed by this research is why should anyone invest tens or hundreds of thousands of dollars in starting to study law in 2024? The old assumption was that investment would pay for itself in lifetime earnings. Those old assumptions seem to be collapsing around us.

[–] [email protected] 9 points 8 months ago (1 children)

I presume there is no path from law school to senior lawyer/partner/whatever without first going through being a junior lawyer.
So, replacing the hiring and training of junior lawyers to save some bucks by replacing them with AI will mean a lack of senior lawyers in a decade

[–] [email protected] 10 points 8 months ago

LLMs replacing the entry level "clerk" work might be great from a financial perspective of the existing firms, you're absolutely right that it'll have long term effects. In the short term though, it'll mean more law grads are going to fail to find work, and attrition effects will kick in earlier. So we'll see even more mortgage brokers and real estate agenst and such with law degrees who have never practised.

In any professional path, there's a sort of junior-years attrition that occurs -- a sort of professional darwinism where some make it through the grind of the junior years, and others peel off to adjacent careers. This will further frontload that attrition, because the number of junior positions available will be even smaller. But it won't eliminate it.

I experienced this funnel. I'm 40 now and survived it, starting my own business in my profession (geophysics). But the number of colleagues that I've seen bail over the years is astoundingly high. Some of them became remarkably successful by leveraging transferable skills from their initial profession, but others just end up as bartenders. The professions are vastly oversubscribed, sadly.

[–] [email protected] 8 points 8 months ago (1 children)

Yep. Let's bet the law firm on technology with a hallucinogen habit.

[–] [email protected] -5 points 8 months ago* (last edited 8 months ago) (2 children)

Oh sweet summer child. This is AI at the equivalent of a few week old baby and it's already better than most people at most things.

Wait until it can improve itself.

[–] [email protected] 4 points 8 months ago

AI has been “improving” itself for years now, that is nothing new or marvel. Current LLM are not intelligent, they are datasets and statistical analysis. AI in its current ignorant form has been replacing mechanically inclined jobs ever since automation itself, but that is nowhere close to the fantasy setting AI that fanboys will try to swoon over and imagine.

Maybe one day, but not with what we currently have!

[–] [email protected] 1 points 8 months ago

As one of the folks helping write the AI. Bless your heart.

[–] [email protected] 6 points 8 months ago* (last edited 8 months ago) (1 children)

Legal research and writing is only one aspect of practicing law. How will your chat gpt associate appear in court? Make oral arguments? Stand up to object on the record? Obtain a bar ID number? Pass the bar? Counsel clients? Console a distraught client who has just lost their child, or home, or personal liberty? Search for new business? AI can do a lot of things but being a lawyer is much more than stringing sentences together with some Latin words thrown in.

[–] [email protected] 6 points 8 months ago (2 children)
[–] [email protected] 3 points 8 months ago

Yeah, OP clearly hasn't used LLMs very much, they can absolutely produce the text necessary for those things. Not reliably right now, mind you - people have been fined for citing made-up case law because ChatGPT went rogue - but it can do a great first pass.

[–] [email protected] 2 points 8 months ago* (last edited 8 months ago)

Passing the test is not the same as being admitted to practice law. Can you give me the bar ID number of a non-human AI who has sat for and passed the bar in a US state? Last time I took it it required having a face, a social security number and a JD.

I think passing a standardized test and legal research and writing are almost certainly things that AI can do better than any human being. But that's a very different statement than saying that people should stop going to law school because AI is being developed. I know if I was facing jail time I would prefer to have a lawyer with biological neurons.

I think less people should go to law school because being a lawyer sucks ass, but that's a different discussion.

[–] [email protected] 5 points 8 months ago

There's been a glut of lawyers in the market since the 2008 financial crash, and many people with law degrees and passed bar but aren't practicing.
There are too many law schools with a total annual enrollment far in excess of the available jobs.
Don't go to law school

[–] [email protected] 5 points 8 months ago (1 children)

Those pointing to hallucinations and such are focused on Generative AI as it is today. However, it will be vastly different in 4-6 years when people leave law school if they start today. This technology is on a growth curve that is much more rapid than most, if not all, we have seen in history.

A lot of the issues in AI today will be mitigated by the time the newly minted attorneys are ready to practice.

[–] [email protected] 3 points 8 months ago (1 children)

Hallucination isn't a solvable quirk of GPTs, its their function. You can't get rid of it by throwing more money at the problem, you'd need another idea.

[–] [email protected] 1 points 8 months ago

There are tools to manage major hallucinations. More are coming. Automated fact checking, pattern analysis, multiple layer analysis, etc.

Yes, there are functional mechanisms that power hallucinations. Especially in the probability models. But there are some powerful tools automate analysis of the outputs and rework for accuracy. Those are likely to improve to eventually reach a level of trust that is sufficient for many business use cases.

[–] [email protected] 5 points 8 months ago

The big firms adapt slowly.

They might start using AI, but they'd still have a human review it, they're not going to risk serious money to save what a junior employee makes.

You will see a lot of lawyers thinking AI will allow them to open a business with zero employees, all that does is open up spots at the big places when they quit.

[–] [email protected] 4 points 8 months ago (2 children)

Is there no risk of the LLM hallucinating cases or laws that don't exist?

[–] [email protected] 6 points 8 months ago

How to use Chat GPT to ruin your legal career.

AI does help with discovery and they don't need to spend 8 days scanning emails before the trial, but they'll still need lawyers and junior lawyers.

[–] [email protected] 2 points 8 months ago* (last edited 8 months ago)

GPT4 is dramatically less likely to hallucinate than 3.5, and we're barely starting the exponential growth curve.

Is there a risk? Yes. Humans do it too though if you think about it, and all AI has to do is better than humans, which is a milestone it's already got within sight.

[–] [email protected] 3 points 8 months ago (1 children)

People are irrational to start any career that can be done by AI without robotic advances except possibly IT.

But in the long run, we're all just mutated monkeys compared to AI.

[–] [email protected] 1 points 8 months ago

Trades seem like a great option. It's entirely possible we IT people will go before electricians do. Monkeys are very well evolved to climb through tangled environments, and not so much to code.

[–] [email protected] 3 points 8 months ago

The actual employment rates for a law grad were depressingly tiny to start with, IIRC.

[–] [email protected] 1 points 8 months ago* (last edited 8 months ago)

Aside from AI aspect law schools all over the world is already giving more graduates than it's needed. There is a surplus of lawyers in the world. So first you should do a research about your country and its conditions.

Secondly I don't think there will be an AI that can take over law related issues/jobs from humans (lawyers, prosecutors, judges and advocates) because that AI needs to be capable of human conciseness to understand what's going in a case. If it has achieved such capabilities than it will be sentient and we will be discussing other things than law.

To elaborate on why I think so is laws might seem like mathematical and ridgid rules but in reality trial process (which includes the judge, prosecutors, lawyers and depending on your country jury) is what decides the outcome. And laws are applied to that process and outcome according to the human's understanding and interpretation. So as a lawyer with over 20 years experience I don't think that any AI will be taking over humans in law sector unless it's a secretarial position.

Lastly in today's environment no matter what job you choose it's up to you to distinguish yourself from the rest. Because in today's environment graduation means nothing. You should focus on networking and developing yourself to become somewhat of an expert in the area of your choosing otherwise you'll be competing with the rest for the same opportunities and AI will be the least of your worries.

[–] [email protected] 1 points 8 months ago (1 children)

Similar issue can apply to many types of degrees. And even without AI there is already a massive oversupply of graduates on many topics, especially in China. So the whole pyramid scheme of universities needs a big rethink.

[–] [email protected] 2 points 8 months ago

I'm with you on this. There's already a systemic issue (well, issues) at play with education. LLM AI might compound it, but it won't create it