this post was submitted on 20 Oct 2024
69 points (96.0% liked)

Futurology

1813 readers
16 users here now

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 8 points 1 month ago (2 children)

People have often tended to think about AI and robots replacing jobs in terms of working-class jobs like driving, factories, warehouses, etc.

When it starts coming for the professional classes, as this is now starting to, I think things will be different. It's been a long-observed phenomena that many well-off sections of the population hate socialism, except when they need it - then suddenly they are all for it.

I wonder what a small army of lawyers in support of UBI could achieve?

[–] [email protected] 14 points 1 month ago (1 children)

The legal profession won't touch it till it's been 100% proven that hallucinations have been completely eliminated. And when it comes to anything Sam Altmann says. People rarely regret doubting him.

[–] [email protected] 5 points 1 month ago

Not all lawyers are that smart or careful. And these are just the ones who let the AI do the work without checking out validating anything! For every lawyer THIS dumb, there are hundreds who let AI do the grunt work, but actually validate the output for hallucination. The main problem is the AI makes them worse lawyers to have because if the AI missed something in the research so will the lawyer.

https://www.reuters.com/legal/new-york-lawyers-sanctioned-using-fake-chatgpt-cases-legal-brief-2023-06-22/

https://bc.ctvnews.ca/a-b-c-lawyer-submitted-fictitious-cases-generated-by-chatgpt-to-the-court-now-she-has-to-pay-for-that-mistake-1.6788128

[–] [email protected] 1 points 1 month ago

llm are no path to socialism and being tricked into believing that a small collection of ultrarich capitalists having ownership of middle and upper class jobs in a more literal sense is somehow going to bring that about is unfortunate. It's neither here nor there, llm will never get there, but still unfortunate.