this post was submitted on 26 Jun 2023
55 points (100.0% liked)
Technology
37719 readers
518 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean AI is already generating lots of bullshit 'reports'. Like you know, stuff that reports 'news' with zero skill. It's glorified copy-pasting really.
If you think about how much language is rote, in like law and etc. Makes a lot of sense to use AI to auto generate it. But it's not intelligence. It's just creating a linguistic assembly line. And just like in a factory, it will require human review to for quality control.
The thing is - and what's also annoying me about the article - AI experts and computational linguistics know this. It's just the laypeople that end up using (or promoting) these tools now that they're public that don't know what they're talking about and project intelligence onto AI that isn't there. The real hallucination problem isn't with deep learning, it's with the users.
Spot on. I work on AI and just tell people "Don't worry, we're not anywhere close to terminator or skynet or anything remotely close to that yet" I don't know anyone that I work with that wouldn't roll their eyes at most of these "articles" you're talking about. It's frustrating reading some of that crap lol.
The article really isn’t about the hallucinations though. It’s about the impact of AI. its in the second half of the article.
I read the article yes
This is the curation effect: generate lots of chaff, and have humans search for the wheat. Thing is, someone's already gotten in deep shit for trying to use deep learning for legal filings.
It drives me nuts about how often I see the comments section of an article have one smartass pasting the GPT summary of that article. The quality of that content is comparable to the "reply girl" shit from 10 years ago.