this post was submitted on 26 Oct 2024
392 points (98.3% liked)

Technology

59587 readers
2470 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy.”

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

More concerning, they said, is a rush by medical centers to utilize Whisper-based tools to transcribe patients’ consultations with doctors, despite OpenAI’ s warnings that the tool should not be used in “high-risk domains.”

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 85 points 4 weeks ago* (last edited 4 weeks ago) (3 children)

Some examples

In this example, the speaker said, “as the um, the, her father dies not too long after he remarried….” while the program transcribes that as " It’s fine. It’s just too sensitive to tell. She does die at 65….”

In this example, the speaker said, “and after she got the telephone he began to pray” while the program transcribes that as “I feel like I’m going to fall. I feel like I’m going to fall, I feel like I’m going to fall….”

[–] [email protected] 63 points 4 weeks ago* (last edited 4 weeks ago) (1 children)

Wow, that's bad. I thought it would be more of a "confusing a sentence for a similar sounding one" type thing but from the above and the article it's just generating semi-believable text and sticking them into the transcriptions.

[–] [email protected] 25 points 4 weeks ago (1 children)

It's actually extremely good at figuring out confusing text. It gets weird when the audio quality is bad.

I use it for generating subs for obscure movies.

[–] [email protected] 7 points 3 weeks ago

No one is good with bad audio. My wife did some transcription work for a little while, it can be pretty painful, especially for doctors, and all the medical terms.

[–] [email protected] 16 points 3 weeks ago (1 children)

This one was wild:

In an example they uncovered, a speaker said, “He, the boy, was going to, I’m not sure exactly, take the umbrella.”

But the transcription software added: “He took a big piece of a cross, a teeny, small piece ... I’m sure he didn’t have a terror knife so he killed a number of people.”

From picking up and object to mass murder lmao. Not even close!

[–] [email protected] 1 points 3 weeks ago

But it gets the spirit right

/s

[–] [email protected] 3 points 3 weeks ago

Sounds less like transcribing word for word, and more like attempting to summarize and parse meaning on the fly. AIS have notoriously little grasp on reasoning and logic, so it's interesting how the output holds up in a court of law.