this post was submitted on 19 Dec 2024
21 points (100.0% liked)

Technology

1568 readers
344 users here now

Which posts fit here?

Anything that is at least tangentially connected to the technology, social media platforms, informational technologies and tech policy.


Rules

1. English onlyTitle and associated content has to be in English.
2. Use original linkPost URL should be the original link to the article (even if paywalled) and archived copies left in the body. It allows avoiding duplicate posts when cross-posting.
3. Respectful communicationAll communication has to be respectful of differing opinions, viewpoints, and experiences.
4. InclusivityEveryone is welcome here regardless of age, body size, visible or invisible disability, ethnicity, sex characteristics, gender identity and expression, education, socio-economic status, nationality, personal appearance, race, caste, color, religion, or sexual identity and orientation.
5. Ad hominem attacksAny kind of personal attacks are expressly forbidden. If you can't argue your position without attacking a person's character, you already lost the argument.
6. Off-topic tangentsStay on topic. Keep it relevant.
7. Instance rules may applyIf something is not covered by community rules, but are against lemmy.zip instance rules, they will be enforced.


Companion communities

[email protected]
[email protected]


Icon attribution | Banner attribution

founded 1 year ago
MODERATORS
 

Large language models continue to be unreliable for election information. Our research was able to substantially improve the reliability of safeguards in the Microsoft Copilot chatbot against election misinformation in German. However barriers to data access greatly restricted our investigations into other chatbots.

top 4 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 2 days ago* (last edited 2 days ago)

Stop using a helicopter to mow grass.

[–] [email protected] 5 points 2 days ago (1 children)

And everything else.

People who love jerking off to AI probably wouldn't care if their calculators were 'close enough'. Or if their bank statement balance looked likely to be true.

[–] [email protected] 1 points 2 days ago

Maybe the real intelligence was the hallucinations we made along the way.

[–] [email protected] 3 points 2 days ago

Until you can explain the entire logic path from input to output, even if the ai makes a logical mistake... Then you can't trust the data.

I haven't seen any results on another important requirement: An ai that can 'forget'or discard information that is bad.