this post was submitted on 25 Aug 2023
267 points (90.6% liked)
Technology
59300 readers
5543 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why do we want social media companies to be the arbiters of truth anyway...
Because like it or not that is where a lot of people get information these days. If it keeps pushing bulshit, people believe bulshit. For an example, anti-vaxxers didn't use to be so common, until their bulshit was spread all over social media.
I would love for people to be wise enough to verify information in reliable sources and not just believe everything they see, but sadly that's not the world where we live in.
Antivax sentiment has been around for hundreds of years, long before the Internet, mostly through political party rhetoric and/or religion, not saying the spread likely hasn't increased, but people believe wrong information all the time.
There is always a nutball, but my point is that, yes, it has increased significantly. Vaccines were a settled matter already, people far and wide trusted them. Now vaccination rates have gone down and diseases that we had nearly eliminated are having a comeback. This has happened because now any stupid grifter can have a worldwide platform and a following who actively spreads their nonsense.
I think we need to pursue a strategy that attempts to discourage the spread of disinformation while avoiding making them the arbiters of truth.
I think social media platforms are like a giant food court. If you do nothing to discourage the spread of germs, your salad bar and buffets are all going to be petri dishes of human pathogens. That doesn't mean that the food court needs to put in hospital-level sterilization measures. It just means that the FDA requires restaurants to use dishwashers that get up to 71 C, and employees are required to wash their hands.
In this case, I think we should experiment. What if platforms were required to let users flag something as disinformation, and share a credible source if they like? Maybe users could see all the flags and upvote or downvote them. The information would still be there, but you'd go to the InfoWars page and it would say, "Hey: You should know that 95% of people say this page posts mostly bullshit."
Something like that. I don't like the role the companies play currently, but disinformation does carry the potential to cause serious harm.
Remember when social media was deleting news stories about a certain laptop?
Yes?
I can't tell if you're agreeing with me or not.
I am also against deleting valid news about wrongdoing for Democrats, if you're implying this stance is political in some way.
They shouldn't be the arbiters of truth anyway.