Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
I'll agree that ISPs should not be in the business of policing speech, buuuut
I really think it's about time platforms and publishers be held responsible for content on their platforms, particularly if in their quest to monetize that content they promote antisocial outcomes like the promulgation of conspiracy theories and hate and straight-up crime
For example, Meta is not modding down outright advertising and sales of stolen credit cards at the moment Also meta selling information with which to target voters... to foreign entities
The problem is that your definitions are incredibly vague.
What is a "platform" and what is a "host"?
A host, in the definition of technology, could mean a hosting company where you would "host" a website from. If it's a private website, how would the hosting company moderate that content?
And that's putting aside the legality and ethics of one private company policing not only another private company, but also one that's a client.
Fair point about hosts, I'm talking about platforms as if we held them to the standards we hold publishers to. Publishing is protected speech so long as it's not libelous or slanderous, and the only reason we don't hold social media platforms to that kind of standard is that they demanded (and received) complete unaccountability for what their users put on it. That seemed okay as a choice to let social media survive as a new form of online media, but the result is that for-profit social media, being the de facto public square, have all the influence they want over speech but have no responsibility to use that influence in ways that aren't corrosive to democracy or to the public interest.
Big social media already censor content they don't like, I'm not calling for censorship in an environment that has none. What I'm calling for is some sort of accountability to nudge them in the direction of maybe not looking the other way when offshore troll farms and botnets spread division and disinformation