this post was submitted on 25 Jul 2023
25 points (67.6% liked)

Technology

58012 readers
3173 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
all 41 comments
sorted by: hot top controversial new old
[–] [email protected] 48 points 1 year ago (3 children)

Mastodon is a Software, the stuff is hosted by others. What a idiotic, click bait and wrong headline.

[–] [email protected] 10 points 1 year ago (3 children)

It's software that also serves as a method to distribute and access it. But ultimately, it doesn't matter, the resulting pushback will be the same.

The conclusion of the study was basically that the biggest players should enter the fediverse in order to use their capabilities to scan and police it.

Wherever this shit exists, unwanted attention and scrutiny will follow.

[–] [email protected] 7 points 1 year ago

That's like blaming vbulletin for Nazi forums or something.

[–] [email protected] 7 points 1 year ago (1 children)

This will literally do nothing, the people conducting the study obviously have absolutely no idea how federation and the fediverse in general work...

And the big Players will be thrown out by everyone else, we are here because we hate them.

[–] [email protected] 0 points 1 year ago (1 children)

the people conducting the study obviously have absolutely no idea how federation and the fediverse in general work...

How do you know that?

[–] [email protected] 5 points 1 year ago (1 children)

Study:

Mastodon = Server hoster

Mastodon = Responsible for content

Mastodon = able to moderate

Reality:

Mastodon ≠ Server hoster

Mastodon ≠ Responsible for content

Mastodon ≠ able to moderate

[–] [email protected] 0 points 1 year ago (1 children)

Where does the study suggest the Mastodon software's ability to moderate, or that it's responsible for content?

[–] [email protected] 1 points 1 year ago (1 children)

Did you read the headline? Or the article?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Did you read the actual study that this article refers to?

[–] [email protected] 1 points 1 year ago

The conclusion of the study was basically that the biggest players should enter the fediverse in order to use their capabilities to scan and police it.

Not sure if that would work as users are fleeing from those big players as they don't prioritize the safety and needs of their users.

The contradictory problem is that current major corporations prioritize money at all costs even at the expense of their users so their customer base flee to the next best service/product provider.

People are currently abandoning Reddit and Twitter because their moderation system either doesn't work or has underlying contradictions to what users are asking for.

Facebook launched Threads and people only joined initially due to FOMO. With how transparent they are in harvesting user data at the expense of people's privacy I think (and hope) that people are starting to realize that this is probably not in their best interests.

I think what we're seeing is evolutionary filtration of the web similar to natural ecosystems where the species with the highest ability to adapt that survives.

Based off of one metric it seems that companies structured around proprietary software (zero-sum systems) are unsustainable. This is my untested observation however so this could be true currently but systemically wrong once examined and tested.

So the idea that

biggest players should enter the fediverse in order to use their capabilities to scan and police it.

doesn't seem to make the most logical sense as the foundation for those companies is untrustworthy and unsustainable.

[–] [email protected] 6 points 1 year ago

Yup. Might as well blame Nginx.

[–] [email protected] 1 points 1 year ago

Classic Verge

[–] [email protected] 16 points 1 year ago

Windows is the No. 1 OS amongst cocaine, heroine and meth dealers. Time for a crackdown.

[–] [email protected] 15 points 1 year ago (4 children)

And here we go.

This will be one of the Fediverse's biggest obstacles.

Need to get this under control somehow or else in a few years, tech companies, banks, and regulators will decide a crackdown on the fediverse as a whole is needed.

[–] [email protected] 7 points 1 year ago

The fediverse is the name for services that use ActivityPub - a communication protocol. What you are saying is like saying “tech companies, banks and regulators need to crack down on http because there is CSAM on the web”.

[–] [email protected] 5 points 1 year ago

A few years? I bet Threads is doing this right now to shut down every private instance and take the fediverse for themselves. They'll argue they are the only one that can moderate the content due to their size/resources

[–] [email protected] 2 points 1 year ago (1 children)

You act like fediverse is one website, and its not.

[–] [email protected] 1 points 1 year ago

I think they're speaking from the point-of-view of an uneducated body of legislators and average people who will not understand this

It doesn't matter what we know the nature of the fediverse to be -- it matters how they perceive it, and uninformed people are perfect targets for this type of FUD

For example, the linked article exists

[–] [email protected] 13 points 1 year ago

And here we go with the corporation funded clickbait folks...

[–] [email protected] 12 points 1 year ago

100% of all child abusers have drank water in the last two days. Clearly water is the problem here.

[–] [email protected] 8 points 1 year ago

Far-right instances Gab and TruthSocial are also technically mastodon. By this metric mastodon also has a nazi problem.

Any software that allows people to communicate over the internet will be used by horrible people to do horrible things.

[–] [email protected] 5 points 1 year ago

That's like child molesters texting each other, and then saying "TELUS AND BELL ARE PERPETUATING A CHILD SEX TRAFFICKING RING"

[–] [email protected] 4 points 1 year ago

People need to read the actual report it is actually reasonable in its findings and actually offers solutions: https://cyber.fsi.stanford.edu/io/news/addressing-child-exploitation-federated-social-media

[–] [email protected] 3 points 1 year ago

Well, then http[s] also has this problem

[–] [email protected] 3 points 1 year ago (1 children)

Shouldn't it be possible to create open-source bots that use the same databases as the researchers to automatically flag and block that kind of content?

[–] [email protected] 2 points 1 year ago

Those databases are highly regulated, as they are, themselves CSAM

Apple tried to do fuzzy hashes to download them to devices, and it wasn’t able to reliably identify things at all

[–] [email protected] 3 points 1 year ago

We need more tools, more automation, in order to fight the trash

[–] [email protected] 2 points 1 year ago

Anything that allows people to escape their corporate controls will be ostracized by these people. I laugh in LEMMY

[–] [email protected] 2 points 1 year ago

The Apache foundation has got a huge child sex problem. They must be policed by Microsoft. /s

[–] [email protected] 2 points 1 year ago

They’re basically saying the fediverse has a bunch of creeps lurking.

[–] [email protected] 0 points 1 year ago (1 children)

Just added “Stanford researchers” to my list of stupid people

[–] [email protected] 3 points 1 year ago (1 children)

https://www.aljazeera.com/news/2023/7/20/stanford-president-resigns-following-research-ethics-probe

The president of Stanford University has stepped down in the wake of an independent investigation that found “substandard practices” in research papers he was involved in.

Already plenty of support for your cause.

[–] [email protected] -1 points 1 year ago (1 children)

So they went on mastodon and started searching for CP? WTF is wrong with these sick people? I hope they're on an FBI list now.

[–] [email protected] 2 points 1 year ago (1 children)

So your advice to any organization seeking to minimize illegal activity is to willfully ignore any trace of it?

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago) (1 children)

"I swear, officer, I was just searching for CP to catch OTHER people!"

It would be just as pathetic as that scene from Something About Mary. "Yeah I was just going to pee, too!"

Maybe they had some kind of legal sanctioning to do it, but holy crap, I wouldn't want that in my search history. I would hope software like that has some mechanism where if people search for certain words it results in an automatic reporting to some FBI API somewhere. I actually know of a couple of people who got caught with that stuff. One got 25 years. The other jumped bail and they eventually caught him. I'm not sure if he's been sentenced yet but I bet he'll get double of what the other guy who cooperated got. Those people are creepy AF and nobody in their right mind would want to be associated with any of it. Those people are 10 times worse than neo nazis.

The funny thing is the first guy, everybody could kind of tell he was a creep. But the FBI caught him and he completely cooperated and admitted everything. The second guy, he really seemed like he was going to be the only person in his family who actually turned out to be a decent guy. He was a really sweet kid in a super trashy family. And then all of a sudden everything goes down and everybody is in shock. Then he jumps bail. Last I heard his dad was about to lose his house because he used it as collateral to bail his piece of shit son out of jail.

This open source software needs to include code that reports certain search terms. There are ML algorithms out there that can automatically detect this stuff. Do not search for that kind of content, thinking you're some sort of vigilante. There are ways to deal with this shit without putting yourself in serious legal peril.

[–] [email protected] 1 points 1 year ago

I don't think you understand how a research organization works. This isn't three guys in a basement searching for child porn. It's a research institute at Stanford University. They'll have gotten funding to do the work by applying for federal grants, getting approval from multiple Institutional Review Boards who are charged with, among other things, making sure that the people involved in the research are appropriately taken care of. They'll be required to have counselors on board. However "legit" you think such an outfit might possibly be, multiply that by three.

This is their job. It is the same as if they worked for a law enforcement agency. When someone gets arrested for child porn, we don't also charge the police, prosecutors, and judges who might have to look at the material as part of prosecuting a case. I promise you Stanford isn't paying a team of professors and postdocs to just diddle themselves to kiddie porn all day.

[–] [email protected] -4 points 1 year ago

Reposting because the comments on the earlier post were far too informative in explaining the truth.