this post was submitted on 12 Jun 2023
245 points (100.0% liked)

World News

22063 readers
32 users here now

Breaking news from around the world.

News that is American but has an international facet may also be posted here.


Guidelines for submissions:

These guidelines will be enforced on a know-it-when-I-see-it basis.


For US News, see the US News community.


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Reddit has stopped working for millions of users around the world.

https://www.independent.co.uk/tech/reddit-down-subreddits-protest-not-working-b2356013.html


The mass outage comes amid a major boycott from thousands of the site’s administrators, who are protessting new changes to the platform.

On 12 June, popular sub-Reddits like r/videos and r/bestof went dark in retaliation to proposed API (Application Programming Interface) charges for third-party app developers.

Among the apps impacted by the new pricing is popular iOS app Apollo, which announced last week that it was unable to afford the new costs and would be shutting down.

Apollo CEO Christian Selig claimed that Reddit would charge up to $20 million per year in order to operate, prompting the mass protest from Reddit communities.

In a Q&A session on Reddit on Friday, the site’s CEO Steve Huffman defended the new pricing.

“Some apps such as Apollo, Reddit is Fun, and Sync have decided this pricing doesn’t work for their businesses and will close before pricing goes into effect,” said Mr Huffman, who goes by the Reddit username u/spez.

“For the other apps, we will continue talking. We acknowledge that the timeline we gave was tight; we are happy to engage with folks who want to work with us.”

In response to the latest outage, one Reddit user wrote on Twitter: “Spez, YOU broke Reddit.”

Website health monitor DownDetector registered more than 7,000 outage reports for Reddit on Monday.

Some users were greeted with the message: “Something went wrong. Just don’t panic.”

Others received an error warning that stated: “Our CDN [content delivery network] was unable to reach our servers.”


Update: Seems to be resolved for most users

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 9 points 1 year ago (1 children)

What you're describing is polarization within a community transforming it into an echo chamber, driving out much of the community. Sure, truechildfree formed out of people who still wanted a community based around that aspect of themselves, but they're not the reason for the split - they're a symptom. For every user that made the journey to truechildfree, there's probably 3-10 that just unsubbed, and another 5 that just stopped participating

My personal example is AITA. It started off as a group judgement based on the morality of the situation, but in the last few years people have become obsessed with "rights". I actually got tempbanned for a situation where a douche told a woman that by joining trivia night in a small town bar she was ruining guys night. I responded to someone saying "IDK why your bf wasn't happy about how you handled it", and I basically said "yeah, he's the asshole, but clearly this is extremely important to him, and saying screw you I have every right to be here while he storms out didn't just ruin his night, it soured the evening for his friends who tried to stop him. That's not going to make you any friends in your new town, and a little compassion could've diffused the situation". It's hard to put into words (and that's just the most salient example, I probably got more negative karma there than everywhere else put together), but the community moved from what's the right thing to do into what's your legal rights

As far as I know, there's no trueAITA - the community just morphed into something I find toxic. The nuance was gone, and it became something very different to the sub I loved participating in. I almost unsubbed, but instead I mostly just would start writing a comment before deleting it and moving on.

I think fractured, smaller communities help with this more than anything. Humans generally adjust their morality based on their peers - and the bigger the community, the more the loudest voices begin to feel like they're expressing the opinion of the majority.

If 10% of a large community upvotes a certain viewpoint, it takes all of the top slots. It's a weakness of the popularity-based ranking system - a relatively small voting block easily dominates the discussion. The moderates just ignore it, because they disagree but not enough to actually fight it out

But force people together in a smaller, more diverse group, and they moderate each other. The trick is, you can't do it through polarization - you can't fragment a community based on beliefs or you get echo chambers.

You just have to throw people together and make them talk it out. Opinions naturally balance towards the mean when the groups are smaller, and the most cohesive voices dominate when the group becomes large

[–] [email protected] 8 points 1 year ago* (last edited 1 year ago) (1 children)

Thanks for sharing your perspective with me, I really enjoyed reading it!

You raised an interesting point, the polarising of r/AITA, and its something I've noticed a few times... I now have a theory:

Personal experiences are far more likely to move towards emotional extremes.

Emotionally-invested people reach points of 'black and white morality' as they get larger, labelled as moral or immoral based on each viewer's personal perspective.

I'm not saying our emotions are bad - if anything, many people are martyrs from their own emotional neglect - rather that many of us have not learned how to feel emotion authentically without treating them as objective judgements that justify action. (eg: this happened, I feel angry, therefore you wronged me, therefore I can defend myself, etc)

Humans are empathetic, which is truly wonderful. But we have two types of empathy:

  • affective empathy is our brain's mirror neurons, feeling emotion in response to others' visible feelings. I see you feel sad, so I feel sad for you. It's innate.
  • cognitive empathy is a social skill, one facet of emotional maturity. I recognise that if this were to happen, then somebody in your situation may feel sad, and I understand why. It's learned, primarily in childhood as modelled by our parents.

So, back to your example of r/AITA - the NAH and ESH ratings are likely only being used by those engaging with cognitive empathy, (hopefully) recognising possible biases and advocating for communication that will satisfy both, as if they are a third party observing.

But for those who engage with their affective empathy, they project themselves into the story - if the story is evocative, they'll readily side with OP. If the other's experience angers them, they'll readily call them out. They're not here to offer perspective - only judgement.

So what does that mean for communities that want to prevent polarisation?

Haha, fuck if I know, I mostly just find the topic interesting and enjoy having a space to explore it. But I have a couple ideas, and would be curious to hear yours?

On Reddit, we see this black/white emotional judgement in upvotes/downvotes - though they are intended for whether a comment contributes something, they're often used to define whether a comment is moral according to the voter's values. Without downvotes, a comment that is bigoted can still be blocked/reported; but with them, a comment that says I think Witcher 3 is boring because- can be buried.

r/AITA also encourages a degree of absolutism by boiling down rulings to three letters, and groupthink by drawing an ultimate conclusion based on which one is most popular rather than presenting a table graph. Users can feel just and righteous - standing up for victim OP, or standing up for their victim.

So I don't know if the problem is preventable, it's a humanities issue; but I would consider some of the following:

  • no downvoting system. It's rarely used in good faith; comments that don't contribute that be reported instead. Comments that are engaging will still rise over comments that are not.
  • active diverse moderation. Hopefully with a diverse enough mod team it will slow homogenisation. eg: a mod that likes children will push for rules that discourage/ban anti-child language; a mod that doesn't like children will push for a platform that encourages/allows those struggling to vent. Together they may find guidelines that emotionally validates struggle without perpetuating hate.
  • smaller communities, like you said. Subs like r/childfree are trying to be resource communities (the list of doctors, advice, etc) and have good reason for being large, but social communities are probably better off kept smaller. eg: if they made a r/childfreesupport for venting and emotional validation.

Also, for those of you who read to the end, I really appreciate it. I know I ramble about stuff I find interesting, and despite editing out a bunch of waffle I know this is still really long. Would enjoy reading your equally long responses lol

[–] [email protected] 2 points 1 year ago

Well first off, I like to write essays too, and I really have been enjoying the fact people here are way more willing to engage in longer posts.

I think you're into something with how humans empathize (kind of interesting to me my first response when someone tells me about an conflict is to try to reconstruct the other person's perspective). I think there's definitely a lot to the way people think less critically the more emotional they get

But to round it all off, smaller communities help, but really it's a matter of self-reinforcing social structures and the ways that social network mechanisms interact with them.

Outrage is the strongest driver for participation - so posts that incite the most outrage will get far more votes and replies in either direction. The outraged position will be far more likely to vote, while people who don't feel as strongly are less likely to do so to the same extent. That skews the metrics most algorithms use to rank them, and so they get more visibility.

As this goes on, the group will shift - the outraged people only need to be a fraction of the group to seem like they're the majority, and people put off by it are more than likely going to leave what looks like a total echo chamber (especially if people get nasty or personal)

The outraged group also starts to feel like their position is actually the average of the group (e.g. the silent majority), and they might shift even further, becoming more extreme - as people's beliefs are relative to their perception of social norms.

This cycle repeats until it becomes so polarized a moderate opinion is seen as extreme, and might be attacked.

It's a difficult problem to solve - the only easy metrics are going to be votes, comments, and maybe if people stay or leave after viewing. There's more complex systems that might work - such as using ai to score additional metrics based on content, or (an idea bouncing around in the back of my head for a while) by profiling the users to try to boost consensus opinions to compete with "outrageous" ones. Obviously, this is way more computationally expensive and requires complex code that few will be in a position to understand (even if it were open source). These strategies could also be used to drive engagement or ad conversation at the expense of mental health (something that seems to be at least explored by some social media companies)

But small groups help in a very simple way -only so much media fits on a page. Even if the top comments are pure outrage porn, the other voices won't be buried

The other solution is moderation (it's in the name) - effective moderation of the tone and "rules of engagement" can tamp things down. But people generally don't like to be censored, and it doesn't scale - moderators are individuals, and too much to go through or dividing it up between larger groups of mods strips the nuance out of the process