Comments

You must log in or register to comment.

Musicferret t1_iu6ia5i wrote

No kidding! You can flag it, and it is virtually NEVER taken down. There needs to be some type of cumulative count of infractions by accounts that do this constantly. On reddit, mods see a log of all infractions in a particular subreddit. On facebook, it appears that every one of these “less bad” offences like misinformation is somehow viewed in a bubble.

Seriously, once the person has spend months spreading hate and obviously false information, why aren’t they banned permanently?

Actually, that’s a rhetorical question Answer: $$$$$$$

28

gerkletoss t1_iu75f9w wrote

To get neonazi comments taken down from reddit you basically have to write a 3 page essay explaining the problem in the appeal. Ask me how I know.

15

cassy-nerdburg t1_iu7f8lx wrote

How do you know?

8

TigBiddiesMacDaddy t1_iu7i8ld wrote

Reminder that in the cats subreddit there was an entire civil war because one of the notorious five mods (group of mods that are head mods in almost EVERY subreddit) called out another person in their group for basically empowered a neo-nazi in multiple subreddits.

14

CrypoFiend t1_iu9hxxz wrote

They just switch locations on their VPN and create a new account

1

Musicferret t1_iua3n2g wrote

See, that’s fine. Right now, nothing happens at all. After 3-4 times, it should be an auto ban. Make them recreate accounts every day or two.

1

Nostradamaus_2000 t1_iu66zkz wrote

100% social media is a failure , still wonder how fact checkers got there degree ! Or the Key Board warriors. Then lets tell the same lie 10 different times. Certainly a platform of hate mongers. Takes time to verify any truths these days. Platforms should be held accountable for the hate they spread including the lies.

25

RealisticCurrent2405 t1_iu7cq2y wrote

Until ai gets near sentient, good luck. It’s a volume thing. It wouldn’t be profitable to go through every post form everyone’s racist uncle

3

Socko-The-Sock t1_iu74ldj wrote

got a threat DM on my personal facebook back when i had it. i reported the message and got an auto-reply that basically was that tyler the creator “turn off your computer” tweet. same with a lot of instagram comments and messages i e received.

25

ButtFuckingGermans t1_iu8r5gy wrote

Same thing happened to me on reddit last year. Some unhinged guy was DMing me saying he will rape my newborn. I reported it, reddit said they found no rule violations.

10

lotusflower64 t1_iu9jo6x wrote

Can you report it to the police?

1

Bully-Rook t1_iu7ngjg wrote

Facebook is a toxic cesspool. Left and never looked back. Not sure how the hell so many people stay there "for the marketplace".

9

CrypoFiend t1_iu9hrx2 wrote

That will change as sellers now have to KYC. Imagine getting an audit from the IRS because you didn"t report the used crib you sold to a neighbor for $20.

3

BigfootSF68 t1_iu9k21f wrote

It is the mall that won't die. Just because it is there.

The kids who didn't get to use America Online get to be stuck on facebook. Too bad they didn't get to use Myspace.

2

dontpet t1_iu63n7q wrote

Reddit seems to fall into a different camp from what I can tell. There are subs that are pretty hairy but they seem more like a ghetto than a haven for hatred.

20

unresolved_m t1_iu6472k wrote

Reddit seem to have some strange bots/shills. I remember getting a lot of negative comments while talking about cops and minorities going missing.

16

HotpieTargaryen t1_iu67ojx wrote

The moderator system encourages misinformation and hate in many subs and there is no real oversight.

15

cleuseau t1_iu68yxn wrote

This is BS. They ban subs all the time. This is Musk pushing a narrative that YoU cAnT CoNtRoL ThE InTeRnEtS.

Yes they fail sometimes but it gets filtered. It is easy to filter. Do your F***ING job and filter the hate and disinformation Musk.

−7

HotpieTargaryen t1_iu6avxs wrote

The reddit mod system is passing the buck and not taking responsibility. I do believe you can control for misinformation and hate on the internet, but not by giving control to random groups of anonymous people that squatted on a topic first and happen to be slightly more clever at promoting toxicity than the true sewage subs that gets banned. Twitter has no excuse.

8

grrrrreat t1_iu68vvf wrote

Still easy to game if you want to push an agenda

14

OcculusSniffed t1_iu6dess wrote

Reddit is primarily intended to be moderated by upvotes and down votes, which gives power to the users. Some subs are overrun but new ones spring up constantly and there's an interesting lifecycle there.

Facebook, Twitter, Instagram, and any site with an algorithm that moderates content seems to think that user interaction is a sign of worth. So any piece of wrong information that attracts a response, negative or positive, gets promoted. I see constant promoted ads of things that are blatantly incorrect or "tips" that just won't work, and they are flooded with comments about how they are bullshit. But that just pushed them further up the rankings.

Reddit isn't perfect, but it's a shit ton better than that.

12

snap-erection t1_iu6vug1 wrote

Reddit is designed more around classic forum websites, to me it feels more like the next iteration of that as opposed to going from one person's profile to the next. It's also more about the content itself and not about the user.

3

HotpieTargaryen t1_iu67gof wrote

Yeah, but it’s the unmonitored sub/moderator system that allows this garbage to continue. r/politics mods have rules in place to protect political troll accounts and ban anyone that calls them out and there is zero oversight. Reddit isn’t better, it just hides the ball better.

9

verasev t1_iu68c6k wrote

I just got banned from there because I told one of those trolls they were full of shit after they said folks in r/politics were all college-educated pussies who never do any real work like blue-collar GED graduates do.

5

DarkDeSantis t1_iu64hfl wrote

Half of reddit has been banned, both from a user and sub stand point, it falls into the exact same camp. Censorship directly leads to bigotry and hate. Without checks and balances we are all doomed to extremism.

5

dontpet t1_iu65otg wrote

I'm so for keeping disinformation off of social media. The devil is always on the details but there is objective truth out there and dishonest malevolent actors and we are all better off pushing it back at various levels.

14

NaturalNines t1_iu66ocd wrote

Problem is that's such a subjective standard. And reddit moderators are known for being of their own ideological slant.

If "disinformation" is defined as "I disagree" or "I'm more skeptical because I disagree, thus I'll enforce a higher standard" is toxic and shuts down the conversation. This creates a bubble that helps to radicalize and further creates disinformation by only presenting limited information.

−3

bluekittydaemon t1_iu79929 wrote

As long as we're talking about fact checking as a part of moderating and not "we have to let white supremacists talk for ten minutes bc you got ten minutes".

5

NaturalNines t1_iu8j1gy wrote

As long as we're talking about actual white supremacists and not "I disagree so you're a white supremacist".

Goes both ways.

1

smitbret t1_iu6acir wrote

I see you're getting downvotes for being open minded.

Wouldn't be Reddit if it was any other way

−3

NaturalNines t1_iu6b3sr wrote

According to upvotes I'm an incredibly nice person on every different subreddit... until politics. Then I'm an irredeemable asshole because I disagree.

And these radicals think that's convincing.

2

Marsdor t1_iu6d539 wrote

It's why I don't even post on political subs, just a bunch of doomsayers with no idea how to help things out.

3

NaturalNines t1_iu6efee wrote

I have very recently learned that lesson. I tried asking questions, just simple questions. They respond with such hostility I eventually would give a "You're a nutjob" response and BAM. Banned permanently. I'd even request why, pointing out that it took constant aggression that they ignored for me to finally insult back. And I got another insult from a mod, calling me a child and asking me why the rules don't apply to me while ignoring the rest. I check out that sub later? Nothing but constant violations of what they banned me for.

Social media is not for disagreement or free thought anymore. The lunatics took control of the asylum for now.

1

Marsdor t1_iu6fg0g wrote

Makes me miss the early 2000s where people could disagree and not be labeled some ist or phobe, people could actually debate and change others perspectives.. nowadays that seems a distant reality in our current socio landscape.

2

NaturalNines t1_iu6g0dx wrote

I mean.... there were problems then. I'm not sure I appreciate having been able to stumble across execution videos and other such horrible shit when I was a teenager.

But debate? Different story, haha. Yes, definitely prefer being able to have discussions. Voice an opinion, wrong or right, it got corrected, we all called each other every name but didn't care, free speech is great.

Bunch of crazy disinformation! Like remember how everyone heard that Marilon Manson removed a rib to be able to suck his own dick? Fucking crazy! Oh well.

1

Marsdor t1_iu6gcui wrote

Yeah it was a little on the rough side and I'm glad we can filter the extreme unwanted content out but yeah we're definitely missing something from then that worked better than the current thing we have now.

2

_ilovemen t1_iu81n56 wrote

It’s called being young lol. You’ve just gotten older and now notice way more stuff.

2

Practical_Law_7002 t1_iu661vc wrote

I'd argue it's actually different because of the upvote/downvote system and seeing comment history and such since we collectively as users can weed out trolls easier than say Facebook.

10

TheKrakIan t1_iu6ebzn wrote

Maybe with Musk taking over twitter and FB losing money social media will finally fall off the face of the earth.

Maybe Tom knew what was gonna happen and just decided to bail.

16

whiplash81 t1_iu6erwf wrote

TikTok enters the chat

2

TheKrakIan t1_iu6f2zb wrote

TikTok has always been one thing, a way for China to collect data.

Not that all the other SM channels don't already have that and do inappropriate things with it.

NM, let Tik Tok fall as well.

3

whiplash81 t1_iu6fkq8 wrote

Trust me I'd love to see a downfall to all these social media giants.

I just don't think it'll happen without another one rising up to take its place.

6

banananailgun t1_iu6irmh wrote

Social media sites don't deal well with hate speech and disinformation because hate speech and disinformation make them lots of money. The truth is just plain boring to a lot of people.

Just one easy example: It's so much more exciting for a lot of people to believe that there is a massive, nationwide (potentially international) conspiracy to steal the election from Trump than to accept the boring, hard truth that he simply lost the election. And think of the content that comes with both expanding and refuting the lie. The 2020 election was a godsend to news and social media because they can keep getting advertising dollars for it without having to discover and market a new story.

12

snap-erection t1_iu6wmdt wrote

No the 2016 election was the god send, news and social media companies absolutely love Trump for the 24/7 insane news and discussion that comes from every time the guy opens his bitch mouth. And people really think the media has a bias against him. You wish. If they were against him you wouldn't hear about him. Look at the coverage Bernie Sanders got in both 2016 and 2020. Very comparable to that of Ron Paul. When the media really doesn't want a candidate (or even movement) they will straight up black list them. At best there's the occasional pity article that goes like "he's got a movement with some support from (insert most unrelatable people to blue collar Americans) and his policies are nice but here's why it won't work etc".

7

sup_ty t1_iu9r6nt wrote

They're just going to do what benefits them and gains them more made up power and money.

2

suarezMiranda t1_iu8gdmg wrote

You have things slightly backward. Whenever people say that they get money from hate speech and disinformation I always think of the underpants gnomes in South Park.

Step 1: Destry civilization. Step 2: ?? Step 3: Profit

It lacks logic, and becomes flat out wrong when you consider that advertisers don’t want their ads beside negative sentiment. All things being equal, if you had two social media networks with similar demographics, they will prefer their ads to be on the one that makes people happier and feel better. This has nothing to do with morals and everything to do with profit and sentiment analysis in marketing

The issue is that people are inherently incapable of freely sharing and consuming information responsibly. They do not check against information that confirms their biases. Echo chambers form on a scale that is not possible to police without a state-sized apparatus. They are betting that ML models will solve this, but I don’t think that will be possible in the foreseeable future. I’m not worried that social media giants are evil and want to destroy the world. What worries me is that they don’t want this, are actually spending colossal sums of money on it, and are failing.

When Facebook, Twitter, and YouTube were young they were actually credited with aiding occupy movements and the Arab Spring. One Egyptian activist talked about Facebook specifically as a media where they could spread hope, and that the generations they were fighting against didn’t have the understanding to fight back. Well they developed that understanding somewhere along the way. That is how these platforms are built. That is the inevitable function of their form unless the state claims control and places it under their own moderating apparatus. Whether Zuckerberg is a saint or a lizard has 0 to do with the outcome of this type of technology.

4

lotusflower64 t1_iu9kfjd wrote

True. Check out the NewsBreak app one day if you have the time. Chock full of all kinds of hate imaginable. They never take anything down.

2

CrypoFiend t1_iu9qjsf wrote

The governments could use block chain technology for elections. Then within seconds any claim of rigged elections would be defeated.

−1

banananailgun t1_iu9qxfn wrote

The people who believe the 2020 US election was rigged were fine with the outcome of the 2016 election. None of this had anything to do with election integrity, and everything to do with not liking the outcome of the election.

3

CrypoFiend t1_iua0fbu wrote

Has nothing to do with wanting to improve security, clarity, and misinformation.

Unclear why ypu downvoted me, but I will return the favor.

−1

banananailgun t1_iua11j4 wrote

Because no amount of evidence will make the mob believe that elections in the West are fair. That's why I downvoted you.

We could try your utopian blockchain plan, and hard core Trumpers would still cry foul until he was president again, even if they had to cheat to get him there.

3

CrypoFiend t1_iuaeocw wrote

That may be true. But it would be the hardest evidence available. Every voter would have a cryptographic key when they register. You could see every invididual vote, see how canidates faired by age, sex, and location. It is basically indisputable.

I do agree that some people will never accept the truth no matter how many facts are presented.

0

Inphexous t1_iu69f1k wrote

Their concern is only running ads and making money. There's nothing social about it.

10

SpotifyIsBroken t1_iu657qb wrote

Twitter is about to be one of the worst when it comes to this (it's already bad).

8

jackflour449 t1_iu6ah06 wrote

Imagining a public forum only calmly accepting and agreeing with the dominant narrative is an absurd thing to expect. I don’t know of any other outcome then wackiness when dealing with the public. Work in public service for a week and you’ll see what dealing with the public is like

7

downonthesecond t1_iu67wv5 wrote

>Facebook, TikTok, Twitter, and YouTube are failing to curb the spread of right-wing extremism and disinformation on their platforms and must immediately implement safeguards with the pivotal U.S. midterm elections less than two weeks away, a watchdog warned Thursday.

Is it their job to do any of that? Seems like some sites would allow it if it means more users.

Why do people only care when midterms are coming up?

5

whiplash81 t1_iu6f1j2 wrote

Safeguards needed to be put in place much sooner than "two weeks before the 2022 midterms."

This has been an issue for over a decade now.

7

FoolHooligan t1_iu69u0u wrote

They waited until Musk took over Twitter to tell us this.

4

ShadowPooper t1_iu6jjus wrote

It's not their job to police disinformation. Quit joining the two together.

4

strangefolk t1_iu7gtvo wrote

Even the terms 'dis' or 'mis' information are totally political, started by folks on the establishment left who control the language. When I see people talking about 'fact checking disinformation' all I see is people who want to control the narrative. Just look at how the CDC guidelines for COVID have changed. In some cases what's labeled 'misinformation' today is accepted truth in 6 months. Hell, I'm still banned from a sub for questioning the utility of masks and I'm just a fat guy nobody on reddit. That's how viciously and quickly the censorship trickled down.

You see the right taking on the same language game now which is always the que that new a vocabulary and definitions will be developed by the leftist academic establishment.

−4

amish_fortnite_gamer t1_iu7mjsn wrote

  • Misinformation (1580-90)
  • Disinformation (1965-70)

Misinformation vs Disinformation

2

strangefolk t1_iu7n6ht wrote

I've seen this before and I'm sure it's true. Cute, but neither term was in common usage. The real reason it's used is because it's a more polite way to call someone a liar and/or an idiot.

The goal, as defined by how we saw it used not what people say it's for - like in your link, is to shoot down opposing views as wrongthink, not simply clarify a misunderstanding. That's what I mean when I say these terms are politicized.

−5

amish_fortnite_gamer t1_iu7so3v wrote

You clearly played hooky on the day that they covered Latin root words in your English class. You are free to reject the accepted definitions for words, but don't act surprised when people disunderstand you the same.

1

strangefolk t1_iu7t2qc wrote

I don't have anything more to add. Per my previous comment, your etymology is a manipulative shell game engineered to distract.

−3

amish_fortnite_gamer t1_iu7tqqf wrote

Disunderstand (dis-un-der-stand) (verb)

  1. Refusal to concede an idea. Unwillingness to acknowledge or attempt to understand a given concept, principle, act, or activity for fear that such understanding or acknowledgement is antithetical to one's own principles.

  2. To fail to comprehend or understand why something is the way it is, when it is obvious that the situation should be otherwise or the situation defies logic or common sense. Similar in meaning to misunderstand, however it implies that the speaker blames the source, often a person or group of people, for intentionally causing confusion or simply being too lazy to clarify the situation.

0

Mr_Dr_Prof_Patrick t1_iu7o8zb wrote

It makes sense that guidelines would change as covid changed

1

strangefolk t1_iu7ow0f wrote

Maybe, but it was never expressed that way.

Those items that were 'dangerous misinformation' one day just suddenly weren't censored anymore. There was never an adult conversation about what was censored, why, when, and why that perspective will no longer be physically removed from the conversation from this point onward. Thinking people notice when you don't let them have a full conversation and then suddenly change the rules. And I don't regard that as a coincidence or an honest mistake.

1

Mr_Dr_Prof_Patrick t1_iu7q6lj wrote

Never expressed that way where? By who? I thought there was pretty widely available + straightforward information about new variants, how they work, how the observed transmissibility was different, and on the other hand people harping on a guidance change without bothering to check the rationale.

I think you're conflating a lot of different people talking about a lot of different things. The guidelines themselves and how they change, third party companies and the terms of service they choose to set, all the other people with opinions to express. Where / how / with who were you expecting to see this adult conversation about what was censored?

1

strangefolk t1_iu7rguq wrote

>Where / how / with who were you expecting to see this adult conversation about what was censored?

Social media platforms have never been clear about what warrants a ban, including when CDC changed their recommendations. Aren't these conversations the job of the corporate media?

But the arbitrary and malicious enforcement really showed it to be a political cudgel. You have to be a real partisan hack to cry about hate speech, ban Jordan Peterson, but then keep ISIS and the Taliban online. I donno, in my perfect world none of it would be censored at anytime so there is no 'good' way to do it anyway.

0

REiiGN t1_iu67rhq wrote

LOL, they're there to sell ad spaces you fucking morons. IT'S TO MAKE MONEY.

3

LumpyDefinition4 t1_iu6p5x0 wrote

Instagram has to be one of the worst. Doesn’t even investigate anything. Rampant antisemitism on there all the time.

3

snap-erection t1_iu6xsae wrote

I genuinely feel like the whole internet and now media is in some Eternal September mode now. And it's because the human brain only tolerates a certain amount of negativity (of certain types, within context) per day, and is overwhelmed when it keeps happening. Now most people, and I mean like on the order of 90% or more, are decent and innocuous. Some are edgy and some are real pathetic scum who seemingly only live to make others angry. Now with the internet though, especially with how engagement works, it's not that hard to find 1000 people in a population of hundreds of millions who are complete drop dead fucking assholes whose every expression is an insult to the human experience. But if you had to personally deal with 1000 complete cunts in one day, or every other day, you would feel completely overwhelmed right? That's what I find modern living is so full of. Just too many people, and too many negative experiences just based on the fact that a small percentage of a huge number is still a pretty big number for an individual.

Anyway that's why I can relate to why fighting this type of dirt on social media platforms is such an exhausting, losing battle. It's really hard to do properly, and it doesn't actually make money. Not directly anyway. The best way to fight it is with good education that doesn't produce this many shitheads. That or make the internet hard to use again so that idiots aren't on there anymore.

3

kaybeesee t1_iu8j28x wrote

It’s on purpose.

3

rhodopensis t1_iu6iy7p wrote

“Fail on”

This suggests that it’s not the point to spread both.

2

bakedtaino t1_iu6pytg wrote

Yet we let them take over, use their services, and allow business to divert our traffic to them. Hmm.

2

Dfiggsmeister t1_iu9nwg0 wrote

It’s not a failure, it’s a feature. They deliberately pushed it through with their algorithms.

2

tjstarlit t1_iu6wjje wrote

I could not get the article to load.. anybody have a summary?

1

JC2535 t1_iu7nwt3 wrote

The algorithms that drive traffic are weighted to prefer more incendiary emotional responses. Anger and hate drives clicks and shares as people are determined to convert their friends to their own point of view. Nobody hate-clicks on a cute kitten.

1

oldastheriver t1_iu9vmw7 wrote

The hundreds of trillions of dollars that Wall Street is pouring into the Internet still hasn't found anywhere useful to go. When these companies start to act as though they had customers, then things will change. But right now all they do is shit on top of you. And that's because they have a monopoly. there are very obvious solutions, and alternatives, but for some godforsaken reason people always prefer the dominant paradigm rather than the alternative that actually fits their needs. Once again, sooner or later economic reality has to come in to play and you will see those things disappear also. Right now the whole world is looking to make a quick buck without having to do any work, and it will fail. and by the way I just left Twitter.

1

CEO_of_paint t1_iu64uu8 wrote

Good. That's how speech works in the real world.

0

snap-erection t1_iu6wrnh wrote

I tell you right now, in the real world there's lines that people cross and people fucking drop these people when that happens. Just like on social media. "Real world" my ass, just don't be a bigot.

1

[deleted] t1_iu7h8wy wrote

[removed]

0

snap-erection t1_iu7krgx wrote

I thought that for some time, until it became clear that the bad faith actors out there are taking us all for a ride. Don't believe these conservatives that go on about freedom of speech, or any freedom really. They have been the ones to curtail people's freedoms and steal their money from day one. It's always a smokescreen. Just how the civil war in the US (that they are still butthurt about) was about "states rights" ... to have slaves. Hmm. Similarly now, serial con artists and disinformation peddling freaks and abusers cry about freedom of speech. Do any of these people give one fuck about actual whistleblowers and journalists out there? Of course not. These are not the issues that are being raised.

6

SirArthurPT t1_iu6y5ci wrote

The problem is; sites like those are ok with lies, hate and misinformation, as long as it's their side lies, hate and misinformation...

0

joshberry90 t1_iu6y5ot wrote

You don't win an idealiological war by silencing the opposition. That's called authoritarianism.

0

AlexanderJablonowski t1_iu7yxqr wrote

Disinformation/misinformation = whatever the opposite political side says, according to the totalitarian leftists.

0

Atomicjuicer t1_iu8ikh0 wrote

The problem is partially due to the "like" or upvote button. There needs to be two seperate buttons.

One to show you agree with the sentiment and the other to show you enjoyed the statement.

0

CrypoFiend t1_iu9sf72 wrote

I have been banned from many subs in reddit. Here are a couple reasons.

1 - I jokingly said Greg Abbots middle name was "Wheels". Banned for bullying.

2 - i told a crypto sub that I believed their coin was a scam and would wall vicitim to a "pump and dump" since VC investors got 60% of the coins at a 96% discount compared to the ICO.

3 - Banned from car sales sub, for adding the true cost of options on a vehicle with links to cost.

4 - Banned from a politics sub for saying cancel culture has gone to far and that we need less extremeism.

5 - Banned for blocking a mod instead of responding to his gas lighting attempt. It was another car sales sub and I told the poster he was asking a sub full of car salesman if he should buy a car.

So until there are statistics for moderator actions, mod abuse will continue.

0

trading-abe t1_iu9vepe wrote

So what is the yardstick? Notes from Mao?

−1

mmarollo t1_iu9y8ms wrote

Is Biden claiming the vax stopped transmission disinformation / misinformation? Recall that Pfizer last week admitted before the EU that they didn't even test for that at all.

If Biden's statement (along with numerous other officials at the time) isn't misinformation then what is?

Why was I banned from several Reddits for linking to peer-reviewed studies that contradicted the prevailing narrative, but have since become fully accepted science?

Does truth / factuality play any role in these definitions of "misinformation"? Because it sure as hell feels like misinformation is anything that contradicts the prevailing progressive left orthodoxy.

−1

UltraMagat t1_iu7a5kf wrote

Is that their job? Didn't think so. If they're editorializing, then they're a publisher.

−4