override367 t1_j9pxu61 wrote

They... literally can't unless a complaint is filed, like holy shit this is the core of the case law around section 230

They can't knowingly put out material that is illegal or would get them in trouble, but they bear no liability if they don't know, until such time as they are made aware of it

The reason 230 was created was that this standard only applied to websites that exercised no moderation. IE: if the algorithm was literally a random number generator and you had an equal chance of it recommending you acooking video or actual child pornography, Youtube would be 100% in the clear without 230 as long as they removed the latter after being notified. 230 was necessary because Prodigy, like Youtube, had moderation and content filtering, and any moderation at all meant that they were tacitly endorsing something that was on their service, therefore, they were liable

This is the entire reason the liability shield was created. Section 230 means websites bear no liability in essentially any circumstance other than willful negligence as long as they didn't upload the content, SCOTUS is only considering this case because they aren't judges, they are mad wizards and this is calvinball, not law


override367 t1_j9oplum wrote

Your argument is asinine, if you buy something from target and get automatically enrolled in their mailing list that isn't a good reason to go to the supreme court and demand retail stores be banned from existing, it's fucking insane they're even hearing this case

In the case of youtube Autoplay is a feature that comes with it, just don't use youtube


override367 t1_j9mccsz wrote

What the hell are you talking about? Google removes terrorist content as soon as it is reported, the case before us is more like a book in the back (that isn't even illegal) which has a bunch of pictures of US soldiers who've been tortured by the Vietcong, and is against the bookstore's internal code of conduct to sell, and offended someone who sued even though they had a button to delete the book and others like it from their own personally curated section of the bookstore forever

I also want to point out that a good deal of terrorist content is legal and covered under the first amendment. Not like bomb making or whatever, but their ideology can absolutely be spoken aloud in America, google gets plenty of pressure from it's advertisers to remove such content

Now, right wing hate speech, not so much, the algorithm encourages it because it favors engagement and highly emotional rage bait encourages engagement, none of this has anything to do with section 230 however, and yet here we are


override367 t1_j9mc1ut wrote

Are you just going to ignore everything else I typed?

There is no way to present content that doesn't favor some weighted position, and with 3.7 million videos a day the service can't exist if you're just blindly putting it out alphabetically

that would be, again, like a book store being forced to just randomly put books out front in the order they are received and not being able to sort them by section


override367 t1_j9kl948 wrote

I mean, they do under 230, they absolutely fucking do, until SCOTUS decides they can't

Even pre-230 the algorithm wouldn't be the problem, after all, book stores were not liable for the content of every book they sold, even though they clearly had to decide which books are front facing

The algorithm front facing a video that should be removed is no different than a book store putting a book ultimately found to be libelous on a front facing endcap, the bookstore isn't expected to have actually read the book and vetted its content, merely having a responsibility to remove it should that complaint be made known


override367 t1_j6k6bq1 wrote

I wouldn't write off socialist programs like UBI yet, once the boomers die the political landscape will be vastly different