Comments

You must log in or register to comment.

Attreidies t1_j9ev86d wrote

Time for private networks to appear

10

Commercial-Honey-227 t1_j9evexi wrote

You can ignore it - it's clickbait. First, there is a sister case, Twitter v. Taamneh, which is going to hold that there is no right of action for generalized posting policies (like all social media) for discrete acts that follow third-party posts. Twitter, in this instances, would have to have been in cahoots with the terrorists to be held liable. Having ruled on that, which is going to be a 9-0 decision, the Court will not have to answer whether Section 230 of the Decency Act allows a cause of action for the algorithms used by social media sites when those algorithms potentially lead to harm.

A case will come, with better facts, where the court will have to decide the breadth of 230, but these cases aren't it.

44

redditbebigmad t1_j9ewgit wrote

Good luck arguing youre a platform when you act like a publisher.

6

sielingfan t1_j9f3bb3 wrote

I'm still recovering from the last time the internet was upended after net neutrality, and it killed me twice.

8

Huarrnarg t1_j9fg2u5 wrote

post a DD about why we listen to someone who replies in /r/Hairymanass/

3

gsasquatch t1_j9g0ffz wrote

ISIS see you in court!

They don't need bombs to destroy 'merica, they need lawyers.

1

Commercial-Honey-227 t1_j9ga4i4 wrote

Arguments are done. Minor correction to my original post, Taamneh only regards terrorism, as does Gonzalez, so if Taamneh goes, Gonzalez goes without even considering Section 230.

Near the end of the two-hour argument, the Court did dig deeper into liability for acts committed by those who, it would be argued, were influenced by what they saw on Twitter, YouTube, Facebook, etc. The gist of the argument is -if the algorithm used to populate objectionable content (terrorism, pro-anorexia, sex trafficking), if that algorithm was content-neutral, the internet sites could not be held liable. There would have to be something more than just publishing or recommending a site with objectionable views through a content-neutral algorithm. There were hypotheticals on the edges (if the site was reported and YouTube kept recommending vids to people it knew had terroristic attitudes), but by and large, today was a huge win for the larger tech companies in the US. The EU already has laws in place that force sites to filter content, but that won't be coming to the U.S. anytime soon. Unless Congress acts, the odds of that happening are slim.

5

Commercial-Honey-227 t1_j9h7dcm wrote

Well, under the Terrorism act, Twitter would have had to be in cahoots with the terrorists with the goal of committing terror in order for liability for that terror to be applied to Twitter. That is the case being heard tomorrow, Twitter v. Taamneh. Oral arguments begin at 10 am -just go to the Supreme Court site and you can listen in along with me and other Court geeks.

2