A friendly reminder of free speech and how online "platforms" are not in compliance with current law.

"Section 230(c) was designed to address early court decisions holding that, if an online platform restricted access to some content posted by others, it would thereby become a “publisher” of all the content posted on its site for purposes of torts such as defamation. As the title of section 230(c) makes clear, the provision provides limited liability “protection” to a provider of an interactive computer service (such as an online platform) that engages in “ `Good Samaritan' blocking” of harmful content. In particular, the Congress sought to provide protections for online platforms that attempted to protect minors from harmful content and intended to ensure that such providers would not be discouraged from taking down harmful material. The provision was also intended to further the express vision of the Congress that the internet is a “forum for a true diversity of political discourse.” 47 U.S.C. 230(a)(3). The limited protections provided by the statute should be construed with these purposes in mind.

In particular, subparagraph (c)(2) expressly addresses protections from “civil liability” and specifies that an interactive computer service provider may not be made liable “on account of” its decision in “good faith” to restrict access to content that it considers to be “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.” It is the policy of the United States to ensure that, to the maximum extent permissible under the law, this provision is not distorted to provide liability protection for online platforms that—far from acting in “good faith” to remove objectionable content—instead engage in deceptive or pretextual actions (often contrary to their stated terms of service) to stifle viewpoints with which they disagree. Section 230 was not intended to allow a handful of companies to grow into titans controlling vital avenues for our national discourse under the guise of promoting open forums for debate, and then to provide those behemoths blanket immunity when they use their power to censor content and silence viewpoints that they dislike. When an interactive computer service provider removes or restricts access to content and its actions do not meet the criteria of subparagraph (c)(2)(A), it is engaged in editorial conduct. It is the policy of the United States that such a provider should properly lose the limited liability shield of subparagraph (c)(2)(A) and be exposed to liability like any traditional editor and publisher that is not an online provider."

https://www.federalregister.gov/documents/2020/06/02/2020-12030/preventing-online-censorship

Are you still confused?

A platform is like a bulletin or a telecommunications company, it is simply a vehicle for you to use to communicate and it cannot editorialize. Censorship outside of targeting child pornography and whatnot voids your status as a platform and thus you are liable for everything your users post.

A publisher on the other hand is like a newspaper. They can editorialize all they want but they are fully responsible for everything within the newspaper. Something like YouTube would become unviable overnight if it was responsible for everything its users posted.

A friendly reminder of free speech and how online platforms are not in compliance with current law.
A friendly reminder of free speech and how online "platforms" are not in compliance with current law.
Post Opinion