💬 Discussion

How Should Online Content Moderation Be Handled?

Wednesday, Feb 9, 2022

Image: Anolytics

There’s been a lot of discussion about censorship and free expression on the internet in the wake of the recent Joe Rogan situation. We’ve previously covered the laws governing how internet companies treat user-generated content, but decided to revisit the topic in light of recent events.

📜 The basics: The First Amendment protects individuals from government censorship. Social media platforms are private companies, so they can censor what people post on their websites as they see fit.

  • But given their influential role in public discourse, many people – including some social media executives – are calling on Congress to update their regulations for online content moderation.

🇺🇸 How does it work right now?… In the US, social media companies and other online platforms are governed by a law known as Section 230 of the Communications Decency Act, which has two key subsections regarding user-generated posts:

  1. Section 230(c)(1) gives online platforms legal immunity from content posted on their sites by third parties.
  2. Section 230(c)(2) allows online platforms to police their sites for harmful content, but doesn’t require that they remove anything, and protects them from liability if they choose not to.

That second clause didn’t always exist – Congress added it in response to a 1995 court ruling that said platforms policing any user-generated content should be considered legally liable for all of the content posted to its site.

  • Lawmakers believed the judge’s ruling would make platforms unwilling to police their sites for harmful content, so they added Section 230(c)(2) to encourage moderation.
  • In 2018, Congress amended the law to remove platforms’ legal immunity from third-party content when it comes to sex-trafficking.

🏛️ What are some alternatives?... In recent years, Congress and the executive branch have proposed dozens of different bills that would alter the scope of Section 230.

  • Some lawmakers are in favor of limiting platforms’ immunity from user-generated posts to encourage them to take down more undesirable content.
  • Others are in favor of limiting platforms’ immunity for certain types of moderation decisions to encourage them to host more content.

+From the business world: In testimony before Congress last year, Facebook CEO Mark Zuckerberg argued in favor of amending Section 230 to require platforms to have “adequate systems” in place to remove illegal content.

  • Other social media CEOs said Congress should keep the law as is to avoid unintended consequences in harming free expression.
Share this!

Recent Discussion stories

Discussion
  |  February 7, 2022

When Will the Pandemic Be Over?

A handful of countries, including Denmark and Sweden, have lifted all Covid restrictions in recent weeks and are shifting to an ‘endemic’ response to the virus.

We’ve previously covered what it might take for the pandemic to be considered endemic in the US, but in light of recent news (and a new Covid variant) we’ve decided to revisit the topic.

Kyle Nowak
Read More
Discussion
  |  February 4, 2022

Does the NFL Have a Race Problem?

Former Miami Dolphins head coach Brian Flores filed a class-action lawsuit against the NFL and three of its teams this week alleging racist hiring practices against Black coaches. 

Kyle Nowak
Read More
Discussion
  |  February 2, 2022

Is Climate Change Causing More Severe Weather in the US?

As millions of Americans prepare for another major winter storm projected to hit the central US starting today, we’re taking a look at the recent uptick in extreme weather events across America and whether they can be attributed to climate change.

Kyle Nowak
Read More

You've made it this far...

Let's make our relationship official, no 💍 or elaborate proposal required. Learn and stay entertained, for free.👇

All of our news is 100% free and you can unsubscribe anytime; the quiz takes ~10 seconds to complete