In the wake of violent protests across the U.S, another kind of confrontation is taking place on the Internet—between President Donald Trump and social media platform Twitter.
The standoff first started when Twitter placed a fact-check link on one of the President’s tweets, which claimed that mail-in ballots were “fraudulent.” Twitter also implemented a warning label on another of Trump’s tweets for potentially inciting violence.
In response, Trump began calling for the revocation of Section 230 of the Communications Decency Act. This is especially significant, as under Section 230, platforms are not held liable for user-generated content. In other words, without it, social media companies could be sued any time one of their users' posts violate the rules.
So, should social media content be regulated? Let’s take a closer look.
Pros of Regulating Social Media Content
People who propose regulating social media content say platforms should be held accountable.
From the early days of social media, these platforms have used algorithms that recommend additional content by looking at users' browsing history and their locations. For example, if you have recently read an article about a bear attack, the platform might recommend similar articles of attacks by other types of animals. As a result, proponents consider social media platforms responsible for the content that their users see.
In particular, proponents are wary of hate speech and false information that could be spread through such algorithms on social media platforms. To combat these, they suggest measures like making fake news illegal (which has been done in Singapore) or requiring social media platforms to follow government restrictions on illegal content, such as Germany's Network Enforcement Act.
Cons of Regulating Social Media Content
People who oppose the regulation of social media content believe that over-regulating could threaten free speech on the platform. How do social media platforms decide which kinds of content to censor or remove, without infringing on free expression?
According to opponents, social media are platforms where producers and consumers can exchange and share information. Thus, regulating social media platforms would make them more similar to publishers, who publish select content marketed to a specific audience. This would subject social media platforms to the same legal limits (i.e. defamation) applied to publishers—in turn limiting free speech and expression.
Are you a pro or a con? Here are some questions to ponder.
- Should social media content regarding fake news and hate speech be restricted?
- Should social media content be somewhat moderated? Or should a balance in between be maintained?
- Who gets to decide what is to be restricted—the government or social media platforms?
- What about implementing an independent body including the voices of civil society groups, social media companies, and the users themselves?
Sources: Washington Post, NYTimes, Wired, brookings.edu, cato.org, WSJ