Advertisement

Social Media Bans ‘Highlight the Profound Censorship on Web 2.0’

The crackdown on alleged hate speech is intensifying as social media platforms either expand their policies or step up enforcement of their terms of service.

Reddit banned over 2,000 subreddits as part of a focus on what it deemed hate speech, including The_Donald as well as the subreddit for the leftist podcast Chapo Trap House. Twitch temporarily banned President Trump. Facebook booted a “boogaloo” group (part of a loose affiliation of anti-government forces that vie for a second civil war), citing its promotion of violence. And YouTube banned a group of far-right content creators, including white nationalists such as David Duke.

The actions seem spurred by a variety of factors, including rising internal pressure from tech employees, the protests around the police killing of George Floyd, Twitter enforcing its terms of service against President Trump and growing advertiser boycotts. The moves ratchet up the volume on a longstanding debate and raise important questions about free speech in the modern internet era, including what constitutes hate speech, whether platforms are obligated to allow hateful content and, most of all, who should get to make decisions about the nature of content.

Related: First Mover: As Bitcoiners Watch Dollar, Deutsche Bank Sees Trump Win Hurting Reserve Status

“I defend the companies’ power and right to make these business decisions, as I defend the right of individuals and organizations to ‘pressure’ them to do so,” said Nadine Strossen, a law professor at New York University and the former president of the American Civil Liberties Union (ACLU), in an email.

But she is convinced any speech restrictions that go beyond what’s consistent with the U.S. Constitution’s First Amendment and International Human Rights principles will be at best ineffective and at worst counterproductive.

A double-edged sword

The application of social media company standards may not mitigate the potential harms of the speech at issue, according to Strossen. The standards for describing the targeted speech are overly vague and broad, meaning they give full power of discretion to those that enforce them, she said. Giving individuals that power means they’ll enforce them in accordance with their personal views and may mean that speech by minority views and voices is disproportionately censored, she said.

This has been the case previously when platforms such as Instagram flagged body positive imagery as “inappropriate.” Facebook reportedly trained its moderators to take down curses, slurs and calls for violence against “protected categories” such as white males, but allow attacks on “subsets” such as black children or female drivers. Facebook’s formulaic approach to what qualified as a protected category is what allowed some vulnerable subsets to fall through the cracks.

Related: Facebook, IoTeX, R3 Among New Members of Confidential Computing Consortium

See also: 93 Days Dark: 8chan Coder Explains How Blockchain Saved His Troll Forum

“Ironically, many of the very same civil rights/human rights groups that are now clamoring for more restrictions by the platforms have consistently complained that the existing ‘hate speech’ standards have disproportionately silenced Black Lives Matter activists, Pipeline protesters, and other social justice advocates,” said Strossen. “Why do they think this would change in the future?”

Amy James, co-founder of the Open Index Protocol (OIP), which is like a decentralized patent filing system protecting content that’s created on it, organizing it and making sure creators get paid, said the bans were horrifying for a number of reasons.

“Even if you disagree with information, censoring it doesn’t destroy it, it just allows it to spread without counterpoints,” said James in an email. “But on the positive side, it highlights the profound censorship.. on Web 2.0, and the more widespread [the] awareness about it, the better.”

James added she absolutely sees more bans in the future, largely because the internet isn’t a real-life public place where First Amendment protections apply.

“On the web, we primarily communicate using platforms that belong to private companies, so they can and should have a right to filter content however they want – based on financial criteria, community standards, etc,” said James.

See also: In Trump Versus Twitter, Decentralized Tech May Win

That’s a key part of this debate. By entering into these platforms, you give them the right to moderate and regulate your speech largely as they see fit, with little to no recourse. It’s ironic the people most adamant about the government not intervening in private businesses lose sight of that when it comes to social media.

Look no further than Trump, who has stridently dismantled business regulations but signed an Executive Order calling for reform of Section 230 of the Communications Decency Act, which shields social media companies from liability for content they publish.

Is there a way forward?

Rather than going after legislative fixes for Section 230, James said solutions offered by blockchain and the decentralized Web 3.0 provide a better path. In practice, that looks like supporting cryptocurrencies such as bitcoin, and open-source web browsers like Brave. She also points to platforms building with OIP –Streambed Media, a tamper-proof media index, or Al Bawaba, the Middle East’s and North Africa’s largest independent news platform, which is building integrations with OIP – as good options to help encourage and build Web 3.0, which would not allow for centralized censorship.

There are “censorship-free” platforms available now like Gab and 4chan but the trade-off with these platforms is some audiences may not go to them because of their content. “One person taking a stand alone has almost no effect,” she said.

Gab and 8chan (4chan’s rowdier offspring) also face consistent threats to their ability to function, as domain name providers such as GoDaddy and payment processing companies such as PayPal and Stripe have previously booted Gab off their services. Such methods go beyond just a ban, and fundamentally affect such websites’ ability to continue.

These platforms are based on the commitment they won’t censor you when they absolutely still could, based on their centralized nature.

See also: Handshake Goes Live With an Uncensorable Internet Browser

Strossen envisions a market in which there are a number of viable alternatives with diverse content moderation standards to choose from. Ideally, this would result in maximally empowered end users to make their own informed choices. She points to Parler, which is branded as a free speech platform, as one recent example of where conservatives have flocked, but even its content moderation standards are “as hopelessly vague and over broad as all the other platforms,” she said.

Now, as Parler’s user base has crossed one million, CEO John Matze is also grappling with the limits of speech.

“As soon as the press started picking up, we had a ton of violations,” Matze told Fortune. “We had a queue of over 7,000 violations, and we only had three people” to police the entire site.

The Santa Clara principles are another framework for moderation deductions. They were spearheaded by the ACLU, Electronic Frontier Foundation and others, and laid out minimum requirements for companies disclosing information about moderation. This includes publishing the numbers of posts removed and accounts permanently or temporarily suspended, giving notice to each user whose content is taken down or account is suspended about the reason for the removal or suspension, and offering a meaningful opportunity for timely appeal of any content removal or account suspension.

Strossen said no one is going to be completely satisfied with any standards no matter how they’re phrased or enforced because of the subjectivity of the issues at hand.

“One person’s ‘hate speech’ is someone else’s cherished speech, one person’s ‘fake news’ is someone else’s treasured truth and one person’s ‘extremist’ speech is someone else’s freedom-fighting speech,” said Strossen.

Related Stories