Advertisement

Supreme Court to Review Law Shielding Social Media Companies From Liability for User Posts

The Supreme Court agreed Monday to hear a case that will decide whether social media companies can be held legally responsible for what users post to their sites.

The case involves terrorist propaganda posted on Twitter, YouTube and Facebook, but is likely to have broader implications for the legal immunity that social media and other internet platforms have under Section 230 of the Communications Decency Act. The nine justices will hear arguments in the case this session, which began Monday, and will issue a ruling by July.

It’s the first time the Supreme Court has taken on a case involving what The Wall Street Journal called “the foundational internet law.” Written in 1996, the law intended to protect companies from liability for the materials supplied by others online, and helped fuel the rise of Twitter, Facebook and other social media, according to The New York Times.

Also Read:
TikTok to Moderate Posts About Midterms, Citing ‘Commitment to Election Integrity’

The case involves suits filed by families of terrorism victims, who maintain that the social media platforms bear some responsibility for attacks conducted by the Islamic State, based on content posted on those sites.

One relates a woman killed in an ISIS attack in Paris in 2015, in which the family of Nohemi Gonzalez alleges that Google subsidiary YouTube helped ISIS by recommending the terror group’s videos to users. At issue before the court is whether the recommendation algorithms are protected by the shield that Section 230 provides.

A second was brought against Twitter, Google and Facebook parent Meta Platforms by family members of Nawras Alassaf, who died in an ISIS attack on an Istanbul nightclub in 2017. This case involves the question of whether hosting content that expresses support for a terror group is equal to aiding and abetting an act of terrorism, CNN reported.

Lawyers representing the social media companies have said in court filings that they make extensive efforts to remove ISIS content, and that there’s no direct link between the postings in question and the attacks that killed Gonzalez and Alassaf.

Also Read:
Fandom Acquires TV Guide, Metacritic in $55 Million Deal With Red Ventures

“Section 230 bars claims that treat websites as publishers of third-party content,” lawyers for Google wrote in a brief, the Times reported. “Publishers’ central function is curating and displaying content of interest to users. Petitioners’ contrary reading contravenes Section 230’s text, lacks a limiting principle and risks gutting this important statute.”

Section 230 became a controversial law in recent years, but efforts to change it to put more responsibility on the platforms for the content shared by users have gone nowhere on the federal level. In general, Democrats argue that social media companies do not police user posts enough, while Republicans believe they censor conservative voices.

These cases could potentially narrow the scope of the law that the tech industry says is essential to keeping the platforms free from spam, hate speech and other objectionable content, according to CNN.

A federal court in Texas last month upheld a law that prohibits social media platforms from blocking or removing posts based on the viewpoint of the content in a suit brought by trade groups representing the Internet companies.

In a separate case before the Supreme Court, The Associated Press reported that the Onion submitted a brief in support of an Ohio man who was prosecuted for spoofing the local police force on Facebook.

“As the globe’s premier parodists, The Onion’s writers also have a self-serving interest in preventing political authorities from imprisoning humorists,” lawyers for the Onion wrote in the brief, which was filed on Monday. “This brief is submitted in the interest of at least mitigating their future punishment.”

Also Read:
Mark Zuckerberg Kick-Starts First Restructuring Since 2004 With Hiring Freeze at Meta