The Supreme Court could change the liability game for internet firms. Here's how.

A 1996 law that’s credited and criticized for legally immunizing interactive websites — like YouTube (GOOG) (GOOGL), Facebook, Instagram (META), and Twitter — that moderate, and refrain from moderating, posts made by third parties, is about to face a challenge before the U.S. Supreme Court.

The bottom line: the case, depending on the outcome, could overhaul the risks of recommending online social content.

On Tuesday, the high court is scheduled to hear arguments in Gonzalez v. Google, which questions conflicting appellate court interpretations of Section 230 of the 1996 Communications Decency Act. The law, often cited as containing the “26 words that created the internet,” supported rapid proliferation of interactive websites by shielding them from legal responsibility for harms third party content may cause.

In Gonzalez, family members and the estate of Nohemi Gonzalez, a 23-year-old U.S. citizen killed in a December 2015 ISIS shooting at Paris' La Belle Equipe bistro, argue that Google should be held at least partially liable for her death. That's because, they allege, the company's YouTube service knowingly permitted and recommended, via algorithms, inflammatory ISIS-created videos that allegedly played a key role in recruiting the attackers.

The U.S. District Court for the Northern District of California dismissed the lawsuit at Google's request, concluding that Section 230 barred the claims because ISIS, not Google, created the videos. Meanwhile, judges in separate jurisdictions, faced with similar claims, applied varying interpretations to Section 230's liability shield.

WASHINGTON D.C., UNITED STATES - DECEMBER 28: The Supreme Court of the United States building are seen in Washington D.C., United States on December 28, 2022. (Photo by Celal Gunes/Anadolu Agency via Getty Images)
The Supreme Court of the United States building in Washington D.C.(Photo by Celal Gunes/Anadolu Agency via Getty Images)

In a case alleging that Facebook's algorithmic recommendations led to killings by Islamist militant group Hamas, the U.S. Court of Appeals for the Second Circuit similarly held in Force v. Facebook that Section 230 protected the social media company from liability. The court reasoned that recommendations brought about the same affect as native third party posts.

However, in a partial dissent Judge Robert Katzman instead argued that content recommendations convey messages beyond the defendant itself and therefore strip away Section 230 protection. Extending Section 230 to recommendations, he said, immunizes social media from "unsolicited, algorithmic spreading of terrorism.”

In still another case, Dyroff v. Ultimate Software Group, the Ninth Circuit held that the defendant's email recommending another party's content amounted to acting as a publisher. The majority held that Section 230 still protected the defendant from liability because its recommendation method — via algorithm — treated harmful other-party content the same as other other-party created content.

In Gonzalez, the high court will wrestle with the question of whether Section 230’s immunity stretches to fully immunize sites that make targeted recommendations, like the ones in Force and Dyroff, or if instead Section 230 provides only limited immunity for such recommendations.

The U.S. Court of Appeals for the Ninth Circuit ultimately affirmed the district court's ruling in the case, holding that Section 230 immunity barred plaintiffs' claims.

Targeted recommendations from social media and other websites can include automated algorithms that select and send specific material to a particular user's account feed, email, or text. Information that a site like Facebook, YouTube, or Twitter knows about a user, such as the health issues or social groups they research, for example, can serve as a basis to recommend content associated with those searches.

Google’s ISIS video recommendations, the Gonzalez plaintiffs argue, provided a uniquely essential role in the development of ISIS’ image and the terrorist organization’s ability to grow and spread its message.

“[W]hether Section 230 applies to these algorithm-generated recommendations is of enormous practical importance,” the plaintiffs wrote in their petition to the court. "Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the U.S. who uses social media.”

Google logo is seen through broken glass in this illustration taken, January 25, 2023. REUTERS/Dado Ruvic/Illustration
Google logo is seen through broken glass in this illustration taken, January 25, 2023. REUTERS/Dado Ruvic/Illustration

For its part, Google argues in its opposition brief that the Ninth and Second Circuits got it right when the majority ruled that Section 230 bars the company from liability for third-party content, including content recommended to its users.

Signed into law in 1996, Section 230 was created to enable online platforms to make “good faith” efforts to moderate user-generated content deemed “objectionable” without facing legal liability over that content. The law states that “No provider or user of an interactive computer service shall be treated as the publisher of or speaker of information provided by another information content provider.”

In 2018, President Donald Trump signed a law weakening some of Section 230’s protections to allow victims to sue websites that knowingly facilitate sex trafficking. However, a Supreme Court ruling further opening the door to liability for recommended third-party content could jeopardize significant company revenue streams.

Once online companies transitioned from subscription-based revenue streams to advertising-based revenue streams, the plaintiffs argue, their motivation to use automated algorithms skyrocketed. The algorithms, they say, tend to increase the time that users spend at their websites.

“The public has only recently begun to understand the enormous prevalence and increasing sophistication of these algorithm-based recommendation practices,” the plaintiffs wrote in court documents.

Alexis Keenan is a legal reporter for Yahoo Finance. Follow Alexis on Twitter @alexiskweed.

Follow Yahoo Finance on Twitter, Facebook, Instagram, Flipboard, LinkedIn, and YouTube

Find live stock market quotes and the latest business and finance news