Censorship Unveiled: Google vs. Gonzalez and the Legal Dynamics of Media Control

To understand the recent Supreme Court case Gonzalez v. Google LLP, we need to trace back to its origins. On November 13, 2015, Nohemi “Mimi” Gonzalez, a senior design major at California State University-Long Beach, was murdered by a terrorist organization during her study abroad trip to France. This murder was one of the many planned attacks coordinated by the ISIS (Islamic State of Iraq and Syria) terrorist group across Paris, which claimed the lives of 130 individuals from 19 countries. For the past nine years, the Gonzalez family has been seeking justice by suing Google for allowing ISIS and other terrorist groups the platform to recruit members.

 Gonzalez v. Google LLC (2021) stands as one of the most relevant legal cases in contemporary society. The case raises questions about the extent of liability for content uploaded by third parties and its associated collateral damage. It prompts discussions about the regulations and legal frameworks that media companies must adhere to in order to maintain a safe online environment. Ultimately, it underscores the delicate balance between freedom of expression and the responsibility to prevent harm in the digital age.

The legal proceedings began in the United States District Court for the Northern District of California, where Judge Chen dismissed the three actions brought forth by the Gonzalez family. Subsequently, the case advanced to the United States Court of Appeals for the Ninth Circuit, which upheld the prior ruling. The majority opinion referenced the case Twitter, Inc. v. Taamneh, which had recently been brought before the Supreme Court. The case came in the aftermath of the ISIS terrorist attack at a Reina nightclub in Istanbul that left 39 people dead. Both cases bring up two important governmental acts: Section 230 of the Communications Decency Act and Section 2333 of the Anti-Terrorism Act (ATA).

Employed by the plaintiffs, Section 2333 of the ATA, which was amended in 2016 by the Justice Against Sponsors of Terrorism Act (JASTA), establishes the mechanism through which victims of international terrorism can seek civil remedies within the jurisdiction of United States courts. This provision empowers individuals affected by acts of terrorism to seek redress and hold accountable both perpetrators and entities that have materially aided such events. The plaintiffs in this case argue that Google, through its actions, actively “communicated ISIS messaging, radicalized recruits, and furthered their mission.” Therefore, Google should be held accountable under Section 2333 and, consequently, provide compensation to the Gonzalez family.

In response, the defendants relied on Section 230 of the Communications Decency Act for the majority of their argument. The relevant rule states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The Act offers immunity from liability for content posted by third-party users. The defendants argued that the immunity afforded to Google under Section 230 of the Communications Decency Act does not apply in this case because Google “approved” advertisement videos from ISIS and shared the revenue through YouTube’s revenue-sharing system.

This case holds significant importance due to its reliance on Section 230 of the Communications Decency Act, the purpose of which was to accommodate the increasing number of individuals accessing and utilizing social media and the internet. To meet the demands of growing audiences, companies like Google rely on algorithms. Section 230 is designed to shield technology companies if their algorithms unintentionally promote harmful content, as alleged by the Gonzalez family. The future of social media platforms hinges on the Supreme Court’s ruling. Platforms like YouTube diligently sort through countless videos to deliver the most relevant ones to users. Should the Court side with the Gonzalez family under Section 230, it would enable future plaintiffs to hold algorithms responsible instead of the content itself, effectively circumventing Section 230(c)(1). This could have far-reaching implications, potentially requiring a complete overhaul of companies’ algorithms to avoid liability for such damages in the future. In turn, this may impact the way information is sorted and accessed online. 

The case was initially filed with the United States Supreme Court on April 6, 2022, and a judgment was issued on June 20th of the following year. The Supreme Court ruled that “The U.S. Court of Appeals for the 9th Circuit’s judgment — which held that plaintiffs’ complaint was barred by Section 230 of the Communications Decency Act — is vacated, and the case is remanded for reconsideration in light of the court’s decision in Twitter, Inc. v. Taamneh.” The Supreme Court’s choice to refrain from centering its ruling on the Section 230 argument is thought-provoking. Given that one aspect of the Section 230 act was deliberatively decided in the Twitter, Inc. v. Taamneh case, there would be no need to make a separate decision in the case of Google LLC vs. Gonzalez, which concerns a slightly different interpretation of Section 230. 

Google LLC v. Gonzalez sparked a crucial dialogue about the extent of liability for technology companies regarding the content shared on their platforms. As we grapple with the intricacies of online content moderation and legal responsibility, it becomes evident that the outcome of this case will shape the future of internet regulation and the safeguarding of users’ rights. It is imperative to acknowledge the urgent need to address the accessibility of terrorist propaganda videos, but curtailing algorithms’ ability to personalize individuals’ internet experiences risks unintended consequences such as making the internet less user-friendly. This then raises the fundamental question: Are we willing to sacrifice the familiar internet landscape of today for a potentially safer tomorrow?

Natalia Riley is a freshman at Brown University concentrating in International and Public Affairs and Economics. She is a staff writer for the BULR Blog and can be contacted at natalia_riley@brown.edu.

Julian Cohen is a sophomore at Brown University, double concentrating in History and International and Public Affairs. He is an editor for the Brown Undergraduate Law Review and can be contacted at julian_@brown.edu.