In 2015, twelve ISIS shooters killed 130 people in coordinated attacks around Paris. In Istanbul, an ISIS-trained gunman killed 39 and wounded 69 in a crowded nightclub on New Year’s 2017.
Relatives of Nohemi Gonzalez and Nawras Alassaf, who were killed in the Paris and Istanbul attacks, respectively, sued Google and Twitter under the Anti-Terrorism Act and the Justice Against Sponsors of Terrorism Act (Gonzales v. Google and Taamneh v. Twitter). They asserted the social media companies provided material support to ISIS, a designated foreign terrorist organization, by granting them access to the communications infrastructure that ISIS used to publish terroristic content, enlist new recruits and plan and execute attacks. Additionally, plaintiffs emphasized that the defendants shared revenue through the monetization of ISIS content and targeted ads.
Google and Twitter responded the claims were barred by 47 U.S.C. § 230(c)(1), a provision of the Communication Decency Act, which states:
No provider or user of an interactive computer service shall be treated as the publisher of speaker of any information provided by another information content provider.”
Plaintiffs countered that Google-owned YouTube’s algorithmic content recommendations and targeted ads changed it from an “interactive computer service” to an “information content provider” (ICP), thereby removing its Section 230 immunity. Section 230(f)(3) defines an ICP as any “person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet.”
A three-judge panel of the Ninth Circuit affirmed the district court’s dismissal of plaintiffs’ claims. But that panel said it was bound because of a recent decision made by another Ninth Circuit panel while Gonzalez was pending. Otherwise, the Gonzalez panel said, it would’ve held that Section 230 did not protect a provider’s content recommendations.
In October 2022, the Supreme Court granted certiorari in both Gonzalez and Taamneh. Briefs have been filed by the parties.
What’s this got to do with police officers?
The National Police Association (NPA) and the National Fallen Officers Foundation (NFOF) joined to file an “amicus curiae” (friend of the court) brief in Gonzalez urging the Supreme Court to adopt the plaintiffs’ view that Section 230 does not protect YouTube’s recommendations of the content of others on the platform. Such recommendations, the brief asserted, led persons showing an interest in ISIS to ISIS’s own propaganda on YouTube. That brought terrorists together, radicalized viewers and exposed them to encouragement to make terrorist attacks.
The brief went on to argue that police are also suffering from social-media-fueled hostility and attacks, and that a court decision against section 230 protection for such recommendations would help decrease anti-police attitudes and violence.
NFOF’s President and CEO, Sgt. Demetrick “Tre” Pennie has said,
This is an important landmark case that will change the landscape of public safety for future generations. Facebook, Google and Twitter have enjoyed broad liability protection under [Section] 230, while fueling societal instability and leaving police officers and citizens vulnerable to attacks facilitated by online radicalization. It’s simply time to bring the outdated 1996 legislation in line with the 21st century rule of law!”
What will the Supreme Court do?
Section 230 of the Common Decency Act was passed in 1996 – over 25 years ago. It began as an attempt to prevent minors from accessing pornographic materials on the Internet.
In 2020, Supreme Court Justice Thomas wrote about Section 230, “[I]n an appropriate case, we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.” Gonzales and Taamneh may be the “appropriate” cases Justice Thomas referenced.
The 167-page opinion of the Ninth Circuit contains a partial concurrence and dissent by Judge Gould that provides an excellent analysis for the Supreme Court of why Section 230 was never intended to cover the social media platforms algorithms that exist over a quarter century later.
Lawmakers have also proposed reforms to Section 230 to create accountability for computer services that help proliferate terrorist content online. So, it’s possible the Supreme Court will punt Section 230 back to Congress to fix – or not.
Whichever branch of government addresses – or fails to – this “landmark” issue of accountability for terroristic calls to violence on the Internet, the implications for law enforcement are interwoven with the implications for terrorism in our country at large and abroad.
Oral argument will be heard on February 21, 2023. A decision from the Supreme Court should be forthcoming in June 2023. Stand by, Police1 will bring an analysis of both to you.