Please E-mail suggested additions, comments and/or corrections to Kent@MoreLaw.Com.

Help support the publication of case reports on MoreLaw

Date: 05-18-2023

Case Style:

Reynaldo Gonzalez, et al. v. Google, LLC

Case Number: 21-1333

Judge: Per Curiam

Court: United States Supreme Court on cert from the Ninth Circuit Court of Appeals (San Francisco County)

Plaintiff's Attorney: Not Available

Defendant's Attorney: Not Available

Description: In 2015, ISIS terrorists unleashed a set of coordinated attacks across Paris, France, killing 130 victims, including
Nohemi Gonzalez, a 23-year-old U. S. citizen.1 Gonzalez’s parents and brothers then sued Google, LLC, under 18
U. S. C. §§2333(a) and (d)(2), alleging that Google was both directly and secondarily liable for the terrorist attack that killed Gonzalez.2 For their secondary-liability claims, plaintiffs alleged that Google aided and abetted and conspired with ISIS. All of their claims broadly center on the use of YouTube, which Google owns and operates, by ISIS
and ISIS supporters.

The District Court dismissed plaintiffs’ complaint for failure to state a claim, though it offered plaintiffs leave to
amend their complaint. Instead, plaintiffs stood on their complaint and appealed, and the Ninth Circuit affirmed in
a consolidated opinion that also addressed Twitter, Inc. v. Taamneh, ___ U. S. ___ (2023). 2 F. 4th 871 (2021). With
respect to this case, the Ninth Circuit held that most of the plaintiffs’ claims were barred by §230 of the Communications Decency Act of 1996, 110 Stat. 137, 47 U. S. C. §230(c)(1). The sole exceptions were plaintiffs’ directand secondary-liability claims based on allegations that Google approved ISIS videos for advertisements and then
shared proceeds with ISIS through YouTube’s revenuesharing system. The Ninth Circuit held that these potential claims were not barred by §230, but that plaintiffs’ allegations failed to state a viable claim in any event.

We granted certiorari to review the Ninth Circuit’s application of §230. See 598 U. S. ___ (2022). Plaintiffs did not
seek review of the Ninth Circuit’s holdings regarding their revenue-sharing claims. In light of those unchallenged
holdings and our disposition of Twitter, on which we also granted certiorari and in which we today reverse the Ninth
Circuit’s judgment, it has become clear that plaintiffs’ complaint—independent of §230—states little if any claim for
relief. As plaintiffs concede, the allegations underlying their secondary-liability claims are materially identical to
those at issue in Twitter. See Tr. of Oral Arg. 58. Since we hold that the complaint in that case fails to state a claim for aiding and abetting under §2333(d)(2), it appears to follow that the complaint here likewise fails to state such a claim.

And, in discussing plaintiffs’ revenue-sharing claims, the Ninth Circuit held that plaintiffs plausibly alleged neither
that “Google reached an agreement with ISIS,” as required for conspiracy liability, nor that Google’s acts were “intended to intimidate or coerce a civilian population, or to influence or affect a government,” as required for a directliability claim under §2333(a). 2 F. 4th, at 901, 907. Perhaps for that reason, at oral argument, plaintiffs only
suggested that they should receive leave to amend their complaint if we were to reverse and remand in Twitter. Tr.
of Oral Arg. 58, 163.

We need not resolve either the viability of plaintiffs’ claims as a whole or whether plaintiffs should receive further leave to amend. Rather, we think it sufficient to acknowledge that much (if not all) of plaintiffs’ complaint
seems to fail under either our decision in Twitter or the Ninth Circuit’s unchallenged holdings below. We therefore
decline to address the application of §230 to a complaint that appears to state little, if any, plausible claim for relief.

Instead, we vacate the judgment below and remand the case for the Ninth Circuit to consider plaintiffs’ complaint
in light of our decision in Twitter.

* * *

47 U.S. Code § 230 - Protection for private blocking and screening of offensive material

U.S. Code
Notes

prev | next
(a) FindingsThe Congress finds the following:
(1) The rapidly developing array of Internet and other interactive computer services available to individual Americans represent an extraordinary advance in the availability of educational and informational resources to our citizens.
(2) These services offer users a great degree of control over the information that they receive, as well as the potential for even greater control in the future as technology develops.
(3) The Internet and other interactive computer services offer a forum for a true diversity of political discourse, unique opportunities for cultural development, and myriad avenues for intellectual activity.
(4) The Internet and other interactive computer services have flourished, to the benefit of all Americans, with a minimum of government regulation.
(5) Increasingly Americans are relying on interactive media for a variety of political, educational, cultural, and entertainment services.
(b) PolicyIt is the policy of the United States—
(1) to promote the continued development of the Internet and other interactive computer services and other interactive media;
(2) to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
(3) to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
(4) to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material; and
(5) to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer.
(c) Protection for “Good Samaritan” blocking and screening of offensive material
(1) Treatment of publisher or speaker

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
(2) Civil liabilityNo provider or user of an interactive computer service shall be held liable on account of—
(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).[1]
(d) Obligations of interactive computer service

A provider of interactive computer service shall, at the time of entering an agreement with a customer for the provision of interactive computer service and in a manner deemed appropriate by the provider, notify such customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors. Such notice shall identify, or provide the customer with access to information identifying, current providers of such protections.
(e) Effect on other laws
(1) No effect on criminal law

Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
(2) No effect on intellectual property law

Nothing in this section shall be construed to limit or expand any law pertaining to intellectual property.
(3) State law

Nothing in this section shall be construed to prevent any State from enforcing any State law that is consistent with this section. No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.
(4) No effect on communications privacy law

Nothing in this section shall be construed to limit the application of the Electronic Communications Privacy Act of 1986 or any of the amendments made by such Act, or any similar State law.
(5) No effect on sex trafficking lawNothing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit—
(A) any claim in a civil action brought under section 1595 of title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title;
(B) any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 1591 of title 18; or
(C) any charge in a criminal prosecution brought under State law if the conduct underlying the charge would constitute a violation of section 2421A of title 18, and promotion or facilitation of prostitution is illegal in the jurisdiction where the defendant’s promotion or facilitation of prostitution was targeted.
(f) DefinitionsAs used in this section:
(1) Internet

The term “Internet” means the international computer network of both Federal and non-Federal interoperable packet switched data networks.
(2) Interactive computer service

The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
(3) Information content provider

The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
(4) Access software providerThe term “access software provider” means a provider of software (including client or server software), or enabling tools that do any one or more of the following:
(A) filter, screen, allow, or disallow content;
(B) pick, choose, analyze, or digest content; or
(C) transmit, receive, display, forward, cache, search, subset, organize, reorganize, or translate content.

Outcome: It is so ordered.

Plaintiff's Experts:

Defendant's Experts:

Comments:



Find a Lawyer

Subject:
City:
State:
 

Find a Case

Subject:
County:
State: