Legal Alert

Trafficking and Child Exploitation Online: The Growing Responsibilities of Entities Operating Online Platforms

by Jill Steinberg and Kelly M. McGlynn
March 7, 2022

Summary

Section 230 immunity, which long has protected entities that host online platforms from liability for their users’ actions, may be significantly cut back. Although the U.S. Supreme Court today declined to hear Doe v. Facebook, which would have given it an opportunity to clarify and/or narrow existing interpretations of Section 230, there are calls from members of Congress to amend the law, in addition to agreement from executive agencies to do so. Section 230 may be amended further to create a duty of reasonable care, particularly with respect to online trafficking and child exploitation. Even in the absence of legislative change, lower courts have begun and may continue to chip away at what previously was considered Section 230’s broad immunity.

The Upshot

The U.S. Supreme Court denied certiorari today in Doe v. Facebook, effectively passing on its opportunity to address the scope of Section 230, the provision in the Communications Decency Act that provides immunity to web platforms for conduct, including potentially illegal conduct, perpetrated online. But regardless of the U.S. Supreme Court’s decision, change is likely coming. Congress aggressively is pursuing amendments to Section 230, and lower courts have begun to question the blanket protections previously given to online platforms.

Criticism of Section 230 has escalated in recent years, and legislators, legal experts, and even industry leaders have called for it to be either amended or repealed. The evidence shows that even well-intentioned online platforms may be used to traffic human beings, entice children for illegal activity, and trade child pornography. There are growing calls for those who manage our online environments to take responsibility for reasonable steps to protect their users. Especially with respect to these criminal activities, website hosts may soon have affirmative legal duties to change their platforms. These duties may eventually arise from a U.S. Supreme Court reinterpretation of Section 230, but are likely to come from congressional action and lower courts decisions.

The Bottom Line

Entities that host interactive platforms online should take steps now to prevent their platforms from being used to facilitate human trafficking and child exploitation. The statutory immunity that has protected web hosts may shrink soon, requiring online service providers to take reasonable steps to make their users safe from trafficking and child sexual abuse.

As human trafficking and child exploitation crimes continue to be perpetuated online, politicians and the public are looking to company leadership to do what they can to avoid facilitating criminal activity. Because of the protection provided by Section 230 of the Communications Decency Act (CDA), entities operating online have avoided liability for the activities of third parties on their platforms. This liability shield has been called into question recently, as legislators, judges, academics, and other expert commentators all have acknowledged that the current approach is untenable.

Human trafficking and child exploitation online are massive problems, and their prevalence is increasing. One example of this is found in the data kept by the National Center for Missing and Exploited Children’s (NCMEC’s) CyberTipline. The CyberTipline has received over 92 million reports of suspected child exploitation since 1988, and the annual volume of reports has grown dramatically in recent years. There were fewer than 10,000 reports in 1999. In 2014 there were one million reports. In 2021, the number was 21 million. Evidence similarly shows that there has been an increased use of the internet to recruit trafficking victims.

Wherever there are online communication tools, especially if they include personal messaging, bad actors can use those tools to identify, groom, and entice children for illegal activity, or recruit children and adults to be trafficked for sex or labor. For example, a social media platform that suggests friends or followers based on a users’ current and past connections can lead traffickers to vulnerable victims. The ability to connect and send messages to others allows traffickers to gain the confidences of their victims. They may use that trust to entice a victim to send an explicit image, or to meet them in person. In-person meetings or the sharing of explicit images quickly can turn these online “friendships” to exploitative and abusive relationships. Victims may be forced into trafficking. Images can be used for extortion or shared without the subject’s consent.

But social media platforms are not the only environments where this abuse occurs. Wrongdoers connect with children on multiplayer video games and coerce them into sending sexual images. Malicious users harass, impersonate, stalk, and abuse others on dating applications. Traffickers advertise victims for sex on auction or user-to-user sale websites. Pedophiles infiltrate youth-targeted messaging applications. Online gambling sites, forums for user discussions, applications to connect people around a common interest, and countless other types of platforms unfortunately all can be venues for trafficking and exploitation.

While businesses whose services may be used in furtherance of trafficking and exploitation–such as hotels, airlines, and restaurants–must take reasonable measures to prevent facilitating such activities, online service providers long have been shielded by Section 230 of the CDA, which has been interpreted to bar any suit where liability is based on content generated by third parties.

Section 230 was passed in 1996 in response to a pair of defamation cases whose holdings disincentivized online platform hosts from moderating the content users posted. The cases held that if a website screened content for offensive material or engaged in other moderation activities, it could be liable for defamatory content it hosted because it would be considered a publisher. If a web host did not screen, however, and instead allowed users to post directly, it would be considered a distributor. As a distributor, the host would not be liable for the content its users posted.

Congress passed Section 230 with the belief that it would allow the then-fledgling internet industry to grow by protecting it from excessive liability for the activities of third parties. Section 230 also sought to encourage web hosts to moderate harmful content by shielding them from liability for any outcomes of that moderation.

The key provisions of Section 230 are 230(c)(1) and 230(c)(2). Section 230(c)(1) states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Section 230(c)(2) provides: “No provider or user of an interactive computer service shall be held liable on account of–(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).”

Courts interpreted these provisions to provide expansive protection. Absent any U.S. Supreme Court precedent on the issue, every circuit court held that Section 230 prohibits liability based on content generated by third parties as long as (1) the defendant is an interactive service provider, (2) the plaintiff seeks to hold the defendant liable as a publisher or speaker, and (3) the plaintiff’s claims arise from content provided by another information content provider. This expansive interpretation of Section 230 has led to dismissals of almost all civil suits brought by victims of trafficking and exploitation against the hosts of platforms that arguably facilitated the crimes.

It is likely, however, that this expansive liability shield could soon shrink significantly. Section 230 has been the subject of high profile political debate, with many members of Congress calling for significant amendments, or for repeal. Attorneys General from every state signed a letter supporting these efforts. Their sentiment was summarized by Senator Edward Markey (D. Mass.), at the close of a hearing in which executives from TikTok, Snapchat, and YouTube were interrogated: “the problem is clear: Big Tech preys on children and teens to make money[.] Now is the time for the legislative solutions for these problems.”

One such legislative change occurred in 2018: the Allow States and Victims to Fight Online Sex Traffickers Act (FOSTA). FOSTA created a carve-out to Section 230 immunities in cases where the underlying conduct violates the federal sex trafficking statute. While FOSTA remains relatively untested, the few cases that have invoked it show new avenues for liability through which online entities may be vulnerable.

FOSTA permits plaintiffs to bring claims under a beneficiary theory, meaning that companies can be liable for a victim’s harms if they benefitted from participating in a venture with sex traffickers. In one case, Doe v. Twitter, Twitter was found to have participated in a venture with traffickers because it allegedly failed to disable an account when it was informed that the account posted child pornography, and because it did not remove material reported as child pornography. No. 21-cv-00485-JCS, 2021 U.S. Dist. LEXIS 157158, at *82 (N.D. Cal. Aug. 19, 2021). FOSTA permits plaintiffs to proceed beyond motions to dismiss when they can plausibly allege that a platform host had actual knowledge and when it should have known that certain activity involved exploitation or trafficking.

FOSTA may be a blueprint for more expansive legislative change to come. In 2020, the Department of Justice (DOJ) issued a comprehensive proposal for amending Section 230 after an expansive stakeholder-based review of the policy. DOJ’s proposal included three elements that together would shift significant responsibility to platform hosts to address online trafficking and child exploitation. Although this proposal may not advance any further, it is worth considering as an example of what legislative change could look like.

First, the DOJ proposal includes carve-outs that would eliminate Section 230 immunity for claims brought by victims of child sex abuse and cyberstalking. These carve-outs would create a tort-based standard of care, meaning that platform hosts would have an affirmative duty to take reasonable steps to prevent child sex abuse and cyberstalking. An entity’s failure to take those reasonable steps would render it civilly liable to victims for the harms they suffered. Second, DOJ proposed a carve-out for claims that a service provider hosted content that violated federal criminal law in spite of having notice or actual knowledge of the content’s illegal nature. Finally, under the DOJ proposal, Section 230 immunity would not apply to any defendant that intentionally hosted illicit content. Under the statutory regime proposed by DOJ, interactive service providers would need to implement measures to prevent their platforms from facilitating trafficking, sex child sex abuse, cyberstalking, and related criminal activity.

An attack on Section 230 immunities has come from the judicial branch as well. Recent cases show avenues through which judges have permitted claims to survive 230 defenses.

One avenue is under the theory that platform hosts sometimes can be considered developers of content that users post. Section 230 protects against suits based on “information provided by another content provider,” but it will not apply if the host has a sufficient role in creating the content at issue. In Fair Housing Council v. Roommates.com, the Ninth Circuit held that a website host is a content generator when it materially contributes to the creation of illicit content. 521 F.3d 1157, 1169-70 (9th Cir. 2008). If a platform host induces users to create illicit content–and their inducement directly relates to the content’s illegality–then that host can be liable as a developer of content. In M.L. v. Craigslist, a Washington court found Craigslist to be a developer with respect to posts in which traffickers advertised victims for commercial sex. No. C19-6153 BHS-TLF, 2020 U.S. Dist. LEXIS 166836, at *33-35 (W.D. Wash. Apr. 17, 2020). Because the plaintiffs alleged that Craigslist was aware of trafficking on its platform, Craigslist’s provision of detailed guidelines for content creation was sufficient to render it a developer of the trafficking advertisements. As a result, the plaintiffs’ claims were not barred by Section 230.

State courts also may find that state law civil trafficking suits survive Section 230. In In re Facebook, the Texas Supreme Court interpreted FOSTA to permit a state statutory claim for damages where the statute was materially similar to the federal sex trafficking law. In re Facebook, 625 S.W.3d 80, 97 (Tex. 2021). Whether one agrees or disagrees with the Texas Court decision, it is likely a sign of things to come. With such a strong appetite for reconsidering Section 230, other state courts may similarly hold that state law sex trafficking claims survive Section 230.

A U.S. Supreme Court opinion reinterpreting the statute could truly open the floodgates of liability. Justice Thomas recently signaled his intentions in an opinion respecting the denial of certiorari of another case, where he wrote that “in an appropriate case, [the Court] should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.” Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, 141 S. Ct. 13, 14 (2020) (statement of Thomas, J., respecting denial of certiorari). 

In a statement accompanying today’s denial of certiorari, Justice Thomas reiterated this position, but noted that the denial was appropriate for procedural reasons. Doe v. Facebook, 595 U.S. ___ (2022) (statement of Thomas, J., respecting denial of certiorari). Thomas again raised his concerns about prevailing interpretations of Section 230, and wrote: “[a]ssuming Congress does not step in to clarify Section 230’s scope, we should do so in an appropriate case.” Id.

Under a narrower reading of Section 230 proposed by Justice Thomas, Section 230 may protect defendants only from liability as a publisher or speaker, but not from liability as distributors. If distributor liability were revived, web-based platforms could be liable for ensuing harms whenever they host content that they know is illegal, or when they host content that they should know is illegal. This would raise difficult questions. Does any employee’s knowledge charge a company itself with knowledge? Does liability immediately attach when a company becomes aware of illegal content, or is there a grace period for investigation? How proactive do entities need to be in identifying illegal content? Do the answers to these questions change based on the type, size, and business model of the company involved?

The Court also could hold that Section 230 does not apply to products liability claims. Plaintiffs in online trafficking and exploitation cases often bring claims that the defendant provided a defectively unsafe product because its platform created opportunities for illegal predatory activity. These claims typically are dismissed under Section 230. Justice Thomas recently disparaged this application of Section 230. If others on the court agree, such claims would become viable. This similarly would raise a challenging set of questions. How will courts apply traditional tort principles of negligence to the online world?

Regardless of when the U.S. Supreme Court ultimately takes up these questions, online service providers should expect changes to come. Some lawmakers in Congress have made clear that they intend to change the law to place more responsibility on those that host online platforms. And lower courts have shown their motivation to shift the responsibility in the same direction.

Platform hosts should be thoughtful about what risks exist in their online environments, and what steps they are taking to address them. 

Beyond liability and legislative pressure, consumers will come to expect positive action. With the growth of Environmental, Social, and Governance (ESG) investing, investors are looking at more than a company’s bottom line. Whether a company acts to protect their users from child exploitation and trafficking may well affect its economic value. And as the Section 230 liability shield shrinks, details of how these harms occur online may be further revealed through the litigation process.

Companies hesitant to act may risk not only financial liability, but reputational harm. On the other hand, those who take proactive steps now have the opportunity to not only minimize those risks, but to take a leadership role.

Subscribe to Ballard Spahr Mailing Lists

Get the latest significant legal alerts, news, webinars, and insights that affect your industry. 
Subscribe

Copyright © 2024 by Ballard Spahr LLP.
www.ballardspahr.com
(No claim to original U.S. government material.)

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, including electronic, mechanical, photocopying, recording, or otherwise, without prior written permission of the author and publisher.

This alert is a periodic publication of Ballard Spahr LLP and is intended to notify recipients of new developments in the law. It should not be construed as legal advice or legal opinion on any specific facts or circumstances. The contents are intended for general informational purposes only, and you are urged to consult your own attorney concerning your situation and specific legal questions you have.