By: Derek Jiang
Volume X – Issue II – Spring 2025
I. INTRODUCTION: A TRAGIC DEATH
On December 7, 2021, ten-year-old Nylah Anderson was scrolling on TikTok when TikTok’s algorithm fed her a lethal video. [1] Even though Nylah was underage (TikTok’s own policies bar users under the age of 13), TikTok’s “For You Page” recommended a video of the “Blackout Challenge,” encouraging viewers to choke themselves with items such as belts and purse strings until they pass out. [2] After watching the video, Nylah attempted the challenge herself, unintentionally hanging herself and dying of asphyxiation.
Who is to blame for Nylah’s death? Surely, the blame cannot lie with a ten-year-old child; Nylah was far too young and immature to fully comprehend the risks. Perhaps the creator of the video shoulders some responsibility for popularizing such a dangerous challenge, and Nylah’s parents probably should have done a better job monitoring their child’s online activities. But what about TikTok? After all, according to a lawsuit filed by Nylah’s mother, TikTok was fully aware of the Blackout Challenge, yet the app allowed users to post videos of the challenge, failed to warn viewers of the potential dangers, and recommended the videos to minors like Nylah. [3] Can TikTok be held legally liable for promoting videos encouraging self-harm to minors?
Under long-standing precedent, the answer is no. Looking to Section 230 of the Communications Decency Act of 1996—a provision of the U.S. code that generally shields social media and other Internet companies from liability for content posted by others on their platforms—lower courts have granted sweeping immunity to online service providers. [4] Under this erroneous interpretation of Section 230, judges have “ruled that platforms cannot be held culpable for negligently, recklessly, or [even] knowingly facilitating” terrorism, [5] child sexual abuse, [6] revenge porn, [7] racial discrimination, [8] human trafficking, [9] and other illegal content. [10] In one famous case, one man’s ex-boyfriend set up a fake profile on Grindr impersonating him, sending over 1,400 men over the course of 10 months to sexually harass him. [11] Despite being repeatedly informed of the fake profile and the harassment, Grindr did nothing, and the app escaped all legal liability. [12]
This state of affairs is not what Section 230 demands. By its own text, Section 230 plainly does not immunize online platforms for their own misconduct, and it likewise does not shield companies when they knowingly distribute illegal content. Lower courts have misconstrued Section 230 too broadly, and it is time for the Supreme Court to step in to narrow Section 230 to its true meaning.
II. LEGAL BACKGROUND AND HISTORY
When it comes to transmitting illegal content, well-established legal principles that long predate Section 230 have traditionally distinguished between “publishers and speakers” (like newspapers, magazines, and broadcast stations) and “distributors” (like newsstands, libraries, and retailers).” [13] Because publishers exercised “editorial control,” they faced strict liability for any illegal content they transmitted; if the New York Times published a defamatory opinion piece written by a non-affiliated author, the Times would be liable for defamation because they ultimately decided to include the author’s words in their publication. [14] On the other hand, distributors were liable only if they knew that content was illegal and failed to remove it. [15] A bookstore, for example, does not exercise editorial control over the books they sell and could not reasonably be expected to vet every single book on their bookshelves; they would only be required to remove legally objectionable material once they became aware of the illicit content.
As the Internet grew, lower courts struggled to fit online service providers into the existing publisher versus distributor liability framework. In Cubby, Inc. v. CompuServe Inc. (1991), a federal court in New York held that CompuServe was a distributor, rather than a publisher, of various defamatory statements posted by users on its online forums. [16] Because it was merely a host for online content and did not exercise editorial control over users’ posts, CompuServe could not be held liable unless it knew (or had reason to know) that the posts were defamatory in nature. Four years later, however, in Stratton Oakmont, Inc. v. Prodigy Servs. (1995), a state court in New York reached the opposite conclusion, holding that Prodigy was a publisher (rather than a distributor) of defamatory statements posted by users on its online messaging board. [17] The court reasoned that because Prodigy occasionally “moderated and took down offensive content” on its forum, its exercise of editorial control “rendered it a publisher even for content it merely distributed.” [18]
Taken together, Cubby and Stratton Oakmont put Internet companies in a bind. To avoid strict liability for their users’ posts, they had no choice but to “adopt an ‘anything goes’ approach:” no editorial control and no content moderation. [19] If they attempted to foster a more welcoming and comfortable online environment—by removing racist, pornographic, or other offensive content, for example—they would become strictly liable for anything their users posted. Under this legal landscape, the most appealing option for Internet companies was probably to shut down their online forums completely.
In response to this dilemma, Congress passed Section 230. In subsection 230(c)(1), Congress ensured that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” [20] A neighboring provision, §230(c)(2), immunizes Internet service providers for taking down objectionable content in good faith. In other words, an online service provider did not become a publisher—and thus face strict liability—“simply by hosting or distributing” third-party content, and it could engage in good-faith content moderation efforts without facing publisher liability for any illegal third-party content it “unknowingly leaves up.” [21] Or put even more simply, an online platform could not be the “publisher or speaker” of speech created by its users, effectively overruling Stratton Oakmont.
The first appellate court to consider the scope of Section 230, however, ruled that the statute not only eliminates publisher liability but also distributor liability. [22] Relying heavily on the supposed policy aims of Section 230, the court inexplicably classified distributors as a subset of publishers and conferred sweeping immunity to online service providers for any exercise of “traditional editorial functions,” including decisions to knowingly distribute illegal content. [23] Ever since, lower courts have consistently applied and extended this “categorical rule across all contexts,” even immunizing companies from their own misconduct in cases alleging defective product design, negligence, and fraud—that is, until Anderson v. TikTok (2024). [24]
In ten-year-old Nylah Anderson’s case, the Court of Appeals for the Third Circuit held that Section 230 did not immunize TikTok from lawsuits related to the “Blackout Challenge.” The court reasoned that TikTok’s recommendations on the “For You Page” reflects TikTok’s own speech, as it involves editorial control over the “selection and presentation” of what content to promote to users, and Section 230 only provides immunity with respect to speech created by third-parties. [25] In holding TikTok potentially liable for distributing the “Blackout Challenge” videos, the Third Circuit explicitly recognized that its holding “may depart from the views… of other circuits,” which generally ruled that platforms were not legally responsible for their algorithmic recommendations. [26] The split among the lower courts about the scope of Section 230 cries out for clarification, and after nearly three decades of avoiding the issue, it is finally time for the Supreme Court to intervene.
III. THE LEGAL CASE
There are two primary legal reasons for why the Supreme Court should rein in the scope of Section 230. Both stem from the only plausible reading of the plain text of the law.
First, lower courts erred when they relied on Section 230 to immunize online service providers for their own misconduct, even though the statute clearly confers no such immunity. The plain text of Section 230 provides immunity only for “information provided by another information content provider” (emphasis added). [27] In other words, online platforms cannot be sued for someone else’s expressive activity (third-party speech), but they are still liable for their own actions. For example, while TikTok cannot be sued for merely hosting videos uploaded by users, they may still be responsible for their own expressive activity (first-party speech); in Anderson v. TikTok, this expressive activity was TikTok’s targeted recommendations to Nylah via the “For You Page.” One may question whether algorithmic recommendations truly count as first-party speech, but the Supreme Court has explicitly held that creating and promoting “a curated compilation” of third-party material reflects “editorial discretion in the selection and presentation of content” and undoubtedly “qualifies as speech activity.” [28] Just imagine if “an adult walked up to Nylah… and said, ‘I know you, and I know you’ll like [these videos].’” [29] TikTok was more than simply a host of third-party content—it was an active promoter.
Beyond merely immunizing first-party speech, however, lower courts have also granted immunity in cases alleging a wide range of misconduct unrelated to traditional publication torts, including defective product design, negligence, and fraud. Consider the Grindr case: the victim alleged that Grindr defectively designed its app without “basic safety features to prevent harassment and impersonation” and negligently failed to remove fake profiles. [30] Another design-defect case alleged that Snapchat contained “a feature that encouraged reckless driving.” [31] One court even immunized an Internet company for “deliberately [structuring] its website to facilitate human trafficking”—for example, by “[accepting] anonymous payments, [failing] to verify e-mails, and [stripping] metadata from photographs to make crimes harder to track.” [32] The decision was so egregious that Congress passed a law in 2018, FOSTASESTA, specifically to strip immunity for online service providers that facilitate sex trafficking. [33]
The “common thread through all these cases” is that the lawsuits were not trying to hold online companies liable as the “publisher or speaker” of third-party content; rather, the plaintiffs were seeking to hold the platforms accountable for their own misconduct, which Section 230 plainly does not cover. [34]
Second, lower courts have incorrectly concluded that Section 230 eliminates both publisher and distributor liability, even though the statute only addresses the former. The plain text of Section 230 states that no online service provider shall be treated “as the publisher or speaker” of any third-party content (emphasis added). [35] Section 230 does not eliminate or even address distributor liability, and while some legal opinions appear to blur the distinction between publishers and distributors by treating distributors as a subset of publishers, it would be odd to interpret Section 230 as doing the same. [36] First, Congress enacted Section 230 in response to Stratton Oakmont, which explicitly distinguished between “publishers” and “distributors;” if Stratton Oakmont served “as the legal backdrop on which Congress legislated,” it would only make sense for Congress to use the same terms with the same meanings that Stratton Oakmont did. [37] Second, in the exact same Act that included Section 230, Congress also “expressly imposed distributor liability” for knowingly distributing “obscene material to children, even if a third party created” it. [38, 39] Given that Congress knew how to address distributor liability in the law, it would be somewhat logically inconsistent to suggest that Congress “implicitly eliminated distributor liability in the very Act in which Congress explicitly imposed it.” [40] Finally, if Congress genuinely intended to confer the sweeping immunity to online platforms that the lower courts have granted them, it would have simply written: “No provider shall be liable for any third-party content.” There would not have been any need to mention “publisher or speaker.”
In discarding the “longstanding distinction between ‘publisher’ liability and ‘distributor’ liability,’” courts have departed from the most natural and most logical reading of Section 230’s plain text. Section 230 does not provide immunity for online platforms when they knowingly distribute illegal content; a website that knowingly hosts child pornography, for example, can be held legally liable as the “distributor” of illicit material without being treated as the “publisher or speaker” of such material, which is the only thing that Section 230 forbids. To hold otherwise would be to flatly ignore the specific words that Congress chose in enacting Section 230.
In construing Section 230 so broadly, lower courts have relied too heavily on policy and purpose arguments. Reasoning that Congress probably intended to promote “freedom of speech in the new and burgeoning Internet medium,” courts have granted sweeping immunity without carefully considering the actual text of Section 230. [41] Unfortunately for online platforms, the true scope of Section 230’s immunity shield is much more modest.
IV. POLICY CONSIDERATIONS
Narrowing the scope of Section 230 would unquestionably leave online service providers more vulnerable to lawsuits, as online platforms could no longer claim immunity for their own misconduct or for knowingly distributing illegal content. For critics of Section 230, this expansion of liability is necessary to encourage Internet companies to act responsibly. Without the threat of lawsuits, companies have little incentive to remove harmful content that puts their users in danger, as they know they can rely on Section 230 as a get-out-of-jail free card even when they negligently, recklessly, or knowingly facilitate harm. [42] For opponents of Section 230 reform, however, an expansion of liability would drastically hinder free speech on the Internet by encouraging platforms to censor and take down any legally questionable content to avoid liability, particularly when pressured by wealthy, litigious individuals and institutions. [43] The heightened legal risk may also force platforms—especially smaller platforms and those that operate in legal gray areas—to shutter some or all of its services; this phenomenon was observed after the passage of FOSTA-SESTA, as some websites for online dating and sex work shut down in fear of potential lawsuits. [44, 45, 46] An expansive interpretation of Section 230, they argue, is essential for a “free and open Internet.” [47]
While the policy arguments against narrowing Section 230 certainly hold some merit, especially those related to free speech and censorship, it is not clear that narrowing Section 230 would ultimately leave online service providers as helpless as they claim. For starters, Section 230’s liability shield already has some exceptions, such as for copyrighted material and content that facilitates sex trafficking. Online companies have successfully navigated these exceptions to date and, over time, would also be able to adapt to a new legal regime that affords lesser immunity. [48] In addition, “paring back the sweeping immunity courts have read into Section 230” would not automatically render online platforms liable for online misconduct; it would simply “give plaintiffs a chance to raise their claims in the first place.” [49] A design-defect lawsuit, for example, would still have to affirmatively prove that the online platform was defectively designed. Finally, a narrower reading of Section 230 would still shield online service providers from strict publisher liability, and they would only be required to remove uploaded content once they became aware such material was illegal.
Ultimately, the best policy solution likely lies in legislative reform. With the rise of sophisticated algorithms and diverse features that modern online platforms implement, as well as the sheer volume of material posted online, the Internet is a very different place than it was in 1996, and Congress should update “liability laws to make them more appropriate for an Internet-driven society.” [50] Indeed, there have been dozens of legislative proposals to amend Section 230. Some advocate for restoring a duty of reasonable care “as a condition of receiving the liability limitations,” while others suggest “removing immunity only for certain types of claims or certain providers.” [51, 52]
In this respect, narrowing the scope of Section 230 would result in an additional benefit: it would bring social media and other Internet companies to the bargaining table with respect to Section 230 reform. Under the current legal framework of maximal immunity, even as Democrats and Republicans alike criticize the breadth of Section 230 and call for major overhauls, Internet companies have little to no incentive to discuss Section 230 reform, and many Big Tech firms actively lobby members of Congress against any major overhauls. [53] Should Section 230 be limited in scope, the opposite would be true: Internet companies would clamor for Congress to pass Section 230 reform, catalyzing legislative efforts to update immunity laws for the modern age. Given the bipartisan pressures to narrow Section 230 in the current political environment, such updated immunity laws would likely be narrower than the current scope of Section 230, reflecting nuance and thoughtful consideration of how the Internet actually works in today’s world.
As policymakers debate the proper level of immunity to afford online service providers, the Supreme Court has a legal duty to step in and narrow Section 230 to its true meaning. Policy arguments cannot override the plain meaning of the statute’s text. “When the express terms of a statute give us one answer and extratextual considerations suggest another, it’s no contest. Only the written word is the law.” [54] Perhaps there are good reasons to confer sweeping immunity to online platforms, but they cannot be found in the text of Section 230.
V. CONCLUSION: AN UNRESOLVED PARADOX
Just last year, in Moody v. NetChoice (2024), large social media companies argued—and the Supreme Court agreed—that curating and promoting personalized compilations of third-party content to users in the form of newsfeeds and “For You Pages” represent the platforms’ own constitutionally protected speech. [55] In doing so, these online platforms were able to secure greater constitutional protections against Florida and Texas laws that attempted to regulate their content moderation practices. However, when the time came for “platforms to be held accountable for their websites,” they argued the opposite. [56] All of a sudden, they were “not speakers under Section 230,” and since it no longer involved their own speech, they could not be held liable. [57] In the eyes of the Internet companies, they are fully responsible for their platforms “when it results in constitutional protections” but not responsible at all for their conduct when it results in legal liability. [58] Both cannot simultaneously be true.
The current interpretation of Section 230 adopted by most lower courts has allowed social media companies to express casual indifference to the death of a ten-year-old. We are told that they did not have—and should not have—any legal duty to prevent such harms. We as a society should expect more from these companies, but more importantly, so should the law. It is time for the Supreme Court to act.
Endnotes
[1] CBS Philadelphia, “Delaware County Mother Suing TikTok After Daughter Dies While Performing ‘Blackout Challenge,’” CBS News, May 12, 2022, https://www.cbsnews.com/philadelphia/news/tiktok-blackout-challengelawsuit-nylah-anderson/.
[2] Anderson v. TikTok, 637 F. Supp. 3d 276 (E.D. Pa. 2022) (complaint)
[3] Anderson v. TikTok. (complaint)
[4] Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).
[5] Force v. Facebook, Inc., 934 F.3d 53 (2nd Cir. 2019).
[6] Doe v. America Online, Inc., 783 So. 2d 1010 (Fla. 2001).
[7] Jones v. Dirty World Entertainment Recordings LLC, 755 F.3d 398 (6th Cir. 2014).
[8] Chicago Lawyers' Committee For Civil Rights Under Law v. Craigslist, 519 F.3d 666 (7th Cir. 2008).
[9] Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12 (1st Cir. 2016).
[10] Neil Fried, “Why Section 230 Isn’t Really a Good Samaritan Provision,” DigitalFrontiers Advocacy, March 24, 2021, https://www.congress.gov/117/meeting/house/111407/documents/HHRG-117-IF16-20210325-SD013.pdf.
[11] Herrick v. Grindr, 765 F. App'x 586 (2nd Cir. 2019).
[12] Carrie Goldberg, “Herrick v. Grindr: Why Section 230 of the Communications Decency Act Must be Fixed,” Lawfare, August 14, 2019, https://www.lawfaremedia.org/article/herrick-v-grindr-why-section-230- communications-decency-act-must-be-fixed
[13] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[14] Restatement (Second) of Torts, § 578
[15] Restatement (Second) of Torts, § 581.
[16] Cubby Inc. v. CompuServe, 776 F. Supp. 135 (S.D.N.Y. 1991).
[17] Stratton Oakmont, Inc. v. Prodigy Services Co., 23 Media L. Rep. 1794 (N.Y. Sup. Ct. 1995).
[18] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[19] David French, “Opinion: The Viral Blackout Challenge is Killing Young People. Courts Are Finally Taking it Seriously,” the New York Times, September 5, 2024, https://www.nytimes.com/2024/09/05/opinion/tiktokblackout-challenge-anderson.html.
[20] 47 U.S.C. § 230(c)(1).
[21] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari)
[22] Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).
[23] Zeran v. America Online, Inc. (4th Cir. 1997).
[24] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[25] Anderson v. TikTok, Inc., No. 22-3061 (3d Cir. 2024).
[26] Anderson v. TikTok, Inc., (3d Cir. 2024).
[27] 47 U.S.C. § 230(c)(1).
[28] Moody v. NetChoice, LLC, 603 U.S. 707 (2024).
[29] David French, “Opinion: The Viral Blackout Challenge is Killing Young People. Courts Are Finally Taking it Seriously,” the New York Times, September 5, 2024, https://www.nytimes.com/2024/09/05/opinion/tiktokblackout-challenge-anderson.html.
[30] Herrick v. Grindr, 765 F. App'x 586 (2nd Cir. 2019).
[31] Lemmon v. Snap, Inc., 440 F. Supp. 3d 1103, 1107, 1113 (CD Cal. 2020).
[32] Jane Doe No. 1 v. Backpage.com, LLC, 817 F. 3d 12, 16–21 (CA1 2016).
[33] Patrick J. Carome and Ari Holtzblatt, “Congres Enacts Law Creating a Sex Trafficking Exception from the Immunity Provided by Section 230 of the Communications Decency Act,” WilmerHale, April 16, 2018, https://www.wilmerhale.com/insights/client-alerts/2018-04-16-congress-enacts-law-creating-a-sex-traffickingexception-from-the-immunity-provided-by-section-230-of-the-communications-decency-act.
[34] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[35] 47 U.S.C. § 230(c)(1).
[36] Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).
[37] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[38] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, (2020) (Thomas, J. respecting the denial of certiorari).
[39] 47 U. S. C. §223(d).
[40] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[41] Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997).
[42] Neil Fried, “Why Section 230 Isn’t Really a Good Samaritan Provision,” DigitalFrontiers Advocacy, March 24, 2021, https://www.congress.gov/117/meeting/house/111407/documents/HHRG-117-IF16-20210325-SD013.pdf.
[43] Jason Kelley, “Section 230 is Good, Actually,” Electronic Frontier Foundation, December 3, 2020, https://www.eff.org/deeplinks/2020/12/section-230-good-actually.
[44] Samantha Cole, “Craigslist Just Nuked Its Personal Ads Section Because of a Sex-Trafficking Bill,” Vice, March 23, 2018, https://www.vice.com/en/article/craigslist-personal-ads-sesta-fosta/.
[45] Samantha Cole, “Furry Dating Site Shuts Down Because of FOSTA,” Vice, April 2, 2018, https://www.vice.com/en/article/furry-dating-site-pounced-is-down-fosta-sesta/.
[46] Siouxsie Q, “Anti-Sex-Trafficking Advocates Say New Law Criples Efforts to Save Victims,” Rolling Stone, May 25, 2018, https://www.rollingstone.com/culture/culture-features/anti-sex-trafficking-advocates-say-new-lawcripples-efforts-to-save-victims-629081/.
[47] “Section 230,” Electronic Frontier Foundation, accessed January 1, 2025, https://www.eff.org/issues/cda230.
[48] Valerie Brannon and Eric Holmes, “Section 230: An Overview,” Congressional Research Service, January 4, 2024, https://crsreports.congress.gov/product/pdf/R/R46751.
[49] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, 592 U.S. ___ (2020) (Thomas, J., respecting the denial of certiorari).
[50] Malwarebytes, Inc. v. Enigma Software Group USA, LLC, (2020) (Thomas, J. respecting the denial of certiorari).
[51] Neil Fried, “Why Section 230 Isn’t Really a Good Samaritan Provision,” DigitalFrontiers Advocacy, March 24, 2021, https://www.congress.gov/117/meeting/house/111407/documents/HHRG-117-IF16-20210325-SD013.pdf.
[52] Valerie Brannon and Eric Holmes, “Section 230: An Overview,” Congressional Research Service, January 4, 2024, https://crsreports.congress.gov/product/pdf/R/R46751.
[53] David McCabe, “Tech Companies Shift Their Posture on a Legal Shield, Wary of Being Left Behind,” the New York Times, December 15, 2020, https://www.nytimes.com/2020/12/15/technology/tech-section-230-congress.html.
[54] Bostock v. Clayton County, 590 U.S. 644 (2020).
[55] Moody v. NetChoice, LLC, 603 U.S. 707 (2024).
[56] Doe v. Snap, LLC, 603 U.S. ___ (2024) (Thomas, J., dissenting from the denial of certiorari).
[57] Doe v. Snap, LLC, (2024) (Thomas, J., dissenting from the denial of certiorari).
[58] Doe v. Snap, LLC, (2024) (Thomas, J., dissenting from the denial of certiorari).