From Alpha to Omegle: Recent Developments in Product Liability and Section 230 of the Communications Decency Act

Preston Buchanan — “Talk to strangers!” seems like an innocent, attention-grabbing headline emblazoned on the front-page of the Omegle website, which markets itself as a “great way to meet new friends” while “paired randomly with another person to talk one-on-one.” Despite the purported social and entertainment value of its services, Omegle appears to have acknowledged an underlying threat posed by sexual predators perusing its chatrooms. Warnings on its front-page claim that “video chat is moderated but no moderation is perfect,” “users are solely responsible for their behavior while using Omegle,” and, most importantly, users “must be 18 or older.” These disclaimers, while reactive in nature, have until recently sufficiently protected Omegle from liability under the prevailing interpretation of Section 230 of the Communications Decency Act of 1996 (CDA).

Congress enacted the CDA when the Internet was in its infancy. It was designed for an era when interactive computer service providers (ICS) served as conduits for third-party content in amounts so vast that moderation was seen as an impossible task. In Section 230, Congress removed the threat of tort liability from online entities serving as speakers or publishers of information. Yet the Internet has evolved far from being only a repository of publications. From scrolling through social media, to shopping, and even working and attending class, the internet now provides forms of interaction that were unforeseeable in 1996. The language of Section 230(c)—a mere twenty-six words—addresses Internet immunity far too broadly. While some observers have called for “high-level, future-proof legislation” in response to the CDA’s shortcomings, for the time being, victims of harm incurred online have had mixed success skirting around Section 230’s publisher immunity by filing suit on alterative theories of liability.

Recently, victims filing suit against ICSs have had the greatest success under product liability theories. As noted by Justice Thomas in Malwarebytes, Inc. v. Enigma Software Grp. USA, LLC, in such cases where a product liability theory is advanced, plaintiffs are “not necessarily trying to hold the defendants liable for their own misconduct.” Justice Thomas specifically cited product liability claims as a class for which platforms should not be entitled to immunity. In doing so, he pointed to the then-unpublished Second Circuit decision in Herrick v. Grindr, where the plaintiff’s product liability claim was ignored in favor of an overly broad construction of Section 230. There, the plaintiff was a victim of a campaign of harassment by his ex-boyfriend, who had created Grindr profiles to impersonate him and communicate with other users in his name, directing the other users to his home and workplace. The key takeaway from Herrick is that a valid product liability claim must be wholly independent of third-party content to evade Section 230’s publisher immunity.

Not all courts have blindly applied Section 230’s expansive protections for ICSs. In Lemmon v. Snap, Inc., the Ninth Circuit rejected a Section 230 defense to a claim alleging wrongful death as a result of SnapChat’s Speed Filter. The plaintiffs alleged that Snap, Inc. knew that users had sped faster than 100 miles per hour on the false assumption that they would be awarded with in-app trophies but failed to discourage such activity. As such, the plaintiffs claimed that the Speed Filter negligently caused their sons’ deaths. The panel concluded that the plaintiffs’ negligent design claim held Snap, Inc. solely liable for Snapchat’s design, contending that the app’s “Speed Filter and reward system worked together to encourage users to drive at dangerous speeds.” Thus, the defect in SnapChat manifested itself before the sharing of third-party content, thereby allowing a valid product liability claim to be advanced over Section 230 of the CDA.

The Ninth Circuit’s conclusion in Lemmon suggests that courts no longer ignore valid product liability claims in favor of the once prevailing view that ICSs are mere publishers or distributors of third-party content immune from suit. In fact, plaintiffs in dozens of pending cases at the trial and appellate levels cite to Lemmon in their motions and briefs. One setback for victims was presented in Anderson v. TikTok, where in October 2022, the Eastern District of Pennsylvania held that the plaintiff’s product liability claims were based entirely on the TikTok defendants’ presentation of “dangerous and deadly videos” created by third parties. There, the family of an eleven-year-old girl who died from self-strangulation after allegedly seeing a video on TikTok encouraging participation in a “Blackout Challenge” filed suit against TikTok and ByteDance, alleging that TikTok’s algorithm was defective in showing videos encouraging dangerous activities. However, the TikTok defendants noted that the Third Circuit recently held that an algorithm is not a product for the purposes of liability, and the court also noted that the claims for design defect and failure to warn were “inextricably linked” to the TikTok defendants’ publishing of third-party content. As much as the Anderson decision is bad news for victims, it appears entirely in line with the limitation set out in Herrick.

The most favorable post-Lemmon decision to date for victims of harm incurred online lies with the District of Oregon’s July 2022 decision in A.M. v. Omegle.com, LLC. There, an eleven-year-old girl sought recovery from Omegle after she was paired with Fordyce, a sexual predator, who extorted her into performing sexual acts on camera. The court concluded that the victim had adequately pled a product liability cause of action and that Omegle could have satisfied its alleged obligation to the victim by designing its website differently. The court reasoned that, like Lemmon, the chats, videos, and pictures sent from the victim to Fordyce were immaterial to whether Omegle was treated as a publisher under the CDA. Although content was created, inadequate warnings or a defective design of the Omegle website led to the illegal conduct between the victim and the Fordyce before the introduction of third-party content.

Going forward, victims filing product liability claims against a website or app must ensure that a defective design or inadequate warning proximately caused their harm before the introduction of third-party content.