Court Rules Social Media Not Liable in Buffalo Shooting

1 min read

In a landmark decision, a New York state appellate court ruled that social media companies, including Meta (Facebook, Instagram), Google (YouTube), and others, cannot be held responsible for the 2022 Buffalo mass shooting. This ruling overturned a lower court decision and affirmed the protections afforded to these platforms under Section 230 of the Communications Decency Act, which shields them from liability over user-generated content.

The case originated from the racially motivated attack carried out by Payton Gendron, who killed 10 Black individuals at a Tops Friendly Markets store in Buffalo, New York, on May 14, 2022. The plaintiffs argued that these platforms played a role in enabling the radicalisation of Gendron by hosting and promoting harmful content. However, the appellate court held that Section 230 protections should remain intact, as the law was designed to foster the growth of the internet by providing immunity to online platforms from content posted by users.

Justice Stephen Lindley, who authored the majority opinion, warned that holding platforms liable for user-generated content could undermine the fundamental intent of Section 230. He argued that such a decision would fundamentally alter the internet, reducing it to simple message boards rather than the dynamic, interactive platforms that support communication, business, and innovation. In contrast, the dissenting justices believed that the platforms’ failure to prevent the spread of extremist content directly contributed to Gendron’s radicalisation, pushing for greater accountability.

This ruling reinforces the legal shield that Section 230 provides to tech companies and highlights the ongoing tension between platform immunity and the push for accountability in the age of digital radicalisation. The case adds to the broader conversation about how much responsibility online platforms should bear in regulating harmful content and preventing real-world violence.

The decision has significant implications for future cases involving online platforms and could influence how courts view the balance between free speech, user content moderation, and public safety in the digital age.

Legal Insider