Massachusetts Rules Meta Must Face Lawsuit Over Youth Addiction Design
The Massachusetts Supreme Judicial Court ruled that Meta Platforms must face a lawsuit from the state's attorney general, alleging the company deliberately designed features on Facebook and Instagram to exploit the psychological vulnerabilities of young users. The ruling is significant because it challenges the scope of Section 230, a federal law that typically shields internet companies from liability for user-generated content. The lawsuit specifically targets features like push notifications and the infinite scroll, alleging they were designed for profit despite internal data suggesting harm to minors. This legal action follows major precedents, including a Los Angeles jury finding Meta negligent in its design, and a New Mexico civil penalty ruling. Currently, over 34 other states are pursuing similar legal challenges against Meta in federal court, focusing on corporate accountability for product design.
Ad slot
The Massachusetts Supreme Judicial Court ruled that Meta Platforms must face a lawsuit from the state's attorney general, alleging the company deliberately designed features on Facebook and Instagram to addict young users. This decision is significant because it challenges the scope of federal protections that typically shield internet companies from lawsuits regarding user-generated content.
The Massachusetts Ruling and Allegations
The ruling confirms that the state can pursue claims against Meta regarding its own business conduct, rather than solely relying on content posted by third parties. The lawsuit, filed by Massachusetts Attorney General Andrea Joy Campbell, centers on the alleged addictive nature of the platform's design features.
Key allegations include:
Addictive Features: The use of push notifications, 'likes' on posts, and the never-ending scroll function.
Profit Motive: The state alleges these features were designed to profit from the psychological vulnerabilities and the 'fear of missing out' (FOMO) among teenagers.
Internal Knowledge: The lawsuit claims that internal data showed the platform was harmful to children, yet top executives rejected changes that would have improved teens' well-being.
Legal Implications and Section 230
Ad slot
Meta had previously attempted to dismiss the case by invoking Section 230 of the Communications Decency Act of 1996. This federal law generally shields internet companies from liability for content posted by their users.
State Argument: Massachusetts argued that Section 230 should not apply because the claims relate to Meta's own design and safety statements, not third-party content.
Court Agreement: A trial court judge agreed, ruling that the law did not apply to allegations concerning the negative impacts of Instagram's design features, as the state was seeking to hold Meta liable for its own business conduct.
Broader Context of Tech Accountability
This ruling comes amid mounting legal pressure on Meta regarding its impact on minors. The Massachusetts case is one of at least nine state-level actions since 2023.
Los Angeles Lawsuit: A recent trial found Meta and Google negligent for designing social media platforms harmful to young people, awarding $6 million to a 20-year-old woman who claimed social media addiction.
New Mexico Lawsuit: Separately, a jury found Meta owed $375 million in civil penalties after New Mexico's attorney general accused the company of misleading users about platform safety and enabling child sexual exploitation.
National Trend: Thirty-four other states are currently pursuing similar lawsuits against Meta in federal court, focusing on corporate responsibility for product design.