Behind Meta's Curtain: Hidden Truths of Social Media Impact
Meta stopped internal research after findings suggested Facebook and Instagram harm users' mental health, according to a lawsuit by U.S. school districts. Accusations include neglecting product risks, inadequately addressing child safety, and optimizing for engagement over safety. The lawsuit implicates Meta, Google, TikTok, and Snapchat in hiding product risks.
Meta, the parent company of Facebook and Instagram, is facing allegations of concealing research that links its social media platforms to negative mental health effects. Internal documents from a class-action lawsuit by U.S. school districts reveal Meta halted further research after findings suggested harms to users' mental health.
The lawsuit claims Meta intentionally obscured the risks associated with its platforms from users, parents, and teachers. As carried by plaintiffs, Meta allegedly prioritized growth over safety, ignoring concerns about child sexual abuse content and the engagement of minors with its products.
Other social media giants like Google, TikTok, and Snapchat are also implicated in similar practices. Meta spokesperson Andy Stone contested the allegations, citing improvements in safety measures and methodological flaws in the halted study. The legal proceedings continue, with a key hearing scheduled for January.
(With inputs from agencies.)
ALSO READ
U.S. Justice Department Files Lawsuit Against University of California
Court Throws Out xAI's Trade Secret Lawsuit Against OpenAI
Supreme Court Ruling Shields USPS from Lawsuits: A Landmark Decision
Judge Allows Tesla Discrimination Lawsuit to Proceed with Skepticism
IIT-K researchers develop world's first 3D data-based model to predict solar activity

