Behind Meta's Curtain: Hidden Truths of Social Media Impact
Meta stopped internal research after findings suggested Facebook and Instagram harm users' mental health, according to a lawsuit by U.S. school districts. Accusations include neglecting product risks, inadequately addressing child safety, and optimizing for engagement over safety. The lawsuit implicates Meta, Google, TikTok, and Snapchat in hiding product risks.
Meta, the parent company of Facebook and Instagram, is facing allegations of concealing research that links its social media platforms to negative mental health effects. Internal documents from a class-action lawsuit by U.S. school districts reveal Meta halted further research after findings suggested harms to users' mental health.
The lawsuit claims Meta intentionally obscured the risks associated with its platforms from users, parents, and teachers. As carried by plaintiffs, Meta allegedly prioritized growth over safety, ignoring concerns about child sexual abuse content and the engagement of minors with its products.
Other social media giants like Google, TikTok, and Snapchat are also implicated in similar practices. Meta spokesperson Andy Stone contested the allegations, citing improvements in safety measures and methodological flaws in the halted study. The legal proceedings continue, with a key hearing scheduled for January.
(With inputs from agencies.)
ALSO READ
Global Genomic Research Skewed Toward Wealthy Nations: WHO Report
WHO Report Warns Genomic Research Lacks Equity, Urges Global Action for Inclusion
High Court Mandates Submission of Medical Records in Rugby Concussion Lawsuit
Pioneering Scientist Takes Helm at Moscow's Premier Vaccine Research Center
Coupang Faces Investor Lawsuit Over Massive Data Breach

