According to CNBC, newly unredacted court documents allege Meta halted internal research in 2019 that showed people who stopped using Facebook reported lower feelings of depression, anxiety, loneliness, and social comparison. The study, called Project Mercury, involved a random sample of consumers who stopped Facebook and Instagram usage for a month starting in late 2019. The lawsuit claims Meta was disappointed with these initial results and chose not to “sound the alarm” but instead stopped the research entirely. The legal filing is part of multidistrict litigation involving school districts, parents, and state attorneys general against Meta, Google’s YouTube, Snap, and TikTok. Meta spokesperson Andy Stone strongly denied the allegations, calling them “cherry-picked quotes and misinformed opinions” while defending the company’s decade-long record of protecting teens.
The Tobacco Company Comparison
Here’s the thing that really stands out in these allegations. The lawsuit cites an unnamed Meta employee who apparently asked, “If the results are bad and we don’t publish and they leak, is it going to look like tobacco companies doing research and knowing cigs were bad and then keeping that info to themselves?” That’s a remarkably self-aware question that basically predicts exactly how this would look if it ever came out. And now it has.
Meta’s Counterargument
Meta isn’t just denying they buried bad research – they’re arguing the research itself was flawed. In a series of social media posts, Andy Stone characterized the 2019 study as merely showing that “people who believed using Facebook was bad for them felt better when they stopped using it.” He argued this doesn’t show anything about the actual effect of using the platform, just confirmation bias. But here’s the question: if the research was so methodologically weak, why halt it entirely rather than improve the methodology? That’s the part that doesn’t quite add up.
Where This Is Headed
This lawsuit is part of a much larger wave of legal and regulatory pressure building against social media companies. We’re seeing school districts, state attorneys general, and parents all lining up with similar claims – that these platforms knew about mental health harms and didn’t do enough. The timing couldn’t be worse for Meta, which is already facing scrutiny over youth safety features and content moderation. If these allegations gain traction in court, we could be looking at a fundamental rethinking of how social media platforms are regulated when it comes to protecting younger users. The tobacco comparison might seem dramatic, but it’s exactly the kind of narrative that could reshape an entire industry.
