US Court Filings Claim Meta Halted Research Highlighting Social Media Mental Health Risks
Newly unsealed court filings outline allegations that Meta ended internal research showing potential mental health harms, raising questions about transparency and youth safety across major social platforms.
New filings in a U.S. lawsuit brought by school districts allege that Meta stopped internal research after early results suggested that reduced use of its platforms led to improvements in users’ mental wellbeing.
The documents describe concerns among staff that the findings indicated meaningful and measurable mental health effects.
According to the filings, the 2020 project known as “Project Mercury” involved cooperation with Nielsen to study outcomes when participants deactivated Facebook.
Researchers reportedly found that users who stepped away for a week reported lower levels of anxiety, loneliness, depression, and social comparison.
The filings state that instead of continuing the work or publicly acknowledging the results, the project was halted.
The internal explanation described the findings as being influenced by negative public narratives, although some staff privately affirmed that the results were valid.
One researcher reportedly wrote that the study demonstrated a causal relationship between platform use and social comparison outcomes.
Another staff member expressed concern that withholding results could resemble historical cases in which harmful information was not shared publicly.
The lawsuit alleges that despite internal research suggesting harm, the company later told lawmakers it had no means of determining whether its products negatively affected young users.
Meta, however, has said the project was stopped due to methodological issues rather than an attempt to obscure outcomes.
A company spokesperson said Meta has consistently worked to improve safety features and has taken feedback from parents and researchers over many years.
The statement emphasized that the company continues to prioritize teen safety and product improvements.
The claims are part of a broader legal action filed by school districts across the country, involving Meta alongside Google, TikTok, and Snapchat.
The plaintiffs argue that the companies hid known risks from parents, educators, and young users.
The suit also alleges that major platforms enabled underage usage, did not adequately address harmful content, and encouraged minors to remain active on their services during school hours.
It further claims that companies attempted to influence child-focused organizations to publicly support their safety policies.
One example cited involves TikTok’s sponsorship of the National PTA, after which internal communications allegedly suggested the organization would strongly support the company’s public messaging.
The filing claims TikTok expected public statements and endorsements following the sponsorship agreement.
Many of the most detailed allegations relate directly to Meta’s internal practices.
The filings claim that youth safety features were intentionally designed in ways that limited their impact or usage.
Documents cited by the plaintiffs also allege a high threshold for removing users involved in serious violations, including sex trafficking attempts.
The claims assert that internal discussions described the threshold as unusually high.
Another allegation suggests that optimizing the platform to increase teen engagement resulted in higher exposure to potentially harmful content.
The documents state that concerns raised by safety staff were not prioritized due to growth considerations.
The lawsuit further asserts that internal efforts to prevent unwanted contact between adults and minors were delayed. It also references communications claiming that senior leadership focused on other priorities at key moments.
Meta disputes these assertions, saying its safety systems are effective and its policies do not tolerate serious violations. A spokesperson said the lawsuit relies on selective interpretations of internal communications.
The company has also opposed the unsealing of certain internal documents, arguing that the request is overly broad. A hearing on these motions is scheduled for late January in federal court in California.
The case represents one of the most extensive legal challenges to social media safety practices brought by public institutions.
As the litigation progresses, it is expected to shape public debates around platform design, youth wellbeing, and transparency in the tech sector.