Meta Allegedly Concealed Research Showing Facebook Linked to Depression and Anxiety

Meta’s parent company allegedly hid research showing Facebook use is linked to depression and anxiety, according to recently unredacted court filings. The documents, released as part of a lawsuit by US school districts, allege that internal communications reveal the company’s findings about negative mental health effects were dismissed as biased. The case has intensified ongoing scrutiny of Meta’s impact on mental, with the company facing additional pressure from the Federal Trade Commission over antitrust allegations.

The findings, which showed users who stopped Facebook reported lower depression and anxiety, were allegedly dismissed by the company as biased. The court filings, a part of a long-running, high-profile lawsuit, allege that Meta stopped the project claiming participants’ feedback were influenced by existing media narratives. This has led to accusations that the company lied to Congress about what it knew, further damaging its reputation.

In addition to the mental health allegations, Meta has faced increased scrutiny in the US in recent months. The company has been under pressure from the Federal Trade Commission, which has accused it of holding a monopoly in social networking. However, last week a Washington district court ruled in Meta’s favor in the antitrust lawsuit, stating that the US competition watchdog had not proven that the company currently holds a monopoly, “whether or not Meta enjoyed monopoly power in the past.”

Despite these allegations, Meta has implemented new measures to improve safety for its younger users. The company said it would add new safeguards to its “teen accounts,” allowing parents to turn off their children’s communications with the company’s AI chatbots, following earlier revelations that they could engage minors in romantic or sensual conversation. These steps, however, have not quelled all concerns, with critics arguing that more needs to be done to address the mental health impact of social media use.

The case has drawn significant public attention and is part of a broader conversation about the responsibilities of tech companies in protecting user well-being. Legal experts are now awaiting further developments in the case, which could have significant implications for how companies handle research and transparency in the future. The outcome may also influence regulatory actions and corporate accountability standards in the tech industry.