Meta Faces Lawsuit Over Alleged Negligence in Child Safety

A significant lawsuit has emerged against Meta, the parent company of Facebook and Instagram, alleging that the company prioritized profits over the safety of minors on its platforms. According to court documents filed in Oakland’s U.S. District Court, Meta’s policies allowed accounts associated with sex trafficking to remain active after 16 prior violations, effectively implementing a system where harmful content could proliferate before any action was taken.

The lawsuit, brought by children, parents, school districts, and states including California, claims that Meta, along with other social media giants such as Google’s YouTube, Snap, and TikTok, intentionally designed their platforms to be addictive, knowingly exposing children to harmful content. The plaintiffs assert that these companies misrepresented their products while targeting vulnerable populations.

“Despite earning billions of dollars in annual revenue—$62.4 billion last year alone—Meta simply refused to invest resources in keeping kids safe,” the filing states. The lawsuit seeks unspecified damages and demands a court order to halt what it describes as “harmful conduct,” along with warnings for parents and minors regarding the addictive nature of these platforms.

The filing includes allegations that Meta’s internal communications reveal a troubling disregard for child safety. For instance, it claims that an account-recommendation feature on Instagram directed nearly 2 million minors toward adults seeking to exploit them. Furthermore, an internal audit reportedly found that over 1 million potentially inappropriate adults were recommended to teen users in a single day during 2022.

Meta has denied these allegations through a spokesperson, who stated,

“We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions in an attempt to present a deliberately misleading picture.”

They emphasized that for more than a decade, Meta has made efforts to enhance safety features, including the introduction of Teen Accounts designed with protections for younger users.

In its response, Snap contended that the allegations mischaracterize its platform, which employs unique features to promote safety and privacy. They highlighted their commitment to creating a secure environment through various safeguards and partnerships with experts.

The plaintiffs have cited internal communications and sworn depositions, although many details remain sealed by the court and cannot be independently verified. Among the troubling claims is that Meta’s algorithms often recommended adult accounts to minors, exacerbating the risks associated with online interactions.

The lawsuit further criticizes Meta’s handling of child exploitation imagery. It alleges that the company did not adequately address reported cases of such content and lacked effective reporting mechanisms for users. As of March 2020, Meta reportedly had no process in place for reporting child sexual abuse material on Instagram. An internal audit revealed that even when artificial intelligence tools flagged harmful content with high confidence, the company hesitated to delete it due to concerns about potential false positives.

Internal documents reportedly cited that when proposed changes to enhance safety conflicted with user engagement metrics, the latter often took precedence. This prioritization allegedly led to significant delays in implementing basic protections for minors, such as making accounts private by default.

The lawsuit also raises concerns over social media’s impact on children’s mental health, claiming that platforms like Instagram have contributed to a compromised educational environment. The plaintiffs argue that schools are now forced to allocate resources to address distractions and mental health issues stemming from social media use.

Former Meta executives have testified that there was a pervasive culture within the company that prioritized growth and engagement over user safety. One former employee expressed skepticism about Meta’s commitment to user safety, suggesting that the company did not take the potential harms seriously.

As this case unfolds, it highlights the growing scrutiny facing social media companies regarding their responsibilities toward young users. The outcome could have far-reaching implications for how these platforms operate and their accountability for the safety of their user base.