New Lawsuits and Allegations Highlight Meta’s Failure to Protect Teens
Recent revelations stemming from whistleblower disclosures and unsealed legal documents have spotlighted Meta’s apparent knowledge of a staggering number of underage users on its platforms, notably Instagram and Facebook. Despite this awareness, Meta reportedly took minimal action, disabling only a fraction of the accounts in question. The company’s handling of underage users’ data without adequate […]
Recent revelations stemming from whistleblower disclosures and unsealed legal documents have spotlighted Meta’s apparent knowledge of a staggering number of underage users on its platforms, notably Instagram and Facebook. Despite this awareness, Meta reportedly took minimal action, disabling only a fraction of the accounts in question. The company’s handling of underage users’ data without adequate consent has raised significant concerns about its ethical obligations and practices in safeguarding minors online.
Allegations of Ignoring Child Users and Charges of Psychological Manipulation
Meta, the parent company of Facebook and Instagram, is facing new allegations that it failed to protect teens from harmful content on its platforms. According to recently unsealed court documents, Meta received over 1.1 million reports of users under the age of 13 using its platforms since early 2019, but disabled only a small fraction of those accounts, highlighting the gravity of Meta’s handling of underage user data without proper consent.
Accusations have emerged that Meta neglected to shield teenagers from harmful content and designed and deployed “harmful and psychologically manipulative product features” to keep younger users engaged, potentially impacting their mental health. These claims suggest a profit-oriented strategy that may compromise user safety, raising ethical questions about prioritizing profits over the well-being of vulnerable individuals.
Testimonies by Whistleblowers
Insights from whistleblowers Frances Haugen and Arturo Bejar have provided crucial details about internal Meta discussions regarding the negative effects of its platforms on young users. Haugen had warned that Instagram was hurting teen mental health back in 2021. She testified that the company knew about the harm its platforms were causing but chose to prioritize profits over user safety. Haugen’s testimony was supported by research studies that found a correlation between social media use and depression, anxiety, and other mental health issues in teens. Haugen’s revelations about the discrepancies between public assurances and internal discussions regarding children’s safety have sparked a wave of lawsuits against Meta.
In November, Congress heard from a second Meta whistleblower who shared evidence that the company was aware its photo-sharing site could harm teens. Arturo Bejar’s testimony underscored Meta’s awareness of risks to teens’ safety and its alleged failure to take adequate action. These revelations have intensified the pressure on Meta to improve its safety measures and comply with government regulations.
Larger Federal Lawsuit
The current privacy charges are part of a larger federal lawsuit from 33 US states that claim the company has repeatedly misled the public about the substantial dangers of its platforms. The states allege that this and other incidents violate the Children’s Online Privacy and Protection Act, which requires that social media companies provide notice and get parental consent before collecting data from children.
Further, they allege that Meta has contributed to a mental health crisis among youth by failing to address the harm experienced by teenagers on their platforms. For example, a recent lawsuit filed against Meta alleges that Instagram’s algorithm promotes images of self-harm and suicide to vulnerable teenagers. The lawsuit claims that this has led to an increase in self-harm and suicide attempts among teenagers who use the platform. Another notable lawsuit was filed in November 2023 by a group of parents and child safety advocates, who accused Meta of ignoring warnings and failing to take action to protect children from online predators. The lawsuit alleges that Meta’s CEO, Mark Zuckerberg, ignored the advice of top executives who called for bolder actions and more resources to protect users, especially kids and teens.
In response to the criticism and with respect to barring younger users from the service, Meta argued age verification is a “complex industry challenge.” Instead, Meta said it favors shifting the burden of policing underage usage to app stores and parents, specifically by supporting federal legislation that would require app stores to obtain parental approval whenever youths under 16 download apps. The proposal would put app stores, like Apple’s App Store and Google Play, under greater scrutiny and make it more difficult for children to access apps that are not age-appropriate.
The Imperative for Industry-Wide Reevaluation
The glaring discrepancies between public assurances and internal actions call for a fundamental reassessment of corporate ethics, transparent practices, and stringent regulatory measures. This underscores the urgent need for a comprehensive reevaluation of ethical standards and responsibilities within the tech industry. Ensuring the safety of vulnerable users, particularly minors, should be an utmost priority, necessitating immediate action and robust measures to prevent further harm in the digital landscape.
Overall, Meta’s failure to protect teenagers from harmful content on its platforms has led to a public outcry and legal action. The impact of social media on teenage mental health is a growing concern, and it is clear that more needs to be done to protect vulnerable teenagers from harmful content on social media platforms.