A lawsuit against Meta Platforms Inc. has unveiled contentious internal communications suggesting CEO Mark Zuckerberg has overridden multiple proposals to introduce measures aimed at promoting the well-being of teenage users of Instagram. Despite widespread support from senior executives for initiatives meant to mitigate the potential harm to teens' mental health, Zuckerberg's decisions have kept certain features unchanged on the platform.
Rejected Safety Measures
At the center of the case is Zuckerberg's decision from April 2020 to keep Instagram “beauty filters,” which digitally alter users' appearances and have been criticized for contributing to unhealthy body image perceptions among teens. Although Zuckerberg affirmed the demand for such filters, asserting a lack of data indicating harm, his stance went against the recommendations of prominent figures within the company.
Adam Mosseri, CEO of Instagram, and other senior staff such as Margaret Gould Stewart, Vice President of Product Design, and Karina Newton, Instagram's Policy Chief, had all expressed support for deactivating these filters. In the aftermath of Zuckerberg's decision, Stewart voiced her concerns directly to him, underscoring the potential risks involved with allowing the beauty filters to remain.
Meta, responding through spokesperson Andy Stone, defended its use of filters, stressing that Meta disallows those promoting cosmetic surgery, changes in skin color, or extreme weight loss, and that the platform offers a variety of tools to help teens and families manage their social media experiences.
Executives' Concerns Overlooked
The unsealed documents further reveal that Nick Clegg, President of Global Affairs, called for additional investment in well-being across the company as recently as August 2021, a recommendation that remained unheeded as communicated via Meta's CFO Susan Li, who cited resource restraints. This was not the first instance; in 2019, David Ginsberg, a Meta product executive, suggested enhancing well-being tools to address addiction, social comparison, and loneliness but was similarly turned down by Li on behalf of Meta's leadership.
Arturo Bejar, a former Facebook engineering director, corroborated these accounts, highlighting the high burden of proof Zuckerberg requires before acting on research into potential harms. Bejar, who has become a whistleblower, appeared before U.S. lawmakers to provide testimony on these issues.
Advocates Demand Accountability
Tech advocacy groups have seized upon these revelations to criticize Zuckerberg's leadership and decision-making, positioning him as neglectful of user privacy and safety. Statements from the Tech Oversight Project and Design It For Us reflect a growing concern that even Meta's senior leadership faces challenges in influencing well-being policies at the company.
These disclosures form part of a broader set of allegations that Meta exploits the psychology of teen users to increase time spent on its platforms. A 2020 internal document mentioned in the complaint suggests Instagram appeals to teens' craving for novelty through dopamine-driven notifications seeking approval and acceptance. The release of these internal communications adds another layer of scrutiny to Meta's practices just as the company faces heightened examination over its social media impact on younger audiences.
Ongoing Multi-State Lawsuit
Meta is already facing legal action from 41 states for exploiting young users with addictive features on its platforms. The states' attorneys general say that Meta knew about the harm its platforms could cause to young people's mental health, but still designed features that kept them hooked for long periods. The lawsuit, filed in a federal court in California, is based on extensive investigations into Meta's practices and their impact on youth mental health.
What Meta did wrong The lawsuit claims that Meta targeted young users as a “valuable, but untapped” market segment. The company ignored its own research that showed the potential negative effects of its platforms on young minds. Instead, it continued to launch features that encouraged prolonged engagement. Features such as the “like” button, constant push notifications, and the “infinite scroll” function are under scrutiny. The lawsuit argues that these features were made to capture young users' attention, exploiting their natural tendencies like the “fear of missing out” (FOMO). Attorney General James said, “Social media companies, including Meta, have contributed to a national youth mental health crisis.”
What the states want The lawsuit also accuses Meta of violating the Children's Online Privacy Protection Act (COPPA) by collecting data from users under 13 without getting parental consent. This could have serious legal and financial consequences for the company. The states want more than just accountability. They want to bring in reforms to make platforms like Facebook and Instagram safer for their youngest users.