Grieving families, including relatives traveling from the UK, converged outside Meta’s Wanamaker Place office in Manhattan Thursday, demanding CEO Mark Zuckerberg address severe online harms—ranging from lethal fentanyl sales traced via social media to cyberbullying and sextortion—they assert platforms like Instagram and Facebook facilitate.
Forty-five families, joined by high-profile attendees Prince Harry and Meghan, the Duke and Duchess of Sussex, held a vigil for children lost to these dangers. The protest, organized by advocacy groups Heat Initiative, ParentsTogether Action, and Design it For Us, accused Meta—a company reporting $164 billion in revenue for 2024—of prioritizing profit over user safety.
The event culminated in the delivery of an open letter demanding specific safety overhauls, backed by an online petition showing 11,040 signatures collected toward its 12,800 goal.
💜"Parents, activists & survivors are gathering in front of Meta HQ to demand they take care of our children and put #KidsOverProfit"👇 pic.twitter.com/xc4tO99PlX
— Brave Movement (@BeBraveGlobal) April 24, 2025
Holding photos of their children and signs stating “Meta profits, kids pay the price,” the parents shared harrowing stories. Perla Mendoza recounted her son’s death from fentanyl obtained via a dealer on Snapchat who, she stated, also operated across Instagram and Facebook, highlighting frustrating platform delays in acting on user reports; she is one of many parents who have sued Snap over similar allegations.
Ellen Roome, whose son Jools Sweeney died in 2022 after a suspected online challenge, attended from the UK, representing ongoing parental struggles to access deceased children’s online data. The Duke and Duchess met with families, with Meghan Markle stating, “It is a universal truth that our children are in harms way by what’s happening online,” while Prince Harry told the BBC that “enough is not being done.” A memorial at the event displayed 50 smartphones showing images of children organizers said were harmed by social media.
Demands For Platform Accountability Amidst Years of Warnings
The open letter presented to Meta lays out three primary demands, rooted in concerns amplified by past revelations. It calls for an end to the use of algorithmic feeds – systems personalizing content to maximize engagement – to promote harmful material (sexualizing content, hate speech, self-harm, eating disorders, drugs) to users under 18.
This demand follows years of concern, heightened by the 2021 “Facebook Files” showing the company knew internally about Instagram’s negative mental health impacts on teen girls, and recent testimony from whistleblower Sarah Wynn-Williams alleging Meta targeted ads based on teens’ negative emotions. The petition text cites Meta’s own research indicating 1 in 8 children reported unwanted sexual advances weekly on Instagram, and 1 in 4 faced discrimination.
Secondly, the letter insists on proactive measures to stop predators, drug sellers, and sextortionists from targeting minors, including purging known bad actors and maintaining effective communication barriers between unknown adults and young users’ DMs.
Organizers specifically alleged Meta previously refused to extend certain safety features, like blocking DMs from strangers, to older teens despite known risks. Thirdly, they seek transparent, rapid responses to user reports about dangerous content or interactions, including clear communication on outcomes and swift removal of harmful material.
Heat Initiative CEO Sarah Gardner told TechCrunch families feel consistently “ignored by the tech companies,” arguing Meta’s existing safety features are insufficient. She also viewed Meta’s shift toward community notes for content moderation as “letting go of more responsibility, not leaning in.”
Meta’s Response And Ongoing Measures
Meta acknowledged the protestors’ concerns. Spokesperson Sophie Vogel told TechCrunch about initiatives like “Teen Accounts” on Instagram, Facebook, and Messenger, which apply restrictive defaults limiting contact and content visibility, features Meta claims 94% of parents find helpful.
The company’s public response to the protest, reported by the BBC, included a call for wider action: “We believe teens deserve consistent protections across all the different apps they use — not just our platforms.” This stance echoes previous industry friction, such as when Google criticized Meta in March for advocating laws shifting age verification responsibility onto app stores.
Internally, Meta is deploying technology. Just days before the vigil, it began testing a new AI system on Instagram in the US to proactively identify potentially underage users based on activity signals (Meta states it does not use facial analysis for this purpose) and automatically apply “Teen Account” restrictions, with an appeal process. This AI approach is more assertive than previous age estimation efforts and complements other verification methods like its partnership with Yoti for video selfie verification.
Legislative Battles And Broader Context
The protest highlights the stalled progress on federal legislation like the Kids Online Safety Act (KOSA), which Meta actively lobbied against before it failed at the end of 2024 despite strong bipartisan Senate support. The bill’s future remains uncertain in the current Congress. State-level actions, however, are moving forward.
California enacted SB 976 in September 2024 (effective 2027) banning algorithmic feeds for minors without consent. Lawmakers there also introduced legislation in December 2024 for mandatory weekly mental health warning labels, supported by parents like Victoria Hinks who described social media’s pull as “dark rabbit holes.”
These state efforts mirror US Surgeon General Dr. Vivek Murthy’s June 2024 call for federal warning labels, where he noted “Adolescents who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms,” and “nearly half of adolescents say social media makes them feel worse about their bodies.”
While industry groups like the Chamber of Progress argue against such regulations, calling warning labels “like a broken fire alarm going off with no evidence of smoke,” advocates like Common Sense Media founder James P. Steyer support California’s actions, stating, “When it comes to protecting kids from the risks of social media, California has always been a leader…”
Regulatory scrutiny continues internationally, with Meta facing formal EU proceedings under the Digital Services Act regarding child safety. Although a US judge ruled in April 2024 that Mark Zuckerberg could not be held personally liable in child addiction lawsuits, the cases against Meta Platforms persist.