On March 13, 2025, Utah became the first state to pass a law requiring app stores to verify the ages of users and obtain parental consent for minors accessing certain apps.
The App Store Accountability Act aims to enhance child safety online but has already sparked controversy. Meta has been a vocal supporter of such legislation, positioning itself as an advocate for child protection.
However, Google has criticized Meta for attempting to shift the responsibility for child safety onto app stores rather than taking action within its own platforms. The ongoing clash between these tech giants sheds light on the broader debate about accountability in the digital age.
Meta’s is Offloading Responsibility onto App Stores
Meta’s support for the Utah law reflects a growing strategy by the company to advocate for third-party entities—specifically app stores—to handle age verification and child safety regulations.
Meta argues that app stores are better positioned to implement centralized systems for age verification, seeing them as the most efficient gatekeepers for child safety measures.
The company, alongside other tech companies like Snap and X, has called for app stores to bear the responsibility of preventing minors from accessing harmful content, positioning the law as a vital step toward improved protection for children.
However, this move has been widely criticized. Google, in particular, has strongly opposed the idea, claiming that Meta’s push for app store responsibility is a deliberate attempt to offload its own duties.
Google has argued that this approach will create unnecessary privacy risks, particularly with the vast amounts of sensitive data that would need to be processed by app stores in order to enforce age restrictions.
As highlighted by Google’s blog post, the company’s stance remains firm that developers, not app stores, should be responsible for age verification to minimize data exposure and reduce the likelihood of privacy breaches.
“There are a variety of fast-moving legislative proposals being pushed by Meta and other companies in an effort to offload their own responsibilities to keep kids safe to app stores. These proposals introduce new risks to the privacy of minors, without actually addressing the harms that are inspiring lawmakers to act. Google is proposing a more comprehensive legislative framework that shares responsibility between app stores and developers, and protects children’s privacy and the decision rights of parents.”
The growing concern over Meta’s motivations reveals a key aspect of the debate: whether platforms should bear responsibility for ensuring child safety, or whether this should fall to third parties like app stores. Google’s criticism underscores its belief that Meta’s advocacy is an attempt to deflect attention from its own record on child safety, which has come under increasing scrutiny in recent years.
Google’s Opposition: Privacy Risks and Accountability
Google’s position on Utah’s new law highlights its concerns about the privacy implications of such regulatory measures. The company fears that shifting responsibility to app stores will lead to increased data collection on minors, raising the potential for privacy violations.
Google has long maintained that age verification should be handled directly by developers, who can ensure that data collection is kept to a minimum.
The company’s focus on privacy has been central to its ongoing opposition to similar laws in other states, including Louisiana, where it argued that app stores should not be required to collect sensitive data.
Google’s stance is not just about privacy; it’s also a critique of Meta’s attempt to offload responsibility. By advocating for app stores to take on the role of enforcing child safety laws, Meta is seen as trying to avoid the burden of regulation on its own platforms.
State-Level Regulation: Utah Sets a Precedent
The Utah law is part of a larger wave of state-level initiatives that aim to increase child safety by regulating app stores and digital platforms.
Following the failure of the Kids Online Safety Act (KOSA) in 2023, many states have introduced their own measures, with Utah being the first to pass such a law.
While advocates argue that state-level regulation is necessary to protect children from online harm, others, including Google, worry that these measures could lead to an inconsistent patchwork of laws that could burden tech companies with conflicting regulations.
The law requires app stores to verify the ages of users and obtain parental consent for minors before they can access certain apps. This move has prompted many to ask whether such state-level initiatives will effectively address child safety or simply shift the responsibility onto the shoulders of app stores without providing a comprehensive solution.
As states like California continue to propose similar bills, the future of digital child safety regulation remains uncertain. As Winbuzzer reported, California is pushing forward with laws requiring social media platforms to display warning labels about mental health risks, aligning with the growing movement for tech accountability.
California and Federal Action: A Patchwork of Laws?
California’s push to regulate social media platforms further complicates the landscape. The state has introduced its own set of regulations aimed at addressing teen mental health, including the requirement for social media platforms to display warning labels about the potential risks associated with prolonged use.
While these measures are seen as important steps toward improving child safety, they have raised concerns about the potential for conflicting regulations across different states. As more states follow California’s lead, the question arises whether federal legislation is needed to standardize child safety measures or if states should continue to lead the way.
The ongoing debate over child safety and privacy will undoubtedly continue to shape the regulatory environment in the coming years. Utah’s law has set a precedent, but the challenges it faces—particularly the concerns raised by Google about privacy risks—illustrate the complex balance that lawmakers and tech companies must strike between protecting children and safeguarding privacy rights.