In a move to combat the potential misuse of artificial intelligence (AI) technologies, Microsoft and Google have announced their commitment to implementing new child safety measures within their generative AI services. These commitments are made in collaboration with Thorn, a non-profit organization dedicated to combating child sexual abuse, and All Tech Is Human, an initiative aimed at fostering a responsible tech ecosystem.
The announcements underscore the tech industry's dedication to ethical innovation and the protection of vulnerable children from sexual exploitation and abuse. Amazon, Anthropic, Civitai, Meta, Metaphysic, Mistral AI, OpenAI, and Stability AI have also committed to the initiative.
Microsoft's Proactive Measures
Microsoft has outlined its strategy to prevent the exploitation of children through its AI technologies. The company asserts that it will develop generative AI models that are not trained on datasets containing child sexual abuse material. Furthermore, Microsoft commits to actively safeguarding these models from being exposed to or generating such content post-release. In their blog post, Microsoft emphasizes the importance of this commitment as a critical step forward in preventing the misuse of AI technologies for creating or disseminating child sexual abuse material (AIG-CSAM) and other forms of sexual harm against children.
Google's Dedicated Efforts
Similarly, Google has detailed its approach to enhancing child safety within its AI services. The company highlights the existence of a specialized team tasked with identifying content that may indicate a child is in danger. Google employs a multifaceted strategy to detect and remove content related to child sexual abuse and exploitation, leveraging hash-matching technology, artificial intelligence classifiers, and human reviews to ensure the efficacy of its efforts. Additionally, Google has expressed its support for several legislative bills in the US Congress aimed at bolstering the protection of children from abuse and exploitation.