HomeWinBuzzer NewsGoogle Debuts AI to Detect and Remove Child Sexual Abuse Material (CSAM)

Google Debuts AI to Detect and Remove Child Sexual Abuse Material (CSAM)

A new artificial intelligence model from Google can find Child Sexual Abuse Material (CSAM) up to seven times faster than manual searches.

-

Artificial Intelligence (AI) is increasingly being used to transform the way we live. From to rival companies, AI is developing at a rapid pace. Google has announced a new AI project that will help organizations detect and remove child sexual abuse material (CSAM).

Google says its AI uses machine learning through deep neural networks to help make intelligent image processing. This will help whoever is reviewing content to sort through images more quickly and detect CSAM content automatically. If any found content is illegal, the AI can remove it.

In its announcement, Google says it wants the AI model to be free. It will be given to NGOs for free, which industry partners will be able to use it through the Content Safety API. Susie Hargreaves, CEO of the Internet Watch Foundation said the group is eager to start using the technology.

“We, and in particular our expert analysts, are excited about the development of an artificial intelligence tool which could help our human experts review material to an even greater scale and keep up with offenders, by targeting imagery that hasn't previously been marked as illegal material. By sharing this new technology, the identification of images could be speeded up, which in turn could make the internet a safer place for both survivors and users.”

Faster Speeds

The power of the tool allows finding CSAM online content much easier. Google says reviewers will be able to find 700% more CSAM content than they previously would under an equal timeframe. In the time one piece of content can be checked manually, the AI can complete seven. Over larger images running into the thousands, this could be a powerful tool.

“Identifying and fighting the spread of CSAM is an ongoing challenge, and governments, law enforcement, NGOs and industry all have a critically important role in protecting children from this horrific crime.”

Luke Jones
Luke Jones
Luke has been writing about Microsoft and the wider tech industry for over 10 years. With a degree in creative and professional writing, Luke looks for the interesting spin when covering AI, Windows, Xbox, and more.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
Mastodon