Facebook is tasking users with spotting deepfakes in its first ever Deepfake Detection Challenge. The initiative has been supported by Microsoft, Amazon Web Services (AWS), and a number of universities.

While the organizations originally announced the challenge back in September, Facebook has now provided a data set to build research on deepfakes.

The Deepfake Detection Challenge involves $10 million in grants and rewards. With grant money, researchers can develop detection systems for AI deepfake videos. Facebook’s dataset includes 100,000-plus videos including paid actors leveraging the best deepfake technology.

While most of us know deepfake videos because of entertaining videos on YouTube, the technology has also raised questions. Authorities are concerned about how deepfakes could be used to influence political elections.

“Ensuring that cutting-edge research can be used to detect deepfakes depends on large-scale, close-to-reality, useful, and freely available datasets. Since that resource didn’t exist, we’ve had to create it from scratch,” said Cristian Canton Ferrer, a Facebook AI Research Manager leading the project.

“The resulting data set consists of more than 100,000 videos featuring paid actors in realistic scenarios, with accompanying labels describing whether they were manipulated with AI.”  

Challenge

Kaggle, a Google-owned data-science platform, is hosting the Deepfake Detection Challenge, including the leaderboard. Facebook says the challenge will run until the end of March, 2020.

How the $10 million fund will be separated is unclear, although Kaggle shows the top prize is $500,000, with a second prize of $300,000. Other prizes include $100,000 for third place followed by fourth and fifth placed prizes of $60,000 and $40,000.

Facebook points out its researchers have used artificial intelligence to deepfake faces and swap their voices compared to the original videos. Ferrer says the company is also aiming to create full body swaps, although admits the technology is not as mature as face swapping.