YouTube and Facebook are said to already deploy an automated content-blocking system able to “block or rapidly take down Islamic State videos and other similar material.”
The two unknown sources cited by Reuters did not offer any hint about how much human work would be involved in the reviewing process for content pre-matched by automated systems.
According to Joseph Menn and Dustin Volz of Reuters, “[This system] looks for ‘hashes,’ a type of unique digital fingerprint that internet companies automatically assign to specific videos, allowing all content with matching fingerprints to be removed rapidly.”
Also, “internet companies including Alphabet Inc’s YouTube, Twitter Inc, Facebook Inc and CloudFlare held a call to discuss options, including a content-blocking system put forward by the private Counter Extremism Project, according to one person on the call and three who were briefed on what was discussed.”
The call which was led by Facebook’s head of global policy management, Monika Bickert took place already in April, two insider sources have confirmed.
Background pressure from Obama administration
It was an already known fact that U.S. President Barack Obama and other U.S. and European leaders had pressured leading IT companies to discuss the matter in a call.
One of the various discussed options then included a content-blocking system proposed by the Counter Extremism Project, led by Frances Townsend, a Homeland Security adviser for former President George Bush, and Mark Wallace, deputy Campaign manager for Bush ‘s 2004 campaign.
The system was initially designed for the identification and removal of video content that are protected by copyright.
It would prevent users from reposting content that is already considered unacceptable, but it will not automatically block videos that have not been viewed previously. As experts speculate, the system will check posted videos against banned content, such as those which incite violence or show beheadings.
Some of the issues that the companies had to contend with included conflicts between corporate authority and the government, free speech, and terrorism.
None of the internet companies were keen on implementing the Counter Extremism Project’s system, and they were wary of third party interventions on how their sites are monitored.
According to Seamus Hughes, George Washington University deputy director for the Program on Extremism,
“It’s a little bit different than copyright or child pornography, where things are very clearly illegal.”
Companies have subjective means for identifying what they consider extremist, but those who do employ automation for sorting these types of content would not publicly discuss their methods for fear of terrorists infiltrating their systems.
Regardless of the methods that these companies choose to use for censoring extremist content, this is still a step forward in the fight against terrorism.