Social Websites Create Terrorism Images Database

Facebook, Microsoft, Twitter and YouTube have teamed up to create a database of the “hashes”—unique digital “fingerprints”—of violent videos or other images used for terrorist recruitment online that they have removed from their respective services.

They will all be able to cross-check what the others have removed so they can be on the lookout and, if they choose, remove them from their websites as well, akin to the "notice and choice" regime for identifying pirated online content.

It is a voluntary effort, the social media sites said in a joint blog post.

"By sharing this information with each other, we may use the shared hashes to help identify potential terrorist content on our respective hosted consumer platforms," the companies said. They said it applies to the "most extreme and egregious terrorist images and videos," ones that would likely violate all of their content policies anyway.

They said no personally identifiable information will be shared and a database match will not be automatically removed, with each company applying its own policies about the definition of terrorist content. There will be an appeals process for removal decisions and each company will retain its standards for review and compliance—or not—with any government requests for information.

"Throughout this collaboration, we are committed to protecting our users' privacy and their ability to express themselves freely and safely on our platforms," the companies said.

But not everyone is convinced that can be achieved.

Social media sites have been under pressure from Washington, including concerns about online recruiting expressed by both Hillary Clinton and Donald Trump, to crack down on such posts.

The Center for Democracy & Technology says it is "deeply concerned" that the project will create a precedent for cross-site censorship of speech and a target for government to suppress speech.

It argues that the companies should rethink the plan.

"The chance that this database becomes anything other than a repository of material prohibited across all participating services seems razor thin," it says. "Proposals such as this one that focus on smoothing the way for coordinated censorship create substantial dangers for the future of the information society."

John Eggerton

Contributing editor John Eggerton has been an editor and/or writer on media regulation, legislation and policy for over four decades, including covering the FCC, FTC, Congress, the major media trade associations, and the federal courts. In addition to Multichannel News and Broadcasting + Cable, his work has appeared in Radio World, TV Technology, TV Fax, This Week in Consumer Electronics, Variety and the Encyclopedia Britannica.