Tech providers and others are offering up some basic guidelines for online content moderation.
That comes as policymakers are increasingly calling for regulations on edge providers in the wake of various issues, from fake news and Russian election meddling to sex trafficking and privacy protection concerns.
New America's Open Technology Institute, whose funders include Google, Facebook and Amazon, but also Comcast, joined with a coalition of other groups and academics to call for protecting free expression online and outline some "minimum standards" for tech platform content moderation. That moderation includes efforts to take down content or suspend accounts.
The so-called Santa Clara Principles boil down to three main ones:
1. Publish the numbers of posts removed and accounts permanently or temporarily suspended due to violations of their content guidelines;
2. Provide clear notice to all users about what types of content are prohibited, and clear notice to each affected user about the reason for the removal of their content or the suspension of their account; and
3. Enable users to engage in a meaningful and timely appeals process for any content removals or account suspensions.
Kevin Bankston, Director of New America’s Open Technology Institute:
“As internet companies are under increasing pressure to more aggressively police the content on their platforms, and especially as companies are increasingly relying on AI and other automated tools to make content decisions, the need for more transparency and accountability around their content takedowns has also increased," said Kevin Bankston, director of the Open Technology Institute. "The steps taken in recent weeks by companies like Facebook and Google to be more transparent about how much content they take down and why is a good start," he said in a statement, "but much more remains to be done. These companies are becoming the de facto arbiters of what content is allowed and not allowed on the internet, a dangerous power and an awesome responsibility that requires meaningful checks and balances. At the very least, as outlined in these new principles, users deserve to know exactly when, why, how, and how much of their content is taken down, and have an opportunity to appeal those decisions.”
Edge providers are likely hoping that self-regulation satisfies Washington, though Facebook CEO Mark Zuckerberg has told the Hill he recognizes carefully crafted legislation/regulation may be appropriate, and in the cards, for the edge.
Others backing the principles include the ACLU Foundation of Northern California, the Electronic Frontier Foundation and the Center for Democracy and Technology.
The principles were unveiled at the second annual Content Moderation at Scale Conference in Washington Monday (May 7), which Santa Clara University organized.