The Santa Clara Principles During COVID-19: More Important Than Ever
Blog Post
Electronic Frontier Foundation
May 11, 2020
This blog is co-authored with Jillian C. York of Electronic Frontier Foundation. It originally appeared on the EFF blog.
As COVID-19 has spread around the world and online platforms have scrambled to adjust their operations and workforces to a new reality, company commitments to the Santa Clara Principles on Transparency and Accountability in Content Moderation have fallen by the wayside. The Principles—drafted in 2018 by a group of organizations, advocates, and academic experts who support the right to free expression online—outline minimum levels of transparency and accountability that tech platforms should provide around their moderation of user-generated content.
However, the values and standards outlined in the Principles are particularly important during crises such as the ongoing pandemic because they hold companies accountable for their policies and practices. The spread of the virus around the globe has been unprecedented, and companies have had to respond quickly. But that’s an explanation, not an excuse: going forward platforms must do more to safeguard digital rights and user expression.
As a result of the pandemic, many technology companies have made changes to the way they moderate content. For example, Reddit—which relies heavily on community rather than commercial moderation—has brought in experts to weigh in on COVID-19-related subreddits. Facebook announced in late April that it was working with local governments in the US to ban events that violated social distancing orders. And Twitter has issued a detailed guide to their policy and process changes during this time.
Several companies—including YouTube, Twitter, and Facebook—have turned to automated tools to support their content moderation efforts. According to Facebook and other companies, the increased use of automation is because most content moderators are unable to review content from home, due to safety, privacy, legal, and mental health concerns. In light of this development, some companies have put new measures, or safeguards, into place, recognizing that automation may not always yield accurate results. For example, Twitter has committed to not permanently suspending accounts during this time, while Facebook is providing a way for users who are dissatisfied with moderation decisions to register their disagreement.
Nevertheless, companies have warned users that they should expect more moderation errors, given that automated tools are often unable to accurately detect and remove speech, particularly with content that requires contextual analysis.
The Santa Clara Principles remain a guiding framework of how companies can provide greater transparency and accountability around their practices, even, and especially, during this opaque and challenging time.
In particular:
- Companies should collect and preserve data related to content moderation during the pandemic. These numbers should be used to inform a COVID-19 specific transparency report, which outlines the scope and scale of company content moderation efforts during the pandemic. In addition, these data points should be used to inform future discussions with policymakers, civil society, and academics on the limits of automated content moderation tools.
- Further, companies should provide meaningful and adequate notice to users who have had their content or accounts flagged or removed. They should also give users notice of any changes in content or moderation policies during this time.
- Finally, companies should continue to offer robust appeals processes to their users in as timely of a fashion as possible. Given internet platforms’ growing reliance on automated content moderation tools during this time, such mechanisms for remedy are more important than ever.