Facebook’s Oversight Board Is Not as Powerful as We Think. But It Can Push the Company’s Policies in the Right Direction.

Blog Post
Editorial credit: Cryptographer / Shutterstock.com
Jan. 15, 2021

Last month, Facebook’s independent Oversight Board revealed the first cases that it will be reviewing. The entity was first conceived over two years ago, when Facebook CEO Mark Zuckerberg announced that the company would establish an independent body to review a set of content moderation cases every year, floating the idea of “a structure almost like a Supreme Court.” Ultimately, Facebook created what it calls a Content Oversight Board. These framings—as either a Supreme Court or as an Oversight Board—incorrectly imply that Facebook’s new entity has an immense amount of power over the company’s operations. As the Board begins reviewing cases, it is important to lower our expectations around the scope and scale of its powers, and instead consider what the Board can actually accomplish under its narrow mandate.

Calling Facebook’s new Board a “Supreme Court” depicts the Board as the final arbiter on the company’s content moderation rules, or “Community Standards.” As the body is currently structured, this is impossible. The Board is tasked to review and offer a binding decision on specific cases that have been surfaced by both users and Facebook itself, and that Facebook has deemed highly important. However, the Board’s role will be limited to deciding these particular cases, rather like an intermediate appellate court. Whereas the U.S. Supreme Court may strike down congressional statutes that conflict with the U.S. Constitution, even if Facebook’s Board concludes that a Facebook rule conflicts with international human rights law or otherwise fails to protect users’ human rights, the Board cannot demand that Facebook remove or amend that rule. Thus, while the first cases announced by the Board raise questions under Facebook’s policies on hate speech, violence and incitement, and nudity, the Board is only empowered to assess whether Facebook properly applied these policies as written. In addition, although the Board can provide policy recommendations to Facebook proactively or when Facebook makes a policy advisory request, the company is not bound to accept and implement this guidance.

The name Content Oversight Board also suggests more power and influence than the new entity possesses. Oversight bodies typically have the authority to identify areas of concern, and then investigate and issue findings. But Facebook’s Board members have explained that they do not have the jurisdiction to review overall compliance with rules, as many oversight bodies do. Further, some oversight bodies, such as the Privacy and Civil Liberties Oversight Board where one of us previously served, have the authority to evaluate whether an entity’s governing rules are appropriate from a policy perspective. But Facebook’s Board also lacks the power to examine how Facebook develops its content policies and evaluate whether they appropriately balance all relevant considerations. If it were actually empowered to act like an oversight body, the Board could directly address criticisms from lawmakers, civil society groups, and users regarding myriad Facebook policies, ranging from its lack of transparency around how it moderates categories of content such as hate speech, to its troubling practices regarding collecting and managing user data, to Facebook leadership’s decision last fall to not hold politicians accountable when they spread misinformation. The limited scope of the Board’s authority means that the leadership of Facebook will continue to operate unchecked and without true oversight.

Additionally, it is important to recognize that the Board will only review a handful of cases every year—indeed, the first group is only six cases—and although Facebook can compel the Board to review certain cases in an expedited manner, most will take the Board weeks, if not months, to review. As representatives from the Content Oversight Board have stated, the Board is a “deliberative body and not a rapid response mechanism.” The lack of a rapid response mechanism has sparked frustration among Facebook critics, who in September established an alternative unofficial oversight board to publicly analyze and critique Facebook’s content moderation policies and practices.

However, as the Board reviews its first set of cases, it is worth considering what impact it may actually have on Facebook’s content moderation operations. Since Facebook announced creation of the Board, the company has shared more information about its structure and governance mechanisms. If the Board is operated effectively, it should be able to have a limited set of positive influences over certain components of the company’s content moderation efforts.

Most notably, the Board provides users with an opportunity to appeal decisions to an independent entity. Four of the first six cases are based on user referrals. Although it is not yet clear how independent the Board will be in practice, the Board’s Charter, Bylaws, and governance structure include measures designed to create independence, and having an additional layer of appeal can be beneficial. Facebook has also stated that the Board’s decisions will establish precedent to be followed in subsequent cases regarding how Facebook’s Community Standards should apply to particular fact patterns.

Another area in which the new Board offers some potential for a positive impact on Facebook’s content moderation practices is to address criticism that it has failed to account for cultural, linguistic, and regional contexts when relying on both humans and automated tools. The first group of cases relate to several different countries, including Azerbaijan, Brazil, China, France, and the United States. While it is unlikely that the Board will be composed of individuals who represent all of the communities and perspectives reflected among Facebook’s user base, the company has stated that it invested significant effort to recruit a broad set of Board members. These Board members will also have access to geographic and subject matter experts when making decisions. If used well, these resources could enable the Board to make more informed decisions on sensitive cases than Facebook's regular moderation staff can.

Finally, if Facebook and the Board prioritize transparency from the outset, the Content Oversight Board could provide unique insight into and accountability around its procedures. According to its Bylaws, the Board will publish an annual report, which will at minimum include data points such as the number and type of cases the Board reviewed, a summary of the Board’s decisions and policy advisory statements, and an analysis of how the Board’s decisions considered international human rights. The Bylaws also state that the Board will make each of its decisions publicly available and archive them in a database. Facebook also committed that within 30 days of receiving a policy recommendation from the Board, it will publish a statement explaining whether or not it will adopt the recommendation. When done correctly, transparency reports are valuable tools. If implemented consistently, these transparency mechanisms could provide insight into Facebook’s internal policy making process, and help hold the company accountable.

As numerous critics and organizations have outlined, whether or not the Facebook Oversight Board is successful is largely dependent on the independence, legitimacy, operational structure, and transparency of the Board and Facebook. If the Board does have a positive impact, it could offer a new framework for appeals in the content moderation space, and its mandate could potentially be expanded in the future to include other forms of content, such as advertising, and other moderation procedures, such as algorithmic curation and promotion. However, as the Board begins its operations, we must keep in mind that this entity will not radically transform how Facebook operates as a whole, or even how it moderates content. Rather, it has the potential to push the company in the right direction, one limited step at a time.

Related Topics
Artificial Intelligence Platform Accountability Algorithmic Decision-Making Transparency Reporting Content Moderation