Skip Navigation

Evaluating Facebook’s New Oversight Board for Content Moderation

The board’s charter provides many of the building blocks for a better process, but how it works in practice remains to be seen.

This was originally published by Just Security.

Facebook recently released more details about its Oversight Board, publishing a governing charter in September that describes how the board will review the company’s decisions on whether to remove certain posts. The charter is a promising start, and Facebook seems to have seriously considered input from its global feedback campaign. As a founding document, it is necessarily an outline that will only be filled in once the board is appointed and begins issuing decisions. The board’s legitimacy and effectiveness will depend on many factors that are as yet unknown, including its membership, the quality of its decisions, the extent to which it is insulated from Facebook’s control, and the company’s response to board decisions and recommendations.

Facebook’s content moderation decisions have far-reaching effects on public discourse worldwide. Over people 1.6 billion people share, read, and post content on Facebook each day. Deciding which posts stay up or come down is a monumental task, done through a combination of human reviewers and algorithms. Facebook has published multiple iterations of the Community Standards it uses to decide on takedowns, but the company’s implementation of these policies has long drawn complaints from across the political spectrum, from President Donald Trump to civil rights groups.

At the same time, the company faces mounting pressure to do more to moderate speech. Governments have long asked social media platforms to take down terrorist or extremist speech, which remains poorly defined. More recently, calls for removing hate speech, fake accounts, and misleading posts have proliferated, often from civil rights groups concerned about the impact of these posts on minority populations.

Uncomfortable with its role as the speech police, in November 2018, Facebook announced it would create an independent board to review its content moderation decisions. After releasing a draft charter in January 2019, the company launched a widely publicized, six-month global input campaign — reportedly soliciting feedback from over 2,000 people through surveys and workshops held across 88 countries. Much of this feedback found its way into the charter for the Oversight Board, which provides a speech-protective framework for the board. But the framework leaves some key issues unsettled, and the board’s legitimacy will depend on how it — and Facebook — handles these matters.


An oversight board set up by a company naturally faces serious questions about its independence from its creator. Civil society groups highlighted a cluster of independence issues, such as the process for appointments to and removals from the board, the board’s freedom to select cases for review, and Facebook’s ability to use the power of the purse to influence the board.

The appointment of board members was a particularly thorny issue, with many observers questioning whether a board appointed by Facebook, or subject to removal by the company, could be regarded as independent. The compromise struck in the charter is that Facebook will appoint a limited number of co-chairs who, together with the company, will select the initial cohort of members. Thereafter, the board will select new members, with both Facebook and members of the public permitted to propose candidates. While this process includes a big role for the company in the initial stages (likely driven by practical concerns), it allows the board to become more independent from Facebook over time. Critically, the charter explicitly provides that members cannot be removed for content decisions they have made, which will insulate them from retaliation and should encourage independent decision-making.

Another important provision of the charter gives the board discretion to choose the cases it reviews. Both users and Facebook can nominate cases for review. This ensures that the board’s docket does not solely reflect the company’s priorities, but also has room for individuals seeking to challenge the company’s decisions.

To insulate the board from financial pressure from the company, the charter obligates Facebook to fund an independent trust that will maintain the board’s operating budget and compensate its members. But the trust itself – which has a role in the appointment and removal of members and in setting the board’s budget – is under Facebook’s control. This creates an opportunity for the company to influence the board’s operations, which may be mitigated by the high-profile nature of the venture, and the prospect of a public backlash if Facebook is seen as undermining the board’s independence.


Given the extraordinary range of Facebook’s users, it is no surprise that a large majority of respondents in Facebook’s global feedback campaign highlighted the need for variety in the board’s cultural and linguistic knowledge, ideological and political views, and race or ethnicity. But, as Facebook noted, it is impractical to expect that a board of up to 40 people would exhaustively represent the various languages and cultural norms of Facebook’s approximately 1.6 billion daily active users.

The charter seeks to square this circle by providing that the board will be composed of a “diverse set of members” possessing “a broad range of knowledge, competencies, diversity, and expertise” and requiring that regional perspectives are brought to bear on individual cases by mandating that the panel reviewing a case has at least one member from the region where the case originated.

Scope of Board’s Authority

The board’s stated function is quite narrow: to review individual content moderation decisions made by the company. Its conclusion on whether or not content stays up or comes down “will be binding and Facebook will implement it promptly, unless implementation of a resolution could violate the law.” It seems that Facebook envisions that the board simply will not hear cases where content has been removed due to local laws because such takedowns are not appealable. But this is not clear from the charter itself, which simply says that where a user disagrees with a Facebook content decision and has exhausted appeals, they can request the board review the matter. The charter does not explicitly state that only content for which Facebook has an appeals process can be reviewed by the board. While the company has legal obligations in countries where it operates, there is a strong case to be made that where these laws violate international human rights norms (e.g., by criminalizing homosexuality and feminism), the board should be able to weigh in and Facebook, in the implementation phase, could explain that local laws prevent it from complying with the board’s decision and how it has done so in most narrow way feasible given legal constraints (e.g., by removing the content only in the country where it is illegal).

Of course, the individual cases reviewed by the board will involve but a miniscule fraction of the company’s decisions. The charter expands the board’s reach beyond this in two ways.

First, “where Facebook identifies that identical content with parallel context remains on Facebook, it will take action by analyzing whether it is technically and operationally feasible to apply the board’s decision to that content as well.” Given the difficulties Facebook has faced in consistently implementing its takedown criteria, this limited commitment is unsurprising. On the other hand, the whole point of the board would be seriously undermined if the company is not able to implement decisions more broadly than for individual cases.

Second, civil society was unified in pushing the company to empower the board to have input on rules and policies in addition to deciding cases. The charter provides that the Oversight Board has the authority to interpret Facebook’s Community Standards and to provide policy guidance on the company’s content policies, either as part of reviewing a case or upon Facebook’s request. Facebook is not bound by the board’s policy recommendations but has committed itself to “analyzing the operational procedures required to implement the guidance, considering it in the formal policy development process of Facebook, and transparently communicating about actions taken as a result.” While this commitment may not seem like much at first glance, the company will face immense public pressure to respond to the board’s recommendations and will not be able to simply bat them away.

Applicable “Law”

The Oversight Board is required to judge cases based on Facebook’s Community Standards, as well as a set of values identified by the company — voice, authenticity, safety, privacy, and dignity – which are meant to inform the interpretation of these standards. The charter also includes a reference to international human rights principles, a key priority for civil society groups and the U.N. Special Rapporteur for Freedom of Expression.

In keeping with the company’s emphasis on “voice,” which it describes in its Community Standards as “people’s ability to share diverse views, experiences, ideas and information,” the charter provides that “[w]hen reviewing [Facebook’s content moderation] decisions, the board will pay particular attention to the impact of removing content in light of human rights norms protecting free expression.”

Given the centrality of Facebook to discourse of all kinds, but especially to political expression and organizing, the centering of free speech serves an important signaling function. The difficulty, which the board will soon enough confront, is what to do when free speech comes into conflict with other norms enumerated in Facebook’s own values (and other international human rights principles not mentioned in the charter). Some of this balancing has already been undertaken in the company’s Community Standards, for example by banning terrorism content and hate speech. But implementing these rules has been challenging, with Facebook taking heat for its decisions from many quarters. The board will also have to make hard calls. While Facebook’s Community Standards and values and international human rights law provide guideposts, it is unclear exactly what direction takedown jurisprudence will take.


Calls for transparency have dogged social media companies for years and the Oversight Board — which is meant to serve as an accountability mechanism that offsets opaque content moderation practices — clearly needs to be as forthcoming as possible. The charter includes some important transparency provisions. The names of the Oversight Board’s members will be made public, but to protect their security the names of the panelists reviewing specific cases will be kept confidential. Each decision made by the board will be published and archived online. Dissenting opinions may be published along with the majority decision, although it is not clear who decides on their inclusion. The procedures for submitting a case to the board and the requirements for review (yet to be determined) will be publicly available. While the charter suggests that the board will release reports, they are not required to do so. Periodic accountings of the board’s operations, including the number of cases that are referred to it but not reviewed, and the nature of such cases, would provide a helpful overview of how the system is functioning.

What’s Next?

Facebook’s announcement of the board co-chairs and initial cohort of members will provide the first indication of whether the charter’s commitment to diversity and independence is honored in practice. Very little is known about the trust and the extent to which it will exercise control over the Oversight Board, although Facebook has promised to release relevant documents. Given the extent of the trust’s control over the board through appointments, removals and funding, these will be closely scrutinized. The budget for the board’s operations is also unknown and will undoubtedly have an enormous impact on its ability to be effective, although it is not clear that it will be made public.

The board’s bylaws will also be critical. According to Facebook, they will “provide greater operational detail on the board’s institutional independence and rules of procedure” and will “include accountability mechanisms, such as a code of conduct and board member disqualifications.” The charter does not specify how the bylaws will be decided, stating only that: “[t]he board’s operational procedures will be outlined in its bylaws. The charter and bylaws will act as companion documents.” Facebook, however, has indicated that it is “crafting” the bylaws and that “ultimately the board alone will have the ability to change them.” This suggests that the company intends to put in place the initial set of bylaws that could be changed by the board at a later point in time. Given the centrality of the bylaws, it can be expected that an independent Oversight Board will expect to have substantial input on these procedures.

The Oversight Board is an ambitious project and one that has the potential to significantly improve content moderation. The charter provides a solid foundation for its operation, but the building blocks which will be put in place over the next months and years will determine whether the board serves its goal of bringing much-needed accountability and transparency to Facebook’s content moderation policies and practices. And, many of the most difficult questions facing Facebook, such as political advertising, efforts to game the platform and downranking, will be outside the board’s purview, sitting solely with the company. The board is one part of the ecosystem of decision-making by social media platforms on the fate of speech and speakers in the digital public square, and the whole system must become more transparent and accountable.