The Oversight Board, after more than two years since its inception, has stated that its guidance has contributed to the enhancement of Meta’s rules, making them more transparent for users. However, it emphasizes that the company still needs to step up its efforts in certain significant areas. The board, comprising approximately two dozen experts in human rights and free speech, released its yearly report outlining its activities and interactions with Meta in the past year.
In contrast to the previous year’s report, which heavily criticized Meta’s lack of transparency, the recent report underscores the impact of the board’s suggestions on the company. “In 2022, it was a positive development to see Meta instituting systemic alterations to its rules and their enforcement for the first time, including user notifications and regulations on dangerous organizations,” the board conveyed in a statement.
The report further brings to light areas where the board thinks Meta can do better. As per the report, nearly two-thirds of the cases that were chosen for the Oversight Board’s review had their initial moderation decisions reversed by Meta. This fact, according to the board, “calls into question the accuracy of Meta’s content moderation and the appeal process.”
The board indicates that despite over two years since its initial recommendation for the company to better align its Instagram and Facebook policies, Meta has consistently delayed implementing this. Also, Meta’s decision to not translate internal guidelines for content moderators into their native languages is a concern for the board. While Meta argues all its moderators are proficient in English, the board contends that an “English-only guideline could lead to context and nuances being missed across languages and dialects,” which may lead to mistakes in enforcement.
The board’s report also highlights the unclear nature of Meta’s “newsworthiness” exception. This allows certain posts that break rules to remain online if the company deems it has a “public interest value.” The Oversight Board suggests that there is scanty knowledge about the process Meta employs to determine the newsworthiness of content and that the company’s responses often avoid direct answers.
The report mentions other challenges the board faces while interacting with Meta. For instance, it took the board eight months to gain access to the company’s analytical tool, CrowdTangle. The report also noted that many of the board’s decisions were published beyond the 90-day period as set out in its rules. A few reasons were given for these delays, but some were due to “prolonged discussions with Meta on how much confidential information we could include in our final decision.”
The report points out that the board only weighs in on a minuscule number of cases. It notes that in 2022, it issued only 12 decisions out of nearly 1.3 million requests from users to reverse one of Meta’s moderation decisions. Although the board selectively chooses cases it believes will significantly affect Meta’s user base, these figures highlight the reality that the board will never manage to handle most of the requests it receives. This is despite the more than $280 million funding from Meta. However, the Oversight Board plans to expedite certain cases and will issue swift “summary decisions” starting this year.
Interestingly, the report also discusses Meta’s idea that other social media companies could use the Oversight Board. The board welcomes companies who share their belief in transparent and accountable content governance, supervised by independent bodies, as essential for an online environment that respects freedom of expression and other human rights. While it is yet to be seen whether Meta’s peers will collaborate with the group, the board believes its experiences can benefit other companies. Oversight Board Director Thomas Hughes states, “We’re not seeking to be the board for the entire industry, but we are seeking to share what we’ve learned and work with companies interested in establishing different bodies to standardize and
Frequently Asked Questions (FAQs) about Meta’s Oversight Board Improvements
What is the Oversight Board’s role in relation to Meta?
The Oversight Board, comprising approximately two dozen experts in human rights and free speech, was formed more than two years ago to provide guidance and make recommendations to Meta. Its objective is to make Meta’s rules more transparent and effective for its users.
What improvements has Meta made according to the Oversight Board’s latest report?
The report mentions that in 2022, for the first time, Meta made systemic changes to its rules and their enforcement. This includes user notifications and regulations on dangerous organizations, representing a positive development in Meta’s transparency.
What are the areas of improvement highlighted by the Oversight Board?
The Oversight Board highlighted the need for Meta to align its policies between Instagram and Facebook, translate internal guidelines for content moderators into their native languages, and improve the transparency of its “newsworthiness” exception.
What challenges does the Oversight Board face in its interactions with Meta?
The report mentions a few challenges such as gaining access to the company’s analytics tool, CrowdTangle, and having many of its decisions published after the 90-day timeframe set out in its rules. Some delays were also due to prolonged discussions with Meta on the inclusion of confidential information in the final decision.
What is the scale of the Oversight Board’s impact on Meta’s decisions?
The board issued 12 decisions in 2022, a small fraction of the nearly 1.3 million requests it received from users to reverse one of Meta’s moderation decisions. Despite selecting cases that will significantly affect Meta’s user base, it will not be able to handle most of the requests it receives.
What is Meta’s suggestion for other social media companies?
Meta suggests that other social media companies should consider using the Oversight Board. The board is open to working with companies that share their belief in transparent and accountable content governance, overseen by independent bodies, as a crucial part of an online environment that respects freedom of expression and other human rights.