Meta's Oversight Board extends its scope to Threads

The Threads app logo is seen in this illustration photo

Image Credits: Jaap Arriens/NurPhoto / Getty Images

Meta’s external advisory group, its Oversight Board, announced today that it is expanding its scope to Threads along with Facebook and Instagram to scrutinize Meta’s content moderation decisions.

This means if users on Threads are unsatisfied with Meta’s decision on issues like content or account takedown, they can appeal to the Oversight Board.

“The Board’s expansion to Threads builds on our foundation of helping Meta solve the most difficult questions around content moderation. Having independent accountability early on for a new app such as Threads is vitally important,” Oversight Board co-chair Helle Thorning-Schmidt said in a statement.

In 2018, Mark Zuckerberg formally talked about having an independent oversight board for the first time. In January 2020, Facebook proposed the board’s bylaws and announced the first set of members in May. In October 2020, the board said that it would start reviewing cases. In 2021, the observatory body expanded its scope to review Meta’s decision to keep certain content up.

The Oversight Board has ruled on some important cases over the years. The most notable ruling was criticizing Facebook for “indefinitely” banning former President Donald Trump. While it agreed that Trump broke the platform’s rules, the board said that the guidelines don’t have specifications for an indefinite ban.

Earlier this year, the board called on Meta to reform its “incoherent” rules about fake videos.

Content moderation on Threads

Since Threads launched in July last year, users have questioned its moderation practices multiple times. In October, The Washington Post reported that the platform has been blocking terms like “Covid” and “vaccines” along with “gore,” “nude,” “sex” and “porn.” The same month, Instagram head Adam Mosseri mentioned that this ban was temporary. However, the company hasn’t lifted the ban as of today.

Earlier this month, Threads said that it is running a fact-checking program, and users were seeing labels on some of the posts. However, the company clarified that it is because it matches existing fact checks on Meta’s other properties to posts on Threads. Last year, Meta said it intended to introduce a separate fact-check program for Threads, but the company hasn’t finalized what fact-checkers will be part of.

Mosseri has been adamant about the decision to not recommend political content and “amplify news” on the platform. However, the social network’s newly rolled out trending topics function could feature political content as long as it doesn’t break company policy, Meta said last week.

Oversight Board co-chair Catalina Botero Marino pointed out that with polls coming up in countries like the U.S. and India this year, advances in AI and conflicts across the world, content moderation has become harder.

“With conflicts raging in Europe, the Middle East and elsewhere, billions of people heading to the polls in global elections and growing abuse towards marginalized groups online, there is no time to lose. Advances in artificial intelligence can make content moderation even more challenging,” Marino said.

“The Board is dedicated to finding concrete solutions that protect freedom of expression while reducing harm, and we look forward to setting standards that will improve the online experience for millions of Threads users.”

The process of the oversight board

The Oversight Board hasn’t changed its process for appealing Meta’s decisions on Threads. A user has to appeal to Meta first, and within 15 days of receiving a verdict, they should appeal to the board if they are unhappy with the social platform’s judgment.

The board can take up to 90 days to review the decision from the date of appeal. In the past, the board and Meta have been criticized for the slowness of their responses. However, the organization hasn’t changed any processes for Threads at the moment.

Notably, the board can issue both recommendations and decisions. While recommendations are not binding, Meta is obliged to follow the board’s rulings.

Meta's Oversight Board takes its first Threads case

The Threads logo on a smartphone

Image Credits: Bloomberg / Gabby Jones (opens in a new window) / Getty Images

Meta’s Oversight Board has now extended its scope to include the company’s newest platform, Instagram Threads. Designed as an independent appeals board that hears cases and then makes precedent-setting content moderation decisions, the board to date has decided on cases like Facebook’s ban of Donald Trump, COVID-19 misinformation, the removal of breast cancer photos, and more.

Now the board has begun hearing cases emerging from Threads, Meta’s Twitter/X competitor.

This is an important point of differentiation between Threads and rivals like X, where Elon Musk and other users heavily rely on crowdsourced fact-checks by Community Notes to complement its otherwise light moderation. It’s also very different from how decentralized solutions, like Mastodon and Bluesky, are managing moderation duties on their platforms. Decentralization allows community members to establish their own servers with their own set of moderation rules and gives them the option to de-federate from other servers whose content runs afoul of their guidelines.

The startup Bluesky is also investing in stackable moderation, meaning community members can create and run their own moderation services, which can be combined with others to create a customized experience for each individual user.

Meta’s move to offload difficult decisions to an independent board that could overrule the company and its CEO Mark Zuckerberg was meant to be the solution to the problem of Meta’s centralized authority and control over content moderation. But as these startups have shown, there are other ways to do this that allow the user to be more in control of what they see, without stepping on the rights of others to do the same.

Nevertheless, the Oversight Board on Thursday announced it would hear its first case from Threads.

The case involves a user’s reply to a post containing a screenshot of a news article in which Japanese prime minister Fumio Kishida made a statement about his party’s alleged underreporting of fundraising revenues. The post also included a caption criticizing him for tax evasion and contained derogatory language as well as the phrase “drop dead.” It also used derogatory language for someone who wears glasses. Because of the “drop dead” component and hashtags calling for death, a human reviewer at Meta decided the post violated the company’s Violence and Incitement rule — despite sounding much like your run-of-the-mill X post these days. After their appeal was denied a second time, the user appealed to the Board.

The Board says it selected this case to examine Meta’s content moderation policies and enforcement of practices over political content on Threads. That’s a timely move, considering that it’s an election year and that Meta declared it would not proactively recommend political content on Instagram or Threads.

The Board’s case will be the first involving Threads, but it won’t be the last. The organization is already preparing to announce another bundle of cases tomorrow focused on criminal allegations based on nationality. These latter cases were referred to the Board by Meta, but the Board will also receive and weigh in on appeals from Threads users, as it did with the case concerning Prime Minister Kishida.

The decisions the Board renders will influence how Threads as a platform chooses to uphold users’ ability to express themselves freely on its platform, or whether Threads will moderate content more closely than on Twitter/X. That will ultimately help shape public opinion about the platforms and influence users to choose one or the other, or perhaps a startup experimenting with new ways to moderate content in a more personalized fashion.