Meta's external advisory group, the Oversight Board, announced today that it is expanding its scope to Threads, along with Facebook and Instagram, to scrutinize Meta's content moderation decisions.
This means that if users of the thread are dissatisfied with Meta's decisions on issues such as content or account deletion, they can appeal to the Oversight Board.
“The Board's expansion to Threads builds on our foundation of helping meta solve the toughest content moderation problems. It is critical that we hold ourselves accountable,” Oversight Committee Co-Chair Helle Thorning-Schmidt said in a statement.
In 2018, Mark Zuckerberg spoke publicly for the first time about creating an independent oversight board. Facebook proposed board bylaws in January 2020 and announced its first members in May. In October 2020, the commission announced it would begin reviewing the case. In 2021, the observatory reviewed Meta's decision to expand coverage and retain certain content.
The Oversight Committee has decided on several important cases over the years. The most notable ruling criticized Facebook for banning former President Donald Trump “indefinitely.” The board agreed that Trump had violated the platform's rules, but said the guidelines had no specification for an indefinite ban.
Earlier this year, the board called on Meta to reform its “disjointed” rules regarding fake videos.
Managing content with threads
Since Threads launched last July, users have questioned its moderation practices multiple times. In October, the Washington Post reported that the platform was blocking terms such as “gore,” “nudity,” “sex,” and “porn,” as well as “coronavirus” and “vaccine.” That same month, Instagram chief Adam Mosseri said the ban was temporary. However, the company has not lifted the ban as of today.
Earlier this month, Threads announced that it was running a fact-checking program and that some of its users' posts were being labeled. However, the company clarified that this was to match posts on the thread with existing fact checks about Meta's other properties. Last year, Meta said he planned to introduce a separate fact-checking program for Threads, but the company had not finalized which fact-checkers would participate.
Mosseri has remained adamant about his decision not to promote political content or “amplify the news” on the platform. But Mehta said last week that the social network's newly rolled out Trending Topics feature could incorporate political content as long as it doesn't violate company policies.
Catalina Botero Marino, co-chair of the Supervisory Board, said content moderation will become increasingly difficult due to polls in countries such as the United States and India this year, advances in AI, and conflicts around the world. He pointed out that it was happening.
“With conflicts escalating in Europe, the Middle East and the rest of the world, billions of people heading to the polls in global elections, and abuse against marginalized groups increasing online, we cannot afford to miss a moment. “Advances in artificial intelligence could make content moderation even more difficult,” Marino said.
“The Board is committed to finding concrete solutions that reduce harm while protecting freedom of expression, and we look forward to setting standards that improve the online experience for millions of Threads users. I am.”
Supervisory Board Process
The Oversight Board has not changed the process for appealing Meta decisions on threads. Users must first appeal to Meta, and if they are dissatisfied with the social platform's decision, they must appeal to the board within 15 days of receiving the verdict.
The Board may take up to 90 days from the date of appeal to consider the decision. The board and Mehta have been criticized in the past for their slow response. However, the organization is not changing the thread process at this time.
In particular, the board can issue both recommendations and decisions. Although the recommendations are not binding, Meta is obligated to abide by the board's decisions.