Meta’s long-running Oversight Board project is facing new challenges, with a range of funding cuts on the way that will result in a reduction of staff for the group, reducing its capacity to hear future cases.
The Oversight Board was conceived as an example of how external regulation could provide a more equitable approach to content moderation, while also lessening the burden on each social media platform to make individual decisions about what should and should not be allowed in their apps. By appointing a team of experts to hear appeals to moderation decisions, the Oversight Board has played an important role in testing the validity of such rulings, while also helping to inform Meta’s policies moving forward.
But now, its capacity to act as an arbitrator could be challenged.
As per The Washington Post:
“The job eliminations will add to challenges facing the Oversight Board at a time when it has faced criticism for moving too slowly and cautiously to issue decisions and opinions that significantly shape how Meta handles contentious free speech debates. The cuts could also affect the Oversight Board’s longtime quest to be considered a viable model of governance for the social media industry in the eyes of regulators, civil society groups and the general public.”
Indeed, as we head into what’s set to be a highly contentious election year, the Oversight Board will again provide a valuable means of review for difficult moderation decisions.
The Oversight Board’s most high-profile case thus far was the banning of former President Donald Trump, which Meta implemented following the January 6th riots in 2021. The Board ultimately supported Meta’s justification for the ban, but it also found that Meta’s decision to enact a ban on the former President lacked defined parameters and transparency. Based on this, Meta announced a two-year end date for the suspension, with a review to be conducted to assess the risk of reinstatement at that stage. Trump was allowed to return to Meta’s apps last January, but has largely stayed away, with Truth Social now his primary social media focus.
This is just one example of how the Oversight Board has played a role in improving the content moderation process, by prompting Meta to enact more helpful, equitable transparency and process, through pressure testing of its decisions.
As such, it remains a valuable project, though it doesn’t seem to have got the attention of Government groups, as Meta had initially hoped.
Part of the motivation for creating the Oversight Board was to enable Meta to showcase how a government-funded regulatory group could potentially perform a similar role, which would then mean that all social platforms would be on a more level playing ground in terms of what they could and could not allow in their apps.
Amid ongoing criticism over its censorship decisions, Meta had hoped that the Oversight Board would showcase how third party rulings on such are a better way, but despite the relative success of the project, there’s been no movement in Washington to implement a broader oversight group.
For its part, Meta says that it remains committed to the Oversight Board project, despite the cuts, which suggests that it’ll at least be around for a while longer, helping to push the case. But it seems that external moderation enforcement is a long shot, meaning that content decisions will remain in the hands of each individual app, at least for the time being.
Which does make sense, in terms of varying regulations by region, and the need for government groups to remain impartial. But it also seems like there should be a more universal, impartial group making such calls on a broader scale.
But then again, such groups would then be open to manipulation as well, and really, we may be safer, in the end, to have individual platform programs like this. So long as the funding for such continues.