Social-Media

The Facebook Supervisory Board takes its first six cases

The Facebook Supervisory Board, an independent body that oversees Facebook's moderation decisions, approved its first events. The six appeals include posts withdrawn under Facebook's hate speech laws, nudity prohibitions, and disinformation policies. They are now open for seven days of public consultation, in which the board will decide if the posts should have been deleted.
 
Most instances include people outside the US uploading non-English posts—a recognized blind point for Facebook moderation—and at least two hinges on the nuance of someone publishing hate content that indirectly criticizes it. One user, for example, posted screenshots of derogatory tweets by former Malaysian Prime Minister Mahathir Mohamad, allegedly to raise awareness of "horrible words." Another post included a user who shared an alleged quotation from Joseph Goebbels, but who appealed by claiming that Goebbels' words were compared to a "fascist model" of US politics.
 
Every case will be submitted to a five-member panel composed of one representative from the same area as the original material. These committees will take their decisions—and Facebook will function on them—within 90 days. The Supervisory Board, which was first revealed in May, includes digital rights campaigners and former European Court of Human Rights Judge András Sajó. Public feedback will guide their actions.
 
Five of the incidents have been reported by consumers who have appealed more than 20,000 decisions since the alternative opened in October. The latter was alluded to by Facebook itself and deals with coronavirus-related misinformation—one of the most important issues on the site. Moderators deleted a video condemning French health authorities for not allowing unproven treatment of COVID-19 hydroxychloroquine, which was incorrectly referred to in the video as a "cure."
 
Subsequently, the organization described it as an example of the challenges faced in addressing the risk of offline damage that may be caused by misinformation about the COVID-19 pandemic.
 
Facebook CEO Mark Zuckerberg likened the Supervisory Board to the Supreme Court for Facebook. It is designed to provide a fair recourse mechanism for users who have their posts removed—something that sometimes seems lacking on social networks, particularly when they take tougher measures to delete misleading facts or offensive expression. Around the same time, it reduces the burden on Facebook to make moderation decisions. Cases such as a pandemic video decision, for example, would set an independently agreed precedent for Facebook to delete related videos in the future.
 
 
The Advisory Board—similar to the Supreme Court of the United States—is primarily intended to interpret laws, not make new ones. However, Facebook has suggested that it could still turn to the Policy Recommendations Board in the future.
 
Many of Facebook's issues include the pace and scale of content moderation, not the exact complexities of understanding of its policies. Obviously, the Review Board cannot hear all the appeals proceedings, and we do not know precisely how rank-and-file moderators can extend their rulings to ordinary decisions. But it's the beginning of a long-awaited experiment to handle Facebook (a little bit) more like a country.

 






Follow Us


Scroll to Top