Facebook’s oversight board, which is tasked with overseeing the removal and moderation of posts on the site, has reversed a number of previous decisions on removed posts, highlighting the need for clearer definitions and rules.
Why Facebook Must Take Responsibility for Fake News | Wesley Lowery | Big Think
The oversight board took up the first five cases in December 2020. While one of the posts in the review came from the US, the posts in total came from four different continents, all of which could view the rulings made in very different ways. As such, the policies Facebook puts in place must be concise and work with whatever community its moderation tools are targeting.
“Facebook’s ‘independent review’ activity must be consistent across international boundaries,” Jim Isaak, a former president of the Institute of Electrical and Electronics Engineers and a 30-year veteran of the technology industry, wrote us via email. “But what is ‘hate speech’ in the U.S. might be defined as patriotic in other autocratic societies, adding to the complexity of what is being done.”
This need for consistency and more concise rules is already at play. Of the five cases Facebook’s oversight board adopted in December, the group decided to rescind four, two of which clearly demonstrate the need for better moderation.