Communities continue to fight against Facebook’s anti-moderation philosophy, as evidenced by CEO Mark Zuckerberg’s attempt to provide users with a freewheeling experience. However, these decisions also have direct consequences for the company and raise concerns among minority groups.
Facebook and the Ethics of Moderation – The Listening Post
Facebook continues to grapple with the fallout from its decision to allow right-wing activists and militia groups to organize counter-protests on the platform in response to the most recent Jacob Blake BLM uprising in Kenosha, Wisconsin. In response, non-binary software engineer Ashok Chandwaney resigned from the company, citing Facebook’s continued failure to curb hate speech and the spread of violent rhetoric.
“I am quitting because I can no longer tolerate contributing to an organization that profits from hate in the U.S. and around the world,” they wrote in their resignation letter, published by The Washington Post. “Violent hate groups and far-right militias exist, and they use Facebook to recruit and radicalize people who will then commit violent hate crimes.”
Facebook’s lack of moderation and refusal to combat misinformation and hate on its platform is an ongoing problem. For years, founder and CEO Mark Zuckerberg has dodged criticism from critics inside and outside the company. Earlier this summer, hundreds of employees held a virtual walkout in a rare moment of public criticism over Zuckerberg’s decision to allow President Trump’s inflammatory, violent posts to circulate.