Facebook is seeking to “police” its content to try to remove hate speech with Myanmar and Sri Lanka just two of the countries in its sights.
In a blog post published this week, Understanding Social Media and Conflict, Facebook stresses it is making efforts through a dedicated team to remove hate speech, reduce misinformation and polarization, and inform people through digital literacy programmes.
Facebook has come under heavy criticism for allowing hate speech to flourish on its platform. Critics are skeptical about the efforts been made by the social media giant to rein in objectionable material on the platform.
Hate speech on Facebook has been mentioned as a contributory factor in the Rakhine crisis.
The company noted action it has recently taken in two countries, Sri Lanka and Myanmar, both of which have been hit by social media-fueled conflict.
“In the last year, we visited countries such as Lebanon, Cameroon, Nigeria, Myanmar, and Sri Lanka to speak with affected communities in these countries, better understand how they use Facebook, and evaluate what types of content might promote depolarization in these environments. These findings have led us to focus on three key areas: removing content and accounts that violate our Community Standards, reducing the spread of borderline content that has the potential to amplify and exacerbate tensions and informing people about our products and the internet at large. To address content that may lead to offline violence, our team is particularly focused on combating hate speech and misinformation,” the post says.