A number of documents and manuals used to train Facebook’s moderators have been exposed in an investigative report by The Guardian, revealing the type of content users are and aren’t allowed to post on the social networking site.
That includes taking some controversial stances. For instance, it’s allegedly Facebook policy to allow the livestreaming of video of people attempting self-harm, only removing the video “once there’s no longer an opportunity to help the person … unless [the videos] are newsworthy”.
Another example is in relation to violent language, which Facebook only deems as against the rules if the specificity of language makes it seem like it’s “no longer simply an expression of emotion but a transition to a plot or design”. General statements like “let’s beat up fat kids” (a direct quote) can remain on the site, whereas someone’s request for a presidential assassination would be removed.
Moderation of the nation
The Guardian report is part of a series the site is calling ‘Facebook Files’ – a combination of articles that discuss the guidelines in depth, and also provide samples of the original moderation documents themselves. The guidelines cover a huge range of specific topics, ranging from the showing of animal cruelty to non-sexual child abuse, and detail how Facebook feels each should be addressed.
Facebook already has around 4,500 content moderators whose sole job it is to wade through reports from users of disturbing or inappropriate content, and the company has said it plans to hire another 3,000 to help deal with the massive workload. While this army of screening staff deal with these reports, they apparently don’t touch any of the content when it first gets posted – that job is instead relegated to automated systems and checks.
These issues are obviously ethically complex, and for many people it will be irksome to see these topics discussed through the lens of corporate interest, no matter how reasonable the policy surrounding each problem may be.
- Here’s how