Facebook has long tried to explain what is and isn’t allowed on its platform through publicly available community standards, but the company today published what it said are internal guidelines for how it enforces its own rules.

The underlying problem Facebook has wrestled with — almost since its inception — is that the lines between “free speech,” “obscenity,” “objectionable content,” “harassment” and a million other related terms are blurred, and the usual issues around subjectivity are often hugely divisive.

For example, Facebook has long faced criticism over how it deals with photos of breastfeeding mothers, and iconic photos such as the “Napalm girl” have been censored for breaching the company’s broad nudity guidelines.

Elsewhere, Facebook has also become embroiled in debates on issues such as what constitutes “hate speech” and other similar inflammatory content. Facebook often finds itself in the unenviable position of having to play judge and jury on matters that aren’t always clear-cut, and at a scale that isn’t possible for humans to manage in a timely manner.


The new publicly available enforcement guidelines cover violence and criminal behavior, safety, objectionable content, integrity and authenticity, respecting intellectual property, and content-related requests. Most of the sections are fairly self-explanatory and not entirely surprising — for example, if you share copyrighted material such as a video that you do not own, don’t be surprised if it magically disappears. And Facebook isn’t overly keen on content related to…