When Facebook talks about its policies, it often sounds like a government. Appearing before Congress earlier this month, Mark Zuckerberg offered testimony filled with state-like pronouncements. “I don’t want anyone to use our tools to undermine democracy. That’s not what we stand for,” he said.

But though it can sound like a state, Facebook has always been vague about its laws. The company does publish community guidelines, but it has offered little information about how it decides those guidelines, how it enforces them, and what people who disagree with its decisions can do to appeal them.

That’s changing. On Tuesday, Facebook announced that it’s releasing its internal content moderation guidelines and giving the public a look into the way it develops its policies. The best glimpse of these guidelines until now has come courtesy of the Guardian, which published a comprehensive report on them last year. Facebook is also introducing an appeals process for people whose posts were removed “for nudity or pornography, hate speech, and violence.”

“Our content reviewers consult a set of internal implementation guidelines in order to make decisions,” Facebook Global Policy Management VP Monika Bickert said in a blog post. “For the first time, we’re sharing updated Community Standards that include these internal guidelines.”

The added transparency, Bickert said, is coming for two reasons:

First, the guidelines will help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give…

[SOURCE]