Facebook Finally Reveals How It Bans Content

Social network now lets users appeal when their posts are pulled down

For the first time in its history on Tuesday, Facebook released a look at its internal guidelines for which posts are and aren’t fit to remain on its platform, shedding light on the company’s policies against bullying, “cruel and insensitive” posts, and hate speech.

Up to this point, the social network had only shared a brief snapshot of its “community standards,” leaving many users in the dark on the specifics of why their pages had been suspended or deleted. Monica Bickert, Facebook VP of product policy and counter terrorism, said in a blog post the company is sharing its 25-page rulebook as an “effort to explain where we draw the line when it comes to content on Facebook.”

Drawing that line appears tricky. Bickert said Facebook wants to be both a “safe place” and a “place to freely discuss different points of view” — two goals that can be at odds with one another.

Facebook’s guidelines touch on a number of reasons posts or pages can be removed, including for promoting violence and posting nude pictures. “Cruel and insensitive” comments, which the company defines as targeting “victims of serious physical and emotional harm,” are also subject to removal, as well as hate speech.

“We define hate speech as a direct attack on people based on what we call protected characteristics — race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease,” said the company in its community standards. “We also provide some protections for immigration status. We define attack as violent or dehumanizing speech, statements of inferiority, or calls for exclusion or segregation.”

CEO Mark Zuckerberg touched on the subject during his congressional testimony earlier this month, saying he’s “optimistic” artificial intelligence will help the company weed out hate speech in the next five to 10 years.

Bullying is another red flag for Facebook. The company said it will “remove content that purposefully targets private individuals with the intention of degrading or shaming them.”

Facebook’s decision to post its guidelines comes after the company has been criticized for being too clandestine in its enforcement. The company added on Tuesday it’s planning on extending its appeals process, letting users know why their content was yanked. Users can then hit a “request review” button, and Facebook will get back to them within about 24 hours on if their content can go live again. To this point, the company only allowed the removal of Groups or Pages to be appealed.

Comments