Facebook Publicly Releases Community Guidelines

In the past, Facebook has typically kept quiet about its policies. Especially those concerning data collection and the Facebook app, which is essentially just a cell phone spy. However, ever since the Facebook – Cambridge Analytica scandal, Facebook has tried to be more transparent with the general public. Because of this, we’re seeing far more policy changes now than we ever have throughout the company’s 14 year history. Yesterday, Facebook surprised everyone by releasing dozens of pages worth of internal guidelines. The same guidelines the company uses to determine when to remove certain types of content from the platform. Find out what kind of content Facebook deems inappropriate below.

Facebook’s Community Guidelines

Facebook Publicly Releases Community Guidelines

According to Facebook’s community guidelines, the company breaks down inappropriate content into several categories. These categories include, Violence and Criminal Behavior, Safety, Objectionable Content, Integrity and Authenticity, Respecting Intellectual Property, and Content Related Requests. Each of these categories comes with it’s own detailed explanation of what is acceptable to post, and what will be flagged for removal. For example, under the “Integrity and Authenticity” category, spam is described as something that would be taken down from the platform. This includes any fake accounts, false advertising, or any practice that is used to artificially increase financial gain.

If you’re interested in reading through the rest of the community guidelines, you can find them here.

New Appeals System

Facebook Publicly Releases Community Guidelines

In addition to the community guidelines, Facebook has also introduced a new appeals system. The previous system only allowed users to review the reasons why a particular piece of content was taken down. Once that content was removed, there was nothing they could do to bring it back. That all changed yesterday. Instead of just receiving a content removal notification, users will also be given the option to “request a review.” This is great for people who believe their content may have been taken down unjustly. Now that we have a better understanding of what constitutes as inappropriate content, users can decide for themselves whether or not Facebook should review their content.

So what do you think of Facebook’s new community guidelines? Do you think the social media giant is doing enough to keep the platform safe for its users? Let us know in the comments.

You Might Also Like: The Mark Zuckerberg Hearing: Everything You Should Know