The latest shocking news revealed by The Guardian in its series of reports is regarding the guidelines for content pertaining to threats, violence, etc. on the social media site Facebook. It is not just two or three reports, but a complete series that include thousands of internal documents. The Facebook rules leaked are regarding the rules and regulations set up by the social media platform that determines what content remains on its site. Among these leaked rules are those that cover content regulations for threats, violence, and nudity. Basically, this is everything that has put Facebook into controversy in the past.
Facebook Rules Leaked and How!
This new insight into what the social media giant allows its users to post is pretty shocking because it can delete a threat against the U.S President. But a similar threat against a common individual won’t be taken action against until it becomes really grave. Now, isn’t this a major contradiction? A social media platform that earns its bread and butter based solely on its content has been fooling it’s users all this while.
The Guardian states that it has reviewed almost over a hundred internal spreadsheets, flowcharts, and training manuals. These documents are those that guide Facebook’s moderators to take appropriate actions when the content is reported.
A Platform for Free Speech
Facebook has, since its inception, shown that it is a social media platform that promotes free speech but also takes care of user’s privacy and safety concerns. To keep this in check, the California company typically uses automatic systems for eliminating content regarding terrorism or child and sexual abuse. Meanwhile, the remaining issues such as threats, violence, etc. are left in the hands of the team of moderators at Facebook.
The moderators have specific guidelines with the help of which they determine the difference between a baseless threat and a serious one. Some posts regarding specific timings, methods, and threats are of course given priority over general or lame ones. Moreover, Facebook specifically outlines typical individuals (heads of states, senior policemen, etc.) or specific groups (such as homeless or Zionists); and the content regarding it gets deleted automatically or other preventive measures are taken immediately.
Abuse is Permitted!
The Facebook rules leaked by The Guardian also points out the significant differentiations by the site. According to the guidelines, it is specified that pictures or even videos regarding animal abuse is permitted. Why? As the social media giant believes that this will raise awareness among people. Also, it allows posts wherein users attempt to harm themselves. The lamest reason that the company can give for this is that it does not want to punish or censor users who are in distress and are attempting suicide.
 Moderators Overwhelmed
The Guardian notes that these moderators review almost millions of reports that have suspected content. So maybe, given the amount of the content that they have to review, they often feel overwhelmed. This may be the reason they make mistakes, especially in the area regarding permissible sexual content.
However, Facebook’s diverse audience shows that the range of considerable accepted behavior is widespread; some comments may violate the company’s policies in specific contexts but not others; says Monika Bickert to The Guardian. Bickert is the Head of Global Policy Management at Facebook.
Facebook needs to Keep a Proper Check on Contents

Facebook CEO Mark Zuckerberg
Reports of Facebook not very far from hitting the 2 million users mark did the rounds on internet just a short time before. At such times Facebook has to certainly scale up its guidelines regarding its content check.
In the past months, Facebook has seen crucial incidents like the live broadcast of a killing, an image of the Vietnam War, among others; as reported by The Verge. These incidents had prompted the company to hire more moderators for content screening as well as take a look at its policies.
With such postings made possible, many questions arise. Does Facebook have major loopholes in its policies? Is the response in content regarding normal users not really speedy? Is content screening at Facebook weak? Did Facebook keep fooling us all this while relating guidelines on content pertaining to violence, threats, abuse, etc.?
With an ever-increasing user-base and Facebook rules leaked; the company will now have to really buck up for improving guidelines and the protection concerns of its users.