Facebook’s internal social networking rules on a number of sensitive issues


English: In The Facebook Show Second Nature pr...

Facebook‘s internal social networking rules on a number of sensitive issues, such as sex, terrorism and violence, were revealed in an investigation published by the British daily The Guardian. These internal policies guide Facebook moderators in choosing content that can be authorized to be published online.

Facebook’s secret rules and practice guidelines used to decide the content that over 2 billion users are allowed to publish on this social network have been revealed for the first time following an investigation by The Guardian. These revelations will most likely generate heated debates about the role and professional ethics of the American company, British journalists say. Reporters and publishers from The Guardian have read over 100 Facebook training manuals, spreadsheets and organizational charts, which provide a series of unprecedented revelations about Facebook’s methods to moderate online content related issues such as violence, hate speech, terrorism, pornography, racism and self-destruction. There are moderation policies even with regard to tricky sporting competitions and cannibalism. This Guardian’s investigation file, called “Facebook Files”, was made at a time when the American company is under huge political pressure in Europe and the United States. Facebook’s internal rules reflect the difficulties faced by company managers when they have to respond to a series of new challenges such as the “revenge porn” concept, but also the moderators’ difficulties that they say are overwhelmed by the volume work – which means that they often have “only 10 seconds” to make a decision on content published on the network. “Facebook can not keep control of its contents. It has become too big, it has grown too much and too fast, “revealed a source. Many moderators have been concerned about the inconsistency and the particular nature of some of the company’s policies. Sexual content, for example, is the most complex and confusing.

An internal document says Facebook revises over 6.5 million reports per week about potential false accounts – known under the acronym FNRP (fake, not real people, not a real person). Using thousands of slides and photos, Facebook has implemented a rule book that worries many people who say the social networking is now a publisher and must make greater efforts to eliminate hateful, intimidating content and violent. This internal regulation could also alarm advocates of the cause of freedom of expression, who are worried about the de facto role that Facebook holds, risking becoming the world’s largest censor. Both camps will most likely require greater transparency, notes The Guardian. Journalists from this British daily have been able to analyze the documents provided to Facebook moderators over the past year. Following these documents, they came to the following conclusions: Remarks such as “Someone Trumped” should be deleted, as Donald Trump, in his capacity as Head of State, falls into the category of protected persons. However, statements such as “To Break a Cow’s Neck, Make sure all pressure is placed in the middle of her neck” and “go fuck and die” – because these phrases are not considered credible threats. Videos with violent deaths – although marked and included in the category of disturbing content, they are not always deleted, as they can help raise awareness among the public about issues such as mental illness. Some photos of non-sexual physical abuse and intimidation or aggression of children – should not be erased or “altered” unless they contain sadistic elements or give the impression that they are celebrating that moment. Photos with animal abuse can be distributed – only extremely disturbing images can be marked as “disturbing”. All handmade artistic creations showing nudity and sexual intercourse are allowed, but digital artistic creations showing sexual activity are not allowed. Videos showing abortions are allowed as long as they do not contain nudity. Facebook will allow users to distribute automotive trials in the streets because the platform “does not want to censor or punish people already in distress.” Any person who has over 100,000 followers on this social network is designated a “public figure” – which denies these users the right to full protection for private individuals. Other types of remarks that are allowed on Facebook, according to these documents revealed by The Guardian: “The girl must keep this secret before her dad breaks his face” and “I hope someone will kill you.” These threats are considered either general or unbelievable. In one of the documents, Facebook recognizes that “people use violent language to express their frustration online” and thinks “it is unreasonable for them to do so” on the site. “Those users feel that the problem will not turn against them and feel indifferent to the person they are threatening because of the lack of empathy created by communication on such devices, as opposed to situations where people face each other” , says the American company. “We can say that violent language is most often unbelievable, until specific language elements that give us a reasonable reason to believe that there is no mere expression of emotions, but a transition to a plot or a voluntary action. From this perspective, expressions such as “I will kill you” and “go fuck and die” are not credible, but represent a violent expression of dissatisfaction and frustration. ” “People regularly express their contempt or disagreement through threats or invoking violence, generally in fun and unrelenting ways,” the same document states. Facebook also publicly acknowledged that “not all disagreeable or disturbing content violates our company standards.” Monika Bickert, head of Facebook’s global policy management department, says the social network has 2 billion users, and it is difficult to reach consensus on what content should be allowed on this online platform. “We have a truly diverse and global community, and people will always have very different ideas about what is OK to distribute. No matter where you pull the line, there will always be gray areas. For example, the line of satire and humor and improper content is sometimes very gray. It’s very difficult to decide whether content belongs to the site or not, “added Monika Bickert. “We feel responsible to our community to keep it safe, and we feel responsible. It is our responsibility to properly manage this issue. It’s a company commitment. We will continue to invest proactively to keep the site safe, but we also want to give people power so they can report any content that violates our standards, “she said. Monika Bickert also believes that some offensive comments may violate Facebook policies in some contexts, but not in others. Facebook’s policies on a series of issues such as violent deaths, images of non-sexual physical abuse of minors, and cruelty to animals highlight how the US company is trying to move forward “on a minefield.” “Videos with violent deaths are disturbing, but they can raise awareness. For videos, we believe that minors need protection, and adults need to be able to choose. We mark ‘disturbing’ videos with the violent deaths of some people, ‘says Facebook Folders. Such filming must be “hidden by minors” but should not be automatically erased, as they can be “valuable in raising awareness of self-mutilation, mental illness, war crimes, and other important issues.” Regarding non-sexual abuse of children, the Facebook internal guide states: “We do not modify child abuse photos. We mark ‘disturbing’ videos with child abuse. We remove child abuse images if they are distributed containing sadism and celebration. ” One slides explains that Facebook does not automatically delete the evidence of non-sexual abuse on a child to allow the material to be distributed, “and the child can be identified and saved, but we add an extra level of protection for protect the public. ” This procedure could be considered a warning that says a particular video is disturbing. Facebook has confirmed that there are “certain situations in which we allow images of non-sexual abuse on a child in order to help the child in question.” Facebook’s policies on animal abuse are also explained. “We allow photos and videos to document animal abuse to raise awareness, but we can add extra protection to the public in the case of content that is considered to be extremely disturbing by users,” says a US company slide . “Generally, the image of an animal abuse can be shared on the site. Some extremely disturbing images may be marked “disturbing content”. Photos of mutilated animals, including those showing torture, can be marked with the same reference without being erased. Facebook moderators can also take the picture of abuse in which a man hits or beats an animal. Facebook’s position on this issue is as follows: “We allow people to distribute images of animal abuse to raise awareness, but condemn abuse and eliminate content that cruelly denounces animals.” The documents show that Facebook has issued a new nudity practice guide after the scandal that appeared in 2016 following the removal of a legendary photo taken during the Vietnam War because the girl in the picture was naked. Facebook currently allows “news-exceptions” under its “terror war” regulation, but has drawn the line to images of “child nudity in the context of the Holocaust.” Facebook representatives told The Guardian that the social network uses software to intercept certain content with explicit graphics before they appear on the site. “But we want people to be able to talk globally and current events … so the context in which a violent image is distributed sometimes matters.” Some critics in the United States and Europe have demanded that Facebook be regulated in the same way that publishers and broadcasters are regulated in the mainstream. But Monika Bickert believes Facebook is “a new kind of company”. “It’s not a traditional technology company. It’s not a traditional media company. We build technology and we feel responsible for how it is used. We do not write the news people are reading on this platform, “she added. A report by British lawmakers, published on May 1, states that “the largest and richest social media companies are shamefully far from the point where they should take adequate measures to contain illegal and dangerous content or to implement appropriate Community standards or to keep users safe “. Sarah T. Roberts, an online content moderator, said, “When you are a small online community, a group of people who share principles and values ​​means something, but it’s quite different when you have a percentage rising from the world’s population and saying: “distribute yourself”. You are in a great mess. ” “Then, when you profit from this practice, you get into a disastrous situation,” added the British expert. Facebook has been experiencing difficulties constantly when it has had to analyze news and the “awareness” of violent images. The company was harshly criticized recently because it did not remove the video in which Robert Godwin was assassinated in the United States and the video in which a father killed his daughter in Thailand. On the other hand, Facebook has played an important role in distributing globally video clips showing police in the United States killing citizens alongside other government abuses. In 2016, Facebook removed a video that showed images immediately after the shooting by Philando Castile by the cops, but he reintroduced the clip later, saying that his deletion was “an error.”

READ  Facebook introduces a facial recognition tool