WebProNews

Focusing on Buttocks: What You Need to Know About Facebook’s Newest Attempt to Clarify Its Content Rules

What can you post on Facebook and will Facebook remove it? What’s ok and what crosses the line? Where the hell is the line?

Facebook has pulled the curtain back – at least a sliver – on its much-maligned content removal process, giving us a more detailed breakdown on each type of controversial content. For example, something like the promotion of self-injury has always been banned on Facebook – but what does the company actually mean when they say self harm? What qualifies?

It’s questions like these that Facebook has decided to tackle with a new Community Guidelines page that offers more specifics than the company has ever given us on the topic.

“We have a set of Community Standards that are designed to help people understand what is acceptable to share on Facebook. These standards are designed to create an environment where people feel motivated and empowered to treat each other with empathy and respect,” says Head of Global Policy Management Monika Bickert.

“Today we are providing more detail and clarity on what is and is not allowed. For example, what exactly do we mean by nudity, or what do we mean by hate speech? While our policies and standards themselves are not changing, we have heard from people that it would be helpful to provide more clarity and examples, so we are doing so with today’s update.”

The new Standards explainer is here. You should go check it out, as it’s interesting to see how Facebook’s mind works (breastfeeding nipples ok, buttocks not ok). But I know you’re busy. I’ve been over it, and here are some of the more interesting distinctions in the new document.

– Facebook bans direct threats – those made toward other users as well as those made toward public figures. But did you know that Facebook takes the location of the threat-maker into account when attempting to determine the threat’s credibility? According to Facebook, it’ll automatically assign more credibility to threats originating and targeting people in “violent or unstable regions.”

“We may consider things like a person’s physical location or public visibility in determining whether a threat is credible. We may assume credibility of any threats to people living in violent and unstable regions,” says Facebook.

– Body modification does not qualify as self-mutilation or self-harm. Facebook does not remove those posts. People can talk about suicide on the site, but only if they don’t promote or advocate it.

“We also remove any content that identifies victims or survivors of self-injury or suicide and targets them for attack, either seriously or humorously. People can, however, share information about self-injury and suicide that does not promote these things.”

– You can discuss ISIS all you want, but you cannot voice support for it or its actions.

“We remove content that expresses support for [terrorist activity or organized criminal activity]. Supporting or praising leaders of those same organizations, or condoning their violent activities, is not allowed.”

– You could get in trouble for sending too many friend requests to the same person. If they don’t want to be friends, they don’t want to be friends. In explaining its ban on bullying and harassment, Facebook says that “repeatedly targeting other people with unwanted friend requests or messages” is a no-no.

– You can discuss illegal activities on Facebook, as long as it’s not celebrating your own handiwork. And you should know this by now, but if Facebook thinks you pose a direct threat to an individual or the public at large, it’s going to tell the police.

“We prohibit the use of Facebook to facilitate or organize criminal activity that causes physical harm to people, businesses or animals, or financial damage to people or businesses. We work with law enforcement when we believe there is a genuine risk of physical harm or direct threats to public safety. We also prohibit you from celebrating any crimes you’ve committed. We do, however, allow people to debate or advocate for the legality of criminal activities, as well as address them in a humorous or satirical way.”

– For the first time, Facebook is specifically banning revenge porn.

“To protect victims and survivors, we also remove photographs or videos depicting incidents of sexual violence and images shared in revenge or without permissions from the people in the images.”

Both reddit and Twitter have also written new language into their ToS specifically targeting revenge porn.

– Facebook’s long and confusing history with nudity has been documented ad nauseam. Porn has always been banned on the site, and any sexual nudity has also been a deal breaker. But Facebook’s long been open to nudity when it’s art, and when it’s used in the depiction of a natural act like breastfeeding.

But of course, there are fine lines. And Facebook knows it often screws up policing said content.

“As a result, our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes. We are always working to get better at evaluating this content and enforcing our standards,” says the company.

Here’s Facebook’s thought process on nudity:

“We remove photographs of people displaying genitals or focusing in on fully exposed buttocks. We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring. We also allow photographs of paintings, sculptures, and other art that depicts nude figures. Restrictions on the display of both nudity and sexual activity also apply to digitally created content unless the content is posted for educational, humorous, or satirical purposes. Explicit images of sexual intercourse are prohibited. Descriptions of sexual acts that go into vivid detail may also be removed.”

Don’t focus in on those buttcheeks.

Did you catch that last part? Facebook may yank your status update if you talk about sex in a graphic manner. Not a photo or anything, just a simple text description.

– You can sell guns (as long as the proper checks have been done), but not pot (even where it’s legal).

– You can discuss others’ hate speech, but if you post anything attacking people on the basis of Race, Ethnicity, National origin, Religious affiliation, Sexual orientation, Sex, gender, or gender identity, or Serious disabilities or diseases – Facebook will remove it if reported.

– If you’re going to share violent content, it can’t be to glorify it. Also, Facebook requests that you give everyone a warning that the video you’re posting is graphic.

Ok, everything’s cleared up now, right?

Probably not. And there’s still a lot of room for mistakes on Facebook’s part. Remember, Facebook isn’t scanning everything to find posts that go against its community standards. Facebook still relies on user reports. Once Facebook’s content moderation team (which is partly outsourced) is made aware of a potentially improper piece of content, it’s a judgement call from there on.

But this does give us a clearer picture. Just make sure that picture isn’t focusing on butts.