WebProNews

Facebook Delves Deep Into The Reporting Process

With 900+ million people on the site, it’s inevitable that someone is going to post something that offends your sensibilities. And although Facebook has systems in place to weed out content that violates their terms of service, the company has always relied on users to report inappropriate, malicious, or otherwise unsafe content that they run upon during their daily browsing.

But what happens when you click that “Report” button? Today, Facebook is giving us an inside look at the entire process – and let me tell you, a whole hell of a lot goes into it.

Here’s what they had to say in a note posted to the Facebook Security page:

There are dedicated teams throughout Facebook working 24 hours a day, seven days a week to handle the reports made to Facebook. Hundreds of Facebook employees are in offices throughout the world to ensure that a team of Facebookers are handling reports at all times. For instance, when the User Operations team in Menlo Park is finishing up for the day, their counterparts in Hyderabad are just beginning their work keeping our site and users safe. And don’t forget, with users all over the world, Facebook handles reports in over 24 languages. Structuring the teams in this manner allows us to maintain constant coverage of our support queues for all our users, no matter where they are.

Facebook goes on to explain that the “User Operations” teams are broken up into four separate teams that each have their own specific area of concentration – the Safety team, the Hate and Harassment team, the Access team, and the Abusive Content team. Of course, any content that one of those teams finds to be in violation of Facebook policy will be removed and the user who posed it will get a warning. From there, the User Operations teams can determine if further action is required – like disabling the user from posting certain types of content or in some case, disabling their account altogether.

Once that goes down, there is also a team that handles appeals.

The security team adds that they aren’t alone in making sure you have a safe experience on the site:

And it is not only the people who work at Facebook that focus on keeping our users safe, we also work closely with a wide variety of experts and outside groups. Because, even though we like to think of ourselves as pros at building great social products that let you share with your friends, we partner with outside experts to ensure the best possible support for our users regardless of their issue.

These partnerships include our Safety Advisory Board that helps advise us on keeping our users safe to the National CyberSecurity Alliance that helps us educate people on keeping their data and accounts secure. Beyond our education and advisory partners we lean on the expertise and resources of over 20 suicide prevention agencies throughout the world including Lifeline (US/Australia), the Samaritans (UK/Hong Kong), Facebook’s Network of Support for LGBT users, and our newest partner AASARA in India to provide assistance to those who reach out for help on our site.

It’s obvious that the reporting process is a complicated one, and users naturally report plenty of content that they just don’t like, but doesn’t really violate Facebook’s terms. In that case, it’s up the the user to initiate contact with the poster and ask that they take it down. Facebook says that user safety and security is of “paramount importance” around their offices, but in order for it to work, the impetus is on the user to click that “Report” button when they comes across inappropriate content.

Check out Facebook’s neat little infographic on the process below: