WebProNews

Tag: odesk

  • It Would Cost $37 Billion to Pre-Screen YouTube Annually

    It was recently reported that YouTube has hit 72 hours of video content uploaded per minute, in a steady ascent since its inception in 2005. By 2007, users were adding six hours of video per minute – by January of 2009, that number hit 15 hours, by March 2010, 24 hours, by November of that year, 35 hours, and so forth.

    Engineer Craig Mansfield came up with a number on how much it would cost Google to pre-screen all of it’s content – a staggering $37 billion a year.

    YouTube has been having some copyright infringement problems since its inception, with record labels and movie studios suing over the platform’s lack of better control over what its users upload. YouTube recently lost a court case in Germany over 12 unlicensed songs a user uploaded to its server. The plaintiff had urged YouTube to install better upload filters in an attempt to stop illegal content streaming. Though, now with the 72 hour per minute ratio, the logistics of this sort of thing just become more complicated.

    Mansfield calculated that it would cost $36,829,468,840, to be more precise, to employ 199,584 moderators to govern the uploads, which is just shy of Google’s annual revenue.

    Essentially, Mansfield’s equation points out that YouTube uploads are very likely never going to be screened, even if Google employed some of oDesk’s $1 per hour branches.

  • Facebook Apologizes for Deleting Photos

    Facebook, presently under a scanning electron microscope as the smoke clears post-IPO, is now under fire for allegedly removing photos posted of a child with a congenital birth defect. Hundreds of users had been sharing photos of Grayson James Walker, who was born with anencephaly, a neural tube birth defect, only to have the pictures removed. His mother Heather was aware there was a small amount of time, and arranged for a photographer to come take some pictures, some with the baby wearing a hat, some exposing the defect.

    Some of those who moderate this sort of content are oDesk employees that Facebook employs for a dollar an hour all over the world. It is evident that particular photos of the child might’ve prompted some deletions. At last count, moderators remove roughly 4 billion articles of content posted by Facebook’s 900+ million users, most falling under categories of pornography, racism and violence. After the photos of Grayson were deleted, Heather re-uploaded them, which led to a temporary ban form the site. Heather made a statement to KCTV in Kansas City, “They allow people to post almost nude pictures of themselves, profanity, and so many other things but I’m not allowed to share a picture of God’s beautiful creation.”

    Facebook responded, stating, “Upon investigation, we concluded the photo does not violate our guidelines and was removed in error.” The Social Network added, alluding to its moderation hovels it bankrolls worldwide, “a billion people share more than 300 million photos a day. Our policies are enforced by a team of reviewers in several offices across the globe – This team looks at hundreds of thousands of reports every week, and as you might expect, occasionally, we make a mistake and remove a piece of content we shouldn’t have. We extend our deepest condolences to the family and we sincerely apologize for any inconvenience.”

    What Facebook likely meant to say is that sometimes porno, racism and violence makes it past the eyes of those workers making $1 an hour, and that said workers likely didn’t know what to do with some of the possibly graphic photos in question. Below is a tribute:

  • Facebook Moderators Earn $1/hr

    In a recent interview, 21-year-old Amine Derkaoui described spending spent three weeks working in Morocco for oDesk, an outsourcing company used by Facebook, to moderate content. Derkaoui’s job, which payed roughly $1 per hour, was to essentially implement Facebook’s strange content standards – ie, he was to delete any pictures of “cameltoes, moose knuckles, insides of skulls” or whatever banned images outlined in oDesk’s “abuse Standards” operations manual. Derkaoui’s short career shed some light upon a seedy facet of the social networking giant, which has been making hundreds of new millionaires.

    odesk

    Other moderators, primarily young, well-educated people working in Asia, Africa and Central America, all describe similar, ridiculously low salaries. Adam Levin, owner of British social network Bebo, says that the process of outsourcing is “rampant” across Silicon Valley. He adds, “we do it at Bebo. Facebook has so much content flowing into its system every day that it needs hundreds of people moderating all the images and posts which are flagged. That type of workforce is best outsourced for speed, scale and cost.”

    About 4 billion articles of content are moved every day between Facebook’s 845 million users. Most falls under acceptable standards, but a lot also falls into catergories of pornography, racism and violence – all of which is policed by an outsourced workforce in a third world country, for $1 an hour. Graham Cluley of Sophos states that Silicon Valley’s outsourcing culture is a “poorly kept dirty secret.” Levin adds that he estimates that Facebook employs between 800 and 1000 workers through oDesk, about a third of its “regular” staff.

    With Facebook mainly consisting of acquaitances explaining pictures of their breakfast, photos of countless new babies that all look roughly the same, friend requests from strangers users knew for a day 15 years before, generalized misrepresentation of one’s actual life and face, etc., it is interesting that the actual moderators of all of this content don’t even undergo criminal background screening. According to Derkaoui, his past wasn’t looked at, and there were no security measures stopping him from obtaining user information, as well as no barrier blocking him from uploading whatever he’d wanted onto Facebook himself.

    Regardless, Facebook has a statement on the matter – “these contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service. No user information beyond the content in question and the source of the report is shared. All decisions made by contractors are subject to extensive audits.” I tend to go with what Derkaoui said.

    Still, I find it hard to believe that any information regarding the actuality of Facebook’s weak privacy standards will prompt more than a handful of its 845 million customers to actually quit.

  • Facebook Content Standards: Arty Nudity Okay; ‘Moose Knuckles’ Not So Much

    The next time you feel like photoshopping a blown-apart head of one of your enemies and then uploading it to Facebook, the site moderators are perfectly fine with that – just make sure that the picture doesn’t actually depict any of the insides of that head.

    That doesn’t make sense, but it doesn’t matter because that’s the law of the land according to the manual used by the live content moderators that comb through Facebook’s gobs of material to identify what gets deleted and what stays. The manual, which the super-sleuths at Gawker were able to get their hands on, contains a dizzying list of guidelines that determine what content moderators are to leave posted on Facebook, what they need to delete, and that stuff in the middle that requires the input of Facebook. Facebook has to be consulted on some of those latter matters because they aren’t actually involved in the immediate process of content moderation. As in, Facebook doesn’t actually moderate the content in-house. Instead, Facebook outsources the menial task to a company called oDesk, who pays people a commission to pour through Facebook in search of speech and images uploaded by users to make sure none of it violates Facebook’s community standards.

    The gruesome yet confusing photo I described at the beginning of this article is actually an acceptable image according to the abuse standards used by the moderators. Here’s a sample of what else you can and can’t get away with on Facebook:

  • Art nudity is okay.
  • “Bodily fluids (except semen) are ok to show unless a human being is captured in the process (to confirm as cyberbullying).”
  • No pics of people on the can or using a urinal. Or an alleyway, for that matter.
  • No pictures of “camel toes or moose knuckles” – yes, those are the exact words included in the manual. It’s on page 4 under “Sex and Nudity.”
  • Breastfeeding is still forbidden.
  • Male nipples are okay, but not female nipples.
  • Violent speech, such as “I love hearing skulls crack,” are no good. However, images of crushed heads are perfectly fine so long as “no insides are showing.”
  • “Users may not describe sexual activity in writing, except when an attempt at humor or insult.” (??)
  • Not threatening of people such as police officers or state officials by photoshopping “cross-hairs of a gun sight” onto their image.
  • Pot depiction is okay, just so long as you aren’t selling, buying, or growing it.
  • Have a look at the full Abuse Standard section below to see what else will get you flagged and what else will just make you an offensive jerk to all of your Facebook friends.

    Abuse Standards 6.2 – Operation Manual