WebProNews

Tag: censorship

  • ‘Right To Be Forgotten’ Dangerous, According To Web’s Inventor

    A lot of people (especially those not trying to hide information about themselves) agree that the Right to Be Forgotten in Europe is problematic for a variety of reasons, including the censorship of information.

    The latest to speak out against the current situation is none other than Tim Berners-Lee, the guy responsible for the World Wide Web. Via CNET:

    “This right to be forgotten — at the moment, it seems to be dangerous,” Berners-Lee said Wednesday, speaking here at the LeWeb conference. “The right to access history is important.”

    In a wide-ranging discussion at the conference, Berners-Lee said it’s appropriate that false information should be deleted. Information that’s true, though, is important for reasons of free speech and history, he said. A better approach to the challenge would be rules that protect people from inappropriate use of older information. An employer could be prohibited from taking into account a person’s juvenile crimes or minor crimes more than 10 years old, for example.

    The EU recently put forth some guidelines for the right to be forgotten, for search engines to work with, though they don’t go very far in terms of quelling the biggest concerns many have with the ruling, such as Berners-Lee’s.

    Image via Wikimedia Commons

  • Amy Schumer Makes “Pussy” Okay On Comedy Central

    It is now okay to say “pussy” on Comedy Central. All thanks to Amy Schumer, of Inside Amy Schumer fame, as well as the show’s executive producers. During the second season, the slang for female genitalia became a point of contention due to the fact that “dick” and other terms for the male sex organ are used freely in other Comedy Central shows including The Colbert Report and South Park.

    On Saturday, November 8, Schumer and the show’s executive producers attended the New York Comedy Festival to talk about the show to a sold out crowd at Paley Center. Executive producer Jessi Klein brought up the topic of their fight to say “pussy” on air and referred to their victory as a “great moment in U.S. history”.

    — Entertainment Weekly (@EW) November 10, 2014

    In the end, it was Inside Amy Schumer executive producer Dan Powell who took action. According to Powell, “Halfway through the first season, we started to realize that a lot of the show was addressing women’s issues and gender politics. I’d written a letter, sort of like I’d write to my congressman, and I guess it struck a chord.” Schumer calls the letter Powell’s “Mr. Smith Goes to Washington” moment.

    Of course, in true Inside Amy Schumer fashion, they used the word in a sketch involving animated meerkats that satirizes Hollywood’s treatment of female celebrities.

    Amy Schumer is a stand-up comedian who joined the fifth season of NBC show Last Comic Standing, a stand-up comedy competition, where she finished fourth. Back in 2010, Schumer did her first Comedy Central special and this marked the beginning of her partnership with the company. She appeared in the Comedy Central roasts for Charlie Sheen and Roseanne Barr and in June 2012, she started working on a show for the network which would eventually become Inside Amy Schumer.

    Schumer will next be seen on the film Trainwreck, a film she co-wrote with director Judd Apatow.

  • Regarding James Foley, Should Twitter Decide What You See?

    Twitter is absolutely, one-hundred percent, within its right to remove whatever content they deem to be in violation of its rules and terms of service. Twitter can do this, and you’re not allowed to cry free speech! Twitter is not your government, and you are not guaranteed the right to freely express yourself on the social network. Twitter is a private company, and by using its services, you agree to play by its rules. As frustrating as this can be at time, this is a simple, and inarguable reality. Twitter can censor and remove whatever it wants.

    Can. Twitter can, and Twitter has. Should is an entirely different thing. Should is what we can debate.

    Should Twitter remove images and suspend accounts associated with spreading images and video of the horrific beheading of American journalist James Foley? I say no. Let’s discuss.

    Should you watch members of the radical group ISIS murder James Foley? I don’t know. I honestly don’t know the answer to that and am rather torn myself. On one hand, we have an obligation to educate ourselves about what’s happening in the world – the type of brutality that permeates. Shying away from that brutality, at its most gut-wrenching expression, isn’t going to make the problem go away. James Foley died so that you could see. Don’t we owe it to him? To me, this argument resonates.

    On the flip side – why? Why put yourself through something that’s beyond upsetting? I remember watching Daniel Pearl beheaded by Al-Qaeda in 2002 – do you? That, and I’m only saying this because I honestly can’t think of more apt words to describe it, was severely fucked up.

    What does it change? What’s the point? Isn’t that what ISIS wants? Don’t you think they want the world to watch this video, pore over the images, and collectively recoil in horror? Shouldn’t we simply shun this propaganda?

    Clearly, I’m conflicted.

    But Twitter shouldn’t be conflicted about this. Twitter, whose most important reason for existing is the unfiltered spread of real-time news and information, should let me and you decide what we see.

    – – – – – – – – – – –

    Early this morning, Twitter CEO Dick Costolo tweeted this:

    This came less than 12 hours after Twitter decided to enact a new policy concerning images of the deceased, as Twitter public policy’s Nu Wexler outlines here:

    Later, in a tweet to GigaOm’s Mathew Ingram, Wexler suggested that the reason for the site-wide search and destroy mission on any account posting images or videos of Foley’s execution had to do with a request from his family, per the new policy.

    It’s important to note that Twitter is not being that discerning in deciding which accounts to suspend – journalists, regular users, and accounts thought to be associated with ISIS were all shut down. Some have been reinstated, some haven’t.

    That new policy, enacted on Tuesday, allows the family of a deceased individual to petition Twitter to remove images “from when critical injury occurs to the moments before or after death.” The Foley video/images clearly satisfies this criteria.

    But Twitter also says that “when reviewing such media removal requests, Twitter considers public interest factors such as the newsworthiness of the content and may not be able to honor every request.”

    What could possibly be more newsworthy than these images?

    FYI, Twitter’s terms of service states that “users are allowed to post content, including potentially inflammatory content, provided they do not violate the Twitter Rules and Terms of Service.” Twitter’s rules do not outlaw violent images. Twitter bars “direct, specific threats of violence against others,” but not simply violent content.

    – – – – – – – – – – –

    There’s a movement on Twitter right now, under the hashtag #ISISmediablackout, in which people are pledging to share, link to, tweet about, or generally give any attention to ISIS’ clear attempts at propaganda. That’s a completely reasonable choice to make. Personally, I won’t be sharing the James Foley imagery.

    But this is just one side of a complicated issue. Twitter, like it or not, is many people’s go-to place for the news. It’s the fastest-growing disseminator of information in the world. People rely on Twitter.

    Is an image of James Foley moments before being beheaded the end all be all of new coverage? No. The story can be told without the image, and without the video. But it is newsworthy. As depraved as it is, it is a crucial element in an event that’s dominating global conversation. Twitter should let its users decide if it’s important enough to view.

    The should Twitter censor offensive content question isn’t new. One could ask the same question of Facebook or YouTube, both of whom do plenty of that.

    And we should ask that question of any corporation that has a huge influence on what information we see and how we see it. Should you watch James Foley’s execution? I don’t know. Do you want Twitter deciding for you? I don’t think so.

  • Telegraph, Like Wikipedia, Keeps List Of Articles ‘Forgotten’ By Google

    The “right to be forgotten” mess continues to get even messier. At least one newspaper is actually removing articles that have been removed from Google because of the law, from its own site, while also writing articles about removing such articles.

    So here’s an example of not only why the law is inherently flawed, but also of how much time it’s wasting on pretty much everybody’s part.

    The Daily Telegraph, as described by Danny Sullivan at Marketing Land, has been on a “campaign to document all its stories that have been removed” as a result of the law. The Telegraph’s Mattthew Sparkes even tweeted about how he’s spending his time (which would no doubt be better used reporting actual news).

    The list referenced in that last tweet contains eight bullet points about articles and images removed.

    Similarly, Wikipedia is keeping a running tab of stories that have been removed by Google.

    In other words, people requesting articles be removed only seem to be drawing more attention to the fact that they’ve done so, which seems to defeat the entire purpose. Shocking, right?

    For more background on the “right to be forgotten” and Google’s role, peruse our coverage here.

    Image via Google

  • Google On Complexity Of ‘Right To Be Forgotten’

    As previously reported, Google (as well as Microsoft and Yahoo) attended a meeting last week with EU regulators to discuss the “right to be forgotten” ruling and the search engines’ approach to handling it.

    Each of the companies was given a questionnaire (via The New York Times), asking about various aspects of their practices related to complying with the ruling. Google’s has been made publicly available, and in it, the company discusses complications it faces.

    Asked about criteria used to balance the company’s own economic interest and/or the interest of the general public in having access to info versus the right of the data subject to have search results delisted, Google said:

    The core service of a search engine is to help users find the information they seek, and thus it is in a search engine’s general economic interest to provide the fastest, most comprehensive, and most relevant search results possible. Beyond that abstractconsideration, however, our economic interest does not have a practical or direct impact on the balancing of rights and interests when we consider a particular removal request.

    We must balance the privacy rights of the individual with interests that speak in favour of the accessibility of information including the public’s interest to access to information, as well as the webmaster’s right to distribute information. When evaluating requests, we will look at whether the search results in question include outdated or irrelevant information about the data subject, as well as whether there’s a public interest in the information.

    In reviewing a particular removal request, we will consider a number of specific criteria. These include the individual (for example, whether an individual is a public figure), the publisher of the information (for example, whether the link requested to be removed points to material published by a reputable news source or government website), and the nature of the information available via the link (for example, if it is political speech, if it was published by the data subject him- or herself, or if the information pertains to the data subject’s profession or a criminal conviction).

    Each criterion, the company continued, has its own “potential complications and challenges”. It then proceeded to list these examples:

    • It is deemed to be legitimate by some EU Member that their courts publish rulings that include the full names of the parties, while courts in other Member States anonymise their rulings before publication.
    • The Internet has lowered the barrier to entry for citizen journalists, making it more difficult to precisely define a reputable news source online than in print or broadcast media.
    • It can be difficult to draw the line between significant political speech and simple political activity, e.g. in a case where a person requests removal of photos of him- or herself picketing at a rally for a politically unpopular cause.

    As previously assessed, it’s a real mess.

    Google says in the document that it has not considered sharing delisted search results with other search engines, adding, “We would note that sharing the delisted URLs without further information about the request would not enable other search engine providers to make informed decisions about removals, but sharing this information along with details or a copy of the complaint itself would raise concerns about additional disclosure and data processing.”

    For some reason, I’m reminded of that time Google accused Bing of stealing its search results.

    You can read Google’s full questionnaire responses here.

    As of July 18th, Google had received over 91,000 removal requests involving over 328,000 URLs. Earlier this week, Google announced dates for presentations to its Advisory Council, aimed at evolving the public conversation and informing ongoing strategy.

    Image via Google

  • Google Announces ‘Right To Be Forgotten’ Tour 2014

    Google has released a schedule for presentations from “experts” on the “right to be forgotten,” which will take place throughout the fall. Consider it Google’s Right to be Forgotten Tour 2014 (I hope there are t-shirts).

    The company recently announced the formation of its Advisory Council on the subject, which stems from a ruling by the Court of Justice of the European Union, saying that search engines must provide people in the EU with a means of requesting content about them be removed from search results. You can get caught up on the whole mess here, but suffice it to say, it’s been a controversial battle between privacy and censorship. Many questions and concerns remain, which is precisely why Google is holding these “in-person public consultations”.

    The schedule is as follows:

    September 9 in Madrid, Spain
    September 10 in Rome, Italy
    September 25 in Paris, France
    September 30 in Warsaw, Poland
    October 14 in Berlin, Germany
    October 16 in London, UK
    November 4 in Brussels, Belgium

    “The Council welcomes position papers, research, and surveys in addition to other comments,” says Betsy Masiello, Google Secretariat to the Council. “We accept submissions in any official EU language. Though the Council will review comments on a rolling basis throughout the fall, it may not be possible to invite authors who submit after August 11 to present evidence at the public consultations.”

    There’s a form here, for those who wish to voice their concerns and be considered for presentation.

    Last week, EU regulators held a meeting with the search engines about the subject, where Google was said to disclose that it had removed over 50% of URLs requested, rejected over 30%, and requested additional info in 15% of cases. It had received requests from 91,000 people to remove 328,000 URLs just through 07/18.

    More on Google’s Advisory Council here.

    Image via Google

  • EU To Hold Meeting On ‘Right To Be Forgotten’ Next Week

    It doesn’t seem like the whole mess that is the “right to be forgotten” is going to be thoroughly sorted out anytime soon, as regulators in Europe are now taking issue with Google’s implementation of the rules it is being forced to adopt.

    The Wall Street Journal is reporting that the EU privacy officials have called a meeting for next Thursday in Brussels with the major search engines to discuss things further. According to the report, Microsoft has confirmed that it will attend, while Google and Yahoo have said they’ll cooperate with officials, but haven’t confirmed attendance for the specific meeting.

    Regulators in Germany, it says, are concerned that Google isn’t removing search results from Google.com in the same way that it is with its EU-specific sites. Likewise, the director of a French watchdog says this puts the effectiveness of the whole thing into question.

    You don’t say.

    Other areas of concern include: cases that end up having the opposite effect of the right to be forgotten, as stories are written about their very involvement with this whole larger story; and the nature in which Google is notifying publishers when they’re content is being hidden in search results.

    It will be surprising if Google doesn’t end up attending the meeting, as it is obviously effected greatly by this whole thing, and the whole world is watching. It’s no surprise that Microsoft has confirmed its attendance, as it has been talking about implementing its version of the “right to be forgotten” feature on Bing in recent weeks, but has admitted it’s been a difficult process. In fact, some have criticized Google for complying so quickly while Bing is taking its time. Yahoo is said to be readying its own version as well, but we haven’t heard much from them on the matter.

    It will be interesting to see what kind of progress is made next week, if any.

    Image via Google

  • Google Acknowledges What A Mess This Is

    The whole “right to be forgotten” thing is an absolute mess, and Google knows it. Google always knew it would be, which is why it always opposed the concept, but now it has no choice but to comply with law. The company is still being vocal in its opposition, while also trying to make people understand the difficult job it’s faced with, and why it’s going to make mistakes. From the sound of it, Google seems to be acknowledging that mistakes will continue to be made as it struggles with figuring out what it should be censoring from search results and what it should not.

    I know I wouldn’t want to be in the position of having to make that call. Do you think Google is doing a reasonable job of handling its role in the court’s decision? Share your thoughts in the comments.

    I’m going to assume that your’e at least somewhat familiar with what’s going on. If not, peruse these articles on the saga. In a very basic nutshell, the Court of Justice of the European Union ruled that search engines must take requests for content to be removed from search results when it’s “inadequate, irrelevant or no longer relevant, or excessive” in relation to the person being searched for.

    Google has to remove search results, while the actual content may remain on the sites where published. The company equates this to keeping a book in a library, but removing it from the card catalog.

    The search engine has only been actually removing content from search results for a couple weeks now, but there have already been controversial examples of removed results (unsurprisingly), which Google has now admitted that it shouldn’t have actually removed.

    David Drummond, Google’s Chief Legal Officer and Senior Vice President of Corporate Development, wrote an article for The Guardian, which the company has re-posted to its official blog, discussing the challenges it faces, and the errors it has already made.

    Google has a team of people tasked with reviewing applications for content to be removed. According to Drummond, most of these come with very little information and “almost no context”. So Google, who shouldn’t be forced to make such judgments to begin with, has to make judgments about censorship with very little to go on. Like I said, I wouldn’t want to be in that position.

    Drummond writes, “The examples we’ve seen so far highlight the difficult value judgments search engines and European society now face: former politicians wanting posts removed that criticise their policies in office; serious, violent criminals asking for articles about their crimes to be deleted; bad reviews for professionals like architects and teachers; comments that people have written themselves (and now regret). In each case someone wants the information hidden, while others might argue that it should be out in the open.”

    “When it comes to determining what’s in the public interest, we’re taking into account a number of factors,” he adds. “These include whether the information relates to a politician, celebrity or other public figure; if the material comes from a reputable news source, and how recent it is; whether it involves political speech; questions of professional conduct that might be relevant to consumers; the involvement of criminal convictions that are not yet ‘spent’; and if the information is being published by a government. But these will always be difficult and debatable judgments.”

    He says Google is doing its best to be transparent about removals. As you may know, Google is showing the following statement on some search results pages in the EU:

    Some results may have been removed under data protection law in Europe. Learn more.

    Google says it will also include the requests in its transparency report, and will continue to notify publishers and webmasters when their pages have been pulled from results, but as Drummond notes, it can’t include specific information about why such pages were removed because it would violate the privacy rights of the individual in question. That’s part of the court’s ruling.

    “Of course, only two months in our process is still very much a work in progress,” says Drummond. “It’s why we incorrectly removed links to some articles last week (they’ve since been reinstated). But the good news is that the ongoing, active debate that’s happening will inform the development of our principles, policies and practices – in particular about how to balance one person’s right to privacy with another’s right to know.”

    That’s a semi-optimistic view, but you have to wonder how frequently Google will continue to “incorrectly remove” links. Google, as of Drummond’s writing has had over 70,000 take-down requests spanning 250,000 web pages just since May. We can only assume that they’ll continue to pour in for the foreseeable future.

    The company has set up an “advisory council of experts,” from outside of Google to advise Google on the issues at hand. These individuals come from the media, academia, data protection, civil society, and the tech sector, and are asking for evidence and recommendations from various groups, while holding public meetings across Europe to “examine these issues more deeply.”

    The experts will make a report available to the public, and it will include recommendations for “particularly difficult” removal requests like criminal convictions, as well as thoughts on the implications of the ruling for users, publishers, search engines, etc. It will also contain steps to improve accountability and transparency, according to Google.

    “The issues at stake here are important and difficult, but we’re committed to complying with the court’s decision,” Drummond concludes. “Indeed, it’s hard not to empathise with some of the requests that we’ve seen – from the man who asked that we do not show a news article saying that he had been questioned in connection with a crime (he’s able to demonstrate that he was never charged) to the mother who requested that we remove news articles for her daughter’s name as she had been the victim of abuse. t’s a complex issue, with no easy answers. So a robust debate is both welcome and necessary as, on this issue at least, no search engine has an instant or perfect answer.”

    Google has been much quicker than its search engine peers to comply with the court’s ruling, but has also been criticized for just that.

    Do you think Google jumped into this too quickly, or should it have taken its time like Yahoo and Bing? Should the whole thing be handled differently? Share your thoughts in the comments.

    Image via Google

  • As Google Begins to Forget, Do You Really Have a ‘Right to Be Forgotten’?

    The internet doesn’t forget. Ask anyone who’s been punished by the brutal truth of the web. The internet doesn’t care that you didn’t really mean what you said in that tweet. The internet doesn’t care that you were super drunk in college and that was a one time thing. The internet doesn’t care that you’re a different person now, and the past is the past.

    If you’ve done it, there’s some record of it online. The internet never forgets.

    But that doesn’t mean that search engines, the tour guides of the internet, have the same ironclad memory. And thanks to a recent court ruling, search engines are set to be forced into forgetting. It’s a digital lobotomy – and it’s just beginning.

    Should Google be forced to remove search results upon request? Let us know in the comments.

    In May, The Court of Justice of the European Union handed down a controversial ruling regarding search results and requests to remove them.

    “An internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties…Thus, if, following a search made on the basis of a person’s name, the list of results displays a link to a web page which contains information on the person in question, that data subject may approach the operator directly and, where the operator does not grant his request, bring the matter before the competent authorities in order to obtain, under certain conditions, the removal of that link from the list of results,” said the Court.

    And thus, the so-called “right to be forgotten” was born. Basically, the ruling makes Google and other search engines responsible for removing results at the request of individuals – in some cases. The decision to remove search results will be up to the search engines, but if agreements cannot be made between search engine and petitioners, then off to court they’ll go.

    More from the ruling:

    So far as concerns, next, the extent of the responsibility of the operator of the search engine, the Court holds that the operator is, in certain circumstances, obliged to remove links to web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person’s name. The Court makes it clear that such an obligation may also exist in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.

    The Court points out in this context that processing of personal data carried out by such an operator enables any internet user, when he makes a search on the basis of an individual’s name, to obtain, through the list of results, a structured overview of the information relating to that individual on the internet. The Court observes, furthermore, that this information potentially concerns a vast number of aspects of his private life and that, without the search engine, the information could not have been interconnected or could have been only with great difficulty. Internet users may thereby establish a more or less detailed profile of the person searched against. Furthermore, the effect of the interference with the person’s rights is heightened on account of the important role played by the internet and search engines in modern society, which render the information contained in such lists of results ubiquitous. In the light of its potential seriousness, such interference cannot, according to the Court, be justified by merely the economic interest which the operator of the engine has in the data processing.

    It’s an interesting take on what a search results page really is – an “interconnector” of information. In a way, the court said that Google kind of creates its own content, its own narrative about any given person through a search results page for said person. But more on this later.

    This “right to be forgotten” has been the cry of many for years. Naturally, people don’t want every little thing they’ve ever done and every little thing they’ve ever been associated with appearing in a basic Google search for their name. This particular ruling from the EU court stems from the case of Mario Costeja, of Spain, who complained of an auction notice of his repossessed home, which is now resolved, continuing to show up in Google search results, infringing upon his privacy.

    As you would imagine, Google’s argument is that being forced to remove certain search results simply because an individual doesn’t like them amounts to censorship.

    “[The ruling is] disappointing…for search engines and online publishers in general,” said Google of the ruling.

    But they complied, and soon the requests began to flow – 12,000 of them even before Google launched a reporting tool for concerned parties to air their search result grievances. After that, the numbers skyrocketed.

    “In evaluating your request, we will look at whether the results include outdated information about your private life. We’ll also look at whether there’s a public interest in the information remaining in our search results—for example, if it relates to financial scams, professional malpractice, criminal convictions or your public conduct as a government official (elected or unelected). These are difficult judgements and as a private organization, we may not be in a good position to decide on your case. If you disagree with our decision you can contact your local DPA,” explains Google.

    Apparently, Google agreed with the petitioners on some requests, and now, Google is starting to forget.

    “This week we’re starting to take action on the removals requests that we’ve received,” a Google spokesman said. “This is a new process for us. Each request has to be assessed individually, and we’re working as quickly as possible to get through the queue.”

    Google has begun to remove search results and add disclaimers at the bottom of results pages – basically saying that some search results may have been removed in order to comply with EU law.

    Google had this to say:

    We look forward to working closely with data protection authorities and others over the coming months as we refine our approach. The CJEU’s ruling constitutes a significant change for search engines. While we are concerned about its impact, we also believe it’s important to respect the Court’s judgment and are working hard to devise a process that complies with the law.

    When you search for a name, you may see a notice that says that results may have been modified in accordance with data protection law in Europe. We’re showing this notice in Europe when a user searches for most names, not just pages that have been affected by a removal.

    Clever. I may be a little heavy-handed in my reading of this, but to me it sounds like Google’s subtle way to express their disappointment in the EU’s ruling. Instead of simply adding that disclaimer to pages where they’v actually yanked results, Google wants users to know on every page that the man is holding them down…man. You have incomplete search results, and you know who’s fault it is.

    Whether that’s the case or not is moot. The salient aspect of this whole issue is that it’s part of bigger trend – one that might be out of Google’s control. They can continue to cry censorship and express “disappointment” in rulings that hamper their ability to provide complete search results – but the world seems to be turning against them in this regard.

    In late 2012, an Australian high court likened Google to a publisher, saying,

    “Google Inc is like the newsagent that sells a newspaper containing a defamatory article. While there might be no specific intention to publish defamatory material, there is a relevant intention by the newsagent to publish the newspaper for the purposes of the law of defamation.”

    Instead of simply being a ‘link-lister’, Google was deemed a publisher of the publishers, of sorts. The distinction was made even muddier when the court, ruling on a lawsuit in which a man sued Google for associating his name and image with (untrue) claims of ties to organized crime, talked about Google Image results being a “cut and paste creation” – as in content created by Google.

    Here, Google was seen as publisher and therefore liable for defamation.

    “It follows that, in my view, it was open to the jury to conclude that Google Inc was a publisher – even if it did not have notice of the content of the material about which complaint was made. Google Inc’s submission to the contrary must be rejected. However, Google Inc goes further and asserts that even with notice, it is not capable of being liable as a publisher ‘because no proper inference about Google Inc adopting or accepting responsibility complained of can ever be drawn from Google Inc’s conduct in operating a search engine,’” said the court.

    And then there are the various autocomplete woes, wherein Google has been fined for their autocomplete suggestions. We’ve seen this happen all across the world – France, Japan, Italy, and more. The fact that Google’s autocomplete results are not manual, carefully chosen and suggested straight from the brains of Googlers (and are instead based on algorithms and search frequency) hasn’t stopped international courts from finding Google responsible for what it suggests in any given search.

    The common thread between all of these cases, including the most recent “right to be forgotten” ruling, is that Google is ultimately responsible for what it provides in a search.

    You have to imagine that this is just the beginning, and Google’s fairly weak resistance to comply with the EU court’s decision means the the “right to be forgotten” may soon become a more universal right.

    Do you really have the right to be forgotten? Or is this censorship, plain and simple? Let us know in the comments.

    Image via Google

  • Facebook Frees the Nipple, Relaxes Breastfeeding Policy

    If you’re a mother who wants to post photos of you breastfeeding your child on Facebook, the company has finally decided that it’s a-ok in their book.

    Well, in the majority of cases.

    Facebook has made a slight alteration to their policy on breastfeeding photos. In a help post asking Does Facebook allow photos of mothers breastfeeding?, Facebook has changed their wording a bit. Here’s what it now reads:

    Yes. We agree that breastfeeding is natural and beautiful and we’re glad to know that it’s important for mothers to share their experiences with others on Facebook. The vast majority of these photos are compliant with our policies.

    Please note that the photos we review are almost exclusively brought to our attention by other Facebook members who complain about them being shared on Facebook.

    In reality, that stance sounds very similar to the stance Facebook has had for years. But here’s the unspoken change in Facebook’s policy now and Facebook’s policy a few weeks ago –

    For a long time, Facebook allowed non-sexual female nudity in the context of breastfeeding – but only if the only nipple exposed was the one nursing the baby. If the photo had a non-babied exposed nipple, it was ripe for moderation and would likely be yanked.

    Now, Facebook is freeing the nipple if you will, as long as the photo is still in the context of breastfeeding.

    If it seems ridiculous that we have to discuss just how exposed certain nipples are in a photo, that’s because it is. It’s ridiculous that this has to be a thing – but Facebook’s breast phobia is well-documented. For years, breastfeeding activists have lamented Facebook’s photo removal process. The issue finally came to a head in early 2012, when Facebook gave a lengthy statement where they said that they place limitations on nudity “due to the presence of minors on the site.”

    “On some occasions, breastfeeding photos contain nudity — for example an exposed breast that is not being used for feeding — and therefore violate our terms. When such photos are reported to us and are found to violate our policies, the person who posted the photo is contacted, and the photos are removed,” said Facebook at the time.

    And that’s precisely the fine print that seems to have been removed.

    One blogger and #FreeTheNipple activist decided to test Facebook’s new policy on breastfeeding photos. After initially having the photo removed by Facebook’s moderation team, the photo was later reinstated with the typical apology and it was removed in error explanation.

    While breastfeeding activists will see this as a victory – and they should – it’s important to remember that Facebook outsources most of its content moderation, and is really kind of bad at it. Oftentimes, things that are perfectly Facebook-legal wind up removed and users find themselves with temporary bans for posting ‘obscene’ content. In most cases, Facebook apologizes and the natural order is restored.

    But the point is – you’re still going to see breastfeeding photos removed in error. They’ll likely get restored, but Facebook’s content moderation setup simply isn’t sophisticated enough to go 100 percent error free.

    Plus, if you go back and reread their updated policy, you’ll find plenty of wiggle room.

  • Google To Include ‘Right To Be Forgotten’ Requests In Transparency Report

    The Court of Justice of the European Union recently ruled that Google and other search engines must take requests for search results to be deleted in what has become known as the “right to be forgotten”. Google then made a request tool available, and immediately started averaging about 10,000 requests per day.

    That number may dwindle a bit once the tool has been around for a while, but it’s clear that a lot of people aren’t happy with the search results that are out there about them, and intend to see these results removed from public view. Google and other search engines may ultimately not comply with such requests – they’ll be evaluated on a case-by-case basis – but we’ll at least get a good idea of what the numbers look like from time to time.

    The Guardian is reporting that Google intends to include the “right to be forgotten” requests in its bi-annual transparency reports, which have historically shown takedown request numbers related to government and copyright requests, as well as requests for information about users.

    The company describes its report as “data that sheds light on how laws and policies affect Internet users and the flow of information online.”

    The Guardian also reiterates that Google will likely alert users on search results pages when content has been removed. Josh Halliday reports:

    The search engine is considering placing an alert at the bottom of each page where it has removed links in the wake of the landmark “right to be forgotten” ruling last month.

    It is understood Google is planning to flag censored search results in a similar way to how it alerts users to takedown requests over copyright infringing material. For example, a Google search for “Adele MP3” shows that it has removed a number of results from that page after receiving complaints under the US Digital Millennium Copyright Act.

    Search reporter Danny Sullivan spoke to Google about its plans recently, and also said that Google would show users when content has been removed.

    Ironically, by doing so, Google will show users that the subject of their search has something to hide – something so bad that they’ve gone to great lengths to have that it removed from Google search results. In some cases, this could be just as damaging to the subject’s reputation as if the results were even there in the first place. At least if the results were there, the searcher would know what they were dealing with.

    More on how Google’s new tool works here.

    Image via Google

  • Jeff Dunham Forced to Censor His Act in Malaysia

    Jeff Dunham Forced to Censor His Act in Malaysia

    Despite past controversy and the ire of many other comedians, Jeff Dunham continues to draw huge crowds across the U.S. and the world. The comedian, best known for using ventriloquist puppets as part of his act, has now run up against government censorship due to his caricature of Islam.

    Dunham, who most recently drew crowds to his “All Over the Map” world tour, was forced by the Malaysia Ministry of Culture and Arts to alter his show for an appearance in that country.

    According to a statement released by Dunham, the ministry’s demands involved the Achmed the Dead Terrorist puppet, which Dunham uses to send up an extremely exaggerated stereotype of Middle Eastern terrorists. The puppet is one of Dunham’s most popular and is also one of the puppets Dunham uses to serve as a foil for racially or religiously sensitive comedy. The comedian describes Achmed as an “inept suicide bomber whose love of life harshly conflicts with his own profession.”

    For the show to go on, Malaysian authorities requested that Achmed’s entire likeness and name be changed. In addition, Dunham would have to censor his act of any references to virgins or any other overtly religious topics.

    Though he later stated that he is a proponent of free speech, Dunham did accede to the Ministry of Culture. The comedian changed the Achmed puppet and re-wrote portions of his act in the hours before his performance, re-casting the puppet as Achmed’s brother. Dunham said he now considers the performance one of the highlights of his tour. He also emphasized that Achmed went over very well in both the United Arab Emirates and Israel.

    “While I respect the wishes of our host country, I’m also an American and a firm believer in the freedom of speech,” said Dunham. “On the other hand, I wanted to avoid Achmed and I getting thrown in jail and being caned.

    “So let’s just say that the character that took center stage in Malaysia was strikingly familiar to all in attendance, but it was a last minute twist that became the highlight of the entire tour.”

    Image via Youtube

  • Report: Google To Have ‘Right To Be Forgotten’ Tool Up In 2 Weeks

    Earlier this week, the Court of Justice of the European Union ruled that Google and other search engines must take requests from people for search results to be deleted. It’s up to the search engines to determine whether or not to comply with such requests. If Google, for example, feels there is a legitimate reason for a particular result to be removed, it can do so. If it doesn’t, it might have to go to court on a case-by-case basis when individuals are willing to fight it.

    The Wall Street Journal is now reporting, citing Germany privacy officials, that Google will “create a mechanism for German users to request the removal of links to information about them from the company’s popular search engine within the next two weeks.”

    In the past, Google has been very vocal about its opposition to removing legal content from search results. The company considers this a form of censorship. With this news, Google appears to be at least playing ball with the recent ruling, in giving users a new way to request that results are pulled. How often Google will actually comply remains to be seen.

    As expected, more complaints are coming out quickly as a result of the ruling. The Journal reports that privacy regulators typically see about 100 requests to suppress search results each year, but had immediately received eight on the day after the ruling alone.

    According to BBC News, Google has already received takedown requests from an ex-politician seeking re-election (regarding “behaviour in office”), a man convicted of possessing child abuse images (regarding pages about his conviction), and a doctor (regarding negative reviews from patients).

    You can already see where the ruling could be a pretty big problem. On one hand, if Google were to delete such things, it could have a dangerous outcome because of the lack of information. On the other hand, it’s going to cost money for Google to battle such individuals in court, and there will clearly be a lot more of them than ever before thanks to the ruling.

    Image via Google

  • Google May Have To Delete Search Results When Requested

    The Court of Justice of the European Union has ruled that Google and other search engines must delete search results at people’s request in some cases, and it’s up to the search engines to determine when to comply. If an agreement can’t be reached between the search engine and the person requesting the deletion of information, then they’ll have to go to court to sort it out.

    Some people have wanted to be able to have information about themselves removed from Google for years, and this is a major development in that storyline. On the other side of the coin, some would say that being forced to get rid of info about people just because they don’t like it amounts to censorship. That’s Google’s argument.

    What do you think? Should Google have to remove search results about people at their request? Share your opinion in the comments.

    This particular case involves Spanish man Mario Costeja, who complained of an auction notice of his repossessed home, which is now resolved, continuing to show up in Google search results, infringing upon his privacy. There are at least 180 more similar cases in Spain alone, where people want Google to get rid of search results for one reason or another.

    The issue is certainly not limited to Spain. There are people all over the world, who would love to see certain pieces of information about themselves disappear from Google’s search results. People charged with crimes, but acquitted, for example, don’t want stories about their arrests showing up in Google results for their names (not that those who weren’t acquitted do either).

    It’s not that Google doesn’t care about this stuff at all. Last year, they launched an algorithm update to demote shady mugshot sites that show people’s mugshots, and make them pay for removal.

    Google hasn’t written about this latest ruling yet, but, Google’s Head of Free Expression William Echikson wrote in February of last year, after declining to comply with an order for the Spanish Data Protection Authority:

    We were asked to remove links from our search results that point to a legal notice published in a newspaper. The notice, announcing houses being auctioned off as part of a legal proceeding, is required under Spanish law and includes factually correct information that is still publicly available on the newspaper’s website.

    There are clear societal reasons why this kind of information should be publicly available. People shouldn’t be prevented from learning that a politician was convicted of taking a bribe, or that a doctor was convicted of malpractice. The substantive question before the Court today is whether search engines should be obliged to remove links to valid legal material that still exists online.

    We believe the answer to that question is “no”. Search engines point to information that is published online – and in this case to information that had to be made public, by law. In our view, only the original publisher can take the the decision to remove such content. Once removed from the source webpage, content will disappear from a search engine’s index.

    Of course, there will also be times when information is published online that is subsequently found by a court to be incorrect, defamatory or otherwise illegal. Such content can be removed from the source website and from search engines. But search engines should not be subject to censorship of legitimate content for the sake of privacy – or for any other reason.

    I don’t imagine their stance has changed much since then.

    The AP did get a statement from Google spokesman Al Verney, who called the ruling “disappointing … for search engines and online publishers in general,” and said Google will “now need to take time to analyze the implications.”

    The ruling says that Google and other search engines have to weigh “the legitimate interest of Internet users potentially interested in having access to that information” against the privacy implications of what is being requested for removal. If the search engine doesn’t want to remove something, but the person wants to fight it, they may need to go to a local judge or regulator.

    “An internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties,” says a press release from the court. “Thus, if, following a search made on the basis of a person’s name, the list of results displays a link to a web page which contains information on the person in question, that data subject may approach the operator directly and, where the operator does not grant his request, bring the matter before the competent authorities in order to obtain, under certain conditions, the removal of that link from the list of results.”

    More from the document:

    So far as concerns, next, the extent of the responsibility of the operator of the search engine, the Court holds that the operator is, in certain circumstances, obliged to remove links to web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person’s name. The Court makes it clear that such an obligation may also exist in a case where that name or information is not erased beforehand or simultaneously from those web pages, and even, as the case may be, when its publication in itself on those pages is lawful.

    The Court points out in this context that processing of personal data carried out by such an operator enables any internet user, when he makes a search on the basis of an individual’s name, to obtain, through the list of results, a structured overview of the information relating to that individual on the internet. The Court observes, furthermore, that this information potentially concerns a vast number of aspects of his private life and that, without the search engine, the information could not have been interconnected or could have been only with great difficulty. Internet users may thereby establish a more or less detailed profile of the person searched against. Furthermore, the effect of the interference with the person’s rights is heightened on account of the important role played by the internet and search engines in modern society, which render the information contained in such lists of results ubiquitous. In the light of its potential seriousness, such interference cannot, according to the Court, be justified by merely the economic interest which the operator of the engine has in the data processing.

    However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of internet users potentially interested in having access to that information, the Court holds that a fair balance should be sought in particular between that interest and the data subject’s fundamental rights, in particular the right to privacy and the right to protection of personal data. The Court observes in this regard that, whilst it is true that the data subject’s rights also override, as a general rule, that interest of internet users, this balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.

    Finally, in response to the question whether the directive enables the data subject to request that links to web pages be removed from such a list of results on the grounds that he wishes the information appearing on those pages relating to him personally to be ‘forgotten’ after a certain time, the Court holds that, if it is found, following a request by the data subject, that the inclusion of those links in the list is, at this point in time, incompatible with the directive, the links and erased. The Court observes in this regard that even initially lawful processing of accurate data may, in the course of time, become incompatible with the having regard to all the circumstances of the case, the data appear to be inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed and in the light of the time that has elapsed. The Court adds that, when appraising such a request made by the data subject in order to oppose the processing carried out by the operator of a search engine, it should in particular be examined whether the data subject has a right that the information in question relating to him personally should, at this point in time, no longer be linked to his name by a list of results that is displayed following a search made on the basis of his name. If that is the case, the links to web pages containing that information must be removed from that list of results, unless there are particular reasons, such as the role played by the data subject in public life, justifying a preponderant interest of the public in having access to the information when such a search is made.

    The Court points out that the data subject may address such a request directly to the operator of the search engine (the controller) which must then duly examine its merits. Where the controller does not grant the request, the data subject may bring the matter before the supervisory authority or the judicial authority so that it carries out the necessary checks and orders the controller to take specific measures accordingly.

    Here’s the full text of the judgment of the court. Here’s the press release about it, which is a bit easier to digest.

    Google has been fighting this battle with other parties throughout Europe for some time. The big story last year was about former Formula One Racing head Max Mosley, who allegedly attended an orgy, which came in a leaked video in 2008. He’s been fighting Google in court to have results about that removed.

    Google also blogged about that in September with a post called “Fighting against a censorship machine.”

    “We sympathize with Mr. Mosley, and with anyone who believes their rights have been violated,” wrote Google Associate General Counsel Daphne Keller. “We offer well-established tools to help people to remove specific pages from our search results when those pages have clearly been determined to violate the law. In fact, we have removed hundreds of pages for Mr. Mosley, and stand ready to remove others he identifies.”

    “But the law does not support Mr. Mosley’s demand for the construction of an unprecedented new Internet censorship tool,” she added. “In repeated rulings, Europe’s highest court has noted that filters are blunt instruments that jeopardise lawful expression and undermine users’ fundamental right to access information. A set of words or images may break the law in one context, but be lawful in another. As an example, a filter might end up censoring news reports about Mr. Mosley’s own court case.”

    Mosley may be happy to see the latest ruling in Spain, though it doesn’t mean he’d ultimately get the content removed as he wants. It would likely just mean more time in courtrooms – something the ruling is probably going to mean a whole lot more of for Google itself.

    It will be interesting to see how high the numbers of complaints jump up after the ruling. It’s unclear what impact the ruling will have on Google’s policy in Spain or the rest of the world. To be continued…

    What do you make of the ruling? Is this a win for privacy and online reputation management or is it a dangerous precedent opening up a huge can of worms? Share your thoughts in the comments.

    Image via Google

  • ‘South Park: The Stick of Truth’ Censored For Europe

    The release of South Park: The Stick of Truth is only one week away. After a year of delays and teases, South Park fans will finally be able to see the series translated in all of its glory into an RPG. At least, South Park fans outside of Europe will be able to.

    A review guide for Stick of Truth that surfaced online today shows that the EMEA (Europe, Middle East, and Africa) console versions of the game will be censored.

    According to the document, seven separate 20-second scenes have been removed from the European console versions of Stick of Truth. The missing scenes include two in which characters (including Randy Marsh) receive an abortion and five in which various characters are “actively” probed in the anus.

    South Park Studios, which has dealt with censorship before, is handling the matter in much the same way it does for censorship on the TV show. Instead of simply changing or blurring out the content, the entirety of the content will be removed and replaced with an image background and description text penned by South Park creators Matt Stone and Trey Parker. Knowing the studio, the text should be very colorful, descriptive, and possibly funnier than the uncensored version.

    In addition to the censorship revelation, the review guide also contains a long list of fixes for Stick of Truth that will be coming in a day-one patch. The fixes include glitches, animation improvement, and some optimization. Hopefully the patch will prevent the sort of launch bugs that Obsidian games are somewhat infamous for.

    via All Games Beta

  • FCC Wants Greater Control Of The News

    FCC Wants Greater Control Of The News

    The Federal Communications Commission is planning on conducting a study to determine how news outlets decide what topics to take up. It sounds simple and innocent enough, but many are critical about the move on speculations that it is the government’s way of elbowing itself into newsrooms.

    When questioned about government surveillance of the media, FCC chair Tom Wheeler stated that the agency does not intend to regulate what broadcasters or journalists have to say. The study the FCC wants to conduct, called the Multi-Market Study of Critical Information Needs, aims only to identify if there are potential barriers in the market, and if there are, whether those obstacles have the power to affect the diversity of media voices.

    What made the media so skeptical?

    FCC commissioner Ajit Pai stated that the questions posted by the study will not be easy for broadcasters to ignore, even if participation is on a voluntary basis. Pai states that through the FCC study, the administration will force newsrooms to conform to what the study demands, or else probably be denied an FCC license.

    Critics also believe that the FCC presence in newsrooms could also be the government’s vehicle in telling news outlets what to write about. Plus, considering the recent NSA surveillance leak and the IRS controversy, it’s understandable why the media is thinking they’re next.

    Mike Cavender of the Radio Television Digital News Association thinks the study must be completely scrapped, because just the concept of having a study like that is abhorrent to those who pride themselves on their “journalistic independence.”

    What critics are saying is simply this: the government has no place in the newsroom, and the study is bound to impede with media practitioners’ First Amendment rights.

    The American Center for Law and Justice is urging the media and concerned citizens to sign its petition opposing the study. The study’s parameters have not been finalized, however, and Wheeler has said that the commission is open to comments.

    Image via FCC.gov

  • Bing Says It’s Not Censoring Chinese Search Results In The U.S.

    If you know anything about the Internet, you’re well aware of the Great Firewall of China. The country polices what its citizens can and can not see on the Internet which means search engines wanting to operate in the country must censor their own results. Bing, being a search engine that operates in China, must censor its results like everybody else. What happens when Bing starts censoring in other countries as well though?

    On Tuesday, The Guardian reported that Bing was censoring search results in the U.S. for searches made in simplified Chinese. The censored topics included the usual subjects, like the Dalai Lama, Tiananmen Square and other taboo topics that the Chinese government would rather its citizenry not see. The same search made in English would result in Bing returning the usual results you would expect in the U.S.

    As you can imagine, this is all very troubling for a number of reasons. The most prominent being Chinese users in other parts of the world or those using VPNs to get around the Great Firewall would no longer get uncensored search results when using Bing. The Guardian also noted that these censored results wouldn’t tell you they were censored whereas searches made in China do.

    Since the story broke, Bing has come out and said that they don’t censor search results unless the IP originated from inside China. Speaking to the Verge, Stefan Weitz, senior director of Bing, says that the censored results were the result of an error. Under normal operation, Bing would return uncensored results even if the one searching had their location set to China. As long as the IP doesn’t originate from within the mainland, the search results should be censorship free.

    Of course, such a story can’t escape the ever watchful eye of our favorite Taiwanese animators. Here’s their take on it:

    Image via Taiwanese Animators/YouTube

  • Should Google Be Forced To Filter Search Results?

    There are a lot of people out there with things in their past that they’re not proud of. Sometimes those things make there way to the Internet and do a great deal of damage to their reputation. This stuff comes up when people search on Google, and Google traditionally has not removed such content unless required to do so by law.

    One man is currently trying to get his damaging content out of Google, and not just removed, but filtered as it’s created. A French court has sided with him, and ordered Google to comply.

    Do you think Google should be forced to filter results? Let us know in the comments.

    Google has been in a legal battle in France for the past couple months regarding reputation-damaging search results involving former Forumua One Racing head Max Mosley’s attendance at orgy, which was leaked in a video back in 2008.

    News of the World had published footage of the orgy, which was described as involving Nazi role-playing. While owning up to the orgy, he denied the Nazi element, which a court also said there was no evidence of after he sued the publication.

    Mosley sued Google with the goal of getting this content out of search results, potentially setting a dangerous precedent in search engine censorship.

    When Google went to court in September, it took to its Europe Policy blog to discuss the case, saying that Mosley requested the judge impose “an alarming new model for automated censorship.”

    Google must be alarmed now. It hasn’t posted anything about it on the blog yet. It will reportedly appeal, however.

    The New York Times reports:

    On Wednesday, the Tribunal de Grande Instance in Paris backed Mr. Mosley’s attempts to force Google to block references to the images from appearing in Google’s search results worldwide. The company said it would appeal the decision.

    Google had this to say about the case in the initial blog post:

    He wants web companies to build software filters, in an attempt to automatically detect and delete certain content. Specifically, Mr. Mosley demands that Google build a filter to screen Google’s index and proactively block pages containing images from our results – without anyone, much less a judge, ever seeing it or understanding the context in which the image appears.

    We sympathize with Mr. Mosley, and with anyone who believes their rights have been violated. We offer well-established tools to help people to remove specific pages from our search results when those pages have clearly been determined to violate the law. In fact, we have removed hundreds of pages for Mr. Mosley, and stand ready to remove others he identifies.

    But the law does not support Mr. Mosley’s demand for the construction of an unprecedented new Internet censorship tool. In repeated rulings, Europe’s highest court has noted that filters are blunt instruments that jeopardise lawful expression and undermine users’ fundamental right to access information. A set of words or images may break the law in one context, but be lawful in another. As an example, a filter might end up censoring news reports about Mr. Mosley’s own court case.

    While constituting a dangerous new censorship tool, the filter would fail to solve Mr. Mosley’s problems. Pages removed from search results remain live on the Internet, accessible to users by other means – from following links on social networks to simply navigating to the address in a browser. As an example, one page Mr. Mosley sought to remove comes from a blog, which according to public sources, receives the vast majority of its visits from sources other than web search.

    Interestingly enough, this comes after Google adjusted its algorithm on its own to prevent mug shot sites’ content from ranking in search results, which could help protect the reputations of some people.

    In another case in June, a European court said Google didn’t have to remove search results when a Spanish man sought for it to remove reputation-damaging materials.

    Do you think Google should be forced to filter results from its search engine? Let us know what you think in the comments.

    Image: Onfreespeech (YouTube)

  • Cubans Using Thumb Drives to Exchange Information

    Citizens of the Republic of Cuba remain mostly disconnected from the internet in the communist country, though have been inventive in finding ways to access and exchange information online.

    The Republic of Cuba, along with China, Laos and Vietnam, is one of the world’s four remaining socialist states espousing communism. Whereas China has a great firewall, Cuba plainly has little internet access.

    At a meeting of the Inter American Press Association in Denver, Yoani Sanchez related the present state of Cuban press and media, comparing Raul Castro and Fidel Castro’s governments. “They play the good and the bad policeman but in the end they are two policemen,” Sanchez explained. Plainly, Raul is adept at arresting and beating those who speak out against the country, much like Fidel did.

    Legal internet access does exist in Cuba. About 200 internet cafes had popped up in the nation in 2013, though the connections are slow, heavily censored, and cost about 5 dollars an hour, which is roughly a third of an average monthly salary in Cuba.

    Regardless, people are able to blindly post things to Twitter with smartphones, which Sanchez describes as being akin to sending a message in a bottle. One cannot really be sure if the tweet was actually posted, or who is reading it.

    Interestingly, Sanchez has developed quite a following on Twitter, as a new Cuban mandate allows people to travel abroad without permission. Sanchez has visited over a dozen countries this year to speak out against the Cuban government, where she can actually see her tweets. Still, when she returns home, she’s essentially anonymous to her compatriots, regarding any internet presence.

    Sanchez, 38, also pointed out that thumbs drives are integral for the exchange of information in Cuba. She joked that when Cuba is free, the country will have to establish a monument to the thumb drive, which she said has done better to help the country than many of the people now honored by statues there.

    Image via Twitter.

  • Former Formula One Head Wants Google To Remove Results Showcasing Infamous Orgy

    Google went to court in France on Wednesday because of a suit filed by Max Mosley, former head of Formula One racing. The battle is similar to others Google has fought in the past. There is undesirable content about Mosley on the Internet, and he wants it out of Google’s index, but Google doesn’t readily remove results unless legally required to do so.

    Mosley was involved in a big orgy scandal, and understandably doesn’t want the remnants available for everybody with access to Google to be able to pull up anytime they want. The problem is that that’s not how it works.

    Back in 2008, when News of the World published footage of an orgy, which was described as involving Nazi role-playing. While owning up to the orgy, he denied the Nazi element, which a court also said there was no evidence of (this was after suing the publication).

    Currently if you Google “max mosley orgy,” you might se a top result from LiveLeak with a video, under the title “F1’s Max Mosley’s Nazi Orgy with 5 Hookers.”

    Google took to its Europe Policy blog to discuss the case, saying that Mosley requested the judge impose “an alarming new model for automated censorship.” Here’s an excerpt from the post:

    He wants web companies to build software filters, in an attempt to automatically detect and delete certain content. Specifically, Mr. Mosley demands that Google build a filter to screen Google’s index and proactively block pages containing images from our results – without anyone, much less a judge, ever seeing it or understanding the context in which the image appears.

    We sympathize with Mr. Mosley, and with anyone who believes their rights have been violated. We offer well-established tools to help people to remove specific pages from our search results when those pages have clearly been determined to violate the law. In fact, we have removed hundreds of pages for Mr. Mosley, and stand ready to remove others he identifies.

    But the law does not support Mr. Mosley’s demand for the construction of an unprecedented new Internet censorship tool. In repeated rulings, Europe’s highest court has noted that filters are blunt instruments that jeopardise lawful expression and undermine users’ fundamental right to access information. A set of words or images may break the law in one context, but be lawful in another. As an example, a filter might end up censoring news reports about Mr. Mosley’s own court case.

    While constituting a dangerous new censorship tool, the filter would fail to solve Mr. Mosley’s problems. Pages removed from search results remain live on the Internet, accessible to users by other means – from following links on social networks to simply navigating to the address in a browser. As an example, one page Mr. Mosley sought to remove comes from a blog, which according to public sources, receives the vast majority of its visits from sources other than web search.

    In June, a European court said Google didn’t have to remove search results when a Spanish man sought for Google to remove reputation-damaging materials. It seems fairly likely that Google will achieve a similar outcome this time around, though Google was ordered by a German court to remove defamatory autocomplete suggestions in May.

    The court will reportedly reveal its decision on the case on October 21st. The appeal process could of course take place after that.

    Image: Onfreespeech (YouTube)

  • Google May Face Full Pakistan Block Over “Blasphemous” Materials

    Google May Face Full Pakistan Block Over “Blasphemous” Materials

    Pakistan’s new IT and telecommunications minister has a problem. She wants to end to longstanding ban on Google’s YouTube, but she also needs assurances that Google will work to block “blasphemous and objectionable” materials from the world’s most popular video sharing site.

    And in order to ensure that happens, the new minister is making a sort of blanket threat against Google – clean it up here in Pakistan or face an all-out ban.

    According to The Times of India, Anusha Rahman Khan sees the total Google block as a last resort.

    “It all depends on our negotiation clout. If they persist with their stance, we can block Google in Pakistan as a last resort as there are many alternative search engines available on the web,” said Khan.

    Khan made it clear that she wishes to get started in unblocking YouTube – but certain assurances from Google need to be in place.

    “Our ministry is responsible for policy decisions, so it’s our job to ensure reopening of YouTube as soon as possible with thorough screening of objectionable material. I will immediately start work on it after a presentation by ministry officials on Monday…We will pump in extra money if needed and do whatever is in our capacity to bring YouTube back to Pakistan without compromising our ethical values,” Khan told Dawn.

    The Pakistani government has had a rocky relationship with YouTube over the past few years. The site was first banned back in 2008 after the Pakistan Telecommunication Authority cited a rise in “non-Islamic, objectionable videos.” Shortly after, the ban was lifted when much of the material was removed from YouTube servers.

    The site stayed open and accessible in the country until 2010, when Pakistan again blocked YouTube in response to “Everybody Draw Mohammed Day.” About a week later, the site was reinstated.

    The latest ban has been the most longstanding. Pakistan blocked YouTube back in September of 2012 in response to the controversial YouTube video “Innocence of Muslims,” which depicts the religion’s prophet as a fool and a child-molesting deviant. Google decided not to remove the video from YouTube.

    Of course, YouTube isn’t the only site that the Pakistani government has been known to censor. They’ve blocked Twitter in the past, and a few years ago made a sweeping ban on many porn sites.

    Khan seems to want to make sure she can assure Google’s compliance before making any sort of decision.

    “We cannot face the embarrassment of opening the website and closing it again after protests. We have to ensure that proper filtration system is in place before we open the website,” said Khan.

    [via CNET]