WebProNews

Tag: Right To Be Forgotten

  • France Wants To Impose Its Laws On Google Worldwide

    France Wants To Impose Its Laws On Google Worldwide

    Google is fighting a ruling in France that requires Google to not just honor search removal requests from users in France, but to also censor those results worldwide, including the USA. In March, the French data protection regulator (CNIL) ordered that its interpretation of French law regarding the right to be forgotten should apply not just in France, but in every country in the world.

    This runs counter to the long-standing principal, that existed way before the advent of the internet, that a law in one country can’t be imposed on other countries. Otherwise, you would quickly get down to the lowest common denominator where countries with strict censorship rules would be able to force their standards on more open countries that value free speech, such as the United States.

    Google’s global general counsel Kent Walker published an article in France’s Le Monde newspaper which Google republished in English in its Google Europe Blog. In the post Walker rightly attacks the concept that one country can tell other countries what to do stating:

    “For hundreds of years, it has been an accepted rule of law that one country should not have the right to impose its rules on the citizens of other countries. As a result, information that is illegal in one country can be perfectly legal in others: Thailand outlaws insults to its king; Brazil outlaws negative campaigning in political elections; Turkey outlaws speech that denigrates Ataturk or the Turkish nation — but each of these things is legal elsewhere. As a company that operates globally, we work hard to respect these differences.”

    The “right to be forgotten” is a right given to European Union member states allowing anyone to require Google to remove search result listings on request by individual citizens of those countries. This was based on a 2014 ruling by the Court of Justice of the European Union (CJEU). As Walker points out in his article, “It lets Europeans delist certain links from search engine results generated by searches for their name, even when those links point to truthful and lawfully published information like newspaper articles or official government websites.”

    Walker revealed that Google has of now reviewed nearly 1.5 million webpages, delisting around 40%. In France alone, they’ve reviewed over 300,000 webpages, delisting nearly 50%.

    In March Google expanded the EU’s right to remove search listings even when searching on other domains from within an EU country. In other words, if you went to the main U.S. version of Google while in France you will now still only see the France version of Google search results. Also in March, in response to the Paris terrorist attacks, the French government enacted a decree that enables administrative de-listing of websites from search engines without requiring any judicial oversight. The problem with censorship for supposedly good reasons is that it often quickly leads to censoring benign content or censoring content that isn’t terrorism but is simply politically incorrect or disagrees with the governments perspective. A government for instance, may censor content that promotes free speech on the basis that it might be seen as sparking violence with Muslims. It’s a very slippery slope away from free speech and freedom in general.

    Google’s Global General Counsel went on to say”

    “As a matter of both law and principle, we disagree with this demand. We comply with the laws of the countries in which we operate. But if French law applies globally, how long will it be until other countries – perhaps less open and democratic – start demanding that their laws regulating information likewise have global reach? This order could lead to a global race to the bottom, harming access to information that is perfectly lawful to view in one’s own country. For example, this could prevent French citizens from seeing content that is perfectly legal in France. This is not just a hypothetical concern. We have received demands from governments to remove content globally on various grounds — and we have resisted, even if that has sometimes led to the blocking of our services.”

    Google has filed an appeal of the CNIL’s order with France’s Supreme Administrative Court, the Conseil d’Etat.

  • Google Provides ‘Right To Be Forgotten’ Update

    Google Provides ‘Right To Be Forgotten’ Update

    Google shared some new numbers related to the “right to be forgotten,” ruling, which has led to individuals requesting URL removals from search results. For all the background on that, peruse our coverage here.

    The stats appear on Google’s Transparency Report, where Google now claims to have evaluated for removal 1,234,092 URLs. The total number of requests it has seen dating back to May, 2014 is 348,085.

    Here’s the latest look at the sites that are most impacted:

    Screen Shot 2015-11-25 at 11.47.39 AM

    This list, Google says, highlights the domains where it has removed the most URLs from search results. Of the total URLs requested for removal, these sites account for 9%.

    Check out the full Transparency Report here.

    Images via Google

  • Google ‘Right To Be Forgotten’ Appeal Shut Down

    Google ‘Right To Be Forgotten’ Appeal Shut Down

    In June, French regulators ordered Google to extend its “Right to be Forgotten” search engine delistings to its sites around the world rather than only in Europe. From their perspective, Google leaving such listings available in other versions of its search engine (such as the American Google.com) lets people easily get around the delistings in localized, European versions of Google. They’re not wrong about that.

    On the other side of the coin, however, Google argues that by complying with this, it would effectively be enabling one regulator to to have control over what happens around the entire world.

    Google appealed in July, but news is out now that its appeal has been blocked, and Google now finds itself at the stage where it has no more course for appeal before facing impending fines. CNIL sasy the appeal has been rejected for the following reasons:

    Geographical extensions are only paths giving access to the processing operation. Once delisting is accepted by the search engine, it must be implemented on all extensions, in accordance with the judgment of the ECJ.

    If this right was limited to some extensions, it could be easily circumvented: in order to find the delisted result, it would be sufficient to search on another extension (e.g. searching in France using google.com) , namely to use another form of access to the processing. This would equate stripping away the efficiency of this right, and applying variable rights to individuals depending on the internet user who queries the search engine and not on the data subject.

    In any case, the right to delisting never leads to deletion of the information on the internet; it merely prevents some results to be displayed following a search made on the sole basis of a person’s name. Thus, the information remains directly accessible on the source website or through a search using other terms. For instance, it is impossible to delist an event.

    In addition, this right is not absolute: it has to be reconciled with the public’s right to information, in particular when the data subject is a public person, under the double supervision of the CNIL and of the court.
    Finally, contrary to what Google has stated, this decision does not show any willingness on the part of the CNIL to apply French law extraterritorially. It simply requests full observance of European legislation by non European players offering their services in Europe.

    You can read CNIL’s whole announcement about the rejection here.

    The Guardian shares quotes from both CNIL (the French regulator) and Google:

    CNIL said in a statement: “Contrary to what Google has stated, this decision does not show any willingness on the part of the CNIL to apply French law extraterritorially. It simply requests full observance of European legislation by non European players offering their services in Europe.”

    A Google spokesman said: “We’ve worked hard to implement the ‘right to be forgotten’ ruling thoughtfully and comprehensively in Europe, and we’ll continue to do so. But as a matter of principle, we respectfully disagree with the idea that one national data protection authority can assert global authority to control the content that people can access around the world.”

    According to the report, Google faces a fine around €300,000 if it doesn’t comply, but that could increase to between 2% and 5% of global operating costs. The company will reportedly then be able to appeal the fine with the he Conseil d’Etat, which serves as the supreme court in France.

    Image via Google

  • Will Google Have To Censor Search Results On A Global Basis?

    Google may end up having to censor its search results around the world, including on Google.com, just as it has been forced to do with certain European versions of its search engine.

    One of 2014’s big search industry stories was Google being forced to comply with a “right to be forgotten” regulation in the European Union. Ultimately, Google has had to remove content from its search results in cases where requests from individuals are found to be worthy of removal. The whole thing, as expected, has been a pretty big and complex mess, but Google has nevertheless played ball and continued to try and improve the process.

    Google has only been doing this in countries where it has been required to do so, and since it went into effect, it’s been facing pressure from European regulators to not only block the results in those countries, but to block the same results in all of its search engines worldwide, including Google.com. Google has not complied with that, and does not seem intent on doing so anytime soon.

    However, France has given Google 15 days to comply before the company faces fines, though the appeal process will no doubt drag the whole thing on much longer. Still, it’s not only Europe where Google is facing this kind of pressure. It’s even happening right here in North America, where the Canadian government is also urging Google to remove search results worldwide.

    Danny Sullivan at Search Engine Land explains the Canadian situation, which he notes may ultimately play out more quickly than the much more publicized situation in Europe:

    Here’s the backstory. A Canadian company named Equustek Solutions won a trademark infringement case against another company called Datalink Technologies Gateways. Equustek then wanted Google to remove links to Datalink. Google did so, but only for those using the Google Canada site.

    Back to court. Last June, a Canadian judge in British Columbia ordered that Google remove Datalink from its search results. All of them, worldwide. Google appealed; now it has lost that appeal.

    As Sullivan notes, Google may still appeal this further, but as it stands right now, things are not working out in Google’s favor, and the pressure on Google to censor search results continues to build.

    Image via Google

  • Should Europe’s Search Law Apply To The World?

    Late last year, EU regulators in Brussels said they wanted the controversial “Right to be Forgotten” ruling applied to search results on a global basis rather than just in its own jurisdiction as it stands today. In other words, if someone is successfully able to get Google (or other search engines) to remove search results about them from its index in Europe, regulators want the search engine to remove the results from all of its localized versions, including Google.com.

    Do you think results should be removed all over the world or should it be limited to Europe? Let us know what you think.

    Obviously this is a tricky subject since it leads to censorship of results in other countries with different laws.

    The Google Advisory Council on the Right to be Forgotten weighed in on the subject in a report. This is who the council is made up of (you can click the image to be taken to the official site, where you can read each person’s bio):

    The report looks at an overview of the ruling, the criteria for assessing delisting requests, and procedural elements. One section deals specificalliy with the geographic scope issue. Here’s what that part says:

    A difficult question that arose throughout our meetings concerned the appropriate geographic scope for processing a delisting. Many search engines operate different versions that are targeted to users in a particular country, such as google.de for German users or google.fr for French users. The Ruling is not precise about which versions of search a delisting must be applied to. Google has chosen to implement these removals from all its European-directed search services, citing the CJEU’s authority across Europe as its guidance.

    The Council understands that it is a general practice that users in Europe, when typing in www.google.com to their browser, are automatically redirected to a local version of Google’s search engine. Google has told us that over 95% of all queries originating in Europe are on local versions of the search engine. Given this background, we believe that delistings applied to the European versions of search will, as a general rule, protect the rights of the data subject adequately in the current state of affairs and technology.

    In considering whether to apply a deslistng to versions of search targeted at users outside of Europe, including globally, we acknowledge that doing so may ensure more absolute protection of a data subject’s rights. However, it is the conclusion of the majority that there are competing interests that outweigh the additional protection afforded to the data subject. There is a competing interest on the part of users outside of Europe to access information via name-based search in accordance with the laws of their country, which may be in conflict with the deslistings afforded by this Ruling. These considerations are bolstered by the legal principle of proportionality and extraterritoriality in application of European law.

    There is also a competing interest on the pat of users within Europe to access versions of search other than their own. The Council heard evidence about the technical possibility to prevent Internet users in Europe from accessing search results that have been delisted under European law. The Council has concerns about the precedent set by such measures, particularly if repressive regimes point to such a precedent in an effort to ‘lock’ their users into heavily censored versions of search results. It is also unclear whether such measures would be meaningfully more effective than Google’s existing model, given the widespread availability of tools to circumvent such blocks.

    The Council supports effective measures to protect the rights of data subjects. Given concerns of proportionality and practical effectiveness, it concludes that removal from nationally directed versions of Google’s search services within the EU is the appropriate means to implement the Ruling at this stage.

    In other words, with the overwhelming majority of Google users in Europe using localized versions of Google, it wouldn’t really be all that more effective in hiding results in question by removing them from other versions of Google outside of Europe. By doing so, search results would be unnecessarily censored in parts of the world (like the U.S.) where laws cater to open access of public information and media reports.

    Here’s the full report:

    Do you agree with the Council that the right to be forgotten should only apply to the European-based versions of Google and other search engines or do you think results should be removed from search engines on a global basis? Let us know in the comments.

  • Is The Right To Be Forgotten Dangerous?

    Google has released its latest Transparency Report, which as of earlier this year, now looks at URL removal requests from the highly-publicized Right to be Forgotten ruling in Europe. The inventor of the World Wide Web recently spoke out against the ruling, calling it dangerous. Meanwhile, the requests continue to roll in, and other parts of the world may start being affected.

    Do you agree that the Right to be Forgotten is a dangerous thing, or do you think it’s the right way for the Internet to work? Share your thoughts in the comments.

    Back in October, when Google first revealed its Right to be Forgotten removal request data in the Transparency report, it said it had evaluated 497,695 URLs for removal and received a total of 144,954 requests.

    The latest data has the numbers at 684,419 URLs evaluated and a total of 189,238 requests.

    On the Transparency Report site, Google also gives examples of requests it encounters. One involves a woman that requested Google remove a decades-old article about her husband’s murder, which included her name. The page has been removed for search results for her name.

    In another example, a financial professional in Switzerland asked Google to remove over 10 links to pages reporting on his arrest and conviction for financial crimes. Google did not remove pages from search results in those cases.

    A rape victim in Germany asked Google to remove a link to a newspaper article about the crime, which Google did in search results for the person’s name.

    According to the company, the sites that are most impacted by the URL removals are Facebook, ProfileEngine, YouTube, Badoo, Google Groups, Yasni.de, Wherevent.com, 192.com, yasni.fr, and yatedo.fr.

    One of the latest to speak out against the situation was none other than Tim Berners-Lee, the guy responsible for the World Wide Web. Via CNET:

    “This right to be forgotten — at the moment, it seems to be dangerous,” Berners-Lee said Wednesday, speaking here at the LeWeb conference. “The right to access history is important.”

    In a wide-ranging discussion at the conference, Berners-Lee said it’s appropriate that false information should be deleted. Information that’s true, though, is important for reasons of free speech and history, he said. A better approach to the challenge would be rules that protect people from inappropriate use of older information. An employer could be prohibited from taking into account a person’s juvenile crimes or minor crimes more than 10 years old, for example.

    The EU recently put forth some guidelines for the Right to be Forgotten, for search engines to work with, though they don’t go very far in terms of quelling the biggest concerns many have with the ruling, such as Berners-Lee’s.

    The Right to be Forgotten appears to be creeping out of Europe, and into other parts of the world. Consider this from earlier this month from Japan Times:

    Yes. In a possible first in Japan, the Tokyo District Court in October issued an injunction ordering Google to remove the titles and snippets to websites revealing the name of a man who claimed his privacy rights were violated due to articles hinting at past criminal activity.

    Tomohiro Kanda, who represented the man, said the judges clearly had the European court’s ruling in mind when they ordered Google to take down the site titles and snippets. Google has since deleted search results deemed by the court as infringing on the man’s privacy, Kanda said.

    But generally speaking, Japanese judges have yet to reach a consensus on how to balance the right to privacy and the freedom of expression and of information.

    Regulators in Europe have also been calling to have URLs removed from Google’s search engines worldwide rather than just from the European versions of Google.

    Are you concerned with the Right to be Forgotten? Let us know in the comments.

  • ‘Right To Be Forgotten’ Dangerous, According To Web’s Inventor

    A lot of people (especially those not trying to hide information about themselves) agree that the Right to Be Forgotten in Europe is problematic for a variety of reasons, including the censorship of information.

    The latest to speak out against the current situation is none other than Tim Berners-Lee, the guy responsible for the World Wide Web. Via CNET:

    “This right to be forgotten — at the moment, it seems to be dangerous,” Berners-Lee said Wednesday, speaking here at the LeWeb conference. “The right to access history is important.”

    In a wide-ranging discussion at the conference, Berners-Lee said it’s appropriate that false information should be deleted. Information that’s true, though, is important for reasons of free speech and history, he said. A better approach to the challenge would be rules that protect people from inappropriate use of older information. An employer could be prohibited from taking into account a person’s juvenile crimes or minor crimes more than 10 years old, for example.

    The EU recently put forth some guidelines for the right to be forgotten, for search engines to work with, though they don’t go very far in terms of quelling the biggest concerns many have with the ruling, such as Berners-Lee’s.

    Image via Wikimedia Commons

  • Google Has Some Right To Be Forgotten Guidelines To Work With

    As we here in the U.S. were entering holiday mode last week, official “Right to Be Forgotten” guidelines made their way to the public over in Europe. These come from the Article 29 Working Party, which is made up of data protection officials from throughout the European Union.

    In case you haven’t been keeping up, the Right to Be Forgotten came as the result of a ruling a few months ago. It enables people to request that search results about them be removed from search engines. Search engines like Google have been tasked with determining whether or not requests are legitimate as well as which ones to act upon. Search engines obviously don’t like removing results because it’s a form of censorship.

    Now, at least the engines have some guidelines to use as criteria for their evaluations rather than just kind of wining it as Google has been doing so far. The search engine, for the record, has been discussing approaches with various experts around the world.

    The new guidelines are as follows:

    Does the search result relate to a natural person – i.e. an individual? And does the search result come up against a search on the data subject’s name?

    Does the data subject play a role in public life? Is the data subject a public figure?

    Is the data subject a minor?

    Is the data accurate?

    Is the data relevant and not excessive?

    Is the information sensitive within the meaning of Article 8 of the Directive 95/46/EC?

    Is the data up to date? Is the data being made available for longer than is necessary for the purpose of the processing?

    Is the data processing causing prejudice to the data subject? Does the data have a disproportionately negative privacy impact on the data subject?

    Does the search result link to information that puts the data subject at risk?

    In what context was the information published?

    Was the original content published in the context of journalistic purposes?

    Does the publisher of the data have a legal power – or a legal obligation– to make the personal data publicly available?

    Does the data relate to a criminal offence?

    Here’s the full document, which elaborates on each of these, courtesy of Search Engine Land (or you can find it on the government website here):

    The blog also points to some findings from Forget.me including that Bing has only received about seven hundred requests to Google’s one hundred and sixty thousands.

    Image via Google

  • Regulators Want ‘Right To Be Forgotten’ Extended To Google.com

    As reported last week, a French court ordered Google to pay fines of €1,000 unless links to a “defamatory” article are removed from its global network. This prompted many to wonder if Google would start removing links taken out of specific European search results as a result of the Right to Be Forgotten ruling, from its search engines all over the world.

    Now, regulators in Brussels have said they want links removed under the ruling to be extended worldwide, which means removing the links from Google’s other search engines including Google.com. This would apply to other search engines like Yahoo and Bing as well, but Google is obviously the top dog and gets most of the focus. Via International Business Times:

    Isabelle Falque-Pierrotin, chairwoman of the Article 29 Working Group that issued Wednesday’s opinion, and who is also head of France’s data-protection regulator, said:

    “Huge social expectations have been created by this ruling. We believe Google, like other search engines, has been surprised by the ruling because they have new obligations to follow now. But the rules are not new; the obligations have applied to websites since 1995. The difference is that it now applies to search engines.”

    Falque-Pierrotin added: “The court says the delisting decision has to be effective. These decisions should not be easily circumvented by anybody.”

    The group issued a press release on the matter, which you can read here.

    Google has indicated that it will study the group’s guidelines carefully, but hasn’t offered much else in the way of comment.

    In semi-related news, Google also reached a settlement to remove defaming links from its search engine in a case not connected to the Right to Be Forgotten ruling.

    The company also has additional trouble in Europe as a potential break-up of the company is being weighed.

    Image via Google

  • Google Agrees To Remove Defaming Links In UK

    Google has settled a defamation suit in the UK, which was filed in response to content Google simply indexed in its search results. The settlement is noteworthy as historically Google has not claimed responsibility for the content in its results. It is, after all, just pointing to websites.

    Things have gotten trickier on that front in Europe, however, since the recent “right to be forgotten” ruling, which has forced Google to yank results based on requests and other criteria it has set, and is still trying to map out.

    This particular case actually isn’t directly related to that, but it’s certainly in the same ballpark, and further highlights how Google is treating this issue differently in Europe, even if it has no choice in some cases.

    The settlement (via Search Engine Land/BBC) was with Daniel Hegglin, a UK businessman, who had been called a murderer, a pedophile, and a KKK sympathizer by an alleged troll. He didn’t specifically target Google in the suit, but the company was brought into the case.

    Terms of the settlement were not disclosed, but Google said in a statement that it reached a “mutually acceptable agreement.”

    In all likelihood, this specific example of result removal is probably more tolerable by the search giant compared to the burden of the whole right to be forgotten mess. That is, by the way, getting even messier, as there’s talk that Google may have to get rid of these results throughout its global network of search engines. What once may have only had to be removed from one country’s version of Google may have to be removed from all of Google.

    Image via Google

  • ‘Right To Be Forgotten’ Going Global?

    ‘Right To Be Forgotten’ Going Global?

    Links removed from Google’s European search engines may end up having to be removed from Google’s other search engines.

    As you may know, Google has been removing links from its European search results because of what has come to be known as the “Right to Be Forgotten” law. The law enables people to request Google get rid of search results about them.

    As expected, the whole thing has been a big mess. It’s about to get even messier, as a new court ruling indicates that search engines like Google will have to extend their hiding of search results on a global basis.

    A French court ordered Google to pay fines of €1,000 unless links to a “defamatory” article are removed from its global network, The Guardian reports (via 9to5Google). The company, it says, is considering its options. The report shares this quote from a Google spokesperson:

    This was initially a defamation case and it began before the CJEU ruling on the right to be forgotten. We are reviewing the ruling and considering our options. More broadly, the right to be forgotten raises some difficult issues and so we’re seeking advice – both from data protection authorities and via our Advisory Council – on the principles we should apply when making these difficult decisions.

    Google has been engaging in something of a Right to be Forgotten tour, traveling around the world to talk with experts on how to proceed.

    This ruling adds a new element to the whole thing, as Google could face more and more fines from others following similar legal paths.

    The Guardian report has some additional context from a lawyer involved with the suit.

    In October, Google provided an update on search result removal stats. It said it had evaluated 497,695 URLs for removal, and had received a total of 144,954 requests.

    Image via Google

  • Google Gives New ‘Right To Be Forgotten’ Stats

    Google shared some new numbers related to the “right to be forgotten,” ruling, which has led to individuals requesting URL removals from search results. For all the background on that, peruse our coverage here.

    The stats appear on Google’s Transparency Report, where Google explains:

    In a May 2014 ruling, Google Spain v AEPD and Mario Costeja González, the Court of Justice of the European Union found that individuals have the right to ask search engines like Google to remove certain results about them. The court decided that search engines must assess each individual’s request for removal and that a search engine can only continue to display certain results where there is a public interest in doing so. For more information about our process and the data we’re providing here, please visit our FAQ.

    The company reveals it has evaluated 497,695 URLs for removal. It has received a total of 144,954 requests.

    Google also gives examples of requests it encounters. One involves a woman that requested Google remove a decades-old article about her husband’s murder, which included her name. The page has been removed for search results for her name.

    In another example, a financial professional in Switzerland asked Google to remove over 10 links to pages reporting on his arrest and conviction for financial crimes. Google did not remove pages from search results in those cases.

    A rape victim in Germany asked Google to remove a link to a newspaper article about the crime, which Google did in search results for the person’s name.

    According to the company, the sites that are most impacted by the URL removals are Facebook, ProfileEngine, YouTube, Badoo, Google Groups, Yasni.de, Wherevent.com, 192.com, yasni.fr, and yatedo.fr.

    Image via Google

  • Google Opens Up Registration For ‘Right To Be Forgotten’ Events

    Google has opened up registration for its first public consultations regarding the “right to be forgotten” law in Europe. The company previously announced its scheduled dates, which are as follows:

    September 9 in Madrid, Spain
    September 10 in Rome, Italy
    September 25 in Paris, France
    September 30 in Warsaw, Poland
    October 14 in Berlin, Germany
    October 16 in London, UK
    November 4 in Brussels, Belgium

    Those interested in providing presentations or discussion about the effects of the law to help Google improve how it handles requests, and to otherwise help shape the public conversation, can sign up here for the Madrid meeting and here for the Rome meeting.

    “At each meeting, the Council will listen to statements from invited experts, ask questions of the experts and discuss matters of law, technology, and ethics,” said Google Secretariat to the Advisory Council Betsy Matsiello. “The public portion of each Advisory Council meeting will last around two and a half hours, with an intermission and the whole meeting will also be live-streamed on the Advisory Council’s website.”

    “During the event, members of the audience can submit questions to the Council and invited experts,” she added. “The Council invites members of the public to share their thoughts on the Right to be Forgotten via the form at www.google.com/advisorycouncil – all contributions will be read and discussed.”

    Registration for each event will start roughly two weeks before it’s scheduled. Google says it will keep updating its European Policy blog with additional details.

    More on the “right to be forgotten” saga here.

    Image via Google

  • Telegraph, Like Wikipedia, Keeps List Of Articles ‘Forgotten’ By Google

    The “right to be forgotten” mess continues to get even messier. At least one newspaper is actually removing articles that have been removed from Google because of the law, from its own site, while also writing articles about removing such articles.

    So here’s an example of not only why the law is inherently flawed, but also of how much time it’s wasting on pretty much everybody’s part.

    The Daily Telegraph, as described by Danny Sullivan at Marketing Land, has been on a “campaign to document all its stories that have been removed” as a result of the law. The Telegraph’s Mattthew Sparkes even tweeted about how he’s spending his time (which would no doubt be better used reporting actual news).

    The list referenced in that last tweet contains eight bullet points about articles and images removed.

    Similarly, Wikipedia is keeping a running tab of stories that have been removed by Google.

    In other words, people requesting articles be removed only seem to be drawing more attention to the fact that they’ve done so, which seems to defeat the entire purpose. Shocking, right?

    For more background on the “right to be forgotten” and Google’s role, peruse our coverage here.

    Image via Google

  • Wikipedia Shows Content Google ‘Forgets’

    It was recently reported that Google is removing links to Wikipedia articles from search results in Europe thanks to the new “right to be forgotten“. The Wikimedia Foundation, which runs Wikipedia, has now put out a statement.

    Do you think the “right to be forgotten” law is going too far? Do you agree with the concept at all? Should Wikipedia articles be vanishing from Google results? Share your thoughts in the comments.

    The foundation says it has received multiple notices of intent to remove certain Wikipedia content from European search results, and that to date, the notices would affect over 50 links directing readers to Wikimedia sites.

    “The decision does not mandate that search engines disclose link censorship,” says recently appointed Wikimedia Foundation Executive Director Lila Tretikov. “We appreciate that some companies share our commitment to transparency and are providing public notice. This disclosure is essential for understanding the ruling’s negative impacts on all available knowledge.”

    What a fun time for Tretikov to be taking over, by the way. The foundation is not only dealing with this, but also with black hat paid editing.

    In terms of search engine disclosure of censorship, Google displays the following message at the bottom of search results pages:

    Some results may have been removed under data protection law in Europe. Learn more.

    The Wikimedia Foundation is keeping a running tab of notices it receives from search engines. One of them is about a link for a Wikipedia article on Gerry Hutch, which according to the article is about “an Irish convicted criminal, alleged to have been one of Ireland’s most successful bank robbers.”

    Splendidly showing how ridiculous the right to be forgotten ruling is, there’s now a section of the Wikipedia article dedicated to informing users that the URL was requested to be removed from search engines. It says:

    Due to a request under data protection laws of Europe, it was revealed in August 2014 that Google has removed the Wikipedia page on Hutch on some search results from European versions of Google.

    I imagine this will be pretty standard on affected articles. It will be interesting to see how crowded the page showing them all gets.

    “We only know about these removals because the involved search engine company chose to send notices to the Wikimedia Foundation,” the foundation says in its statement. “Search engines have no legal obligation to send such notices. Indeed, their ability to continue to do so may be in jeopardy. Since search engines are not required to provide affected sites with notice, other search engines may have removed additional links from their results without our knowledge. This lack of transparent policies and procedures is only one of the many flaws in the European decision.”

    Google further examined the complexity of complying with the decision in a questionnaire from regulators. The search engine has dates set up throughout the fall, for experts to discuss ideas and concepts for how this should all be implemented. Wikipedia co-founder Jimmy Wales will appear in Madrid next month at the first of these meetings.

    The Wikimedia Foundation has also released its first-ever transparency report, disclosing that in two years, it has received 304 general content removal requests, zero of which were granted. That seems like a surprisingly low number of requests, doesn’t it?

    “The Wikimedia Foundation is deeply committed to supporting an open and neutral space, where the users themselves decide what belongs on the Wikimedia projects,” write Legal Counsel Michelle Paulson and General Counsel Geoff Brigham.

    Additionally, it says only 14.3% of requests for user data were granted because many were found to be illegal or not up to the foundation’s standards. In other cases, the foundation just didn’t have any information to give. You can find the report here.

    Gizmodo points to an interesting thing in the transparency report showing that the foundation denied a photographer’s requests to remove pictures of a monkey because it contends that the monkey is the copyright holder. In the report, the foundation says:

    A photographer left his camera unattended in a national park in North Sulawesi, Indonesia. A female crested black macaque monkey got ahold of the camera and took a series of pictures, including some self-portraits. The pictures were featured in an online newspaper article and eventually posted to Commons. We received a takedown request from the photographer, claiming that he owned the copyright to the photographs. We didn’t agree, so we denied the request.

    A photo of Babe Ruth’s famous called shot is also among the content to have been requested for takedown. The foundation cites fair use in its denial on that one, for “its extraordinary value in illustrating the famous moment and the educational purpose it serves.”

    Is Wikipedia taking the right approach to takedowns? Is Google? Let us know what you think.

    Image via Wikimedia Commons

  • Wikipedia Articles Not Exempt From ‘Right To Be Forgotten’

    Okay, if this thing wasn’t already getting out of hand (it was), it certainly is now. A Wikipedia link is reportedly being removed from Google search results as a result of Google’s process for complying with the new “right to be forgotten” law in the EU.

    Buried in the middle of a lengthy Jimmy Wales profile at The Observer, is this:

    On 9 September, he will travel to Madrid as a member of a Google-appointed panel, charged with drawing up guidance for search engines on how to handle requests to remove links to web pages under Europe’s controversial right to be forgotten legislation. It is an issue close to home – Google is understood to be about to remove its first link to a Wikipedia page. “The legislation is completely insane and needs to be fixed,” says Wales.

    It’s unclear what exactly the link being removed is about, or who requested its removal. It will be interesting to see if that information comes out, as it could obviously help us get a better understanding of the context.

    For now, we’ll just have to consider it part of the larger mess that is the “right to be forgotten,” and know that not even the Internet’s encyclopedia is exempt from having information disappear from search results. This is particularly significant as Wikipedia results are often among the top results for informational queries on Google. The site even powers a great deal of the information that appears in Google’s Knowledge Graph results.

    The fact that Wikipedia is a community-edited effort only makes things more complex. It’s supposed to be bias free in the first place, leading one to wonder what grounds would call for an article to be eliminated from search results as opposed to having an informational article edited for the removal of bias.

    It is worth noting that Wikipedia has had some problems with undisclosed paid editing.

    Google outlined the complexity of enforcing the right to be forgotten in a questionnaire from EU regulators last week. The company also released some dates for when it will consumer presentations from “expert” voices on the EU’s right to be forgotten ruling. These events should help Google shape its policy for URL removal.

    Image via Google

  • Google On Complexity Of ‘Right To Be Forgotten’

    As previously reported, Google (as well as Microsoft and Yahoo) attended a meeting last week with EU regulators to discuss the “right to be forgotten” ruling and the search engines’ approach to handling it.

    Each of the companies was given a questionnaire (via The New York Times), asking about various aspects of their practices related to complying with the ruling. Google’s has been made publicly available, and in it, the company discusses complications it faces.

    Asked about criteria used to balance the company’s own economic interest and/or the interest of the general public in having access to info versus the right of the data subject to have search results delisted, Google said:

    The core service of a search engine is to help users find the information they seek, and thus it is in a search engine’s general economic interest to provide the fastest, most comprehensive, and most relevant search results possible. Beyond that abstractconsideration, however, our economic interest does not have a practical or direct impact on the balancing of rights and interests when we consider a particular removal request.

    We must balance the privacy rights of the individual with interests that speak in favour of the accessibility of information including the public’s interest to access to information, as well as the webmaster’s right to distribute information. When evaluating requests, we will look at whether the search results in question include outdated or irrelevant information about the data subject, as well as whether there’s a public interest in the information.

    In reviewing a particular removal request, we will consider a number of specific criteria. These include the individual (for example, whether an individual is a public figure), the publisher of the information (for example, whether the link requested to be removed points to material published by a reputable news source or government website), and the nature of the information available via the link (for example, if it is political speech, if it was published by the data subject him- or herself, or if the information pertains to the data subject’s profession or a criminal conviction).

    Each criterion, the company continued, has its own “potential complications and challenges”. It then proceeded to list these examples:

    • It is deemed to be legitimate by some EU Member that their courts publish rulings that include the full names of the parties, while courts in other Member States anonymise their rulings before publication.
    • The Internet has lowered the barrier to entry for citizen journalists, making it more difficult to precisely define a reputable news source online than in print or broadcast media.
    • It can be difficult to draw the line between significant political speech and simple political activity, e.g. in a case where a person requests removal of photos of him- or herself picketing at a rally for a politically unpopular cause.

    As previously assessed, it’s a real mess.

    Google says in the document that it has not considered sharing delisted search results with other search engines, adding, “We would note that sharing the delisted URLs without further information about the request would not enable other search engine providers to make informed decisions about removals, but sharing this information along with details or a copy of the complaint itself would raise concerns about additional disclosure and data processing.”

    For some reason, I’m reminded of that time Google accused Bing of stealing its search results.

    You can read Google’s full questionnaire responses here.

    As of July 18th, Google had received over 91,000 removal requests involving over 328,000 URLs. Earlier this week, Google announced dates for presentations to its Advisory Council, aimed at evolving the public conversation and informing ongoing strategy.

    Image via Google

  • Google Announces ‘Right To Be Forgotten’ Tour 2014

    Google has released a schedule for presentations from “experts” on the “right to be forgotten,” which will take place throughout the fall. Consider it Google’s Right to be Forgotten Tour 2014 (I hope there are t-shirts).

    The company recently announced the formation of its Advisory Council on the subject, which stems from a ruling by the Court of Justice of the European Union, saying that search engines must provide people in the EU with a means of requesting content about them be removed from search results. You can get caught up on the whole mess here, but suffice it to say, it’s been a controversial battle between privacy and censorship. Many questions and concerns remain, which is precisely why Google is holding these “in-person public consultations”.

    The schedule is as follows:

    September 9 in Madrid, Spain
    September 10 in Rome, Italy
    September 25 in Paris, France
    September 30 in Warsaw, Poland
    October 14 in Berlin, Germany
    October 16 in London, UK
    November 4 in Brussels, Belgium

    “The Council welcomes position papers, research, and surveys in addition to other comments,” says Betsy Masiello, Google Secretariat to the Council. “We accept submissions in any official EU language. Though the Council will review comments on a rolling basis throughout the fall, it may not be possible to invite authors who submit after August 11 to present evidence at the public consultations.”

    There’s a form here, for those who wish to voice their concerns and be considered for presentation.

    Last week, EU regulators held a meeting with the search engines about the subject, where Google was said to disclose that it had removed over 50% of URLs requested, rejected over 30%, and requested additional info in 15% of cases. It had received requests from 91,000 people to remove 328,000 URLs just through 07/18.

    More on Google’s Advisory Council here.

    Image via Google

  • Google Reportedly Reveals ‘Right To Be Forgotten’ Stats

    EU regulators had that meeting with the search engines about the “right to be forgotten” ordeal on Thursday, and Google did indeed participate. There had initially been some question regarding whether or not Google would be in attendance.

    The Wall Street Journal has a source apparently with direct knowledge of what was discussed in the meeting, and shares some stats Google presented, which include:

    – It has removed over 50% of URLs requested.

    – It has rejected over 30%.

    – It has requested additional info for 15% of cases.

    – It has received requests from 91,000 people to remove 328,000 URLs just through 07/18.

    – 17,500 requests came from France, while 16,500 came from Germany, and 12,000 came from the UK.

    There’s no word on what kind of numbers Bing is seeing, though its tool has only been available since last week.

    Yahoo was reportedly also in attendance at the meeting, but it’s still unclear what the company’s status is in relation to the law.

    Image via Google

  • Bing Joins The ‘Right To Be Forgotten’ Party

    Google has had its “right to be forgotten” request form up and running since late May. Bing has now finally followed suit with its version.

    If you haven’t been following along with the “right to be forgotten” storyline, I suggest you catch up here. It’s just too much to keep rehashing for every related article.

    Bing’s tool consists of a four-part process. Users must enter their identity, residence and contact info, state their role in society or their community, request the specific pages they want blocked, and of course the signature.

    “We encourage you to provide complete and relevant information for each applicable question on this form,” Bing tells users on the page. “We will use the information that you provide to evaluate your request. We may also consider other sources of information beyond this form to verify or supplement the information you provide. This information will help us to consider the balance between your individual privacy interest and the public interest in protecting free expression and the free availability of information, consistent with European law. As a result, making a request does not guarantee that a particular search result will be blocked.”

    It continues, “Note regarding minor children: If you are a minor, you may submit this form on your own. If you are a parent or legal guardian of a minor, you may submit this form on that minor’s behalf, in which case, all references to ‘you’ or ‘your’ will refer to the minor child.”

    “Given the many questions that have been raised about how the recent ruling from the Court of Justice of the European Union should be implemented, this form and the related processes may change as additional guidance becomes available,” it concludes before getting into the form itself. “Submissions may be reevaluated over time.”

    The EU has called upon the search engines to have a meeting next week, and Bing has already confirmed that it will be in attendance. While Google and Yahoo have indicated they’ll cooperate with the EU on the matter, they didn’t immediately confirm attendance at the specific meeting.

    Yahoo has yet to introduce a tool of its own.

    Image via Bing

  • EU To Hold Meeting On ‘Right To Be Forgotten’ Next Week

    It doesn’t seem like the whole mess that is the “right to be forgotten” is going to be thoroughly sorted out anytime soon, as regulators in Europe are now taking issue with Google’s implementation of the rules it is being forced to adopt.

    The Wall Street Journal is reporting that the EU privacy officials have called a meeting for next Thursday in Brussels with the major search engines to discuss things further. According to the report, Microsoft has confirmed that it will attend, while Google and Yahoo have said they’ll cooperate with officials, but haven’t confirmed attendance for the specific meeting.

    Regulators in Germany, it says, are concerned that Google isn’t removing search results from Google.com in the same way that it is with its EU-specific sites. Likewise, the director of a French watchdog says this puts the effectiveness of the whole thing into question.

    You don’t say.

    Other areas of concern include: cases that end up having the opposite effect of the right to be forgotten, as stories are written about their very involvement with this whole larger story; and the nature in which Google is notifying publishers when they’re content is being hidden in search results.

    It will be surprising if Google doesn’t end up attending the meeting, as it is obviously effected greatly by this whole thing, and the whole world is watching. It’s no surprise that Microsoft has confirmed its attendance, as it has been talking about implementing its version of the “right to be forgotten” feature on Bing in recent weeks, but has admitted it’s been a difficult process. In fact, some have criticized Google for complying so quickly while Bing is taking its time. Yahoo is said to be readying its own version as well, but we haven’t heard much from them on the matter.

    It will be interesting to see what kind of progress is made next week, if any.

    Image via Google