WebProNews

Tag: autocomplete

  • Google Alters Search Algorithm Ahead of US Election

    Google Alters Search Algorithm Ahead of US Election

    Google is making some major changes to how its search engine operates as the US prepares for the election in November.

    Tech companies have come under fire from both sides of the aisle for alternately doing too much and not enough to combat misinformation, false claims and divisive content. Facebook famously got in major trouble over the Cambridge Analytica scandal, resulting in multiple fines and ongoing scrutiny.

    It appears Google is already taking measures to avoid any scenarios that could put it in the hot seat, by changing how its Autocomplete algorithm works in the weeks leading to the election.

    “We expanded our Autocomplete policies related to elections, and we will remove predictions that could be interpreted as claims for or against any candidate or political party,” writes Pandu Nayak Google Fellow and Vice President, Search. “We will also remove predictions that could be interpreted as a claim about participation in the election—like statements about voting methods, requirements, or the status of voting locations—or the integrity or legitimacy of electoral processes, such as the security of the election. What this means in practice is that predictions like ‘you can vote by phone’ as well as ‘you can’t vote by phone,’ or a prediction that says ‘donate to’ any party or candidate, should not appear in Autocomplete. Whether or not a prediction appears, you can still search for whatever you’d like and find results.”

    The new feature builds on the company’s policy of excluding hateful and inappropriate results from displaying in Autocomplete. It remains to be seen if these measures will have a noticeable impact.

  • Google: Easy Mobile Forms Are Crucial

    Google: Easy Mobile Forms Are Crucial

    Google has been making a lot of moves to get mobile users a better user experience. As you may know, the company recently announced a pair of ranking signals for this purpose. One that’s already in effect will return in-app results for signed-in users who have those apps installed on their devices. The other one, which is coming next month, places emphasis on sites that are mobile-friendly in general.

    Google is now stressing the importance of helping mobile users fill out online forms on your website, and while this may not be directly related to search, it can certainly help in the conversions department, which is probably the reason you’re wanting to rank highly in the first place.

    Back in 2012, Google launched support for the “autocomplete” attribute in Chrome, aimed at making it easier for people to fill out forms on websites. Chrome now supports it for form fields according to the current WHATWG HTML Standard so webmasters and developers can label input element fields with things like “name,” “street address,” etc., without changing the user interface or backend.

    According to the company, many webmasters have increased the rate of form completion on their sties by marking up their forms for auto-completion.

    Here’s an example of the markup for an email address field.

    <input type=”email” name=”customerEmail” autocomplete=”email”/>

    You can see a sample form here.

    “A lot of websites rely on forms for important goals completion, such as completing a transaction on a shopping site or registering on a news site,” Google says in a blog post. “For many users, online forms mean repeatedly typing common information like their names, emails, phone numbers or addresses, on different sites across the web. In addition to being tedious, this task is also error-prone, which can lead many users to abandon the flow entirely. In a world where users browse the internet using their mobile devices more than their laptops or desktops, having forms that are easy and quick to fill out is crucial!”

    Google put out a video about the “autocompletetype” attribute a couple years ago. In that, Matt Cutts explained how the auto-completing of forms can help you get users to “whisk right through your form” and do what you want them to do.

    “Making websites friendly and easy to browse for users on mobile devices is very important,” Google says in the blog post. “We hope to see many forms marked up with the ‘autocomplete’ attribute in the future.”

    Here’s a look at Google’s recommended input name and autocomplete attribute values:

    “This is important,” said Google’s Gary Illyes in a Google+ post. “Not because it will give you higher rankings, but because filling out forms on mobile is a nightmare without autocomplete.”

    He also said he’s not hinting at this being in the search ranking algorithm for mobile.

    For more on getting your site mobile-friendly, read this.

    Images via Google

  • Google Autocomplete Misogyny Serves as the Focal Point for These Powerful Ads

    Google autocomplete, for all its various detractors, is simply a reflection of the people. Google’s not picking what it suggests to you when you start typing a query – you are. Well, maybe not you entirely. Google’s autocomplete suggestions are based on a complex algorithm that take into account the popularity of specific search terms, your own personal search history, and more. Basically, if you see some odd or downright repulsing suggestions in Google autocomplete, it’s because humanity sucks.

    UN Women, the United Nations organization dedicated to gender equality and the empowerment of women, has created some incredible new ads using Google autocomplete as the backdrop.

    Have you ever typed in “women cannot” or “women shouldn’t” into Google and looked at what comes up? Obviously it’s going to be different for each person depending on where they are in the world – but for the most part the suggestions are full of misogyny. Seriously. Try it.

    Yeah.

    To UN Women, these autocomplete suggestions show that there’s still a lot of work to do when it comes to promoting gender equality. Check out the ads:

    If you were wondering, here’s what men should do:

    Images via UN Women, AdWeek

  • Google Must Remove Defamatory Autocomplete Suggestions Says German Court

    Google’s autocomplete results are not suggestions straight from the brains of Googlers, pecking away on their keyboards. When you type something and Google attempts to finish your thought for you, it’s simply throwing up the most popular searches for that string of word. It’s an algorithm, not manually determined – everything that appears has been previously typed by another Google user.

    But that hasn’t stopped plenty of people from going after Google when they don’t like what they see appearing next to their names or businesses. And sometimes successfully, I might add. The latest case to spring from a disputed autocomplete result comes from Germany and is bad news for Google.

    A German court has ruled that Google must manually remove autocomplete results if they are determined to be defamatory. This wide ruling could have an effect on not only cases in Germany, but in other countries who could use the decision as a model.

    As the AP reports, the case stems from an unidentified plaintiff, only known as “R.S,” whose company sells nutritional supplements. R.S. filed a complaint when they saw that Google autocomplete results associated the name of the company with “fraud” and “Scientology” – both of which they considered defamatory.

    A lower court dismissed R.S.’ claim, but the Federal Court of Justice overruled. According to the ruling, Google isn’t being directed to turn of autocomplete or even interfere preemptively, only required to eliminate defamatory autocomplete suggestions when they are brought to the company’s attention.

    This isn’t the only case in Germany involving Google’s autocomplete to make headlines. Last year, former German First Lady Bettina Wulff claimed that Google destroyed her reputation with their autocomplete suggestions. Wulff, who has battled rumors that she worked as an escort prior to marrying former German president Christian Wulff, has her name associated with “escort” and “prostitute” in multiple languages in Google autocomplete.

    Of course, those suggestions only exist because of the high volume of Google searches. But this new ruling could affect that case, which is still pending.

    In April, Google lost a case in Japan over their autocomplete function. A man sued Google over suggestions relating to criminal activity – activity he denied. A Japanese court ruled that Google must alter their results and they also issued a 300,000 yen fine (roughly $3,100).

    Google has also faced autocomplete complaints in France.

  • The Human Condition Tracked via Google Autocomplete Is Sad, Beautiful, Sex-Obsessed

    There is something poignant, and unnervingly beautiful about this. It’s also incredibly depressing in a way. Google’s autocomplete feature uses algorithms to suggest queries based on their popularity with other users, so when Google is suggesting something to you, just know that a whole hell of a lot of people have search it before you.

    And that’s what makes this so…just…yeah:

    “Using billions of searches, Google has prototyped an anonymous profile of its users. This reflects the fears, inquiries, preoccupations, obsessions and fixations of the human being at a certain age and our evolution through life,” says creator Marius B. Well said.

    And just FYI, he made this video in a incognito window with no user signed in, no cookies, and with a deleted history. This is pretty much the human condition, visualized via Google autocomplete. Life really is all about sex.

    [Marius B via reddit]

  • Bing Suggests You Search for ‘Sex Games for Kids’ and a Bunch of Other Questionable Queries

    Updated with comment from Microsoft below.

    Like Google, Yahoo, and most other search engines, Bing offers to autocomplete queries in their search box. While Google calls this feature “autocomplete,” Bing calls it “search suggestions.”

    Well, it looks like Bing is suggesting that you search for some pretty disturbing stuff.

    I was pointed in the right direction thanks to a reddit post. “Why don’t you have a seat over there, Bing,” it read, referencing Dateline NBC host Chris Hansen’s famous line on the show To Catch a Predator.

    Ok, I’ll bite. Here’s what Bing’s search suggestions suggest:

    Say what? It appears that Bing is suggesting that I search for “sex games for kids,” and “sex games for kids in bed” and “sex games online for children.” Hm, ok then.

    Digging a little deeper with the questionable queries produced similar results. For instance, here’s what Bing suggests when you search for “sex kids”:

    And here’s a Bing search for “sex child…”:

    Even worse, here are Bing’s suggestions for what I’m sure is one of their (and any search engine’s) most popular single-word queries, “sex”:

    Damnit, Bing.

    Also, changing your SafeSearch settings to strict does nothing to eliminate these results. When you think about it, it would probably look even worse for Bing if it did, because that would indicate that Bing felt that a search suggestion like “sex games for kids in bed” was an appropriate suggestion for a moderate level SafeSearch.

    “Still seeing inappropriate content? SafeSearch uses advanced technology to filter adult content, but it won’t catch everything. If SafeSearch is set to Strict or Moderate and you’re seeing adult content, tell us about it so we can filter it in the future,” says Bing.

    But this isn’t a SafeSearch problem, this is a search suggestions problem. You can turn search suggestions off in your settings, but by default they are on. That means that the average person that pulls up bing.com and searches for “sex g…” sees these questionable suggestions.

    Now, I guess the next big question is whether or not Bing has a responsibility to filter out these search suggestions.

    On one hand you could make the argument that Bing doesn’t have to manually edit which search suggestions it gives for particular queries. The suggestions are clearly based upon popular and recent searches from the Bing community – and if that’s what they’re searching for then hey – let it be.

    On the other hand, Google limits its autocomplete results. Here’s what you’ll see when you search “sex games” on Google:

    And here’s what you see when you search “sex kid”:

    As you know, Google also censors other questionable searches. They won’t give you suggestions for sexual terms like “boobs” or “pussy,” and they won’t even display curse words like “fuck” or “shit” in autocomplete results.

    They also censor any search that has to do with the illegal downloading of copyright protected content. For instance, “game of thrones torrent” won’t autocomplete.

    Over on Bing, it’s a totally different story:

    Bing doesn’t really filter any of the types of searches that Google does. Last year, we pointed out that Bing was suggesting painless ways to kill yourself while Google was displaying the suicide prevention hotline.So, if they’re going with a true hands-off approach to any sort of search suggestion censoring, what’s different about queries about sex games for kids?

    Well, it’s the “c’mon, dude” argument I guess. As in, Bing…c’mon dude. It doesn’t help that instead of “autocomplete,” Bing’s version of the technology is called “search suggestions.” So, when you think about it, Bing is suggesting that you search for “sex kids movies” and “sex games with kids in bed.”

    C’mon, dude.

    I’ve reached out to Bing for comment and will update when I hear back.

    UPDATE:

    As you know, Facebook partners with Bing for their search results. And you can find the same questionable suggestions inside Graph Search results:

    UPDATE 2: A Microsoft spokesperson has given me this:

    “We’re reviewing the guidelines for search suggestions related to this type of query.”

    Well have to see if anything changes.

  • Google Loses Lawsuit Over Autocomplete in Japan

    A Tokyo District Court has ruled that Google must alter its autocomplete results to make sure they don’t suggest criminal activity when users search for a specific man’s name.

    This case began in March of 2012 when a Japanese court demanded that Google delete certain search terms inside their autocomplete function – ones that related to a specific man whose identity is still being witheld. The man claimed that when his name was searched, suggestions popped up linking him to criminal activity of which he was innocent. Clicking through to the links provided led user to websites filled with further defamation.

    Not only did the plaintiff allege that Google’s autocomplete results caused him pain and personal anguish, but they also contributed to him losing his job and being unable to procure another.

    Now, the court has ruled that Google must alter their results in the case of this anonymous man. They also ordered that Google pay 300,000 yen ($3,100) for the man’s pain and suffering – but not the job loss as he couldn’t prove that the two were definitely linked.

    Well, it’s another day, another foreign court making a ruling on Google autocomplete. We’ve seen plenty of this in the past. Back in January of 2012, Google chose to pay a fine issued by a French court over the company’s autocomplete results. A local insurance company complained that Google autocomplete associated their name with the term “esroc,” roughly translating to mean “crook” or “swindler.”

    Later in the year, Google made another deal in a French case, this time involving autocomplete results that labeled certain high-profile celebrities and politicians as “Jewish.” The complaint was originally filed by French anti-racism groups.

    Google has also been in trouble in Germany and Italy over their autocomplete results.

    Of course, Google’s autocomplete results stem from an algorithm that is based on prior searches. Google does not manually select which terms pop up when you type in any query.

    “Autocomplete is a feature of Google search that offers predicted searches to help you more quickly find what you’re looking for. These searches are produced by a number of factors including the popularity of search terms. Google does not determine these terms manually–all of the queries shown in Autocomplete have been typed previously by other Google users,” says Google.

    But that hasn’t stopped courts from ordering that Google manually intervene in certain circumstances.

    Since Google Search isn’t rooted in Japan, Google isn’t required to follow this ruling – just like they weren’t required to follow the previous injunction the court issued in the case (and they didn’t). The ruling, however, can be appealed.

  • Gmail Rolls Out Past Search Autocomplete, Contact Thumbnails

    Google is shipping an update to Gmail that should make it easier to locate that email that you just searched for the other day.

    Starting today, Google begins the global rollout of new autocomplete predictions for your past Gmail searches.

    “If you’ve searched your email for ‘supercalifragilisticexpialidocious’ or other lengthy phrases, it just got easier to find what you’re looking for. Autocomplete predictions in Gmail may now include your past Gmail searches,” says Google.

    Also rolling out – contact thumbnails in Gmail search:

    Google says that the global rollout of both of these features will take a few days and it even includes Google Apps for Business customers.

    We think that these updates will probably go over a little better than the last update to hit Gmail. Last week, Google began pushing the new compose box to all users and there was a audible groan from a good portion of the Gmail-usiing population.

  • Bing Adds Some ‘Ghosting’ To Autosuggest

    Bing Adds Some ‘Ghosting’ To Autosuggest

    Bing announced that it has made a change to its Autosuggest feature, which it says makes the search experience faster by completing your query when they’re “confident” they “really know” what you’re looking for.

    Bing refers to its latest development as “Autosuggest Ghosting”.

    “Autosuggest algorithms are able to determine just how likely it is that you want the #1 suggestion with various degrees of confidence,” explains Dan Marantz, Senior Program Manager Lead, Bing Experiences and Query Formulation Team. “This confidence is highest in the two major patterns: Navigation and Search History. Ghosting is a way to pre-populate the query most likely to be used in the search box (blue selected-text style below) in an effort to speed up the time it takes to express your intent and get to your destination. This has seen to help users speed up by over 16%.”

    “The design challenge was to focus on simplicity and intuitiveness. The interaction should feel natural and instinctive when you need it, and easy to work around when you don’t want it,” he says. “The simplest solution is to grey-in (or “ghost”) the high-confidence suggestive text and hope you notice. The problem then becomes – how do you accept the suggestion vs ignore it?”

    Naturally, he takes a dig at Google.

    “Google’s model complicates this by not being clear about what happens when you hit <enter> to submit the query,” he says. “Will the search be for ‘bed’ or ‘bed bath and beyond’? Turns out the query is only ‘bed’ and you need to press <tab> or <down> to select the full query.”

    The Bing philosophy, he says, is not grounded in applying already-learned interaction models.

    Users can press Enter to accept a suggestion, continue to type through it with something else, or press Delete/Backspace to remove the suggested text.

  • Does Google the Link Lister Equal Google the Publisher?

    Is Google a publisher? Or is Google simply a displayer of links? Are these two things the same?

    Those questions are at the heart of a Australian case that just tipped against Google, and are likely at the heart of many cases to come. An Australian high court has found Google liable for libelous content tying a man to organized crime. Of course, Google didn’t create the article that made the references, it simply provided a link to it within its search results.

    The man’s name is Milorad Trkulja, and he claimed that Google defamed him by associating his name and image with (untrue) claims of ties to organized crime, both in regular search results and in Google Image search. The jury in the case found Google guilty and therefore responsible for the content that they link to. They’ve been fined $200,000, but are in the process of appealing the ruling (as you would expect).

    Is Google responsible for the content that is found using their search engine? Or is this a ridiculous claim to make? Let us know in the comments.

    Here’s what the Judge in the case had to say:

    The question of whether or not Google Inc was a publisher is a matter of mixed fact and law. In my view, it was open to the jury to find the facts in this proceeding in such a way as to entitle the jury to conclude that Google Inc was a publisher even before it had any notice from anybody acting on behalf of the plaintiff. The jury were entitled to conclude that Google Inc intended to publish the material that its automated systems produced, because that was what they were designed to do upon a search request being typed into one of Google Inc’s search products. In that sense, Google Inc is like the newsagent that sells a newspaper containing a defamatory article. While there might be no specific intention to publish defamatory material, there is a relevant intention by the newsagent to publish the newspaper for the purposes of the law of defamation.

    Basically, Google may not want to publish it, but they are publishing the publishers. And since Google’s algorithms are tooled to find said content, they are responsible. Or at least it is plausible that a jury could see it that way. The Judge is clearly unconvinced that this stance is set in stone.

    The Judge also differentiated search results pages from Google Image searches. The plaintiff also complained of images tying him to crime figures. The Judge notes that a Google Image search is a more-sophisticated version of cut-and-paste from magazines, and importantly a Google-created page:

    As was pointed out by counsel for the plaintiff in his address to the jury, the first page of the images matter (containing the photographs I have referred to and each named “Michael Trkulja” and each with a caption “melbournecrime”) was a page not published by any person other than Google Inc. It was a page of Google Inc’s creation – put together as a result of the Google Inc search engine working as it was intended to work by those who wrote the relevant computer programs. It was a cut and paste creation (if somewhat more sophisticated than one involving cutting word or phrases from a newspaper and gluing them onto a piece of paper). If Google Inc’s submission was to be accepted then, while this page might on one view be the natural and probable consequence of the material published on the source page from which it is derived, there would be no actual original publisher of this page.

    You can see just how much of a charlie-foxtrot this is. Which pages are Google’s creation, and which are simply the “consequence of the material published on the source page from which it is derived?”

    The jury concluded that Google was a publisher, and was liable for the defamatory content even if they weren’t notified of it yet. Although Google contended that it doesn’t matter if they were notified of the content of not – they’re not responsible – the Judge rejected that notion as well.

    It follows that, in my view, it was open to the jury to conclude that Google Inc was a publisher – even if it did not have notice of the content of the material about which complaint was made. Google Inc’s submission to the contrary must be rejected. However, Google Inc goes further and asserts that even with notice, it is not capable of being liable as a publisher “because no proper inference about Google Inc adopting or accepting responsibility complained of can ever be drawn from Google Inc’s conduct in operating a search engine”.

    This submission must also be rejected. The question is whether, after relevant notice, the failure of an entity with the power to stop publication and which fails to stop publication after a reasonable time, is capable of leading to an inference that that entity consents to the publication. Such an inference is clearly capable of being drawn in the right circumstances (including the circumstances of this case). Further, if that inference is drawn then the trier of fact is entitled (but not bound) to conclude that the relevant entity is a publisher.[42] Google Inc’s submission on this issue must be rejected for a number of reasons, the least of which is that it understates the ways in which a person may be held liable as a publisher.

    Of course, $200,000 to Google is basically nothing. The appeal really has nothing to do with the monetary damages. Google knows that this kind of decision sets an unsettling precedent for their future defenses in similar cases. Google as “automated news agent that’s responsible for what their algorithms pull out of the depths” is a view of Google that the company can’t afford to have stick.

    We’ve seen this story play out numerous times over the past couple of years with Google’s autocomplete feature. In August of 2011, Google lost a case in Italy and was forced to remove autocomplete suggestion in its search box that tied a man to the word “truffatore,” meaning con man. A few month later, Google was fined $65,000 because one of its autocomplete suggestions labeled a French man “esroc,” meaning crook.

    And this year, Google made an out-of-court settlement with French anti-discrimination groups over a “Jewish” autocomplete suggestion.

    Google’s argument in these cases is similar to the argument in the Australian case. We’re not suggesting anything. We’re not defaming anyone. Google’s autocomplete suggestions are based on popularity of terms. That means that if anything, Google users are the ones linking people’s names with unsavory terms. Google’s search results are also based on an algorithm. Just ask Rick Santorum about how much responsibility Google claims in what people find using its search engine.

    So, is Google a publisher? If not, what are they, exactly? How much responsibility do you think Google has for what people find using their search engine? Tell us what you think in the comments.

  • Google Makes A Bunch Of Changes To Autocomplete

    Google released a big list of 65 changes it has made over the course of August and September, and quite a few of them were tweaks to its autocomplete feature.

    This is all part of Google’s goal of getting you to what you’re looking for more quickly, and with less steps, something the search engine has made tremendous strides on over the years.

    The following 10 changes deal specifically with autocomplete:

    • #83197. [project “Autocomplete”] This launch introduced changes in the way we generate query predictions for Autocomplete.
    • essence. [project “Autocomplete”] This change introduced entity predictions in autocomplete. Now Google will predict not just the string of text you might be looking for, but the actual real-world thing. Clarifying text will appear in the drop-down box to help you disambiguate your search.
    • #84259. [project “Autocomplete”] This change tweaked the display of real-world entities in autocomplete to reduce repetitiveness. With this change, we don’t show the entity name (displayed to the right of the dash) when it’s fully contained in the query.
    • TSSPC. [project “Spelling”] This change used spelling algorithms to improve the relevance of long-tail autocomplete predictions.
    • Dot. [project “Autocomplete”] We improved cursor-aware predictions in Chinese, Japanese and Korean languages. Suppose you’re searching for “restaurants” and then decide you want “Italian restaurants.” With cursor-aware predictions, once you put your cursor back to the beginning of the search box and start typing “I,” the prediction system will make predictions for “Italian,” not completions of “Irestaurants.”
    • #84288. [project “Autocomplete”] This change made improvements to show more fresh predictions in autocomplete for Korean.
    • espd. [project “Autocomplete”] This change provided entities in autocomplete that are more likely to be relevant to the user’s country. See blog post for background.
    • #83391. [project “Answers”] This change internationalized and improved the precision of thesymptoms search feature.
    • #82876. [project “Autocomplete”] We updated autocomplete predictions when predicted queries share the same last word.
    • #80435. [project “Autocomplete”] This change improves autocomplete predictions based on the user’s Web History (for signed-in users).

    Last month, Google Autocomplete stopped excluding the term “bisexual,” attracting some headlines for the feature – probably the most positive headlines the feature has seen in recent memory, given that they didn’t involve Google getting in trouble for making controversial suggestions about specific people.

  • Bing Beats Google, If You’re Looking to Kill Yourself [UPDATED]

    UPDATE: I’ve received the following statement from Bing:

    “In some cases we do prioritize the hotline and we’re reviewing the guidelines for instant answers related to this type of query,” says Stefan Weitz, Senior Director, Bing.

    ORIGINAL ARTICLE: Earlier this month, Bing launched their “Bing It On” challenge – a blind comparison test designed to see if users preferred the search results from Bing as opposed to those of Google, if they weren’t biased. Bing said that in these blind tests, internet users chose Bing over Google by a 2 to 1 margin. The campaign has some detractors, as some pointed out that Bing was excluding features like Google’s knowledge graph from the challenge results. It could be argued that Knowledge graph is one of the things that people really love about Google nowadays, so that wasn’t exactly fair. And Google’s Matt Cutts pointed out a pretty big fail within Bing’s search results.

    Oh well, all of that is beside the point, except to frame the background for this: Bing totally bests Google’s search results, if your search queries involve suicide.

    Check out this comparison of Google and Bing results for the search “how to commit suicide,” courtesy of redditor naidlm:

    As you can see, Google inserts the contact information for the National Suicide Prevention Hotline at the top, before any search results. Also, Google’s autocomplete will not suggest the full phrase as any point in typing the query. Bing, on the other hand, autocompletes it for you and lists a bunch of related searches on the right.

    Additionally, “how to commit suicide” searches on both Yahoo and Ask.com display the number for the National Suicide Prevention Hotline above all results. Neither Yahoo or Ask.com will suggest the complete phrase at any point.

    Then, there is the response to other phrases like “how to kill…” Bing autocompletes it with “yourself” and “yourself painlessly” while Google simply suggests bugs.

    To be fair, neither Google or Bing can imagine all possible suicide-related queries and plan for them. Searches for “how to slit my wrists” appear similar on both sites. And both sites freely autocomplete phrases about committing various acts of homicide. No “get help before you kill mama” warning from Google.

    But searches for the phrases “I want to kill myself,” “I want to die,” “how to die,” “suicide” and “how to commit suicide” bring up the number for the National Suicide Prevention Hotline as a feature in Google search. No such luck with Bing.

    We’re not suggesting that Bing should nerf their autocomplete results. Plenty of people take issue with just how many words appear on that Google autocomplete blacklist. But it wouldn’t be too hard to throw a phone number for a hotline at the top of the results when someone searches one of these loaded phrases. Google did it out of a partnership with Samaritans back in 2010. Maybe they have yet to push Bing to include the feature?

    But it’s hard to argue that Bing wins this round of the search battle by giving the user exactly what they want – even if what they want is information on how to end their own life.

  • Google Autocomplete No Longer Excludes “Bisexual”

    There are thousands of words that Google has “blacklisted,” meaning they won’t trigger any suggestions within Google’s autocomplete feature. If you don’t quite understand what I’m talking about, head on over to Google and type in “football.” Before your fingers even hit the letter “t,” there’s a good chance that “football” or “football score” or something similar is dropped down as a suggestion.

    Now search for “porn.” Nothing, right?

    In some instances, Google does this to protect copyright. For instance, searchers of the word “torrent” will find a dearth of autocomplete suggestions. Just recently, Google added The Pirate Bay to its list of blacklisted search terms for autocomplete. In some cases, words like “amateur,” “porn,” “boobs,” and other related terms are blacklisted to…really I don’t know. To protect Google users from *gasp* pornography, I guess.

    One of the words that failed to produce any autocomplete suggestions was “bisexual.” Notice the past tense here. That’s because a bisexual advocacy group is claiming to have won the battle and gotten “bisexual” off the blacklist.

    “Since late 2009, Google has had “bisexual” on a list of banned words; such words were de-prioritized by the Google search algorithm, leading to a drop in search rankings for all bisexual organizations and community resources. Since its search engine would not auto-suggest or auto-complete any term with the word “bisexual”, Google made it harder for any user to find bisexual content, whether that be on coming out as bisexual or finding local support groups across the United States and elsewhere.” explains BiNet, a longtime bisexual advocacy organization.

    They are claiming victory in an effort to change that. The group’s head, Faith Cheltenham, had this to say on her personal blog:

    Google search results WILL vary by user, one user has already reported seeing “bisexual” when typing in “bi” while other users don’t even see “bisexual quotes” when typing in “bisexual q”. The block was lifted on August 21st as far as some VERY DEDICATED volunteers can tell. It was on that date that “bisexual q” started producing “bisexual quotes”. Just a few weeks later, I get these results when typing in “bisexual q”.

    For me, typing “bise” produces a suggestion for “bisexual quotes.” I’m not seeing any other suggestions for “bisexual” on its own or anything else, however – but Google’s instant suggestions are different for everybody.

    What’s particularly odd about this is that for years, Google didn’t block autocomplete results for words like “heterosexual,” “homosexual,” “asexual,” and even “trisexual.”

    “We thank Google for making the right call here and for acting as a responsive corporate citizen committed to dignity and equality,” said Kate Kendell, Director of the National Center for Lesbian Rights.

    It’s important to note that Google has always been one of the most vocal companies in their support of LGBT rights.

    [via Slate]

  • Google in Trouble (Again) over Autocomplete

    Google finds itself embroiled in another legal case stemming from their autocomplete feature in search. This time, it’s former German First Lady Bettina Wulff, who claims that Google has defamed her and “destroyed her reputation” with its instant search.

    Wullf, the wife of former German President Christian Wulff has battled persistent rumors that she worked as an escort before the two met. The 38-year-old has denied the rumors, but of course that usually has no bearing on whether or not they continue to exist online.

    In this case, a Google search for her name does yield two autocomplete results consistent with the rumors. “Bettina Wulff escort” and “Bettina Wulff prostituierte” show up in multiple languages.

    Here’s what Google suggests when you perform a search for “Bettina Wulff” on Google’s German site:

    And the same autocomplete results appear when searching on Google’s English site:

    This definitely is not the first time that Google has found itself under fire for its autocomplete results. In June, Google settled out of court with French anti-discrimination groups over the charge that Google autocomplete was labeling certain celebrities as “Jewish.” Even if you or I don’t feel like being labeled “Jewish” is discriminatory, some groups do and they accused the search giant of “creating probably to greatest Jewish history file ever.”

    Back in December of 2011, Google was forced to pay a $65,000 fine because one of its autocomplete suggestions labeled a French insurance company as “esroc,” meaning “crook.”

    Back in April of 2011, Google lost a case in Italy and was forced to manually intervene and eliminate autocomplete suggestions that labeled one man a “truffatore” and a “truffa” (con man and fraud).

    Of course, Google is not suggesting that Wulff is a prostitute, or the French President is Jewish. Google searchers are. Google’s autocomplete is based on algorithms that factor in popularity of certain searches:

    “As you type, Google’s algorithm predicts and displays search queries based on other users’ search activities and the contents of web pages indexed by Google. If you’re signed in to your Google Account and have Web History enabled, you might also see search queries from relevant searches that you’ve done in the past.

    Predicted queries are algorithmically determined based on a number of purely algorithmic factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries,” they say on their support page.

    Google will, from time to time, intervene and alter autocomplete results. The main instances of this are with cases involving “pornography, violence, hate speech, and copyright infringement.” Today, we learned that Google is now censoring suggestions of The Pirate Bay (torrent site) in autocomplete for instance.

    Google’s autocomplete results are simply expressions or current searches around the world (plus a little bit of your own personal search history). They don’t just make this stuff up. Although recent decisions would suggest that authorities in some countries feel the company has a duty to manually intervene in cases where reputation is on the line.

    [sueddeutsche.de via TechCrunch]

  • Gmail Makes Search A Little Better, Adds Features

    Google announced that a few Gmail Labs features have graduated into actual features, and that it has made some improvements to advanced search in Gmail.

    Advanced search now supports autocomplete predictions in the From: and To: fields, so you can find messages exchanged with specific people easier.

    Google improves Gmail search

    The three Labs features, which have graduated are: Refresh POP accounts, Filter import/export and Navbar drag and drop.

    “With the graduation of Refresh POP accounts, clicking the refresh link at the top of your inbox will now not only update your inbox with your new Gmail messages, but will also fetch messages from any other POP accounts which you have set up,” Google said in a Google+ post. “From the Settings > Filters page you can download a file containing some or all of your filters or upload a file to create a set of filters all in one go. This makes it easy to share filters with friends, backup filters for later and more.”

    “Lastly, if you use gadgets on the left-hand side of Gmail, you can now rearrange them with drag n’ drop,” Google added.

    Last week, Google announced the launch of additional language support for improved search in Gmail.

  • Patent Troll Goes After Netflix And Others Over Autocomplete

    Autocomplete is pretty ubiquitous in the tech space by now. If a site has search, you can bet that they have an autocomplete feature of some sort powering the front end. That feature just got a dozen or so companies sued by the latest patent troll to climb out from underneath its bridge.

    The patent troll of the week is Data Carriers LLC, a shell company that is the epitome of the patent troll. They’re suing numerous companies over US Patent 5,388,198 for “Proactive presentation of automating features to a computer user.” To them, that translates to autocomplete and companies like Apple, Nokia and more are the targets of their frivolous litigation.

    Gigaom is reporting that Data Carriers has expanded their patent lawsuit to even more companies now. Companies including Netflix, LinkedIn, Target, Wal-Mart and others are now included in the latest round of litigation from this particularly nasty troll.

    This latest lawsuit just further confirms the need for patent reform. It’s a problem when software like autocomplete is used as fuel in patent lawsuits. The entire Internet relies on these features, and further innovations are impeded by companies looking for a quick buck. It becomes even more complicated when large corporations create these shell companies to handle patent lawsuits all in the name of slowing innovation and draining money from those who are actually making the Internet better.

    Fortunately, the lawsuit brought by Data Carriers doesn’t seem to hold much merit. You can see for yourself in the court document below.

    Data Carriers v Netflix

  • Google Makes Deal In “Jewish” Autocomplete Case

    Back in April, Google was sued (again) over something that’s really not their fault.

    Or is it?

    Several French anti-discrimination groups, including SOS Racisme, accused Google of “creating probably the greatest Jewish history file ever,” and said that French Google users were “confronted daily with the unsolicited association” of popular figures with being “Jewish” or “a Jew.”

    And today, the AFP is reporting that Google has reached a deal with these groups under legal mediation.

    So, how exactly is Google labeling people who are not Jewish as Jewish? They aren’t really – but their autocomplete feature is. For example, French Predident François Hollande was one of the figures named in the proceedings as being tagged with “Jewish” by Google’s autocomplete feature.

    Of course, Google isn’t back there tinkering with their suggestions and manually pairing François Hollande with “Jewish.” Google’s autocomplete suggestions are the result of an algorithm that takes into account various data points like popular searches from across the web and personalized user activity. Here’s how Google explains it:

    As you type, Google’s algorithm predicts and displays search queries based on other users’ search activities and the contents of web pages indexed by Google. If you’re signed in to your Google Account and have Web History enabled, you might also see search queries from relevant searches that you’ve done in the past.

    Predicted queries are algorithmically determined based on a number of purely algorithmic factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries.

    Although “human intervention” is rare, it exists. Google will manually exclude certain autocomplete suggestions rooted in “pornography, violence, hate speech, and copyright infringement.” Just go to google and type “The Dark Knight torrent” or “Kim Kardashian porn” and you’ll see this in action.

    But the French plaintiffs felt that Google was at fault for their autocomplete results. Here are the results of the deal, according to a Google France spokesman,

    “Google supports education and information against racism and anti-Semitism…together with the associations, we will develop and promote projects aimed at increasing the awareness of Internet users to values of tolerance and respect.”

    The deal is still under wraps, but it definitely involves working with the French anti-discrimination groups on “public education projects.”

    I made the case for Google when I first reported on this lawsuit in April. The autocomplete suggestions are derived from an algorithm and simply reflect a term’s popularity online. For instance, look what happens when you search for “Obama is”:

    Should Google have to go in and manually extract these suggestions? In the case of the word “Jewish,” I concede that it’s a little tricky. Labeling someone a “Jew” is still a negative in the eyes of some people, so I can see why the French activists take issue with it. But does “Jewish” necessarily equate to “hate speech?” Not on its own.

    But Google has removed autocomplete results in the past, even if they weren’t pornographic, violent, or hateful. In December of 2011, Google had to pay a $65,000 fine because an autocomplete suggestion tagged a French insurance company with the word “esroc,” meaning “crook.”

    Still, it’s unclear whether Google will manually remove the “Jewish,” suggestions – just that they have reached some sort of deal. Should Google be liable for what already exists on the web? Thoughts?

  • Google Autocomplete Is… (Improved)

    Google Autocomplete Is… (Improved)

    Google put out its big list of algorithm changes for May, and 5 out of 39 of them are related to autocomplete predictions. From the sound of it, they’re the predictions are getting more useful, and there’s a lesser chance that you’ll see low-quality predictions. We’ll see.

    Here are the relevant changes:

    • Autocomplete predictions used as refinements. [launch codename “Alaska”, project codename “Refinements”] When a user types a search she’ll see a number of predictions beneath the search box. After she hits “Enter”, the results page may also include related searches or “refinements”. With this change, we’re beginning to include some especially useful predictions as “Related searches” on the results page.
    • More predictions for Japanese users. [project codename “Autocomplete”] Our usability testing suggests that Japanese users prefer more autocomplete predictions than users in other locales. Because of this, we’ve expanded the number or predictions shown in Japan to as many as eight (when Instant is on).
    • Improvements to autocomplete on Mobile. [launch codename “Lookahead”, project codename “Mobile”] We made an improvement to make predictions work faster on mobile networks through more aggressive caching.
    • Fewer arbitrary predictions. [launch codename “Axis5”, project codename “Autocomplete”] This launch makes it less likely you’ll see low-quality predictions in autocomplete.
    • Improved IME in autocomplete. [launch codename “ime9”, project codename “Translation and Internationalization”] This change improves handling of input method editors (IMEs) in autocomplete, including support for caps lock and better handling of inputs based on user language.

    At times, Google’s autocomplete feature has gotten the company some unwanted attention. A few months back, for example, a Japanese court ordered Google to delete specific terms from Autocomplete. It’s interesting that they’re now offering more predictions for Japanese users.

    At the time, a Google spokesperson told us, “Autocomplete is a feature of Google search that offers predicted searches to help you more quickly find what you’re looking for. These searches are produced by a number of factors including the popularity of search terms. Google does not determine these terms manually–all of the queries shown in Autocomplete have been typed previously by other Google users.”

    Since then, we’ve seen the company sued over “Jewish” autocomplete suggestions.

    According to Google’s new and improved Autocomplete, Google Autocomplete is funny (as you can see from the image above).

    For some fun with the feature, check out: The 2012 Presidential Election, As Told By Google Autocomplete.

  • Google Places Autocomplete Removes The Tedium Of Address Entry Pages

    I have my disagreements with Google, but we’re on the level more often than not. You see, Google agrees with me that address entry pages on the Internet are stupid. They’re slow, tedious and only serve to slow me down as I’m rushing to buy some awesome new toy or t-shirt on one of those deal-a-day Web sites. Surely there’s a better way besides just using one of the many add-ons available for Firefox and Chrome that enter your address each time you get to a page like that.

    It turns out that Google has built the solution right into Google Places. During a Hangout last week, the Google Places team demoed autocomplete for Google Places. It brings the autocomplete that Google Search is known for to the Places API. While it brings up an address as a single line of text by default, you can put the address into a structured format for things like address entry pages by using either the Autocomplete.getPlace() method of the Maps API, or you can just the simple Places API Details service.

    If you run a Web site that features a lovely address entry form, you might want to consider using the Google Places Autcomplete feature. You can tune it to favor certain areas in autocomplete as well if you’re expecting to get business from mostly one area or country.

    Besides making address entry pages bearable, Google announced two new changes to Google Places Autocomplete that should make it better for administrators and consumers. The first change is country filtering so that you can restrict autocompletes to certain countries. If you’re certain that only U.S. consumers will order from your site, then you can make sure that autocomplete will always go with U.S. addresses.

    The other filter is for city and region types. This is for those that are searching for more detailed places in autocomplete besides just states or larger cities. You can now search via autocomplete using zipcodes. This should make the Places autocomplete even more fine tuned to find the place you’re looking for.

    If you want to start using autocomplete in your Google Places application, you can get the latest places library for the Google Maps API here. If you’re building a native Places app, then you can also take advantage of the Google Places API.

    Google also offers a demo on the blog post that lets you try out autocomplete in Google Places. The demo has you searching for hotels which funny enough brought me results for the Kappa Omega sorority in Lexington, KY. I guess they could be renting out the house for the summer.

    Google Places Autocomplete Removes The Tedium Of Address Entry

    On a final note, you can use Places autocomplete on any text entry field, including address entry pages. All they ask is that you put the little “Powered by Google” logo under the text field that uses autocomplete.

  • The 2012 Presidential Election, As Told By Google Autocomplete

    With Newt Gingrich dropping out of the GOP primary race today, the inevitable has happened. Mitt Romney will be your Republican candidate for President in 2012. The Romney/Obama battle has already begun with hard hard-hitting attacks about each candidate’s dealings with dogs – so you know we’re in for a treat.

    Earlier this week we told you that Google was once again in hot water over its autocomplete feature. A French anti-discrimination group sued the company for search suggestions that labeled many prominent public figures as “Jew” or “Jewish.”

    With that on my mind, I thought we should take some time, about six months out, to look at the race through the eyes of Google autocomplete. You know the feature – it’s when you start typing your query and Google attempts to finish it for you. Since Google’s predictive search feature is based on an algorithm that factors in popular searches from other users as well as the content of all the pages it indexes, you can almost say that Google autocomplete is a rudimentary pulse of the collective internet.

    With the exception of very specific circumstances like hate speech and violence, Google does not manually interfere with its autocomplete suggestions. So most of what you’re about to see is based on the searches of real people.

    Having said that, let’s tell the story of the upcoming 2012 election from the strange eyes of Google search.

    First, the collective feelings on Newt Gingrich

    Newt Gingrich should

    Now, on to our GOP challenger Mitt Romney

    What about the incumbent, President Obama?

    Obama is

    Forget the actual candidates, what about Google searchers’ queries about the two parties?

    First, the Republicans

    Republicans are

    Now to the Democrats

    Democrats are

    Not a lot of choice there. What about a search for “This election is…”

    This election is

    Maybe that last one has something to do with this next query about our government…

    our government is

    I guess there’s only one logical conclusion…

    America is

    What a bunch of optimists! I guess it was fun while it lasted, guys.

  • Google Sued Over “Jewish” Autocomplete Suggestions

    If you’ve spent any time on humor sites, forums, or user-submitted content aggregators like reddit, you have probably seen Google’s autocomplete search feature used as a tool for discovering the sometimes fascinating, sometimes downright odd, and oftentimes frightening collective queries of the internet population. If you want to see this in action, just go to Google and type “Why can’t I” or “Should you” or “British people are.” You’ll see that people are actively searching some pretty weird stuff.

    While autocomplete can produce this decidedly comedic result, it’s not a laughing matter for some who have accused the feature of having untold reputation-ruining powers. Today, Google is being sued over their autocomplete feature, and it’s definitely not the first time the company has faced these allegations.

    The newest lawsuit comes from France, where anti-discrimination group SOS Racisme has accused Google of the “creation of what is probably the greatest Jewish history file” ever.

    French site La Cote reports (Google translation):

    Numerous users of the first search engine from France and world are confronted daily with the association unsolicited and almost systematically the term ‘Jew’ with the names of those most prominent in the world of politics, media or business, “deplore these organizations.

    The claim is that Google’s autocomplete feature is mislabeling celebrities, politicians, and other high-profile people by suggesting “Jew” or “Jewish” next to their names in possible search queries. These celebs include News Corp’s Rupert Murdoch and actor Jon Hamm. As you can see above, a search for “rupert m…” suggests “Rupert Murdoch jewish” as its fourth option.

    As you’re most likely well aware, Google isn’t sitting back there hand-picking these suggestions. They are the result of an algorithm that takes into account popular searches from other users as well as your own previous Google activity (if you’re logged in).

    Here’s how Google describes its autocomplete feature:

    As you type, Google’s algorithm predicts and displays search queries based on other users’ search activities and the contents of web pages indexed by Google. If you’re signed in to your Google Account and have Web History enabled, you might also see search queries from relevant searches that you’ve done in the past. In addition, Google+ profiles can sometimes appear in autocomplete when you search for a person’s name. Apart from the Google+ profiles that may appear, all of the predicted queries that are shown in the drop-down list have been typed previously by Google users or appear on the web.

    For certain queries, Google will show separate predictions for just the last few words. Below the word that you’re typing in the search box, you’ll see a smaller drop-down list containing predictions based only on the last words of your query. While each prediction shown in the drop-down list has been typed before by Google users or appears on the web, the combination of your primary text along with the completion may be unique.

    Predicted queries are algorithmically determined based on a number of purely algorithmic factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries.

    That lack of manual intervention has gotten Google in trouble in the past. Back in December of 2011, Google was ordered to pay a $65,000 fine because of an autocomplete suggestion directed toward a French insurance company called Lyonnaise de Garantie. One suggestions inserted the word “esroc,” which means “crook.” In the ruling, it was emphasized that they court felt Google should exercise some human control over these autocomplete suggestions.

    Google also found themselves in trouble in Japan earlier this year after autocomplete associated a man with crimes he apparently did not commit.

    It’s important to note that Google does manually exclude some autocomplete suggestions in very limited circumstances – those having to do with “pornography, violence, hate speech, and copyright infringement.”

    Having “Jew” or “jewish” pop up as a suggestion with some people’s names is simply a reflection of that term’s popularity on the internet. It’s no different that the second suggestion that pops up when you search “Obama is,” but a tad different from the fourth result:

    The point is, people are going to search for untrue things. Jon Hamm may not be Jewish, but apparently enough people have heard that he is and are checking it out. I’m also aware that labeling certain high-profile public figures as “Jews” is a negative in the eyes of many. But “Jew” or “Jewish” doesn’t fall into one of those categories that would demand an intervention from Google. It’s not hate speech to say someone is Jewish, even if the people searching for it might have hate on their minds.

    But as we’ve seen, Google is vulnerable to this sort of lawsuit. The world “esroc” doesn’t qualify as pornographic, violent, hate speech, or promoting copyright infringement – it simply harms a reputation. Nevertheless, Google had to pay a fine and remove it.

    Should Google really have to take action on autocomplete results? Tell us what you think in the comments.

    [Via The Hollywood Reporter]