WebProNews

Tag: safesearch

  • Google Makes Slew of Changes to Protect Minors

    Google Makes Slew of Changes to Protect Minors

    Google has announced a slew of changes to its platform in an effort to afford more protections to those under 18.

    Social media companies, and tech companies in general, have been under increased scrutiny and pressure over the negative impact social media and the internet can have on young people. Instagram recently announced it would set new accounts for those under 16 to private by default.

    Mindy Brooks, Product and UX Director, Kids and Families, outlined new features and changes Google is now rolling out, including setting YouTube uploads for teens, aged 13-17, to the most private option. The company will also prominently feature videos aimed at addressing digital wellbeing and commercial consent, concepts teens sometimes struggle with.

    Google also plans to expand its SafeSearch feature, which filters out explicit content, turning it on for the accounts of teens under 18, and making it the default mode for all new accounts created by teens. The company is making similar efforts to ensure mature content doesn’t surface when a child uses Google Assistant on a shared device.

    Location History will also receive some changes. As it stands now, the feature cannot be turned on for children with supervised accounts, but Google will expand that to all accounts for teens under 18 globally.

    Google’s new safety section in the Play Store will give parents more details regarding apps, letting them know which ones follow the company’s Families policies. Similarly, Google Workspace for Education will receive a number of changes to make it easier for administers to customize experiences for different age groups.

    Google will also add additional safeguards “to prevent age-sensitive ad categories from being shown to teens, and we will block ad targeting based on the age, gender, or interests of people under 18.”

    Google’s plans represent one of the most comprehensive efforts the company has made to protect teens under 18, and will hopefully be emulated by other companies.

  • Could Google’s SafeSearch Be Costing You Traffic By Filtering Your Safe Images?

    There’s an interesting thread in WebmasterWorld, in which one webmaster claims that Google is filtering non-explicit image results from their site when SafeSearch is selected.

    Have you noticed Google filtering safe results with SafeSearch? Let us know in the comments.

    User mboydnv writes, “Our site has images of Hawaii and other scenery. When SafeSearch is selected all images disappear and 300 pages from the results.”

    User Robert Charton responds, “Hawaii suggests the possibility of too much flesh tone in some beach images, something SafeSearch might look at. I know it’s stretching, but it’s a possibility. You might also want to use view as Googlebot to make sure that your site hasn’t been hacked, which might introduce adult-related content seen only by Googlebot, that could be triggering the filter.”

    mboydnv isn’t the only users who has noticed issues like this. Sgt_Kickaxe adds, “I’ve reported this before. A picture of a brick, another of a duck, one of a frog and another of a hubcap were all blocked by safesearch on my site a while back, along with 50 others. There isn’t a single even remotely risky image or word on the site but that didn’t matter. An image reconsideration request and email solved nothing. The image file names and alt text are not the problem as these are benign as well. Best guess: A competitor is reporting images which gets them labeled unsafe, even if they aren’t.”

    In fact, while it doesn’t get brought up a whole lot, people have had the issue of their non-explicit results getting blocked by SafeSearch for years. There was a similar discussion thread running from 2007 to 2008 about how to get out of “Google Image SafeSearch Hell”. Barry Scwhartz at Search Engine Roundtable reported on it at the time. He wrote:

    Basically, sometimes Google might label your images as being sexually explicit or not appropriate for the average searcher. If that is done, you shouldn’t come up in Google’s web search results or in the standard Google image search (unless someone changes their Google preferences).

    What we have been noticing is that Google has been a bit more sensitive on image filtering recently. This has impacted a lot of webmasters, where they have noticed a major decline in image search traffic.

    A lot has happened with Image Search since 2008, including with SafeSearch. Early this year, Google launched a major redesign of Image Search, suggesting that the new style would increase clicks to sites. Webmasters, however, disagreed furiously.

    New Image Search

    Back in the Spring, Define Media published a study analyzing image search traffic of 87 domains, and finding a 63% decrease in image search referrals from Google after it launched its new interface.

    “Publishers that had previously benefitted the most from their image optimization efforts suffered the greatest losses after the image search update, experiencing declines nearing 80%,” said the firm’s Shahzad Abbas at the time.

    So obviously webmasters have enough hurdles to get over if they want any traffic from Image Search. Getting filtered for no apparent reason because someone has SafeSearch enabled is yet another obstacle, and perhaps one that some affected webmasters aren’t even aware of.

    Over the past year, Google has also quietly made adjustments to the SafeSearch feature itself. Basically, it became enabled automatically, and can’t be turned off, but only further filtered (which is, I probably where the filtering in the original poster’s situation occurred).

    So basically, it’s possible that webmasters’ images have multiple levels of filtering going on, and more chances to get buried. To explain, Google used to have three basic levels for SafeSearch: strict, moderate and off. Off opened the floodgates for explicit content, while the other two offered different levels of filtering. The difference is that now it’s more like it’s automatically set to moderate (or perhaps even on the strict side of moderate), and can be filtered further to strict. The terminology is just different now. Nows, users only have the option of enabling “Filter explicit results” or not, except that when you don’t enable it, the results are still filtered.

    Google explains the change like this: “This change actually doesn’t prevent anyone from getting to the content they want to see. So, if you search for explicit content, you’ll be able to find it — just make sure your query reflects this intent, and Google will show the most relevant content for the search. This recent change is just one in a number of search quality improvements we make on an ongoing basis. Our data has shown that this specific change has resulted in more people finding what they were looking for, while significantly reducing the chances of stumbling upon undesired content for potentially ambiguous queries.”

    But the filtering doesn’t only affect explicit images. You can toggle it on and off on pretty much any image query and see a difference in results. It’s not always a major difference, but usually there is a little shuffling going on at the least. There’s no way to know if the broader filtering that you can’t toggle is hiding innocent images too.

    If sites aren’t getting any traffic from Google Image Search anymore due to the main redesign, I’m not sure how much this really matters anyway in terms of traffic. You can’t really lose traffic you weren’t getting anyway. But if images are being filtered out, they’re still going to lose visibility.

    The late Ted Ulle (Tedster) offered a few tips in that old WMW thread, which Schwartz listed in his coverage back then. These include checking your on-page text content and image filenames for things that Google could view as adult words, checking to make sure your outbound links aren’t pointing to any adult neighborhoods, and making sure you’re not hosted on the same server as an adult site.

    Beyond that, your guess is as good as mine.

    Are you getting any traffic from Google Image Search these days? What about other Bing Image Search or Yahoo Image Search? Let us know in the comments.

  • Laura Prepon Is The Face Of ‘Orange Is The New Black,’ According To Google

    For the last couple weeks, we’ve been hearing that Laura Prepon is leaving the cast of Netflix’s Orange is the New Black. Interestingly, she appears to be the face of the show, at least as far as the world’s largest search engine is concerned.

    Google’s Knowledge Graph, which aims to show users quick info about whatever topic they’re looking for at a glance, uses Prepon’s picture as the main photo for the show. The images in the Knowledge Graph often point to whatever image is posted on Wikipedia, as this is where it often grabs the main info for a topic. In this case, it does grab the info from Wikipedia, but turns to Google Image Search for the photo (which it also often does).

    Laura Prepon

    The image WIkipedia shows is a more general title picture for the show, but is not listed as a Wikimedia Commons file, so that could have something to do with it.

    Still, it’s interesting that Prepon, an apparently outgoing cast member is presented as the face of the show. It’s not necessarily surprising. I would assume that Google is somehow factoring in search popularity, and Bing recently told us that Prepon is the show’s most searched-for cast member.

    If Prepon does in fact leave the show, it could be an interesting case study into how Google updates its Knowledge Graph as things change over time. Last month, Google apparently pushed a major update to the offering.

    Another thing that I find interesting about Google’s results here is Google’s willingness to point users toward sexually explicit (even if mildly) content. If you switch to image results for the “orange is the new black” query, you can see Prepon’s image as the first one it shows.

    OITNB Image Search

    If you look a little more closely, you can see that this is only part of a group of images with the label “Shower”. This is part of Google’s image carousel feature. The first regular image result is a poster for the show, which would arguably more appropriate for the Knowledge Graph result. It seems a bit odd that Google would even be pulling from this other query.

    But in terms of sexually explicit material, it’s even more interesting, considering that Google has gone out of is way recently to make such material harder to find, even with SafeSearch turned off.

    I’ll assume that “shower” is surfaced based on search popularity (you’ll notice that Prepon is in the carousel as well), and it’s still not as explicit as some of the stuff you’ll see Bing recommend on image searches (have you searched “ben affleck batman” with filtering turned off lately?), but it is an interesting example of Google’s efforts seemingly contradicting each other.

    Lead Image: Orange is the New Black (Netflix) via YouTube

  • Google SafeSearch Changes Finally Hit Germany, France, and Other Non-English Speaking Countries

    Back in December, Google made some subtle changes to their Image search. To make a long story short, Google made it so that users in the U.S. could no longer disable SafeSearch altogether. At first glance, it appeared as though Google was simply censoring adult content in Image Search – but that wasn’t the case. What they did do, however, is make it harder to unearth.

    I argued that this fragmented Google Image Search and in the end, made it worse. By choosing to alter the steps users had to take to find adult images, Google made their Image Search product weaker and less able to generate the most relevant results for individual queries. But more on that later.

    About a month later, those Image Search changes hit other English-speaking countries – including U.K., Australia, New Zealand, and South Africa. And now, it appears as though they’re spreading to some non-English speaking countries around the world.

    Ok, back to the long story. On the face of it, what Google has done is change the filtering options for Image Search. The old Image Search allowed users to select one of three levels of SafeSearch – STRICT, MODERATE, and OFF. Strict SafeSearch would filter out anything with a hint of NSFW content. Moderate, as you would expect, was somewhere in the middle – bikini shots ok, but exposed breasts a no-go, for instance. Turning SafeSearch off completely meant that your image results were completely unfiltered. For most queries, that’s the best way to find relevant content.

    Now, the new SafeSearch options that hit the U.S. in December, other English-speaking countries in January, and now places like Germany, France, and Spain, are a lot less varied. Now, you only have the ability to “filter explicit results” – that’s it (and report images, of course). The baseline Google Image Search is now set to MODERATE, and there’s no way to fully turn off SafeSearch.

    That doesn’t mean that you can no longer find adult images with Google Image Search. Google is not censoring these images. They’re just making you add qualifiers to your searches in order to find them.

    For instance, a search for “boobs” now produces moderate-level SFW content – no exposed breasts, just bras, bikinis, and such. If you choose to “filter explicit results,” it completely wipes out all results.

    Here’s the new SafeSearch options for google.de (Germany):

    And here are the SafeSearch options for google.se (Sweden):

    Now, if someone wanted to see NSFW results for “boobs,” they would have to add something else – “boobs porn” or “boobs nude,” for instance.

    The problem with this is that by making some moderate level of SafeSearch mandatory, Google is making image searches worse. It’s not truly showing the most relevant images for each query – it’s only showing the most relevant PG-13 results for each query.

    The problem gets worse if you search a term with a little less ambiguity. For instance, you can’t tell me that these are the most relevant results for the search “pussy.”

    Here’s what I had to say about how it makes Google Image Search worse back when Google first enacted the changes. I know the language is a little crass, but it’s the only way to get the point across:

    Ok, so the point here is that users need to be specific with their searches. Got it. Apologies for the frankness, but if I want to find pussy images, I now have to search “pussy porn.” There is now no way that I can edit my own personal settings to make a search for just “pussy” yield all results, both NSFW and otherwise.

    In essence, Google is fragmenting their image search. A “no filter” search is a true search of the most popular images across the web. U.S. users no longer have this option. We’re now only given the choice between filtered results for “pussy” or the most popular results for “pussy porn.” That smattering of all results, both NSFW and SFW for the query “pussy,” cannot be achieved anymore.

    Plus, is there really a question about what I’m looking for when I search “pussy?” Do I really need to provide any more detail?

    It seems like a big gripe about a small change, and it is in a way. But one could make the argument that this actually is a form of censorship. If I go to Google images and search “pussy,” I want to see the best of what the web has to offer – all of it. Not what Google thinks I should see based on their desire to prevent adult results unless users are super specific.

    Go ahead and try a search for “pussy” on Google Images right now. Those aren’t really very relevant results, are they? Users should see the most relevant results for their searches, no matter what. And they should have the option to simply turn off the SafeSearch filter, which they all had just a couple of days ago.

    Google’s SafeSearch support page tells us how to disable SafeSearch, but it only tells us how to turn off SafeSearch Filtering. That still leaves us with a “MODERATE” level SafeSearch and no way to see all web results, both NSFW and SFW at once for a single query.

    “We are not censoring any adult content, and want to show users exactly what they are looking for — but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you’re looking for adult content, you can find it without having to change the default setting — you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings work the same way as in web search,” Google told me back in December.

    The changes are subtle, yes. But they do make for an Image Search that feels lacking. Shouldn’t Google be about providing the best search results, not just the best moderately non-explicit results? Or at least still give users the option to disable SafeSearch completely?

    I’ve reached out to Google for comment and will update this article accordingly.

  • Arrested Development May Have Helped Google Clean Up Its Search Results

    Mitch Hurwitz, creator of the hit show Arrested Development, participated in a reddit AMA (ask me anything) today. Here’s what he had to say about the story going forward (which could be a fifth season or a movie).

    As we cover both Arrested Development and Google quite a bit, one particular response unrelated to actual ‘Arrested’ news seemed worth highlighting.

    Fans of the show know what a “never nude” is. For the rest of you, Tobias Fünke, a character played by David Cross, is one of these. It’s a psychological condition in which he can never bring himself to be nude, so instead, he always wears a minimum of blue-jean cut-offs.

    One redditor asked Hurwitz ” if there was any inspiration in real life that led to the idea of Tobias being a Never Nude?”

    His response was this:

    No, there wasn’t. The Never Nude thing – I will trace the etiology of that idea, and it’s this. We had this joke that just put us out, that was Tobias keeps crying in the shower. And then I had pitched – I was thinking about production, and the way they shoot those things, they always put people in flesh colored bathing suits, and I said, what if we show part of the flesh colored bathing suits for 3-4 weeks – and then in the 4th week we reveal that he showers in a flesh-colored bathing suit because he doesn’t like showering naked. And then Richie Rosenstock (who’s an absolutely brilliant, hilarious guy – and is responsible for so many of the giant laughs in the show) said without hesitation: “Oh, he’s a Never Nude.”

    And everybody in the room froze. And looked at him, and said, “is that a real thing?” and he shrugged, and it was just so funny. It wasn’t a funny idea until Richie called him a Never Nude, which took the joke from being just a sight gag, to a psychological affliction that really elevated it in such a brilliant way. And then I remember looking up to see online if there was such a thing as a Never Nude – and guess what you can’t search for besides finding pornography? “Never Nude” – back then you’d get 25,000 pages with the word “Nude” in it. Even if you used the Boolean quotation marks, you would still get things like “Hot 18 year old who’d NEVER been NUDE in front of a boy!” So we’ll never know if it was a thing before ARRESTED. Although I suppose I could just ask Richie.

    He doesn’t specifically mention Google, but this is somewhat amusing given that Google has been going out of its way to hide nudity in image search results in recent months.

  • Heather Graham And The Difference Between Google And Bing

    As we’ve reported on several times, Google has started cleaning up its Image Search experience, sometimes making results less relevant in the process. In fact, some other things that Google is doing these days make for a much more censored experience altogether. With regards to Image Search in particular, they made it harder to find adult content, even with SafeSearch turned off.

    Bing, on the other hand, is not only not doing this, but it is going out of its way to suggest that you search for adult content on some popular searches. If you go to Bing’s image search without typing a specific query, it displays the top twenty trending image searches like so:

    Bing image search

    Many of these, when clicked, come with some rather risque search suggestions. Number ten, Heather Graham, for example, suggests the following searches in bold at the top of the page: Heather Graham Naked, Heather Graham Nude, Heather Graham Tits, Heather Graham Pussy, Heather Graham Sex Scene, Heather Graham No Bra, Heather Graham Boobs, Heather Graham Sex, Heather Graham Hot, and Heather Graham Tesch.

    Heather Graham

    Granted, SafeSearch is off, but this is quite a bit different than what you get from Google. Click on those explicit search suggestions, and you’ll get explicit content. The suggestions change when the setting is set to moderate or strict.

    Google’s recommended searches at the top of the screen on a “Heather Graham” search are: heather graham hangover, heather graham premiere, heather graham 2013, heather graham judy moody, heather graham no makeup, heather graham husband.

    Quite a bit different.

    It’s worth noting that Google calls them “related searches,” and that this is without checking the “filter explicit results” option, which replaced the old SafeSearch style that was similar to what Bing still has.

    Heather Graham

    Now, you can still find the type of content that Bing is recommending in Google if you specifically type the keywords that Bing is suggesting. It’s not exactly missing from the index. Google has just gone out of its way in recent months to make it more difficult to find this kind of content, electing to make people get more specific with their keywords. As we’ve seen, there are times when this approach has sacrificed the relevancy of the search.

    Perusing Bing’s various trending image searches, you see similar results. Amelia Vega suggests: Amelia Vega Nude. Sloane Stephens suggests Sloane Stephens nude. Jodi Arias suggests: Jodi Arias Nude and Jodi Arias Naked. Lohan suggests: Lindsay Lohan Nude, Lindsay Lohan Naked, Lohan Bares All, Lohan Tits, Lohan Topless, Lindsay Lohan Playboy, Lindsay Lohan Spread Eagle, Lohan Wardrobe Malfunction, etc. Similar results for searches for Taraji P. Henson and Sarah Silverman. It’s not just the women though. Nick Lachey suggests: Nick Lahcey Nude, Nick Lachey Naked, etc.

    The difference in how these two search engine competitors illustrates how different their philosophies are regarding certain types of content. On the other hand, we’ve seen Bing make some much more appalling suggestions recently too.

  • Google SafeSearch Changes Hit the U.K., Australia, New Zealand, and More [CONFIRMED]

    It’s official: Google’s fragmented, less-useful Image search has spread from the U.S. and is now affecting English-speaking countries all over the world.

    Google has confirmed that the previous SafeSearch changes that made it impossible for U.S. users to fully disable SafeSearch have been launched in English-speaking countries internationally.

    Although, it’s unlikely that Google would describe the changes in that fashion.

    What do you think of Google’s changes to SafeSearch? Do you want to be able to fully disable SafeSearch? Does it really matter to you? Let us know.

    Back in December, we told you that Google had made a change to its SafeSearch feature in the U.S. that made it impossible for users to entirely disable SafeSearch when searching for Images on the site.

    Long story short, Google has prevented users from disabling SafeSearch altogether in Image search. It’s important to note that this is different from Google censoring NSFW content. That’s all still there, in fact, it’s just that users must now be very specific in their queries in order to access it.

    For example, a Google Image search for “boobs” will now yield SFW results, by default. In order to find NSFW results for that query, you must now add a modifier – let’s say “boobs porn” or “boobs nude” for instance.

    Users used to be able to turn SafeSearch off, completely. There is a little box at the top right of SafeSearch that used to allow users to pick their level of SafeSearch: “STRICT,” “MODERATE,” and “OFF” completely. But now, Google only allows users to filter all explicit results.

    What’s more, Google users are no longer given the option to turn off all types of SafeSearch filtering within the Search Settings.

    If all of this sounds a little confusing – that’s because it is. Google has fragmented their Image search in an attempt to keep NSFW materials from popping up without a specifically explicit search.

    But here’s the gist of it, in plain English: A search for ‘boobs” in the U.S. (and other English-speaking countries) now yields SFW results, as Google Image Search is now defaulted to “MODERATE” level. Users are not allowed to fully turn off SafeSearch. In order to see those NSFW results, users have to be more specific with their searches.

    Here are your SafeSearch options for Google.co.uk, Google Australia, Google South Africa, and Google New Zealand, etc.:

    And here are the options in Germany:

    Notice the difference? We’ve tested this for other non-English-speaking countries like France and the Netherlands and have seen the same results that we have for Germany. It appears that, at least for the time being, non English-speaking countries have not been affected by the changes.

    “We are not censoring any adult content, and want to show users exactly what they are looking for — but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you’re looking for adult content, you can find it without having to change the default setting — you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings work the same way as in web search,” Google told me back in December when we first reported on the changes to SafeSearch.

    Still, Google has fragmented Image search and ultimately made it worse. Here’s what I said in regards to that last month:

    Ok, so the point here is that users need to be specific with their searches. Got it. Apologies for the frankness, but if I want to find blowjob images, I now have to search “blowjob porn.” There is now no way that I can edit my own personal settings to make a search for just “blowjob” yield all results, both NSFW and otherwise.

    In essence, Google is fragmenting their image search. A “no filter” search is a true search of the most popular images across the web. U.S. users no longer have this option. We’re now only given the choice between filtered results for “blowjob” or the most popular results for “blowjob porn.” That smattering of all results, both NSFW and SFW for the query “blowjob,” cannot be achieved anymore.

    Plus, is there really a question about what I’m looking for when I search “blowjob?” Do I really need to provide any more detail?

    It seems like a big gripe about a small change, and it is in a way. But one could make the argument that this actually is a form of censorship. If I go to Google images and search “blowjob,” I want to see the best of what the web has to offer – all of it. Not what Google thinks I should see based on their desire to prevent adult results unless users are super specific.

    Go ahead and try a search for “blowjob” on Google Images right now. Those aren’t really very relevant results, are they? Users should see the most relevant results for their searches, no matter what. And they should have the option to simply turn off the SafeSearch filter, which they all had just a couple of days ago.

    Google’s SafeSearch support page gives us steps for disabling SafeSearch, but it really only tells us how to turn off SafeSearch Filtering. That still leaves us with a “MODERATE” level SafeSearch and no true way to see all web results, both NSFW and SFW at once.

    Do you think this makes Google Image search worse? Are results less relevant now that Google is automatically filtering out potential NSFW images? Or are we making a mountain out of a molehill? Let us know in the comments.

    [Image via CharlesFred, Flickr]

  • Google No Longer Allows You to Disable SafeSearch, and That Makes Google Search Worse

    UPDATE: Google SafeSearch Changes Hit the U.K., Australia, New Zealand, and More

    Google has just made their Image search worse in an effort to protect your virgin eyes.

    If you’re in the U.S. and trying to search for boobs on Google Images right now, you’re going to have a tougher time. That’s because Google has prevented U.S. users from disabling SafeSearch. And if you want to find NSFW images, you’re going to have to be more specific with your searches.

    Google users should be familiar with the SafeSearch toolbar at the top right of Image searches. Until recently, that bar allowed users to select MODERATE, STRICT, or OFF. As of right now, those options have been removed from the drop-down menu – but only in the U.S.

    Do you think Google should have messed with the SafeSearch format? Do you think it makes Google Image search worse? Let us know what you think in the comments.

    Here’s what it currently looks like:

    And here’s what it looks like on Google.co.uk (and other countries). This is what it looked like in the U.S. before today:

    Note that “SafeSearch” is the only option for U.S. users. For a search for “boobs” for instance, this means that the results will not feature any “explicit” images (nipples showing). The only other option is the “filter results,” which in our boobs search filters out all results. “The word ‘boobs’ has been filtered from the search because Google SafeSearch is active,” reads the results.

    It would appear that the only two options Google is giving U.S. users are STRICT and MODERATE. Or in other words, you can’t turn off SafeSearch.

    The default “SafeSearch” results for U.S. users are the exact same results for a MODERATE SafeSearch results for U.K. users:

    U.S. default:

    U.K. MODERATE:

    If you go to Google’s SafeSearch help page, they tell you how to disable SafeSearch from your settings:

    Here’s how to disable SafeSearch:

    • Visit the Google Preferences page.
    • In the “SafeSearch Filtering” section, select Do not filter my search results.
    • Click Save Preferences.

    But when U.S. users visit their search settings, they are only given the option to filter explicit results:

    But moderate SafeSearch is already on by default.

    If you go to another country’s Google, say Google.com.bz, the search settings provide the option to turn on “no filtering.”

    So, what gives? Is Google guilty of some seriously awful censorship? The quick answer is no, but it’s a little more complicated than that.

    You see, you have to be specific if you want NSFW results. Searching for “boobs porn” does give you plenty of nudity. But for searches on words without specific qualifiers, you still don’t see any of these results. In essence, Google has changed their search settings to only display adult results when queries are specifically adult-oriented.

    That means for all intents and purposes, users in the U.S. now only have two options – default SafeSearch and filter explicit results. The default SafeSearch is akin to MODERATE. The option to turn off SafeSearch completely for all results is gone in the U.S. And the big question is why? What was wrong with the old Google image search format (and the one seen in the rest of the world)? Why did Google feel the need to change it?

    Let’s look at two different responses we received from Google. First, Google Webmaster Trends Analyst John Mueller had this to say:

    “The default should continue to behave similarly to what most users have had as the default so far (“moderate”). Our algorithms are designed to downgrade explicit content when you’re not specifically looking for it. If a search term is very explicit, relevant adult content may show up, but we’ll err on the conservative side. So if you want to see adult content in Image Search, just make it clear with the query — we’ll show the most relevant content for each search.”

    Now from another Google spokesperson:

    “We are not censoring any adult content, and want to show users exactly what they are looking for — but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you’re looking for adult content, you can find it without having to change the default setting — you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings work the same way as in web search.”

    Ok, so the point here is that users need to be specific with their searches. Got it. Apologies for the frankness, but if I want to find blowjob images, I now have to search “blowjob porn.” There is now no way that I can edit my own personal settings to make a search for just “blowjob” yield all results, both NSFW and otherwise.

    In essence, Google is fragmenting their image search. A “no filter” search is a true search of the most popular images across the web. U.S. users no longer have this option. We’re now only given the choice between filtered results for “blowjob” or the most popular results for “blowjob porn.” That smattering of all results, both NSFW and SFW for the query “blowjob,” cannot be achieved anymore.

    Plus, is there really a question about what I’m looking for when I search “blowjob?” Do I really need to provide any more detail?

    It seems like a big gripe about a small change, and it is in a way. But one could make the argument that this actually is a form of censorship. If I go to Google images and search “blowjob,” I want to see the best of what the web has to offer – all of it. Not what Google thinks I should see based on their desire to prevent adult results unless users are super specific.

    Go ahead and try a search for “blowjob” on Google Images right now. Those aren’t really very relevant results, are they? Users should see the most relevant results for their searches, no matter what. And they should have the option to simply turn off the SafeSearch filter, which they all had just a couple of days ago.

    How about you? Do you think this is a form of censorship? Are we making mountains out of molehills? Could this Image search tweak make you seek out a competitor’s image search? Tired of reading the word “blowjob?” Let us know what you think in the comments.

  • Google Makes Changes To Improve SafeSearch

    On Thursday, Google released a list of 65 changes it made during the months of August and September. 5 of the changes are related to SafeSearch, Google’s filter feature that allows you to search without having to see adult content.

    Those changes are listed as follows:

    • Maru. [project “SafeSearch”] We updated SafeSearch to improve the handling of adult video content in videos mode for queries that are not looking for adult content.
    • Palace. [project “SafeSearch”] This change decreased the amount of adult content that will show up in Image Search mode when SafeSearch is set to strict.
    • #82872. [project “SafeSearch”] In “strict” SafeSearch mode we remove results if they are not very relevant. This change previously launched in English, and this change expanded it internationally.
    • Sea. [project “SafeSearch”] This change helped prevent adult content from appearing when SafeSearch is in “strict” mode.
    • Cobra. [project “SafeSearch”] We updated SafeSearch algorithms to better detect adult content.

    You would think Google would have this down by now, and with these changes, you would also think Google has gotten even better at SafeSearch. It’s hard to say whether or not it really has, but before these changes were announced, Search Engine Roundtable shared a story of an instance where Google was failing at SafeSearch, and even failed to make the necessary changes once notified (though it’s possible that they will still be made).

    Have you noticed whether or not SafeSearch has improved in recent months?

  • How Google Uses Twitter, SafeSearch – Matt Cutts Changes Advice

    Matt Cutts posted a new Webmaster Help video in which he answers his own question rather than a user-submitted one (like usual). Specifically, he asks if there’s any advice that he’d like to change from what he’s said in the past. 

    "I did a video back in May of 2010, that said we don’t use, for example, Twitter at all in our rankings other than as a normal web page, and the links are treated completely like normal web pages," he says.

    He then references a recent Danny Sullivan article which breaks down how both Google and Bing use Twitter. He notes that Google worked with him to ensure its accuracy. "It says that in some cases we do look at, for example, how reputable a particular person on Twitter might be, and we can use that in our rankings in some ways."

    And another thing that Cutts wanted to update…

    "SafeSearch, when I wrote the very first version, years and years and years ago – whenever you’re not able to crawl something – so for example, if it’s blocked by robots.txt, since people have deliberately said, ‘I would like a safe version – a family-safe version of Google, we would say, ‘oh, if we haven’t been able to crawl it, then we don’t know whether it’s porn or not, so we’re not going to be able to return it to users," says Cutts.

    "So, the Library of Congress or WhiteHouse.gov or Metallica at one point…Nissan, had blocked various pages from being crawled in the search engines, and so to be safe, we said, ‘you know what? We don’t know whether that’s family-safe or not, so we won’t return it’,"  he adds.

    "Luckily, the SafeSearch team has gotten much more sophisticated, and better, and more robust since I wrote the original version, so now that’s something that we might change. If something is forbidden from being crawled, but for whatever reason we think that it might be safe, now we’ll start to return it in our search results."

    It’s always good to set the record straight.