WebProNews

Google SafeSearch Changes Finally Hit Germany, France, and Other Non-English Speaking Countries

Back in December, Google made some subtle changes to their Image search. To make a long story short, Google made it so that users in the U.S. could no longer disable SafeSearch altogether. At first glance, it appeared as though Google was simply censoring adult content in Image Search – but that wasn’t the case. What they did do, however, is make it harder to unearth.

I argued that this fragmented Google Image Search and in the end, made it worse. By choosing to alter the steps users had to take to find adult images, Google made their Image Search product weaker and less able to generate the most relevant results for individual queries. But more on that later.

About a month later, those Image Search changes hit other English-speaking countries – including U.K., Australia, New Zealand, and South Africa. And now, it appears as though they’re spreading to some non-English speaking countries around the world.

Ok, back to the long story. On the face of it, what Google has done is change the filtering options for Image Search. The old Image Search allowed users to select one of three levels of SafeSearch – STRICT, MODERATE, and OFF. Strict SafeSearch would filter out anything with a hint of NSFW content. Moderate, as you would expect, was somewhere in the middle – bikini shots ok, but exposed breasts a no-go, for instance. Turning SafeSearch off completely meant that your image results were completely unfiltered. For most queries, that’s the best way to find relevant content.

Now, the new SafeSearch options that hit the U.S. in December, other English-speaking countries in January, and now places like Germany, France, and Spain, are a lot less varied. Now, you only have the ability to “filter explicit results” – that’s it (and report images, of course). The baseline Google Image Search is now set to MODERATE, and there’s no way to fully turn off SafeSearch.

That doesn’t mean that you can no longer find adult images with Google Image Search. Google is not censoring these images. They’re just making you add qualifiers to your searches in order to find them.

For instance, a search for “boobs” now produces moderate-level SFW content – no exposed breasts, just bras, bikinis, and such. If you choose to “filter explicit results,” it completely wipes out all results.

Here’s the new SafeSearch options for google.de (Germany):

And here are the SafeSearch options for google.se (Sweden):

Now, if someone wanted to see NSFW results for “boobs,” they would have to add something else – “boobs porn” or “boobs nude,” for instance.

The problem with this is that by making some moderate level of SafeSearch mandatory, Google is making image searches worse. It’s not truly showing the most relevant images for each query – it’s only showing the most relevant PG-13 results for each query.

The problem gets worse if you search a term with a little less ambiguity. For instance, you can’t tell me that these are the most relevant results for the search “pussy.”

Here’s what I had to say about how it makes Google Image Search worse back when Google first enacted the changes. I know the language is a little crass, but it’s the only way to get the point across:

Ok, so the point here is that users need to be specific with their searches. Got it. Apologies for the frankness, but if I want to find pussy images, I now have to search “pussy porn.” There is now no way that I can edit my own personal settings to make a search for just “pussy” yield all results, both NSFW and otherwise.

In essence, Google is fragmenting their image search. A “no filter” search is a true search of the most popular images across the web. U.S. users no longer have this option. We’re now only given the choice between filtered results for “pussy” or the most popular results for “pussy porn.” That smattering of all results, both NSFW and SFW for the query “pussy,” cannot be achieved anymore.

Plus, is there really a question about what I’m looking for when I search “pussy?” Do I really need to provide any more detail?

It seems like a big gripe about a small change, and it is in a way. But one could make the argument that this actually is a form of censorship. If I go to Google images and search “pussy,” I want to see the best of what the web has to offer – all of it. Not what Google thinks I should see based on their desire to prevent adult results unless users are super specific.

Go ahead and try a search for “pussy” on Google Images right now. Those aren’t really very relevant results, are they? Users should see the most relevant results for their searches, no matter what. And they should have the option to simply turn off the SafeSearch filter, which they all had just a couple of days ago.

Google’s SafeSearch support page tells us how to disable SafeSearch, but it only tells us how to turn off SafeSearch Filtering. That still leaves us with a “MODERATE” level SafeSearch and no way to see all web results, both NSFW and SFW at once for a single query.

“We are not censoring any adult content, and want to show users exactly what they are looking for — but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you’re looking for adult content, you can find it without having to change the default setting — you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings work the same way as in web search,” Google told me back in December.

The changes are subtle, yes. But they do make for an Image Search that feels lacking. Shouldn’t Google be about providing the best search results, not just the best moderately non-explicit results? Or at least still give users the option to disable SafeSearch completely?

I’ve reached out to Google for comment and will update this article accordingly.