WebProNews

Could Google’s SafeSearch Be Costing You Traffic By Filtering Your Safe Images?

There’s an interesting thread in WebmasterWorld, in which one webmaster claims that Google is filtering non-explicit image results from their site when SafeSearch is selected.

Have you noticed Google filtering safe results with SafeSearch? Let us know in the comments.

User mboydnv writes, “Our site has images of Hawaii and other scenery. When SafeSearch is selected all images disappear and 300 pages from the results.”

User Robert Charton responds, “Hawaii suggests the possibility of too much flesh tone in some beach images, something SafeSearch might look at. I know it’s stretching, but it’s a possibility. You might also want to use view as Googlebot to make sure that your site hasn’t been hacked, which might introduce adult-related content seen only by Googlebot, that could be triggering the filter.”

mboydnv isn’t the only users who has noticed issues like this. Sgt_Kickaxe adds, “I’ve reported this before. A picture of a brick, another of a duck, one of a frog and another of a hubcap were all blocked by safesearch on my site a while back, along with 50 others. There isn’t a single even remotely risky image or word on the site but that didn’t matter. An image reconsideration request and email solved nothing. The image file names and alt text are not the problem as these are benign as well. Best guess: A competitor is reporting images which gets them labeled unsafe, even if they aren’t.”

In fact, while it doesn’t get brought up a whole lot, people have had the issue of their non-explicit results getting blocked by SafeSearch for years. There was a similar discussion thread running from 2007 to 2008 about how to get out of “Google Image SafeSearch Hell”. Barry Scwhartz at Search Engine Roundtable reported on it at the time. He wrote:

Basically, sometimes Google might label your images as being sexually explicit or not appropriate for the average searcher. If that is done, you shouldn’t come up in Google’s web search results or in the standard Google image search (unless someone changes their Google preferences).

What we have been noticing is that Google has been a bit more sensitive on image filtering recently. This has impacted a lot of webmasters, where they have noticed a major decline in image search traffic.

A lot has happened with Image Search since 2008, including with SafeSearch. Early this year, Google launched a major redesign of Image Search, suggesting that the new style would increase clicks to sites. Webmasters, however, disagreed furiously.

New Image Search

Back in the Spring, Define Media published a study analyzing image search traffic of 87 domains, and finding a 63% decrease in image search referrals from Google after it launched its new interface.

“Publishers that had previously benefitted the most from their image optimization efforts suffered the greatest losses after the image search update, experiencing declines nearing 80%,” said the firm’s Shahzad Abbas at the time.

So obviously webmasters have enough hurdles to get over if they want any traffic from Image Search. Getting filtered for no apparent reason because someone has SafeSearch enabled is yet another obstacle, and perhaps one that some affected webmasters aren’t even aware of.

Over the past year, Google has also quietly made adjustments to the SafeSearch feature itself. Basically, it became enabled automatically, and can’t be turned off, but only further filtered (which is, I probably where the filtering in the original poster’s situation occurred).

So basically, it’s possible that webmasters’ images have multiple levels of filtering going on, and more chances to get buried. To explain, Google used to have three basic levels for SafeSearch: strict, moderate and off. Off opened the floodgates for explicit content, while the other two offered different levels of filtering. The difference is that now it’s more like it’s automatically set to moderate (or perhaps even on the strict side of moderate), and can be filtered further to strict. The terminology is just different now. Nows, users only have the option of enabling “Filter explicit results” or not, except that when you don’t enable it, the results are still filtered.

Google explains the change like this: “This change actually doesn’t prevent anyone from getting to the content they want to see. So, if you search for explicit content, you’ll be able to find it — just make sure your query reflects this intent, and Google will show the most relevant content for the search. This recent change is just one in a number of search quality improvements we make on an ongoing basis. Our data has shown that this specific change has resulted in more people finding what they were looking for, while significantly reducing the chances of stumbling upon undesired content for potentially ambiguous queries.”

But the filtering doesn’t only affect explicit images. You can toggle it on and off on pretty much any image query and see a difference in results. It’s not always a major difference, but usually there is a little shuffling going on at the least. There’s no way to know if the broader filtering that you can’t toggle is hiding innocent images too.

If sites aren’t getting any traffic from Google Image Search anymore due to the main redesign, I’m not sure how much this really matters anyway in terms of traffic. You can’t really lose traffic you weren’t getting anyway. But if images are being filtered out, they’re still going to lose visibility.

The late Ted Ulle (Tedster) offered a few tips in that old WMW thread, which Schwartz listed in his coverage back then. These include checking your on-page text content and image filenames for things that Google could view as adult words, checking to make sure your outbound links aren’t pointing to any adult neighborhoods, and making sure you’re not hosted on the same server as an adult site.

Beyond that, your guess is as good as mine.

Are you getting any traffic from Google Image Search these days? What about other Bing Image Search or Yahoo Image Search? Let us know in the comments.