WebProNews

Tag: Search

  • Google Paid Inclusion Results: Sponsored, But Not Ads?

    It would appear that some companies get the privilege of paying to be featured in Google search results that most others don’t. Obviously anyone can pay for AdWords ads, but Google reportedly has a new presentation for a certain kind of ad that is a little bit different.

    Danny Sullivan, with contributions from Pamela Parker, has put together a very interesting report about Google’s apparent paid inclusion program, which comes in the form of sponsored comparison ad results. According to Sullivan, Google considers these to be something in between organic results and ads – a “third kind of thing”. As he points out, it seems very much like paid inclusion, even if it’s only on a few select types of searches (hotels, flights and financial products).

    Update: A Google spokesperson tells us, “We’re changing the design layout of our hotel, flight, credit card and bank account results, which help users complete actions such as booking flights quickly and easily. We’ve always disclosed that Google may be paid when a user completes such an action; we want to be clear and consistent in how we do that.”

    Who’s to say this won’t expand to other types of searches in the future, as Google looks for more revenue streams? Meanwhile, Google is continuing to make improvements to mobile ads, which could help in that department as well.

    The main point of Sullivan’s article is that Google has long been against paid inclusion in search. He points to Google’s 2004 Founders’ IPO Letter from its S-1 registration statement, which includes the famous “Don’t Be Evil” section. This says:

    Google users trust our systems to help them with important decisions: medical, financial and many others. Our search results are the best we know how to produce. They are unbiased and objective, and we do not accept payment for them or for inclusion or more frequent updating. We also display advertising, which we work hard to make relevant, and we label it clearly. This is similar to a well-run newspaper, where the advertisements are clear and the articles are not influenced by the advertisers’ payments. We believe it is important for everyone to have access to the best information and research, not only to the information people pay for you to see.
    Emphasis added.

    It is interesting that financial is mentioned in there, considering that financial products are apparently one of the verticals that display this new format.

    I’m not sure the new format is a complete contradiction to this statement, however, as they are still clearly marked as “sponsored”. Just not as clearly marked. They don’t come in the colored boxes like AdWords ads. There’s a white background that will make them blend in much like other types of Google results. Though, it’s pretty much the same approach Facebook uses on its sponsored posts. Of course the big difference there is that with Facebook, these posts were organically made by your friends in the first place, so there’s a good chance you would’ve seen them anyway.

    Google's Paid Inclusion?

    Image courtesy: Search Engine Land

    The timing of this is pretty interesting, considering Google’s massive push against webspam with the Penguin update, which is designed to target sites violating Google’s quality guidelines, which, of course, prohibit paid links and links schemes.

    The idea behind paid links is that you are paying to influence search results. Obviously this is quite different than Google’s new comparison listings format, but in effect, it is still select companies paying to influence search results. The big differences are that Google is the one being paid and they are marked as sponsored. It’s that positioning of them as “a third kind of thing” rather than an ad product (and the fact that they don’t use the well-known ad background color) that might raise a few eyebrows.

    I’m sure you remember the debacle over “Local Paid Inclusion” a few months back, which sent SEOs into an uproar. If what Google is offering is a kind of paid inclusion, which one of the most respected names in the industry is calling it, I’m guessing we’ll see some backlash here too.

    According to Sullivan, the new sponsored comparison results format is going live over the coming days, so you may or may not see them yet. I’m not seeing them, but it looks like Google will be controlling a lot of what users see on hotel searches. I’m already seeing plenty of ads, Google Maps and Places results and Google+ Page results on a search for “hotels”:

    Google Hotels Search

    We’ve reached out to Google for more on these so-called “paid inclusion” results/ads. We’ll update with any more info that comes to light.

    What are your thoughts about what Google is doing?

  • Google’s Fresh Results: Irrelevancy In Action

    Google continues to place a certain emphasis on the freshness of search results. Even with its latest monthly list of algorithm changes (which reminds me, another one should be coming out any day now), Google had five different changes related to freshness.

    Do you think Google’s increased emphasis on freshness has made results better? Let us know what you think in the comments.

    I’ve hinted at it several times while writing about Google, but I’ve never come out and written an article specifically about this. Google’s emphasis on freshness is often burying the more relevant results. While I run into this problem fairly often, I ran into it while I was working on my last article, so I decided to go ahead and point out an example of what I’m talking about.

    WebProNews puts out a lot of content. I put out a fair amount myself, and sometimes I simply find it easiest to go to Google to search for past articles I know we’ve written, when I want to reference something we’ve talked about in the past. When I do this, I’ll usually search for “webpronews” and a few keywords I know are are relevant to the article I’m looking for. Sometimes Google will give me exactly what I need immediately. Sometimes, however, freshness is getting in the way, and this example proves that.

    In this case, I was looking for the article I wrote back in August called “Does Google Need Twitter?” So I searched, “webpronews, does google need twitter”. I can’t imagine what else could be more relevant to that query than that article. According to Google (and this is with or without Search Plus Your World turned on, mind you), two more recent stories I wrote about the Penguin update (both from today) were more relevant to that search.

    Fresh isn't always more relevant

    The only mention of Twitter in either of the two articles ranking above the one I was actually looking for, comes in the author bio sections, where it says to follow me on Twitter. I’m not sure what signals Google was looking at to determine that these results would be relevant to me for that query, but clearly freshness was given too much weight.

    This is just one example, of course, but I see this all the time. I’ve seen others mention it here and there as well. We had a comment from Matt, on a past article, for example, who said:

    “I find that recency is often given more credence than relevancy. Sometimes the content I’m looking for is older. Not all of the best content on the web happened in the last week.” Exactly! I thought it was just me. Freshness over relevancy was driving me nuts, I started using Bing it was getting so bad. Turns out Bing is actually pretty awesome.

    Google may be looking to compensate for its lack of realtime Twitter data, which it lost as the result of a deal between the two companies expiring last year (in fact, that’s what “Does Google Need Twitter” was about).

    We get it. Google can index new, fresh content. That’s good. I wouldn’t have it any other way. However, when Google had realtime search, it came in the form of a little box in the results, much like other universal search results appear – like when you get results from Google News. The latest tweet wasn’t presented as the top, most relevant result, just because it was indexed a minute ago.

    Realtime search was Google’s best example of freshness, in my opinion, and that went away with the Twitter deal, although Google has hinted that it could return, with Google+ and other data. I don’t think it would work as well without Twitter though. But this is one important area of search where Google isn’t cutting it anymore. If you want the latest, up-to-the-second info or commentary on something, where are you going? Google or Twitter?

    Interestingly enough, the fact that Twitter is better in this case, gives Google one line of defense against antitrust accusations. There is competition. In fact, verticals like this, with efforts from different companies (including Twitter) that have the potential to chip away at various pieces of Google dominance, may just be Google’s biggest weakness. I’ve had a conversation with one Googler, which leads me to believe the company tends to agree.

    We saw how Google was falling short in the area of realtime search, in particular, when Muammar Gaddafi died.

    Google continues to make changes to its algorithm every day, and a focus on quality, both with the Panda update and the Penguin update is good, even if these updates may not be entirely perfect. It’s also good to have content that’s as fresh as possible, so I also don’t want to say that Google’s focus on improving freshness is bad either, but I do feel that Google may be giving a little too much weight to this signal in its ranking process, just as it may be giving a little too much weight to social signals for some types of queries.

    Either way, it clearly pays to keep putting out fresh content.

    Have you noticed relevancy being sacrificed for freshness in Google results? Let us know in the comments.

    Image: The Fresh Prince Of Bel-Air (via)

  • Google Talks About Getting Your Images Removed From Google Images

    Last week, Google’s Matt Cutts posted a video talking about why Google, in most cases, won’t remove pages about you from its index. Today, Google has uploaded a similar video about removal of images from Google Images.

    Instead of Matt Cutts, this time the explanation comes from Google Consumer Experience Specialist Jeff LaFlam.

    For the most part, the advice is pretty much the same for images as it is web pages. If you’re hosting it, simply take it down, and Google will follow suit. If it’s on someone else’s site, Google suggests contacting them and having them take it down. Again, Google will follow suit, because it won’t be there anymore for them to index.

    “In certain cases, such as images infringing your copyright, you can submit a removal request through our removal process,” says LaFlam. “Remember to be patient. It can take time to remove images from our index.”

    Google has more tips about this in its help center.

  • Google Panda Update: Tips To Stay On Its Good Side

    Martin Panayotov recently wrote a post for SEOmoz about how a site (MyMovingReviews.com) he works for managed to benefit from the Panda updates. It’s an interesting article because usually you see articles about people who have been hit by the Panda update. Those who actually gained from the updates don’t have much to complain about, so they’re not as vocal.

    We reached out to Panayotov for more on the site’s Panda success, so perhaps you can learn a thing or two (if your site was negatively impacted) from what he has to say.

    Some will no doubt dispute this, but Panayotov believes Google’s search quality has gotten better with Panda.

    “Until the last update, Google emphasized on quality with the Panda updates,” he tells us. “We saw a lot of content farms going down to make space for the niche websites. Usually the niche websites will be more informative since they are operated by experts in the niche. General websites have less in-depth knowledge on the subject and given that they should be below the experts. Also duplicate content is gone and the thin pages are nowhere to be found on the 1st page.”

    So what did the site do right?

    “With My Moving Reviews we started with improving the quality,” he says. “We thought what would someone need to know before moving. Since we are in the moving and relocation niche, we try to be the most user friendly and informational source on the subject. Also we wanted to improve the visibility of all the great articles within the website.”

    “This helped keeping the visitor more engaged and at the same time improving the metrics that Google monitors so closely within the Panda algorithm – the bounce rate for example,” he says. “We also worked on the CTR to make sure we are preferred within the SERPs. We also included some rich snippet markups as well as authorship markup as our authors are experts in the niche. We wanted to make sure the readers know who created the article and to be able to interact and connect.”

    “We love user generated content,” he adds. “Starting from the moving company reviews, the company responses to the blog comments and Facebook comments – we are working on making it even easier for the visitors to share, comments and interact. We want to cover every side of the story by making the process easier and more user friendly. I think every major website and brand should try to utilize more and more UGC within their websites. This will also help with the search engines since they love fresh content.”

    They do love fresh. Google, in particular, has displayed an increased emphasis on freshness of search results in recent month – even since the Freshness update in November, Google has made more subtle improvements here and there related to fresher results.

    I say improvements, though in all honesty, I think they go too far with freshness sometimes. The right answer isn’t always from this week. Though, Google certainly had to do something to help fill the void left by the expiration of its deal with Twitter.

    “We analyzed our traffic closely,” continues Panayotov. “It turned our that people are searching for moving services directly from their phones more and more. First we created iPhone and Android apps to cover that, but it wasn’t enough. This is why we created a mobile website. This was a great move because we instantly increased conversions, increased the apps downloads, reduced the bounce rate and we made your visitors happier.”

    By the way, Google just released some new mobile AdWords features that could help in this department as well.

    Panayotov offers five points of advice for those who have struggled with Panda:

    1. Produce great content on a regular basis. Make sure you have a plan on content marketing. Don’t go for the keywords articles only. There are a lot of great content opportunities out there – make sure you utilize that.

    2. Sometimes the most unexpected articles get the most shares and re-tweets, so make sure you try different approach with every article to find the right spot.

    3. Improve social metrics and especially Google Plus and Twitter. Be active there and connect with your industry leaders. Share great stuff on a regular basis not only from your websites, but from other great sources in the niche. This will help with SEO too and also will give you ideas about new content.

    4. Work on your website and make sure you can receive user generated content. This will help your engagement metrics and will boost your rankings. Also try to markup all content properly. If you do, you may get rich snippets which will increase your click-through rate. If applicable, go with the authorship markup.

    5. Improve bounce rate and make sure visitors won’t leave your website seconds after being there. Make them read what you have to say. You can do that by making your articles easy to scan before reading. Use a lot of H tags, bullet points and great images. Save your visitors’ time by structuring your information better. There are also some other tricks you can do like adding a fly-box on your blogs or by having a news section visible from within the posts.

    With regards to recent Google updates, Panayotov says, “There is another factor introduced that I think have something to do with keyword density and synonyms. It affects some of the heavily mentioned main keywords as well as long tails. As it is a pretty fresh update there is still not a lot of information out there.”

    “On the other hand, the new Penguin update might be looking after the links,” he adds. “There are some recent speculations about penalizing footer links and also too many links from blog websites. There are some interesting signals from Google that we are just starting to analyze.”

    “As Google tries to make sure it gets harder for SEOs to manipulate the SEPRs, an idea would be to structure your website and information as there was no Google,” says Panayotov. “Focus on the visitor. Make sure you lead them the way they were supposed to and not for the best SEO benefit. I believe this would be the best long-term advice.”

    Given that Google always says it wants to deliver the best content for the user, and that Google’s Matt Cutts said Google wants people doing no SEO at all to be “free to focus on creating amazing, compelling web sites,” that’s probably not bad advice.

  • Google Penguin Update: 12 Tips Directly From Google

    In case you haven’t heard the news, the official name for the Webspam update Google launched this week, is reportedly the Penguin update. Google’s Matt Cutts even tweeted a picture of a stuffed Panda hanging out with a stuffed Penguin. How cute.

    Now, we don’t know exactly what all Google takes into account with this Penguin update. But Google made it pretty clear that it’s about targeting those violating its quality guidelines. Here’s an exact quote from the announcement:

    In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.

    There wasn’t this much apparent clarity with the Panda update. There was (and frankly, still is) a lot of speculation about how to survive Panda. Google did release a list of questions that webmasters should ask themselves related to how Google assesses quality, but it wasn’t completely black and white.

    Luckily, Google lists exactly what the quality guidelines are. In other words, Google tells you exactly what not to do.

    There are 8 “specific guidelines”. They are (verbatim):

    1. Avoid hidden text or hidden links.

    2. Don’t use cloaking or sneaky redirects.

    3. Don’t send automated queries to Google.

    4. Don’t load pages with irrelevant keywords.

    5. Don’t create multiple pages, subdomains, or domains with substantially duplicate content.

    6. Don’t create pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware.

    7. Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.

    8. If your site participates in an affiliate program, make sure that your site adds value. Provide unique and relevant content that gives users a reason to visit your site first.

    Some of them are more black and white than others. For example, avoiding hidden text or hidden links seems like a pretty obvious thing. Just don’t do it. The duplicate content one is a little different. What does Google consider “substantially duplicate content”? How much is too much?

    Beyond the specific guidelines, Google also lists 4 “basic principles”. These are:

    1. Make pages primarily for users, not for search engines. Don’t deceive your users or present different content to search engines than you display to users, which is commonly referred to as “cloaking.”

    2. Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”

    3. Don’t participate in link schemes designed to increase your site’s ranking or PageRank. In particular, avoid links to web spammers or “bad neighborhoods” on the web, as your own ranking may be affected adversely by those links.

    4. Don’t use unauthorized computer programs to submit pages, check rankings, etc. Such programs consume computing resources and violate our Terms of Service. Google does not recommend the use of products such as WebPosition Gold™ that send automatic or programmatic queries to Google.

    Google provides plenty more guidelines and elaboration on the quality guidelines in its help center.

    Of course, none of this is new. It’s just that now Google has a better way to enforce them (or at least, it hopes it does).

    More on Penguin/Webspam Update:

    Google Penguin Update: The New Name For The WebSpam Update
    Webspam And Panda Updates: Does SEO Still Matter? 
    Google Webspam Algorithm Update Draws Mixed Reviews From Users 
    Google Webspam Update: Where’s The Viagra? [Updated] 
    Google Webspam Update: “Make Money Online” Query Yields Less Than Quality Result 
    Google Webspam Update: Losers & Winners, According To Searchmetrics [Updated] 
    How Much Of Google’s Webspam Efforts Come From These Patents? 
    Google Panda Update: Data Refresh Hit Last Week

    Image: Batman Returns from Warner Bros.

  • Google Launches A Bunch Of Mobile Ad Improvements

    Google announced some new mobile search ad improvements today. These include: app promotion with a new Mobile App extension, richer info about mobile apps in ad units, the ability to track Android app downloads through AdWords and the addition of custom search ads to tablet apps.

    “For the first time, businesses can use AdWords mobile search ads as a holistic solution to promote, monetize and track their app downloads,” says Mobile Search Ads product manager Anurag Agrawal.

    The Mobile App extension is a new member of Google’s Ad Extensions for promoting apps. It lets advertisers add a mobile app download link to their search ads. Google says beta testers found a 6% increase in clickthrough rate with the extension.

    Mobile App extension

    As far as the richer info in the ad units, that includes image previews, app descriptions, pricing and ratings. All of the info comes directly from Google Play and the iTunes App Store.

    Richer mobile ads

    Advertisers will be able to track app downloads from AdWords, specifically as AdWords conversions.

    The custom search ads for tablet apps are available through Google’s new AdMob SDK.

    All of this comes after two quarters in a row of sharp declines in cost per click for Google ads, for which mobile has been largely blamed. More people are searching with their mobile devices, but the CPCs just haven’t been what they are for desktop. Google believes that will change, however.

    During the company’s earnings call earlier this month, CEO Larry Page said he’s “very bullish” that mobile CPCs will get better, noting that Google is making lots of investments in that area and comparing it to how desktop search was in the early 2000s.

    “Mobile apps represent a significant opportunity for businesses to reach their customers, and mobile search is an important channel to reach these customers,” says Agrawal. “We’re looking forward to bringing new products in the coming year that will help businesses grow by promoting, tracking, and monetizing their mobile apps with Google.”

    It will definitely be interesting to see what other improvements and innovations Google comes up with for mobile ads this year. It will be even more interesting to see how CPCs react in the coming quarters.

  • Recovering From Google’s Penguin Update

    First, before you start your campaign for Penguin recovery, you should probably determine whether you were actually hit by the Penguin update, or by the Panda update (or even some other Google algorithm change).

    Shortly after the Penguin update rolled out, Google’s Matt Cutts revealed that Google had implemented a data refresh for the Panda update several days earlier. This threw off early analysis of the Penguin update’s effects on sites, as the Panda update was not initially accounted for. Searchmetrics put out a list of the top losers from the Penguin update, which was later revised to reflect the Panda refresh.

    Google also makes numerous other changes, and there’s no telling how many other adjustments they made between these two updates, and since the Penguin update. That said, these two would appear to be the major changes most likely to have had a big impact on your site in the last week or two.

    According to Cutts, the Panda refresh occurred around the 19th. The Penguin update (initially referred to as the Webspam Update) was announced on the 24th. The announcement indicated it could take a “few days”. Analyze your Google referrals, and determine whether they dropped off before the 24th (and around or after the 19th), and you should be able to determine if you are suffering the effects of Panda or Penguin, at least in theory.

    If it looks more likely to be Panda, the best advice is probably to focus on making your content itself better. Also, take a look at Google’s list of questions the company has publicly said it considers when assessing the quality of a site’s content. We’ve written about these in the past, but I’ll re-list them here:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Penguin is different. Penguin and Panda are designed to work together to increase to the quality of Google’s search results. Whether or not you think this is actually happening is another story, but this does appear to be Google’s goal, and at the very least, that’s how it’s being presented to us.

    Google’s announcement of the Penguin update was titled: “Another step to reward high-quality site“.

    “The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs,” Cutts wrote in the post. “We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available ‘above the fold.’”

    If your site was hit by Penguin, you should, again, focus on quality content, and not trying to trick Google’s algorithm. All that Penguin is designed to do is to make Google better at busting you for abusing its algorithm. It’s designed to target those violating Google’s quality guidelines. The guidelines are not new. It’s not some new policy that is turning SEO on its ear. Google just found a way to get better at catching the webspam (again – at least in theory).

    So, with Penguin, rather than a list of questions Google uses to assess content, as with the Panda list, simply look at what Google has to say in the Quality Guidelines. Here they are broken down into 12 tips, but there is plenty more (straight from Google) to read as well. Google’s guidelines page has plenty of links talking about specific things not to do. We’ll be delving more into each of these in various articles, but in general, simply avoid breaking these rules, and you should be fine with Penguin. If it’s too late, you may have to start over, and start building a better link profile and web reputation without spammy tactics.

    Here’s a video Matt Cutts recently put out, discussing what will get you demoted or removed from Google’s index:

    Assuming that were wrongfully hit by the Penguin update, Google has a form that you can fill out. That might be your best path to recovery, but you really need to determine whether or not you were in violation of the guidelines, because if you can look at your own site and say, “Hmm…maybe I shouldn’t have done this particular thing,” there’s a good chance Google will agree, and determine that you were not wrongfully hit.

    By the way, if you have engaged in tactics that do violate Google’s quality guidelines, but you have not been hit by the Penguin update, I wouldn’t get too comfortable. Google has another form, which it is encouraging people to fill out when they find webspam in search results.

    To report post-Penguin spam, fill out https://t.co/di4RpizN and add “penguin” in the details. We’re reading feedback. 2 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    They’ve had this for quite a while, but now that some people are getting hit by the Penguin update, they’re going to be angry, and probably eager to point out stuff that Google missed, in an “If what I did was so bad, why not what this person did” kind of mentality.

    Another reason not to be too comfortable would be the fact that Google is likely to keep iterating upon the Penguin update. We’ve seen plenty of new versions and data refreshes of the Panda update come over the past year or so. Penguin is already targeting what Google has long been against. I can’t imagine that they won’t keep making adjustments to make it better.

  • Webspam And Panda Updates: Does SEO Still Matter?

    It’s been a crazy week in search. While not entirely unexpected, Google launched its new Webspam update (which should still be in the process of rolling out, as Google said it would take a few days). This update, according to the company, is aimed at black hat SEO tactics and the sites engaging in them, to keep them from ranking over content that is just better and more relevant. While most that don’t engage in such tactics would agree that this would be a good thing, a lot of people are complaining about the effects of the update on the user experience, and on results in general.

    Do you think Google’s results have improved or gotten worse with this update? Let us know in the comments.

    The Webspam update, as it’s officially been dubbed by Google’s Matt Cutts, is really only part of the equation though. Cutts also revealed that Google launched a data refresh of the Panda update around April 19th. So it would appear that a mixture of these two updates (along with whatever other tweaks Google may have made) have caused a lot of chaos among webmasters and in some search results.

    What The Panda Update Is About

    I’m not going to spend a lot of time talking about Panda here. I feel I’ve done that enough for the past year. If you’re not familiar with Panda, I’d suggest reading through our coverage here. Essentially, it’s Google’s attempt to make quality content rise to the top. There are a lot of variables, opinions and speculation throughout the Panda saga, but in a nutshell, it’s just about Google wanting good, quality content ranking well.

    What The Webspam Update Is About

    Interestingly enough, the Webspam update is about quality content as well. In fact, Google’s announcement of the update was titled: Another Step To Reward High-Quality Sites. It can be viewed as a complement to Panda. A way for Google to keep spammy crap from interfering with the high quality content the Panda update was designed to promote. That is, in a perfect world. But when has this world ever been perfect? When has Google ever been perfect?

    When Matt Cutts first talked about this update, before it had a name or people even really knew what to expect, he said Google was going after “over-optimization”. He said, at SXSW last month, “The idea is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level.”

    At the time, we wrote an article about it, talking about how Google was working on making SEO matter less. This week, Cutts aimed to clarify this a bit. Danny Sullivan quotes Cutts as saying, “I think ‘over-optimization’ wasn’t the best description, because it blurred the distinction between white hat SEO and webspam. This change is targeted at webspam, not SEO, and we tried to make that fact more clear in the blog post.”

    Well, it’s clear that black hat webpsam is a target, because the post says those exact words. “The opposite of ‘white hat’ SEO is something called “black hat webspam” (we say ‘webspam’ to distinguish it from email spam),” Cutts says in the post, later adding, “In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. ”

    OK, so as long as you abide by Google’s quality guidelines, this update should not impact you negatively right?

    The part that isn’t quite as clear is about how much SEO tactics really matter. While he have clarified that that they’re more concerned about getting rid of the black hat stuff, he also said something in that post, which would seem to indicate that Google wants content from sites not worried about SEO at all to rank better too (when it’s good of course).

    “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites,” says Cutts. Emphasis added.

    To me, that says that Google is not against white hat SEO (obviously – Google promotes plenty of white hat tactics), but they also would like to have it matter less.

    While I’m sure many in the SEO industry would disagree (because it could cost them their businesses), wouldn’t it ultimately be better for users and webmasters alike if they didn’t have to worry about SEO at all? If Google could just determine what the best results really were?

    Don’t worry, SEOs. We don’t live in that fantasy land yet, and while Google (and its competitors) would love to be able to do this, there is little evidence to suggest that will happen in the foreseeable future. In fact, I’d expect the nature of how we consume information from the web to evolve so much by that point, that it may not even be a relevant discussion.

    But rather than talk about what the future may bring (though Google’s certainly thinking about it), let’s focus on the here and now.

    Who Has Felt The Effects Of Google’s Updates?

    You can browse any number of forum threads and blog comments and see plenty of personal stories about sites getting hit. Searchmetrics, as it usually does following major Google updates, compiled some preliminary lists of the top winners and losers. Before we get to those lists, however, there are some caveats. For one, the firm was clear that these are preview lists. Secondly, the update has probably not finished rolling out yet. Third, they were put out before the Panda refresh was made public, and Matt Cutts says the list isn’t indicative of the sites impacted by the Webspam update.”

    He told Sullivan, “There’s a pretty big flaw with this “winner/loser” data. Searchmetrics says that they’re comparing by looking at rankings from a week ago. We rolled out a Panda data refresh several days ago. Because of the one week window, the Searchmetrics data include not only drops because of the webspam algorithm update but also Panda-related drops. In fact, when our engineers looked at Searchmetrics’ list of 50 sites that dropped, we only saw 2-3 sites that were affected in any way by the webspam algorithm update. I wouldn’t take the Searchmetrics list as indicative of the sites that were affected by the webspam algorithm update.”

    OK, so the lists apparently more indicative of the lastest Panda victims and winners. We still don’t really know who the biggest losers and winners on the Webpspam front are. Perhaps Searchmetrics will release another lists soon, with this new information taken into account.

    Here are the lists:

    Searchmetrics list

    Searchmetrics list

    Note that Demand Media’s eHow.com is not on the list. If you’ve followed the Panda saga all the way, you’ll know that it has always been in the conversation. Thought of as a content farm, it was the kind of site many thought Panda was designed to target. While it managed to escape unscathed for a while, Panda eventually caught up with it, and Demand Media made a lot of changes, which seem to have helped tremendously. They deleted a lot of articles and implemented some other things designed to keep quality up.

    During the company’s most recent earnings call (there’s another one coming in May), Demand Media said it hadn’t been affected by a Google update since July. It will be interesting to see what they say on the next call.

    There is some speculation that eHow may have benefited from recent Google updates, whether Panda or Webspam. Here’s a tweet from WebmasterWorld/PubCon Founder Brett Tabke:

    Did ‘ehow’ just make a comeback in the serps? hmmm – ran into them in 4 searches in last hour. 1 hour ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    We asked Demand Media if they’ve seen any increase in Google referrals. The company won’t comment because they’re in a quiet period ahead of their results announcement.

    Are Google Results Better?

    There is never a shortage of criticism of Google’s search results, yet it has managed to steadily dominate the market, so clearly they’ve remained good enough not to alienate the majority of users. There do, however, seem to be some very identifiable flaws in some search results right now.

    For example, there is all kinds of weird stuff going on with the SERP for “viagra”. For example, viagra.com, the official site, was not on the first page, when it should have been the first result. Just as I was writing this piece, viagra.com reappeared at number one. More on the other viagra page issues (some of which are still there) here.

    For the query, “make money online,” the top result was a page without any content on it whatsoever. Not what Google had in mind in terms of quality, I assume. Looking now, it actually appears Google has fixed this one too.

    A couple things we’ve seen mentioned by webmasters repeatedly, with regards to what has gotten sites’ Google rankings hit, are exact match domains and sites with a lot of links from spun content sources. Of course not every exact match domain is hit, but it could be a factor for some topics that do tend to generate a lot of spam. Viagra would certainly fit that bill, and may have just been an innocent casualty, which Google had to correct. I wonder how many more of those there are, and if Google will correct them.

    From what Google says, it’s more about things like keyword stuffing, link schemes and other things that violate its quality guidelines. You may want to go read those carefully.

    Update: Apparently, the Webspam update is now called the Penguin update, even though Cutts already called it the Webspam update. Sigh. I guess I have some re-tagging to do.

    What do you think? Did Google get its Webspam update right? As Panda continues to march on, is that making results better? Share your thoughts in the comments.

  • Reconsideration Request Tips From Google [Updated]

    If you think you’ve been wrongfully hit by Google’s Penguin update, Google has provided a form that you can fill out, in hopes that Google will see the light and get your site back into the mix.

    The update is all about targeting those in violation of Google’s quality guidelines. It’s an algorithmic approach designed to make Google better at what it has been trying to do all along. For those Google has manually de-indexed, there is still a path to redemption, so it seems likely that those impacted by the update can recover as well.

    For example, if you were busted participating in a link scheme, you’re not necessarily out of Google forever. Google says once you’ve made changes to keep your site from violating Google’s guidelines, you can submit a reconsideration request.

    To do so, go to Webmaster Tools, sign into your Google account, make sure you have your site verified, and submit the request.

    Google’s Rachel Searles and Brian White discuss tips for your request in this video:

    “It’s important to admit any mistakes you’ve made, and let us know what you’ve done to try to fix them,” says Searles. “Sometimes we get requests from people who say ‘my site adheres to the guidelines now,’ and that’s not really enough information for us, so please be as detailed as possible. Realize that there are actually people reading these requests.”

    “Ask questions of the people who work on your site, if you don’t work on it yourself,” she suggests, if you don’t know why you’re being penalized. Obviously, read the quality guidelines. She also suggests seeking help on the Google Webmaster forum, if you’d like the advice of a third party.

    “Sometimes we get reconsideration requests, where the requester associates technical website issues with a penalty,” says White. “An example: the server timed out for a while, or bad content was delivered for a time. Google is pretty adaptive to these kinds of transient issues with websites. So if you sometimes misread the situation, as ‘I have a penalty,” and seek reconsideration, it’s probably a good idea to wait a bit, see if things revert to their previous state.”

    “In the case of bad links that were gathered, point us to a URL-exhaustive effort to clean that up,” he says. “Also, we have pretty good tools internally, so don’t try to fool us. There are actual people, as Rachel said, looking at your reports. If you intentionally pass along bad or misleading information, we will disregard that request for reconsideration.”

    “And please don’t spam the reconsideration form,” adds Searles. “It doesn’t help to submit multiple requests all the time. Just one detailed concise report and just get it right the first time.”

    Google says they review the requests promptly.

    Update: Apparently reconsideration requests don’t do you a lot of good if you were simply hit by the algorithm. A reader shares (in the comments below) an email from Google in response to such a request:

    Dear site owner or webmaster of http://www.example-domain.com/,

    We received a request from a site owner to reconsider http://www.example-domain.com/ for compliance with Google’s Webmaster Guidelines.

    We reviewed your site and found no manual actions by the webspam team that might affect your site’s ranking in Google. There’s no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.

    Of course, there may be other issues with your site that affect your site’s ranking. Google’s computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users.

    If you’ve experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site’s content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you’ve changed the URLs for a large portion of your site’s pages. This article has a list of other potential reasons your site may not be doing well in search.

    If you’re still unable to resolve your issue, please see our Webmaster Help Forum for support.

    Sincerely,

    Google Search Quality Team

    Anyhow, should you need to submit a reconsideration request (I assume Google will still take manual action as needed), these tips might still come in handy.

    Image: Batman Returns from Warner Bros.

  • Google Penguin Update: Petition Calls For Google To Kill It

    Last week, Google gave frustrated webmasters a place to complain if they felt they were unjustly hit by the Penguin update. While I’m sure Google has received plenty of feedback through that, some are gravitating towards a petition to get Google to kill the Penguin update.

    For those of you who haven’t been following, Google announced the Penguin update (formerly known as the “Webspam update”) on April 24th, to target sites violating Google’s quality guidelines.

    Here’s what the petition says:

    Penguin killedPlease kill your Penguin update!l

    With the recent Google Penguin update, it has become nearly impossible for small content based websites to stay competitive with large publishers like eHow, WikiHow, Yahoo Answers and Amazon.

    Countless webmasters have seen their livelihoods vanish with this update. Sergey Brin recently came out against “Walled Gardens” of the likes of Facebook. However, the Penguin update has created a similar garden that only admits multimillion dollar publishing platforms.

    I’ll sign off with the words of someone who has lost everything in this update:

    “I got stuffed by it. I have a 7 year old website with SEO work done on it several years ago. No real SEO done in the past 3 years. So I have been penalised for SEO work done 3 years ago is all I can think.

    My website “was” top of its niche, with several hundred multi million pound clients. In the past day we have had a 90% drop in traffic and all but a bare few keywords left with rankings. Over 250 rankings we did have that we monitor each day have gone. These were top 3 rankings, now not even in the top 200.

    We have never done any bad SEO, we need to compete, but we have never done black hat. Saying that, what we did do was borderline, but then so does everyone else so we were left with little choice.

    Overnight my business which supports my 5 children, 3 employees, pays for my mortgage and debts etc has been wiped out.

    Thanks Google. At a time where almost every country in the world is suffering, way to go with applying a little more hardship to people whom have just tried to play the game as does everyone.

    The petition, which seeks 500 signatures, has 289 so far. There are also plenty of comments from webmasters leaving their reasons for signing.

    We saw plenty of stories about people losing their businesses and having to get rid of employees when Google launched the Panda update, and it appears that the Penguin update is having a similar effect.

    It’s still the early days for Penguin. My guess is that we’ll continue to see more adjustments on Google’s part. It’s hard to gauge how well Google’s update did from the outside looking in, in terms of getting rid of webspam and not penalizing the innocent. We have seen some examples where Google results were quite questionable, though Google quickly made adjustments. Of course, examples are always out there waiting to be pointed out, independent from the Penguin update.

    Comic image courtesy: DC Comics: Batman Annual #15 (via alternity)

  • Google Obviously Powers Ask.com’s Paid And Organic Search Results

    Not that this will necessarily come as a surprise to you, but it seems pretty obvious that Google is powering Ask’s organic search results. Ask has an open partnership with Google for its sponsored search results, but will not come right out and say who is powering its regular results.

    I’m not sure that there was much doubt it was Google anyway, but after looking at Google’s results for “viagra” in light of its Penguin update, and comparing them to the results on other search engines, Ask’s SERP for the query was nearly identical, down to the specific flaws we pointed out about Google’s version. Google has corrected some of these flaws, and those same ones appear to have been corrected on Ask’s version as well.

    A spokesperson for Ask told us, “A third-party partner powers core web search on Ask.com, but that information is not public for contractual reasons.”

    “Ask’s search technology is focused on surfacing answers to questions rather than links, and it’s powered by a combination of technologies,” she said. “A third party search engine supplies the raw search feeds and we build our own algorithms on top of that, designed specifically to locate and extract answers to questions.”

    Here’s what Ask says on its Editorial Guidelines page about its automated search results:

    Ask.com delivers its primary search results using it’s proprietary search technology. These search results appear under the heading “Web Results”. Ask.com search technology uses sophisticated algorithms and Subject-Specific PopularitySM data to generate the most relevant and authoritative results on the Web.

    Here’s what it says about its sponsored links:

    Results appearing under the heading “Sponsored Web Results” or “Sponsored Web Result” are provided by Google, a third party provider of pay for performance search listings. Google generates highly relevant sponsored results by allowing advertisers to bid for placement in this area based on relevant keywords. These results, which are powered by Google’s advanced algorithms, are then distributed across the Internet to some of the world’s most popular and well-known Web sites, including Ask.

    Here’s a screen cap of the “viagra” results before they were fixed:

    null

    You can just compare the results to the ones I talked about in this article and see the obvious similarities (which were not all duplicated on the other search engines).

    By the way, if you ask Ask.com, “Does Google power Ask.com’s search results?” the top two results are articles that suggest that Google may power Ask.com’s search results. Of course, they also happen to be the same results Google gives you when you ask the same question in a Google query.

    If you’re counting Bing and Yahoo together in those search market reports, you might as well be counting Google and Ask together as well. And AOL, of course.

  • Here’s What Matt Cutts Says to Sites That Have Been Hacked

    Google’s head of Webspam, Matt Cutts, has been in the news a lot this week, thanks to Google’s big webspam update, which has become officially known as the Penguin update. As Cutts says, Google has to deal with more types of spam than just the black hat SEO tactics, which the update targets. They also have to deal with sites who have been hacked.

    It’s not uncommon to stumble across compromised sites in Google’s search results. In fact, we saw a few with the “This site may be compromised” tag on the SERP for “viagra” this week, when we were analyzing the effects of the Penguin update. While Google addressed some issues with that SERP (viagra.com is ranking at the top again), there are still some compromised results on the page, even today.

    On his personal blog, Cutts posted an example email of what he tells site owners who have sites that Google has identified as hacked. The email (minus identifying details) says:

    Hi xxxxxxx, I’m the head of Google’s webspam team. Unfortunately, example.com really has been hacked by people trying to sell pills. I’m attaching an image to show the page that we’re seeing.

    We don’t have the resources to give full 1:1 help to every hacked website (thousands of websites get hacked every day–we’d spend all day trying to help websites clean up instead of doing our regular work), so you’ll have to consult with the tech person for your website. However, we do provide advice and resources to help clean up hacked websites, for example
    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=163634
    https://sites.google.com/site/webmasterhelpforum/en/faq-malware-and-hacked-sites
    http://googlewebmastercentral.blogspot.com/2008/04/my-sites-been-hacked-now-what.html
    http://googlewebmastercentral.blogspot.com/2007/09/quick-security-checklist-for-webmasters.html
    http://googlewebmastercentral.blogspot.com/2009/02/best-practices-against-hacking.html

    We also provide additional assistance for hacked sites in our webmaster support forum athttps://groups.google.com/a/googleproductforums.com/forum/#!forum/webmasters . I hope that helps.

    Regards,
    Matt Cutts

    P.S. If you visit a page like http://www.example.com/deep-url-path/ and don’t see the pill links, that means the hackers are being extra-sneaky and only showing the spammy pill links to Google. We provide a free tool for that situation as well. It’s called “Fetch as Googlebot” and it lets you send Google to your website and will show you exactly what we see. I would recommend this blog posthttp://googlewebmastercentral.blogspot.com/2009/11/generic-cialis-on-my-website-i-think-my.html describing how to use that tool, because your situation looks quite similar.

    Cutts says the best advice he can give to site owners is to keep their web server softare up to date and fully patched. If you want Google’s advice on the other kind of spam, read this.

  • Google Penguin Update Gets Fresh Losers List From Searchmetrics

    Earlier this week, Searchmetrics put out lists of winners and losers from Google’s Penguin update (when it was still called the Webspam update). After the lists were released, Google’s Matt Cutts spoke out about them, saying that they were inaccurate, because there had also been a Panda update, and that the lists were likely more indicative of that.

    Searchmetrics has now updated the lists, acknowledging what Cutts had to say.

    “I took nearly a huge set of keywords from short-head to medium and low search volume and looked at the current rankings from position 1 to 100 and compared the rankings to April 20th,” Searchmetrics Founder Marcus Tober writes. “In the data were also some glitches from the Panda 3.5 update which was going live from April 19th to 20th, Matt Cutts mentioned. But overall you see a trend of those domains which really lost visibility within the Google Penguin update.”

    Penguin Losers

    “A lot of these losers are database-driven websites – they mainly aggregate information and use large database systems to create as many pages as possible. Sites such as songlyrics.com, cubestat.com, lotsofjokes.com or merchandtcircle.com fall into this pattern. It makes sense that these sites will lose visibility,” says Tober. “Press portals and feed aggregators such as pressabout.us, newsalloy.com and bloglines.com were also affected, which makes sense from a Google point of view since these are the website types that are very often created by very aggressive (possibly overly aggressive) SEOs and often contain similar content.”

    He notes that ticketnetwork.com and ticketcity.com fit the bill of Google’s efforts with automatic, and possibly spun content.

    If you need to know exactly how to avoid getting caught by Google’s Penguin update, I’d start with the tips they give you. If you think you were unfairly hit by it, you can let Google know with a new form they’re providing.

    Image: Batman Returns From Warner Bros.

  • Google Penguin Update: Google Gives You A Place To Complain

    Google’s Matt Cutts tweeted that Google has a form webmasters can fill out if they think they’ve been wrongly hit by Google’s Penguin update (also known as the Webspam Update).

    If you know a site affected by algo update that you don’t think should be affected, we made a form to provide feedback: http://t.co/eE7TTWBz 13 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    The link takes you to a page that looks like this:

    Penguin Feedback Form

    It’s interesting that Google is giving users a form this time. With the Panda complaints, Google was trying to have them all directed to a particular thread in its Webmaster forums, though the thread eventually got broken up.

    On the form, you’ll notice that Google tells you to go to a different page to submit a spam report, and to use the word “Penguin”.

    @mattcutts Totally forgot to add the word penguin in the spam report, still going to get any attention? 13 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @DanielDeceuster might want to submit again and include penguin just to be safe. 13 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @mattcutts Thank you! We were seeing some major effects and wanted a good way to report some poor quality sites coming up in search. 13 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @ScottJConlon we’re definitely interested and we’ll be reading the feedback from the forms. 8 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Keep in mind when doing your reporting on either side of the equation, that the Penguin update is specifically aimed at sites that are violating Google’s quality guidelines. Here, you can read about what those are exactly.

    More Penguin Update coverage here.

  • Zerg Rush Easter Egg Destroys Your Google Search Results

    Zerg Rush Easter Egg Destroys Your Google Search Results

    As if it’s not already hard enough to get anything done on a Friday…

    A new Easter Egg in Google search results allows you to fight an onslaught of little Google “o”s, usually with disastrous results. Well, maybe I just suck at the game.

    Typing Zerg Rush into your search bar will produce a search page that immediately comes under attack. Your mission, should you choose to accept it, is to destroy the “o”s by clicking them with your mouse before the green bar associated with each level turns red. The attackers will comes from all directions – top, bottom, sides. And it’s really pretty hard. This guy is a lot better than I am – but to be fair it’s early.

    If you don’t want to do battle, you can always click on the X at the top right of your results to stop the attack. There, you’ll also see a stat counter for how many you’ve destroyed vs. how much damage they’ve done.

    Once the search game is over, you can post your score to Google+.

    Real-time strategy (and other types as well) gamers will know why Google made their little Easter Egg on the term “Zerg rush.” The Zerg come from the game StarCraft, and the “rush” (or Zerging) is now gameing-speak for “sacrificing economic development in favor of using many low-cost and weak units to rush and overwhelm an enemy by attrition or sheer numbers,” according to Wikipedia. Basically, to bum rush the hell out of someone.

    So go ahead – protect your search results before it’s too late.

  • Google Penguin Update: The New Name For The WebSpam Update

    Here we go.

    Get ready for a barrage of Penguin articles to complement the Panda articles, just as the Penguin update complements the Panda update in bringing quality to Google’s search results (or at least trying to). Yes, the Webspam Update has now been named the Penguin Update, reportedly.

    According to Danny Sullivan, whose word is pretty credible within the search industry, Google has officially named the Webspam Update the Penguin update. Sullivan had previously reported that Google’s Matt Cutts specifically called it the Webspam algorithm update, but has now altered his article, saying Google is officially calling it the Penguin update.

    Matt Cutts tweeted this Instagram photo (why no Google+?) which would seem to confirm the name:

    Matt Cutts Penguin Tweet

    At least it will be easier to find stock images of penguins (as opposed to webspam) for future articles. And it’s better than the “viagra update” (arguably).

    More coverage on the algorithm (and not the silly name) here:

    Webspam And Panda Updates: Does SEO Still Matter?
    Google Webspam Algorithm Update Draws Mixed Reviews From Users
    Google Webspam Update: Where’s The Viagra? [Updated]
    Google Webspam Update: “Make Money Online” Query Yields Less Than Quality Result
    Google Webspam Update: Losers & Winners, According To Searchmetrics [Updated]
    How Much Of Google’s Webspam Efforts Come From These Patents?
    Google Panda Update: Data Refresh Hit Last Week

  • Google Webspam Algorithm Update Draws Mixed Reviews From Users

    Google’s Matt Cutts has been talking about leveling the playing field for sites that don’t participate in “over-optimization”. Last month at SXSW, Cutts made something of a pre-announcement about such changes, and it looks like a major part of these efforts is now launching.

    According to Danny Sullivan, who spoke directly with Cutts, this is indeed the change Cutts was referring to at SXSW, but that Cutts admits “over-optimization” wasn’t he best way of putting it, because it’s really about webspam, and not white hat SEO techniques.

    Cutts himself announced a new algorithm change targeted at webpspam, which he describes as black hat techniques. “We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings,” he says.

    Link schemes are actually something webmasters have been getting messages from Google about already. The company recently de-indexed paid blog/link networks, and notified webmasters about such links.

    “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines,” says Cutts. “We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.”

    Google has kind of sent webmasters mixed signals about search engine optimization. They recently shared some SEO DOs and DON’Ts, specifically talking about some white hat things webmasters can do to help Google rank their content better. And Cutts’ point about not divulging specific signals so people can’t game search results is one the company has stood by for ages. But at the same time, Google does divulge algorithm changes it makes via monthly lists, which seem to dare webmasters to play to certain signals. That’s not to say they’re encouraging the kind of black hat stuff Cutts is talking about here, but doesn’t it kind of say, “Hey, these are some things we’re focusing on; perhaps you should be thinking about these things with your SEO strategy?” Isn’t that encouraging “gaming” to some extent, rather than just telling webmasters not to worry about it?

    Of course Google always says not to focus on any one signal, and just focus on making good, quality content. In fact, this new change (as in line with Cutts’ comments at SXSW) indicates that sites shouldn’t have to worry about SEO at all.

    “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites,” Cutts says. Emphasis added.

    As far as black hat SEO, it’s not as if this is some big change out of the blue. Algorithmically, it’s a change, but Google has always targeted this stuff. There’s a reason Cutts has been the head of webspam. Google has never been shy about penalizing sites violating its quality guidelines. Google even penalized its own Chrome site when some paid linking by the hands of a marketing agency was unearthed.

    If you’re engaging in SEO, and Google gets you on black hat tactics, you probably knew what you were doing. You probably knew it was in violation of Google’s guidelines. Of course, that’s assuming Google’s algorithm change does not make any errors. And what are the chances of that happening? Google will be the first to admit that “no algorithm is perfect.” As we saw with the Panda update, there were some sites hit hard, that possibly shouldn’t have been.

    So is that happening this time? It’s still early. As far as I can tell, the change hasn’t even finished rolling out. But there are plenty of people already commenting about it.

    Others are critical of Google’s search quality in general:

    From the comments on Cutts’ announcement:

    So far today’s search results are worse than they’ve been for the past month. On one search for a keyword phrase there’s a completely unrelated Wikipedia page, a random Twitter account for some company, and a page from an independent search engine from 1997 showing in the top 10 results. Yeah, that’s the kind of quality user experience we want to see. Way to knock it out of the park.

    well now more rubbish results appearing in search than before. more exact domain name match results and unrelated websites . Google failed once again.

    so many .info, .co unrelated domains ranked for respected queries. are you sure no mistake in this update?

    Surely, whatever these updates are doing, they are not right. Here’s just one example. A search for “ereader comparison chart” brings up “ereadercomparisonchart dot com” on 2nd page of results and it goes “Welcome! This domain was recently registered at namecheap.com. The domain owner may currently be creating a great site for..”
    While my site which provided true value to its readers is nowhere to be found.
    Please fix this.

    there is something wrong with this update . search “viagra” on Google.com 3 edu sites are showing in the first page . is it relevant? matt you failed .

    Search Google for a competitive term such as “new shoes” — look who’s #1: Interpretive Simulations – NewShoes – (Intro to Marketing, Marketing Principles). All competitive terms have some youtube videos on the top which aren’t of any good quality even. This is not what is expected of google. Please revert.

    These are results have to be a complete joke, so much unrelated content is now surfaced to the top it’s sickening.

    That’s just a sampling. There’s more in other forums, of course, such as WebmasterWorld. There is some more talk about exact match domains being hit. User Whitey says:

    News just in to me that a large network of destination related exact match domains [ probably 1000+], including many premium ones [ probably 50+], ultra optimized with unique content and only average quality backlinks with perhaps overkill on exact match anchor text, has been hit.

    A few of the premium one’s have escaped. Not sure if the deeper long tail network which were exact match have been effected, but they would have had little traffic.

    The sites were built for pure ranking purposes, and although largely white hat, didn’t do much beyond what other sites in the category do.

    User Haseebnajam says:

    Ranking Increase = squidoo, blogspot, forums, subdomains
    Ranking Decrease = exact match domains, sites with lots of backlink from spun content sources

    User driller41 says:

    I am seeing changes in the UK today, most of my affiliate sites are down which is annoying – all are exact match domains btw.

    Most of the backlinks are from web2.0 sites with spun content in the downed sites.

    One interesting point is that one of the sites which I had built most links to is unafected – the only differnce between this and my downed sites is that I never got around to adding the affiliate outlinks to this website – so google does not know that this site is an affiliate and thus no punishment has been dished out.

    We’ll keep digging for more on the Google’s webmspam update.

    Update: More on that viagra thing.

    The new algorithm change is launching over the next few days, Cutts says, and it will impact 3.1% of queries in English, “to a degree that a regular user might notice.” It affects about 3% of queries in German, Chinese and Arabic, but in “more heavily-spammed languages,” he says. “For example, 5% of Polish queries change to a degree that a regular user might notice.”

  • Google Webspam Update: Where’s The Viagra? [Updated]

    Google Webspam Update: Where’s The Viagra? [Updated]

    Update: Viagra.com is back at number one.

    As you may know, Google launched a new algorithm update, dubbed the Webspam Update. According to Google, it’s designed to keep sites engaging in black hat SEO tactics from ranking. The update is still rolling out, but it’s already been the target of a great deal of criticism. You can just peruse the comments on Google’s Webmaster Central blog post announcing the change, and see what people have to say.

    I can’t confirm that Viagra.com was number one in Google for the query “viagra,” but I can’t imagine why it wouldn’t have been. Either way, viagra.com is not the lead result now. That is, unless you count the paid AdWords version.

    Google Viagra results

    As you can see, the top organic result comes from HowStuffWorks.com. Then comes….Evaluations: Northern Kentucky University? Interesting. Here’s what that page looks like:

    Northern Kentucky University

    You’ll notice that this has absolutely nothing to do with Viagra.

    Browsing through some more of the results, there are some other very suspicious activity going on. Look at this result, which points to : larryfagin.com/poet.html. That URL does’t sound like it would have anything to do with Viagra, yet Google’s title for the result says: “Buy Viagra Online No Prescription. Purchase Generic Viagra…” and the snippet says: “You can buy Viagra online in our store. This product page includes complete information about Viagra. We supply Viagra in the United Kingdom, USA and …”

    If you actually click on the result, it has nothing to do with Viagra. It’s about a poet named Larry Fagin. Not once is Viagra mentioned on the page.

    Larry Fagin

    Also on the first results page: aiam.edu. That’s the American Institute of Alternative Medicine. At least it’s semi-drug-related. However, once again, no mention of Viagra on this page, though the title and snippet Google is providing, again, indicate otherwise. Google also informs us, “this site may be compromised”. I’m not sure what about this particular result is telling Google’s algorithm that it should be displayed on page one.

    The next result is for loislowery.com:

    Lois Lowery

    You guessed it. Yet again, nothing to do with Viagra. And once again, Google displays a Viagra-related title and snippet for the result, and tells us the site may compromised.

    Note: Not all of these results indicate that they’ve been compromised.

    A few people have pointed out the oddities of Google’s viagra SERP in the comments on Google’s announcement of the webspam algorithm change:

    Sean Jones says, “There is something wrong with this update. Search ‘viagra’ on Google.com – 3 edu sites are showing in the first page. Is it relevant? Matt you failed.”

    Lisaz says, “These results have to be a complete joke, so much unrelated content is now surfaced to the top it’s sickening. As a funny example check this one out….Search VIAGRA and look at the results on first page for USA queries. Two completely unrelated .edu’s without viagra or ED in their content. Another site about poetry with not even a mention of viagra anywhere to be found. Then two more sites that in google that have this site may be compromised warnings. LOL what a joke this update is. Sell your Google stocks now while you can.”

    ECM says, “Google.com. buy viagra online. Position 2… UNIVERSITY OF MARYLAND lol. I have seen a big mess in results now. Doesn’t this algo change just allow spammers to bring down competitors a lot more easily, just send a heap of junk/spam links to their sites. Nice one google, you’re becoming well liked. Enter BING.”

    How’s Bing looking on Viagra these days?

    Bing Viagra Results

    Yeah, I have to give Bing the edge on this one.

    And Yahoo:

    Yahoo Viagra Results

    And Blekko:

    Blekko Viagra Results

    And DuckDuckGo:

    DuckDuckGo Viagra Results

    We’ve seen people suggesting that the new Google update had a direct effect on exact match domain names. That could explain why viagra.com is MIA. However, it doesn’t exactly explain why some of these other results are appearing.

  • Google Webspam Update: “Make Money Online” Query Yields Less Than Quality Result

    Update: It looks like the top result has been changed now.

    Google announced a big algorithm change called the Webspam update. It’s in the process of rolling out, and is designed to penalize sites engaging in black hat SEO – activities that are direct violations of Google’s quality guidelines. In theory, it sounds like a good idea, but users are already complaining about the negative effects the update seems to have had on results.

    We looked at some weird things going on with Google’s results page for the query “viagra”. For one, viagra.com is not ranking at the top. This would be the obvious, most relevant choice. Most search engines agree, based on their rankings. Now, it’s nowhere to be found on the first results page for the query in Google. There are other weird results showing up as well.

    The lack of viagra.com might be explained as an issue having to do with exact match domains. People have already been talking about this in forums, and in the comments of Google’s blog post. The update, according to various webmasters, appears to have hit a fair amount of exact match domains. For example, viagra.com for the query “viagra”.

    Of course, not every exact match domain for every query is missing. For example, if you search “webpronews,” you’re still going to get WebProNews.com. But perhaps there is a subset of queries that tend to have more spam targeting that were hit in this manner, and even in a case like Viagra, in which the exact match actually is the most relevant result, the algorithm is not picking up on that.

    We’ve seen a few people point out Google’s SERP for “make money online”. I don’t know that makemoneyonline.com was the top result for this before anyway. It certainly should not be:

    Make Money Online

    But, the top (organic) result now, is makemoneyforbeginners.blogspot.com. As Google tells us from its own snippet, “No posts. No posts.”

    No Posts

    I don’t personally believe that the fact that it’s on Blogger (Google-owned) is much of a factor here, but it’s probably worth pointing out, given that Google is often accused of favoring its own content in search results.

    Here’s what that page looks like:

    Make?

    Hardly the “quality content” Google is demanding of webmasters these days.

    To be fair, Bing’s ranking this result too, for some reason. It’s not number one on Bing, but it’s number 3. Why is it there at all? It could be related to that whole Bing using Google results thing Google called Bing out on last year. It’s the same on Yahoo, which of course uses Bing on the back-end.

  • Google Panda Update: Data Refresh Hit Last Week

    On the 17th, we wrote about some webmasters who were suspecting a major update from Google. Google’s Matt Cutts has now come out and said that there was a Panda refresh around the 19th. They just didn’t say anything about it until now, which is interesting itself, considering they were tweeting about Panda updates before.

    This latest Panda refresh came to light as Searchmetrics put out its winner and loser lists (though the firm specified that they were not the final lists) for Google’s new Webspam update, which is presumably still rolling out. Cutts commented in response to Danny Sullivan’s article about the lists, saying, “Hey Danny, there’s a pretty big flaw with this “winner/loser” data. Searchmetrics says that they’re comparing by looking at rankings from a week ago. We rolled out a Panda data refresh several days ago. Because of the one week window, the Searchmetrics data include not only drops because of the webspam algorithm update but also Panda-related drops. In fact, when our engineers looked at Searchmetrics’ list of 50 sites that dropped, we only saw 2-3 sites that were affected in any way by the webspam algorithm update. I wouldn’t take the Searchmetrics list as indicative of the sites that were affected by the webspam algorithm update.”

    @dannysullivan Searchmetrics data is a weekly diff & includes a Panda data refresh, so sites going up/down mostly aren’t due to algo update. 11 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @dannysullivan yup, believe a Panda data refresh on 4/19. I don’t think @rustybrick asked us about it; we sometimes wait for him to ask. 🙂 11 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Webmasters have had over a year to get used to the Panda update, but it is clearly still wreaking havoc. For one, here’s the list of losers from Searchmetrics again:

    Searchmetrics loser list

    A couple weeks ago, we wrote about DaniWeb, which managed to get hit by Google yet again, after being hit by and recovering from the Panda update multiple times over the course of the past year. The latest incident may or may not have been Panda.

  • Matt Cutts Talks About How Google Handles Ajax

    Google’s Matt Cutts put up a new Webmaster Help video, discussing how Google deals with Ajax. He takes on the following user-submitted question:

    How effective is Google now at handling content supplied via Ajax, is this likely to improve in the future?

    “Well, let me take Ajax, which is Asynchronous Javascript, and make it just Javascript for the time being,” says Cutts. “Google is getting more effective over time, so we actually have the ability not just to scan in strings of Javascript to look for URLs, but to actually process some of the Javascript. And so that can help us improve our crawl coverage quite a bit, especially if people use Javascript to help with navigation or drop-downs or those kinds of things. So Asynchronous Javascript is a little bit more complicated, and that’s maybe further down the road, but the common case is Javascript.”

    “And we’re getting better, and we’re continuing to improve how well we’re able to process Javascript,” he continues. “In fact, let me just take a little bit of time and mention, if you block Javascript or CSS in your robots.txt, where Googlebot can’t crawl it, I would change that. I would recommend making it so that Googlebot can crawl the Javascript and can crawl the CSS, because that makes it a lot easier for us to figure out what’s going on if we’re processing the Javascript or if we’re seeing and able to process and get a better idea of what the page is like.”

    As a matter of fact, Cutts actually put out a separate video about this last month, in which he said, “If you block Googlebot from crawling javascript or CSS, please take a few minutes and take that out of the robots.txt and let us crawl the javascript. Let us crawl the CSS, and get a better idea of what’s going on on the page.”

    “So I absolutely would recommend trying to check through your robots.txt, and if you have disallow slash Javascript, or star JS, or star CS, go ahead and remove that, because that helps Googlebot get a better idea of what’s going on on the page,” he reiterates in the new video.

    In another new video, Cutts talks about why Google won’t remove pages from its index at your request.