WebProNews

Tag: SEO

  • Google Algorithm Changes: Google Just Released The Big Lists For August And September

    Google has finally posted its big list(s) of algorithm changes for the past two months. Google was steadily putting them out on a monthly basis until last time, when they waited two months to put out two months’ worth of changes. It appears that this is becoming a trend.

    Stay tuned as we look closer at the lists, and analyze Google’s changes more.

    Update: Here are some thoughts about domain-related changes.

    More on freshness

    More on Page Quality

    More on Autocomplete

    More on “Answers”

    More on “Other Ranking Components”

    More on Knowledge Graph

    More on SafeSearch

    More on Snippets

    More on Local changes

    In the meantime, here are a bunch of other things you can consider in addition to the big EMD update and Panda update Google just launched.

    Here are Google’s lists in their entirety:

    Here’s the list for August:

    • #82862. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources.
    • #83197. [project “Autocomplete”] This launch introduced changes in the way we generate query predictions for Autocomplete.
    • #83818. [project “Answers”] This change improved display of the movie showtimes feature.
    • #83819. [project “Answers”] We improved display of the MLB search feature.
    • #83820. [project “Answers”] This change improved display of the finance search feature.
    • #83384. [project “Universal Search”] We made improvements to driving directions in Turkish.
    • #83459. [project “Alternative Search Methods”] We added support for answers about new stock exchanges for voice queries.
    • LTS. [project “Other Ranking Components”] We improved our web ranking to determine what pages are relevant for queries containing locations.
    • Maru. [project “SafeSearch”] We updated SafeSearch to improve the handling of adult video content in videos mode for queries that are not looking for adult content.
    • #83135. [project “Query Understanding”] This change updated term-proximity scoring.
    • #83659. [project “Answers”] We made improvements to display of the local time search feature.
    • #83105. [project “Snippets”] We refreshed data used to generate sitelinks.
    • Imadex. [project “Freshness”] This change updated handling of stale content and applies a more granular function based on document age.
    • #83613. [project “Universal Search”] This change added the ability to show a more appropriately sized video thumbnail on mobile when the user clearly expresses intent for a video.
    • #83443. [project “Knowledge Graph”] We added a lists and collections component to theKnowledge Graph.
    • #83442. [project “Snippets”] This change improved a signal we use to determine how relevant a possible result title actually is for the page.
    • #83012. [project “Knowledge Graph] The Knowledge Graph displays factual information and refinements related to many types of searches. This launch extended the Knowledge Graphto English-speaking locales beyond the U.S.
    • #84063. [project “Answers”] We added better understanding of natural language searches for the calculator feature, focused on currencies and arithmetic.
    • nearby. [project “User Context”] We improved the precision and coverage of our system to help you find more relevant local web results. Now we’re better able to identify web results that are local to the user, and rank them appropriately.
    • essence. [project “Autocomplete”] This change introduced entity predictions in autocomplete. Now Google will predict not just the string of text you might be looking for, but the actual real-world thing. Clarifying text will appear in the drop-down box to help you disambiguate your search.
    • #83821. [project “Answers”] We introduced better natural language parsing for display of the conversions search feature.
    • #82279. [project “Other Ranking Components”] We changed to fewer results for some queries to show the most relevant results as quickly as possible.
    • #82407. [project “Other Search Features”] For pages that we do not crawl because of robots.txt, we are usually unable to generate a snippet for users to preview what’s on the page. This change added a replacement snippet that explains that there’s no description available because of robots.txt.
    • #83709. [project “Other Ranking Components”] This change was a minor bug fix related to the way links are used in ranking.
    • #82546. [project “Indexing”] We made back-end improvements to video indexing to improve the efficiency of our systems.
    • Palace. [project “SafeSearch”] This change decreased the amount of adult content that will show up in Image Search mode when SafeSearch is set to strict.
    • #84010. [project “Page Quality”] We refreshed data for the “Panda” high-quality sites algorithm.
    • #84083. [project “Answers”] This change improved the display of the movie showtimes search feature.
    • gresshoppe. [project “Answers”] We updated the display of the flight search feature for searches without a specified destination.
    • #83670. [project “Snippets”] We made improvements to surface fewer generic phrases like “comments on” and “logo” in search result titles.
    • #83777. [project “Synonyms”] This change made improvements to rely on fewer “low-confidence” synonyms when the user’s original query has good results.
    • #83377. [project “User Context”] We made improvements to show more relevant local results.
    • #83484. [project “Refinements”] This change helped users refine their searches to find information about the right person, particularly when there are many prominent people with the same name.
    • #82872. [project “SafeSearch”] In “strict” SafeSearch mode we remove results if they are not very relevant. This change previously launched in English, and this change expanded it internationally.
    • Knowledge Graph Carousel. [project “Knowledge Graph”] This change expanded theKnowledge Graph carousel feature globally in English.
    • Sea. [project “SafeSearch”] This change helped prevent adult content from appearing when SafeSearch is in “strict” mode.
    • #84259. [project “Autocomplete”] This change tweaked the display of real-world entities in autocomplete to reduce repetitiveness. With this change, we don’t show the entity name (displayed to the right of the dash) when it’s fully contained in the query.
    • TSSPC. [project “Spelling”] This change used spelling algorithms to improve the relevance of long-tail autocomplete predictions.
    • #83689. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources.
    • #84068. [project “Answers”] We improved the display of the currency conversion search feature.
    • #84586. [project “Other Ranking Components”] This change improved how we rank documents for queries with location terms.

    Here’s the list for September:

    • Dot. [project “Autocomplete”] We improved cursor-aware predictions in Chinese, Japanese and Korean languages. Suppose you’re searching for “restaurants” and then decide you want “Italian restaurants.” With cursor-aware predictions, once you put your cursor back to the beginning of the search box and start typing “I,” the prediction system will make predictions for “Italian,” not completions of “Irestaurants.”
    • #84288. [project “Autocomplete”] This change made improvements to show more fresh predictions in autocomplete for Korean.
    • trafficmaps. [project “Universal Search”] With this change, we began showing a traffic map for queries like “traffic from A to B” or “traffic between A and B.”
    • #84394. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources.
    • #84652. [project “Snippets”] We currently generate titles for PDFs (and other non-html docs) when converting the documents to HTML. These auto-generated titles are usually good, but this change made them better by looking at other signals.
    • #83761. [project “Freshness”] This change helped you find the latest content from a given site when two or more documents from the same domain are relevant for a given search query.
    • #83406. [project “Query Understanding”] We improved our ability to show relevant Universal Search results by better understanding when a search has strong image intent, local intent, video intent, etc.
    • espd. [project “Autocomplete”] This change provided entities in autocomplete that are more likely to be relevant to the user’s country. See blog post for background.
    • #83391. [project “Answers”] This change internationalized and improved the precision of thesymptoms search feature.
    • #82876. [project “Autocomplete”] We updated autocomplete predictions when predicted queries share the same last word.
    • #83304. [project “Knowledge Graph”] This change updated signals that determine when to show summaries of topics in the right-hand panel.
    • #84211. [project “Snippets”] This launch led to better snippet titles.
    • #81360. [project “Translation and Internationalization”] With this launch, we began showing local URLs to users instead of general homepages where applicable (e.g. blogspot.ch instead of blogspot.com for users in Switzerland). That’s relevant, for example, for global companies where the product pages are the same, but the links for finding the nearest store are country-dependent.
    • #81999. [project “Translation and Internationalization”] We revamped code for understanding which documents are relevant for particular regions and languages automatically (if not annotated by the webmaster).
    • Cobra. [project “SafeSearch”] We updated SafeSearch algorithms to better detect adult content.
    • #937372. [project “Other Search Features”] The translate search tool is available through the link “Translated foreign pages” in the sidebar of the search result page. In addition, when we guess that a non-English search query would have better results from English documents, we’ll show a feature at the bottom of the search results page to suggest users try the translate search tool. This change improved the relevance of when we show the suggestion.
    • #84460. [project “Snippets”] This change helped to better identify important phrases on a given webpage.
    • #80435. [project “Autocomplete”] This change improves autocomplete predictions based on the user’s Web History (for signed-in users).
    • #83901. [project “Synonyms”] This change improved the use of synonyms for search terms to more often return results that are relevant to the user’s intention.

    What’s jumping out at you right away?

  • Google: By The Way, A Panda Update Is Rolling Out Alongside The EMD Update

    Last Friday, Google announced the EMD update. It was billed as a small and minor update, but the effects seemed to be fairly large, with many webmasters claiming to have been hit. Google’s Matt Cutts made it a point to say that the algorithm change was unrelated to both Panda and Penguin.

    He then said it was not the only update that was rolling out during that timeframe, noting that Google makes changes every day (over 500 a year). He didn’t happen to mention that there was a new Panda update, however. Finally, he has dropped the news that there was indeed a Panda update going on at the same time as the EMD update (and it’s still rolling out).

    Were you impacted by one of these updates? Are you able to discern which one it was? Let us know in the comments.

    Search Engine Land reports that Google released a Panda algorithm update (not a data refresh, but an actual update) on Thursday, and that it impacts 2.4% of English search queries (and is still rolling out). That’s significantly larger than the 0.6% of English-US queries Cutts said the EMD update affected. So, it seems that the majority of those claiming to be hit by the EMD update were likely hit by Panda (which would explain those claiming to be hit, that didn’t have exact match domains).

    Here’s the exact statement from Cutts that the publication is sharing: “Google began rolling out a new update of Panda on Thursday, 9/27. This is actually a Panda algorithm update, not just a data update. A lot of the most-visible differences went live Thursday 9/27, but the full rollout is baking into our index and that process will continue for another 3-4 days or so. This update affects about 2.4% of English queries to a degree that a regular user might notice, with a smaller impact in other languages (0.5% in French and Spanish, for example).”

    Couldn’t he have just said that in the first place? Google had to know the confusion this would cause. Since the original Panda update, Google has made more of an effort to be transparent about algorithm changes, and it certainly has been. It seems, however, like delayed transparency is becoming the trend recently.

    For months, Google was releasing monthly lists of updates that had been made the prior month. The last time, they left people waiting before finally posting a giant list for two months’ worth of changes. It seems that Google is doing this again, as we have yet to see lists for August or September (assuming Google is about to release these lists).

    Either way, it appears the Panda continues to wreak havoc on webmasters. Wait until they get a load of the next Penguin.

    For those sites that were hit, obviously if there is not an exact match domain involved, that makes the problem a little easier to figure out, at least in terms of which update the site was actually hit by. It seems unlikely that the EMD update would have done much to impact you if your site does not use an EMD. Which leaves Panda (and of course, any other updates that Google hasn’t told us about – they do make changes every day, and often more than one in a day).

    While Cutts said that the EMD update is unrelated to Panda, that is not necessarily the case, depending on how you view the comment. Algorithmically speaking, I presume Cutts means the two have nothing to do with each other. However, in concept, the two are very similar in that they go after low quality. So, doesn’t it stand to reason that if you improve the quality of your content, you could actually recover from either update? That is assuming that the EMD update is one that can be recovered from. Let’s put it this way: if it’s possible to recover from the EMD update (which most likely it probably is), improving the quality of your site and content should be the main objective.

    This just happens to be the same objective for recovering from Panda. Of course quality is subjective, and Google has it’s own view of what this entails. Luckily for webmasters Google has essentially laid out exactly what it is looking for from content, specifically with regards to the Panda update.

    Googe has pretty much given webmaster the rules of the road to Panda recovery, even if they’re not official rules. You’ve probably seen the list before, but if you were never hit by the Panda update until now, maybe you haven’t. Either way, here are the questions Google listed last year as “questions that one could use to assess the quality of a page or an article:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Of course, Google uses over 200 signals in all, but that should get you started on thinking about your site’s content.

    And with regards to the EMD update, remember, Google is targeting “low quality” EMDs. Not simply EMDs in general.

    We’ve provided tons of coverage of the Panda update since Google first launched it. To learn more about it, feel free to peruse the Panda section of WebProNews.

    Do you think Google has improved its search results with this algorithm combo? Is Google being transparent enough about algorithm updates for your taste? Let us know what you think in the comments.

    Image credit: Rick Bucich

  • Google Will Not Hesitate To Take Manual Action On Rich Snippet Abuse

    As previously reported, Google has updated its Webmaster Guidelines. Part of the update is some new stuff about rich snippets.

    Google is highlighting these changes specifically in a post on the Webmaster Central blog today.

    “Once you’ve correctly added structured data markup to you site, rich snippets are generated algorithmically based on that markup,” Google says in the post. “If the markup on a page offers an accurate description of the page’s content, is up-to-date, and is visible and easily discoverable on your page and by users, our algorithms are more likely to decide to show a rich snippet in Google’s search results.”

    “Alternatively, if the rich snippets markup on a page is spammy, misleading, or otherwise abusive, our algorithms are much more likely to ignore the markup and render a text-only snippet,” Google adds. “Keep in mind that, while rich snippets are generated algorithmically, we do reserve the right to take manual action (e.g., disable rich snippets for a specific site) in cases where we see actions that hurt the experience for our users.”

    As far as the actual quality guidelines for rich snippets go, you should basically avoid marking up content that is in now way visible to users and marking up irrelevant or misleading content, such as fake reviews or content unrelated to the focus of the particular page. If you’re selling products, reviews for each page, for example, should be about that page’s particular product, and not for the store itself.

    Read the whole Rich Snippets section here.

  • Google EMD Update: Good Or Bad For Search?

    On Friday, Google’s Matt Cutts revealed that Google was rolling out a new algorithm update geared at reducing “low-quality” exact match domains in search results. He indicated that “the EMD algo” affects 0.6% of English-US queries “to a noticeable degree”.

    As a webmaster or site owner, have you noticed an impact from this update? Have you noticed a dramatic change in search results as a user? Share your thoughts in the comments.

    Just to clear up any confusion from the start, Cutts also said the EMD update is unrelated to the Panda and Penguin updates. Here are his exact tweets:

    While 0.6% of English-US queries may not sound like an incredible amount of results impacted, there are already tons of people claiming to have been hit by the update. Here is a small sampling of the comments we’ve received from readers:

    90% of my sites got hit. Yes they had part of a keyword in the domain name but other than one site, I wouldn’t consider the rest of them low-quality sites. Each one had high quality unique content, numerous pages.

    This is utter nonsense. I have a site which was hit that the domain name contained one keyword that I was ranking for. But, I was also ranking for 15 other keywords that weren’t related to the domain name, but they are also nowhere to be seen in google. This is a site with 100′s of pages of unique, quality content, all hand written by me, with a high quality well followed facebook fan page. Just gone. I’m just glad I can rely on facebook for quality traffic, as it doesn’t seem that google can provide that anymore.

    Okay, at least I know what happened. Two of my websites are gone. Good sites, with unique content and a lot of backlinks and work behind.

    Some readers appear to welcome the update. Here are a few of the more positive comments we’ve received:

    I’ve been waiting for an update like this for a long time. I’ve speculated that something like this has been in the works because a brand is almost always going to be more valuable than a spammy exact match domain.

    Good Authoritative content is all that has ever mattered & has been the Google mantra from the start, The EMD with “Good Authoritative” root domain content will always have the edge…

    I was waiting for this update it may brings my blogs up in the google search. I have blogs which don’t have keywords in urls. This updates helps a lot.

    Here’s some additional reaction from Twitter:

    Dr. Peter J. Meyers at SEOmoz put together some research on the update using MozCast “Top-View” metrics, indicating that despite Cutts’ wording of “upcoming,” the change appears to had already begun:

    EMD data from SEOmoz

    “We measured a 24-hour drop in EMD influence from 3.58% to 3.21%,” writes Meyers. “This represents a day-over-day change of 10.3%. While the graph only shows the 30-day view, this also marks the lowest measurement of EMD influence on record since we started collecting data in early April.”

    The following sites are some examples of those who got hit, according to Meyers (though he acknowledges he can’t prove they were definitely because of this specific update – it does seem highly likely): bmicalculatormale.com, charterschools.org, playscrabble.net, purses.org, and teethwhitening.com. None of these had actually ranked number one for their respective keywords, according to Meyers, but they went from postions like 3, 4 and 7, to dropping significantly.

    It will be interesting to see if more domain-related changes are announced. This is the second one Cutts has tweeted about in recent weeks. He recently talked about a domain diversity update.

    When Google releases its monthly (sometimes) lists of algorithm changes, there is often a visible theme from month to month. In June, for example, there were quite a few updates related to how Google handles natural language. I wonder if we might see more domain-related tweaks when Google finally releases the September (and August) lists. Perhaps there will be more heading into October.

    What do you think of the EMD update? Good move on Google’s part? Let us know in the comments.

  • Google EMD Update Was Accompanied By At Least One Other Update

    Update: Apparently the other update people are experiencing was a new Panda update. Google transparency at its finest.

    As you probably know by now, Google’s Matt Cutts announced an algorithm change on Friday – the EMD update. The change was designed to reduce low low-quality exact match domains in search results. Cutts deemed the change “small” and tweeted about it as a “minor” weather report.

    Based on all of the complaints we’re seeing (you can read plenty of them in the comments of this article), it may not have been all that minor. Cutts said that the change affects 0.6% of English-US queries to a noticeable degree, and noted that it was unrelated to Panda or Penguin. Still, based on all of these sites claiming to have been hit, you would think it was Panda or Penguin.

    Some webmasters claim to have been hit, but not necessarily on sites with exact match domains. So why would they have taken such a hit? Well, it’s not news that Google launches various changes to its algorithm on a day to day basis. The company often gives the “over 500 a year” number. This time is no different.

    Search Engine Roundtable is pointing to a reply Cutts gave to one person on Twitter about the situation, where he noted that he knows of one change that was also released during the same timeframe as the EMD update. Here’s the exchange (with another interesting one about Google’s struggle with quality thrown in):

     
     
     
     
     
     

    While it’s not that interesting that Google launched another change the same time as the EMD update (again, it’s common knowledge that Google pushes changes every day), it is interesting that so many people are complaining about being hit when the update Cutts tweeted about was said to be so small, and that many of those claiming to have been hit were not dealing with exact match domains. If another change had as big of an impact, a greater impact, or anywhere close to the impact as the EMD update, why wouldn’t Google announce that one?

    Meanwhile, we’re still waiting on Google to be “transparent” about the changes it has made over the course of August and September, with its monthly (at least they used to be) lists. All of that combined with new updates to Google ‘s Webmaster Guidelines should be enough to keep webmasters busy for a bit.

  • Google Webmaster Guidelines Get An Update

    Google Webmaster Guidelines Get An Update

    Remember when an update to Google’s Webmaster Guidelines was spotted prematurely, before Google pulled it back down? Well, Google has officially updated them now.

    “Both our basic quality guidelines and many of our more specific articles (like those on links schemes or hidden text) have been reorganized and expanded to provide you with more information about how to create quality websites for both users and Google,” explains Google’s Search Quality team.

    “The main message of our quality guidelines hasn’t changed: Focus on the user,” the team adds. “However, we’ve added more guidance and examples of behavior that you should avoid in order to keep your site in good standing with Google’s search results.”

    This video is now at the top:

    The guidelines are important for not only ensuring that you aren’t manually penalized by Google, but also that you don’t get hit by an algorithm update designed to enforce them (like Penguin). They’re also just good general rules to follow for SEO and user purposes.

    The guidelines are still divided into design/content, technical and quality guidelines. Peruse the new guidelines here.

  • Google Webmaster Tools Will Now Email You About Critical Site Issues

    Google announced today that Webmaster Tools will start letting you know when it discovers critical issues with your site, by sending you an email with more info. The company says it will only notify you about issues that it thinks have a significant impact on your site’s health or search performance and which have clear actions you can take to address the issue.

    “For example, we’ll email you if we detect malware on your site or see a significant increase in errors while crawling your site,” says Google Webmaster Trends analyst John Mueller.

    “For most sites these kinds of issues will occur rarely,” he adds. “If your site does happen to have an issue, we cap the number of emails we send over a certain period of time to avoid flooding your inbox. If you don’t want to receive any email from Webmaster Tools you can change your email delivery preferences.”

    This is just the latest in a series of new alerts from Webmaster Tools. Last month, Google launched alerts for Search Queries data to complement the Crawl Errors alerts it began sending out before that.

    Webmaster Tools also recently started sharing more detailed Site Error info, such as stats for each site-wide crawl error from the past ninety days. It also shows failure rates for category-specific error. More on that here.

  • Google Sheds More Light On Freshness As A Ranking Signal

    There’s a new Webmaster Help video from Google’s Matt Cutts. In this one, he talks specifically about freshness as a ranking signal. The video is a response to the following user-submitted question:

    Google has expressed in the past that frequently updated pages get a boost in rankings (QDF), that seems to favor blogs and news sites over company sites, which have less reason to be updated often. How important of a signal is “freshness”?

    There’s no question that Google has put greater emphasis on freshness of content in many SERPs. Last November, Google launched the “Freshness” update, and since then, Google has made various adjustments to how it handles the signal.

    In fact, just since this video was released, Google put out a big list of algorithm changes it made throughout the past two months, and there were some freshness signals mentioned on that (though not as many as in past lists. More on that here.

    I’ve criticized the search engine’s emphasis on freshness in the past, as I’ve found more times than I can count, instances of results where fresher results were being shown, making it harder to find content that was actually useful to my search needs. Readers suggested that I was not alone.

    “There’s a little bit of an interesting twist in this question, where it’s not just the case that just because something is frequently updated – in terms of the pages on your blog or on your site – that you automatically should sort of be ranking higher. I wouldn’t have that interpretation of freshness,” Cutts says.

    “Sometimes people are looking for something that’s fresh-seeking, so if you’re searching for an earthquake or some event that just happened, that would be QDF (that would be query that deserves freshness)…not every query deserves freshness,” he says. “So…if it’s evergreen content – sometimes people are looking for long form content or doing more research, than freshness wouldn’t be counted as that much.”

    I’ve actually encountered a lot of the questionable results in searching for things that did happen in the news at one time, but were not necessarily news any longer. Part of my job is finding points of reference for articles, so this is pretty much a daily task. Freshness, in my experience, has often outweighed relevance to a fault.

    “We have over 200 signals that we use, and the thing that I would not do – the pitfall – the trap that I would not fall into is saying, ‘OK, I have to have fresh content, therefore, I’m going to randomly change a few words on my pages every day, and I’ll change the by-line date so that it looks like I have fresh content,” Cutts continues. “That’s not the sort of thing that’s likely to actually lead to higher rankings.”

    “And if you’re not in an area about news – you’re not in sort of a niche or topic area that really deserves a lot of fresh stuff, then that’s probably not something you need to worry about at all,” he says. “It might be better to…like in SEO, it’s not like…there will always be some SEO events, but there’s some content that’s evergreen that lasts and stands the test of time, and it might be better to work on those sort of articles than just trying to jump onto whatever’s on the top of Techmeme or whatever the story du jour is.”

    “I wouldn’t spend so much time thinking about freshness just because it’s one of the over two hundred signals, that you sort of miss out on all the other signals,” he says. “Now, if you’re in a hot breaking area where you’re competing with Engadget, The Verge…you know, if you write about video games, there’s a lot of topical, breaking news, then it is good to try to be fresh and make sure that you have content that’s especially relevant.”

    “But it’s not the sort of thing where you need to worry about making sure that you’re re-writing your pages or changing words on your page just so you look fresh,” he concludes. “Google is relatively good about trying to suss out when it’s more helpful to be fresh and when it’s sort of just regular search, where web pages that were good yesterday were also good today.”

    You know what was really good for fresh results? Realtime search powered by Twitter.

  • Matt Cutts Just Announced A Google Algorithm Change

    Google’s Matt Cutts just announced a new Google algorithm change via Twitter. He says it will reduce low-quality “exact-match” domains in search results.

    It sounds like an extension of the last change he tweeted about, which was aimed at improving domain diversity. Here’s the new tweet:

    Update: Cutts tweeted a follow-up:

    Probably good of him to clear that up right away.

    Google is about due to publish its big list of algorithm changes for the months of August and September. When that happens, it will be interesting to see how many entries are related to domains. It seems like there are typically visible themes in the lists. For example, in the June list, there were a lot of changes related to improving how Google deals with natural language.

    Have you seen any effects from this update? Let us know.

  • Google Penalties Won’t Necessarily Kill Your Rankings

    According to Google, just because your site is manually penalized, it does not mean that your site will vanish from the rankings. In fact, it may not hurt it much at all in some cases. A lot of that depends on you, the webmaster, and the other signals you’ve managed to send the search engine.

    In a Google Webmaster Central forum thread (via Search Engine Roundtable), a webmaster discussed being denied his 4th reconsideration request, and Google Webmaster Trends analyst John Mueller had some interesting words. For that guy’s specific story, you can view the thread, but that’s not really the point. Here’s what Mueller said:

    The primary manual action that is affecting your site is that these unnatural links are being ignored. This is more or less in line with the spreadsheet that you have submitted, and would generally not be affecting the other links to your site. That said, while these things may have been counting for your site in the past, they no longer are — so it’s possible that you’d see some effect in your site’s crawling, indexing, and ranking. Past that, keep in mind that the manual action here might not be the strongest element affecting your site’s performance, we use over 200 factors in our crawling, indexing, and ranking, and regularly announce updates (such as http://googlewebmastercentral.blogspot.ch/2012/04/another-step-to-reward-high-quality.html ). My recommendation would be to not focus so much on this specific manual action, but instead to work to make sure that your site (and how it interacts with users and the rest of the web) is the best it can possibly be. Emphasis added.

    The manual action is just one thing Google is taking into consideration. If you have enough other positive signals going your way (or can at least gain some), it’s possible that Google’s manual penalty will have little effect on your site’s rankings.

    Speaking of signals, Google is about due to put out its monthly (at least it used to be monthly) list of algorithm changes. We have yet to see August’s changes, and September is almost over. Last time, they released two months’ worth of lists at the same time. Perhaps we’ll see them do that again. It seems likely at this point. Or who knows, maybe they’ll wait and give 3 months’ worth next time.

  • Google Tweaks Rich Snippets Testing Tool, Calls It Structured Data Testing Tool

    Google has launched a new version of its rich snippet testing tool, which it now calls the Structured Data Testing Tool. The company says it has improved how it displays rich snippets in the tool to better mach how they appear in search results.

    Google also says the new design makes it clearer what structured data Google can extract from the page (and how it may be shown in search results). It’s also now available in languages other than English.

    The tool works with all supported rich snippets, as well as authorship markup, at least in theory.

    Former Googler Vanessa Fox, who built Webmaster Central, writes, “I’m having a bit of trouble with the tool. For instance, looking at the page they show in the blog post: http://allrecipes.com/recipe/banana-banana-bread/, the rich snippet appears in the tool correctly…However, the image is missing for this page in the actual search results. Why isn’t the image showing up?”

    You can access the tool here.

    Last month, Google introduced the Structured Data Dashboard in Google Webmaster Tools. More on that here.

  • Google May Soon Update Its Webmaster Guidelines

    It looks like Google may soon be changing its Webmaster Guidelines. Patrick Sexton from FeedTheBot claims to have spotted an updated version of the guidelines, and posted about them. However, he says, two Google employees told him that they were put up by mistake and “were not meant to go public yet.”

    Sexton removed his post, but in the comments of the following Google+ post about the post (via Search Engine Roundtable), someone posted a link to a screen cap of her account of the guidelines.

    Eren Mckay

    Official Google Webmaster Guidelines gets updated
    thanks to +David Harry  for sharing:
    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 here's a post about it too:
    http://www.feedthebot.com/blog/official-google-guidelines-get-updated/


    Official Google Guidelines Updated – Webmaster Advice
    update: I have confirmed with two Google employees that these new guidelines were put up by mistake and were not meant to go public yet. I just happened to notice them and I naturally wrote about them…

    (Click the timestamp to go to the comment thread.)

    Google must have had some reason to pull the guidelines, so it’s hard to say how much of what Sexton spotted is what Google will end up going with. The webmaster guidelines are obviously important, however, in ensuring that a site stays in Google’s good graces and doesn’t face getting penalized or hit by algorithm changes designed to enforce the guidelines. The Penguin update was geared towards enforcing the quality guidelines specifically (part of the Webmaster Guidelines).

    According to the screen cap of Sexton’s post, there is some new stuff about rich snippets, which are not currently mentioned on the Webmaster Guidelines page at all.

    There are things like, “Review our recommended best practices for images, video and rich snippets,” and “Avoid abusing rich snippets markup.”

    Things to avoid include: automatically generated content, participating in link schemes, cloaking, sneaky redirects, hidden text/links, doorway pages, scraped content, participating in affiliate programs without adding sufficient value, loading pages with irrelevant keywords, creating pages with malicious behavior (such as phishing or installing viruses, torjans or other badware,” abusing rich snippets markup and sending automated queries to Google.

    It says to “engage in good practices” like: monitoring your site for hacking and removing hacked content as soon as it appears, preventing and removing user-generated spam on your site.

    Some other quotes from the post:

    “Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.”

    “Don’t deceive your users.”

    “Avoid scraped content.”

    “Avoid automatically generated content.”

    “Monitor your site for hacking and remove hacked content as soon as it appears.”

    “Prevent and remove user-generated spam on your site.”

    This is all pretty basic and common sense stuff, but that’s essentially what the guidelines are are about, for the most part, anyway.

    I guess we’ll see if the changes are implemented soon, and whether Google has even more to add.

  • Did This Google Update Actually Improve Results?

    Google messes with its algorithm every day, but it’s not every day that Google lets us know about what it’s doing.

    On Friday, Google’s Matt Cutts tweeted about a new update to Google’s algorithm, which he said, “improves the diversity of search results in terms of different domains returned.”

    Has this update improved Google results? Let us know what you think.

    Most can probably agree that such an update would be helpful to users and webmasters alike (apart from those lucky enough to be dominating the SERPs). One WebProNews reader commented, “I think this is a great idea. I have no idea how many times I have seen a certain domain show up 3-4 different times in the top 5 pages. The first page is really the only one that matters anyways. But this will enable more people to rank for harder keywords. Well hopefully at least.”

    Another added, “I’ve seen something even worse, results from a same domain occupied the whole first 3 pages in SERPs. Hopefully this update works.”

    So far, however, we’ve seen little evidence that the update has done what it is supposed to do on a wide scale. In fact, so far, we haven’t seen any examples where it’s specifically been improved. There are some examples out there of where it has not improved.

    Danny Sullivan at Search Engine Land points to the results for the query “christopher jagmin plates,” for example. Search for that, and you’re likely to get ten results on the first page from ChristopherJagmin.com. You start getting into some other domains about halfway down page 2.

    Google Domain Diversity?

    Likewise, Barry Schwartz at Search Engine Roundtable points to a query for “bobs furniture”. This one isn’t quite as bad, but still, four out of seven results are from Yelp.

    Google domain diversity?

    In a WebmasterWorld forum thread on the topic, one user comments, “I’m still seeing typical results in travel though – for a sample ‘xxx hotel reviews’ no less than 8 out of 10 results are Tripadvisor. Admittedly they are on two different domains (.co.uk and .com), but surely Google can work out they’re effectively the same site. Looks like they still have some work to do.”

    Another adds, “I share that sentiment and find it borderline idiotic to return the same site up to 87 times in the top 100.”

    One member says, “Diversity in search results used to be standard in Google results. They’ve really messed up their search results and are now backpedaling.”

    Some do claim to be seeing some improvements.

    Brett Tabke, WebmasterWorld’s founder, even joined the conversation, saying, “Remember a few months ago when I had a search that returned 20 results from the same site? That type of multi-result is not happening anymore.”

    A WebProNews reader tells us, “I would say I noticed immediate improvements in recipe searches. Where AllRecipes.com used to command 70-80% of first and second page results, they don’t really take over until the 4th page now.”

    “In the last 24 hours I’ve seen several market categories where exact match domains seem to have dropped at the expense of other domains,” another reader added.

    Just because the examples haven’t been easy to spot, does not mean Google’s update did not perform as intended. You can always point to examples of where Google updates didn’t work. The question is, how often are you organically happening onto search results pages where Google is plastering results from the same domain all over the page? If the answer is, “not very,” than perhaps Google succeeded in its goal.

    As Google will often say, no algorithm is perfect.

    Cutts did indicate that this was a minor update. It would be interesting to know how many sites have been positively and negatively impacted. There don’t seem to be nearly the amount of complaints we would see with a Panda or a Penguin update.

    There was some speculation going around last week that Google may have launched a new Panda update, but the general consensus appears to be that what webmasters were experiencing was more likely a result of this change.

    Google did announce this morning, however, that a Panda refresh is rolling out now, so I guess we’ll see the reaction to the effects of that come pouring in next:

    Meanwhile, webmasters and SEOs are anxiously awaiting Google’s next Penguin update, which the company has indicated could be “jarring”.

    We’re also still waiting on Google to reveal its big list of algorithm changes for the month of August. Last time, they oddly waited a couple of months before unleashing a giant list of two months worth of changes, so it’s hard to say when we might get the next list.

    What do you think? Have you noticed any improvement? Let us know in the comments.

  • Google Panda Update: Refresh Is Rolling Out

    Google announced that a data refresh is rolling out for Panda, and that webmasters should expect “some flux” over the next few days. Here’s a tweet Google just posted:

    Some thought there had been a Panda update last week, but it turned out that webmasters were apparently seeing the effects of a different update Google’s Matt Cutts later tweeted about. Google was, however, about due to launch a Panda refresh, so here we are.

    We should be seeing plenty of webmaster feedback about this throughout the week. Have you noticed any changes to your Google referrals yet? Let us know.

    More on Panda here.

  • Google Launched An Update This Week To Improve Domain Diversity

    Google launched an algorithm update that affects the diversity of search results. Google’s head of webspam and Distinguished Engineer, tweeted:

    There have been complaints in recent weeks about Google showing search results pages with a lot of results from the same domain for a lot of queries. Presumably that will be better now, and users will get a more diverse set of results in more cases. Or maybe it’s just about spreading the love among more domains in general (and not just per page).

    That’s as much as we know about the update for now, but it’ll be interesting to see if the change is noticeable on a day to day basis.

    There has been talk from webmasters that there may have been a new Panda update this week. We’ve not heard from Google on that front, and it’s unclear at this point whether this could have been the change people were noticing.

    Google’s big list of algorithm changes for the month of August is due out any time now, and when it’s released, we’ll get more insight into the direction Google is going on, and its core areas of focus in recent weeks. Stay tuned.

  • Google Panda Update: Some Wonder If One Just Happened

    Some webmasters suspect that Google may have launched a new significant Panda update. This kind of suspicion is fairly common. Sometimes it turns out to be accurate, and sometimes not so much.

    Barry Schwartz at Search Engine Roundtable, who regularly monitors the forums, points to some discussion in WebmaterWorld, with one webmaster saying they saw a 50% increase in Google referrals to a “heavy panda’d site” on Thursday.

    “It has been hit by multiple rounds of Panda and overall has lost 90% of its google traffic since Feb 2011,” the webmaster said. “Has anyone else noticed a bump to a Panda’d site on Sept 14?”

    Sure enough, someone else had, but noted that they were not sure if it was Panda related.

    The last time there was a Panda refresh, Google tweeted about it a couple days later. We’ll keep an eye out for any such tweets, for sure.

    We’ve reached out to Google for comment, and will update if we receive confirmation or denial.

    Meanwhile, we’re still waiting for Google to release its big list of algorithm changes for the month of August. That could happen any day now. Or it could happen next month.

    Update: Not sure if its directly related, but Matt Cutts tweeted about a non-Panda algorithm change.

    Image: gigglecam (YouTube)

  • SEO Budgets Grow, Google Tells You What To Avoid

    Google has been making SEO more difficult for years, and perhaps that’s why a new study from SEMPO, the world’s largets search marketing-specific nonprofit trade organization, has found that SEO spending is still healthy, despite all of the freely available online information that has spread across the web over the past decade or so.

    Are you spending more or less on SEO than you were a year ago? Let us know in the comments.

    The organization has put out a 72-page report, published by Econsultancy, looking at a survey of nearly 900 companies and agencies.

    “Overall, the report depicts a stable industry, without dramatic changes,” says SEMPO. “Although the practices of search engine and digital marketing may have changed significantly as new tools, algorithms and platforms have come into play, the survey depicts very much the same goals in place.”

    Survey respondents have increased their SEO budgets, and only 2% of them indicated that they spent nothing on SEO at all.

    The amount of agency billing for SEO services is on the rise. “A significant drop in those
    spending less than $100k corresponds to higher number across the board, with the greatest increase
    in the $1 to $5 million range,” SEMPO says.

    The survey found that, while it remains a key goal for SEO, SEMPO says, “Survey responses show a drop in the blunt objective of driving traffic.” Meanwhile, the number of agencies citing brand/reputation as a goal, doubled year-over-year.

    On the paid side of things, agencies evaluating their clients’ goals for paid search noted a significant rise in seeing brand/reputation as their top objective, SEMPO notes.

    One thing is clear: the search marketing industry is only increasing in value, despite the rise of social media, and Google algorithm updates forcing sites to become less dependent on Google:

    Search Industry Value

    “Changes to the Google algorithm affected a large percentage of marketers, or at least has them concerned,” SEMPO notes in the report. “87% call the updates of the last 12-18 months ‘significant or highly significant.’ In most cases marketers feel the overall effect to be positive, but success in combating SEO spam sites has come at the expense of many legitimate brands.”

    On Friday, there were rumblings about the possibility of a new major Panda update, but at the time of this writing, they have yet to be confirmed. A big Penguin refresh is also expected.

    If those Google updates are keeping you up at night, you may want to revisit what Google itself says about SEO, particularly when looking to hire someone.

    “Deciding to hire an SEO is a big decision that can potentially improve your site and save time, but you can also risk damage to your site and reputation,” Google says. “Make sure to research the potential advantages as well as the damage that an irresponsible SEO can do to your site.”

    Google adds that, “Many SEOs and other agencies and consultants provide useful services for website owners,” such as:

    • Review of your site content or structure
    • Technical advice on website development: for example, hosting, redirects, error pages, use of JavaScript
    • Content development
    • Management of online business development campaigns
    • Keyword research
    • SEO training
    • Expertise in specific markets and geographies.

    “Before beginning your search for an SEO, it’s a great idea to become an educated consumer and get familiar with how search engines work,” Google says, suggesting its Webmaster Guidelines and Google 101: How Google crawls, indexes and serves the web as starting points.

    Google also suggests hiring an SEO early in the site development process, like when you’re planning a redesign or planning to launch a new site altogether (many no doubt are, thanks to Panda and Penguin). In case you’re not familiar with it, Google has a list of questions that it says you should ask an SEO during the hiring process:

    • Can you show me examples of your previous work and share some success stories?
    • Do you follow the Google Webmaster Guidelines?
    • Do you offer any online marketing services or advice to complement your organic search business?
    • What kind of results do you expect to see, and in what timeframe? How do you measure your success?
    • What’s your experience in my industry?
    • What’s your experience in my country/city?
    • What’s your experience developing international sites?
    • What are your most important SEO techniques?
    • How long have you been in business?
    • How can I expect to communicate with you? Will you share with me all the changes you make to my site, and provide detailed information about your recommendations and the reasoning behind them?

    “While SEOs can provide clients with valuable services, some unethical SEOs have given the industry a black eye through their overly aggressive marketing efforts and their attempts to manipulate search engine results in unfair ways,” Google says.

    The company advises site owners to “be wary” of SEO firms and consultants or agencies that send you an email out of the blue, noting that even Google gets these emails. Google also reminds you that nobody can can guarantee you a #1 ranking. Google says to be careful if a company is secretive or won’t clearly explain what they intend to do, and makes it clear that, “You should never have to link to an SEO.”

    And just in case you needed another list of things to steer clear of, Google says to “feel free to walk away” of the SEO:

    • owns shadow domains
    • puts links to their other clients on doorway pages
    • offers to sell keywords in the address bar
    • doesn’t distinguish between actual search results and ads that appear on search results pages
    • guarantees ranking, but only on obscure, long keyword phrases you would get anyway
    • operates with multiple aliases or falsified WHOIS info
    • gets traffic from “fake” search engines, spyware, or scumware
    • has had domains removed from Google’s index or is not itself listed in Google

    Despite all of this advice, Google is making it harder and harder to get on the first page of results for reasons even beyond the algorithm updates. Google is showing less organic results for an increasing number of queries, and showing more direct answers whenever it can provide them.

    Is increasing the SEO budget the right response? Let us know what you think.

  • Now Is Your Chance To Ask Matt Cutts A Question

    If you’re reading this, you’ve probably seen some of the Webmaster Help videos Google’s Matt Cutts has done. He regularly takes questions from webmasters about various Google behaviors, and uploads responses to YouTube. Sometimes, they’re longer in more in depth, and sometimes they’re quick and to the point. We usually share them here as they become available, as they’re often filled with useful information, even if it’s not always completely new info.

    Cutts took to his blog this afternoon to announce that he’s currently taking questions for the next series. He says he plans to record some videos next week, and asks that people submit their questions via this Google Moderator page.

    Ask Matt Cutts a question

    So, now is the time to submit a question if you want a shot at getting an answer from Google’s Distinguished Engineer and head of web spam. The man can hardly walk through the hallways at an industry conference without being bombarded by people with questions. His answers are clearly in demand. Here’s your chance to get an answer without having to wait in line.

    Or, of course, you can just check out the Moderator page, and just upvote the questions you think are best.

  • Google’s Matt Cutts On Why Ads Can Be More Helpful Than Organic Results

    Some people think Google’s search results pages are getting too cluttered. There’s no question that Google has been adding more elements to them over the years. In fact, just today, Google announced that it is expanding the “Knowledge Graph Carousel,” its visual placement of fairly large Knowledge Graph results directly underneath the search box, to the rest of the English speaking world. Other languages are surely not too far behind.

    Sometimes, however, it’s simply Google ads that are taking up much of the screen real estate, and as we’ve already seen, Google is showing less organic results for a growing number of results pages.

    As a user, do you think Google’s paid results are often more helpful than its organic results? Which perform better for your business? Let us know in the comments.

    Google’s Matt Cutts participated in an interesting discussion in a Hacker News thread, in response to an article from Jitbit, called, “Google Search is only 18% Search“.

    Despite the title, the article is really about how little of the screen is used to display non-paid search results for a Google SERP. In the example author Alex Yumashev uses, Google was found to dedicate 18.5% of the screen to results (not including ads). The author found a screenshot from years ago, where Google was found to dedicate as much as 53% of the screen to results.

    Read the article if you want to get into the methodology, the resolutions, etc. There’s certainly room for debate around some of that, but in more general terms, there’s no denying that Google’s SERPs have changed over the years.

    Cutts argued that the article has a number of “major issues,” though most of his points are based on the notion that the article is about Google reducing “search” related elements, as opposed to just classic non-paid results, which I don’t think was really the point the author was trying to make.

    Cutts points out that the left-hand column is about search, that the search box is about search, and that whitespace is about search. He notes that there are “tons of searches” where Google doesn’t show ads.

    “A lot of people like to take a query that shows ads and say ‘Aha!’ but they’re forgetting all the queries that don’t show ads,” said Cutts. “Not to mention that our ads aren’t just a straight auction; we try to take into account things like the quality of the destination page in deciding whether and where to show ads, just like we do with web search results.”

    Of course, Yumashev did acknowledge that he was looking for a screen with as many ads as possible, indicating that this is specifically about the pages that do show ads. The “help-desk app” query the author used for the first example certainly does have a fair amount of ads “above the fold“.

    In his argument, Cutts said, “We actually think our ads can be as helpful as the search results in some cases. And no, that’s not a new attitude.”

    One reader challenged him to come up with an example.

    “Ads can totally be useful,” Cutts responded. “Here’s one from earlier today: [att cordless phones]. For Google’s web results, we often interpret a query [X] as ‘information about X.’ The #1 web search result I see is http://telephones.att.com/att/index.cfm/cordless-telephones/ which does have information about cordless phones from AT&T. But I was looking for which models of cordless phones AT&T has. There’s an ad that points to http://telephones.att.com/att/index.cfm/cordless-telephones/… which is actually more helpful because that shows me a bunch of different models.”

    “Now you can argue that Google should be able to find and somehow return the page that AT&T bought the ad for,” he added. “But that can be a hard problem (Bing returns the same page that Google does at #1 for example, as does DDG). So that ad was quite helpful for me, because it took me to a great page.”

    You can read the Hacker News thread to see Cutts’ comments in their entirety, and completely in context with the rest of the conversation. He also goes into why he thinks Google+ is a good business tool.

    There’s no question that Google is cramming more non-traditional content into search results pages than it used to, particularly with things like the Knowledge Graph, Search Plus Your World, and now the Gmail results, which are in opt-in field trial mode. Google is showing more direct answers, and on a larger number of SERPs, it’s showing less organic results. In fact, Google is reportedly even testing SERPs with less organic results than previously thought.

    It’s not all about ads (though Google’s revenue certainly is).

    Hat tip to Barry Schwartz for pointing to the Hacker News thread.

    Do you think Google is showing too few organic results? Let us know what you think.

  • Google Penguin Update Refresh Suspected

    Google Penguin Update Refresh Suspected

    Last month, Google’s Matt Cutts made some comments at the Search Engine Strategies conference, reportedly indicating that the next Penguin update would be “jarring”. Since then, webmasters have been waiting for the big day when Google drops the hammer.

    Last week, there were some rumblings in the forums, and further in our own comment threads, about a possible update, but Google has been silent on the matter.

    Barry Schwartz at Search Engine Roundtable is pointing to more rumblings today, saying that he suspects a Penguin refresh.

    The rumblings don’t seem to be heavy enough to suggest anything too “jarring,” so he may be right on the data refresh. Or it could simply be something else entirely. Google makes changes every day.

    Sometimes Google will actually announce an update or a data refresh. Sometimes they’ll do so in a simple tweet, however we have yet to see a tweet. Google’s big list of changes for the month of August is about due out, although last time, we had to wait two months for two months worth of changes, so who knows when that will actually come? They could simply inform us of any Penguin-related changes in that.

    The last Penguin refresh, which Cutts tweeted about, came in late May.

    Image: Batman Returns (Warner Bros.)

  • What Google’s Synonym Treatment Means For Businesses

    Google is getting better at understanding synonyms, and that is part of the search engine’s decreasing dependence on keywords for returning results. What does this mean for businesses trying get in front of Google searchers? That’s a question that could keep site owners relying primarily on Google for traffic up at night as Google progresses in this area. The good news is it might actually make things easier. However, Google has shown that in some cases, it might actually help competitors into your brand’s results.

    Have you seen Google’s use of synonyms impact search results in a negative way? As a content provider, has it made things easier? Let us know in the comments.

    Last month, after a great deal of waiting, Google released its big lists of algorithm changes for the months of June and July. In June, it was revealed that Google had made a number of changes to how it handles synonyms. As we noted at the time, the better Google gets at understanding the way users search (in terms of the language they use), the more it is getting away from dependence on keywords for delivering relevant results. Combine that with Google’s increased delivery of its own quick answers-style results and increased number of search results pages that show less than ten classic, organic results, and it’s going to have an affect on how sites can get in front of users, for better or for worse.

    Former Googler Vanessa Fox, who built Webmaster Central (and now runs Nine By Blue), tells WebProNews, “Google was already much better than a lot of people realized at synonyms when I worked there. But things have definitely improved considerably”

    “Since Google is always looking to better understand what the searcher is looking for and what pages on the web most satisfy that search, you can imagine that they spend a lot of time in this area — not just synonyms but overall query intent and page meaning,” she says.

    Fox tells us that Google’s decreased dependence on keywords makes things easier on content providers.

    “Write content based on how it best helps your audience, not based on getting in all the variations of keyword phrases for search engines,” she says. “Keyword-stuffed titles, headings, and text can be less engaging for users. They may skip the listing in search results and may bounce off the page if they click through. By focusing on solving a searcher’s problem, you better connect with your audience and ensure that all the work you did to enable your site to rank well pays off.”

    “This isn’t a new change for SEO,” she says, noting that it’s the core focus of her book , “and even was for the first edition published in 2010.”

    “It’s still important to do keyword research to understand what your audience is looking for, and I still think it’s important to use the most important keyword phrase in the left side of your title tag so it stands out for searchers scanning the listings,” Fox says. “But don’t create separate pages for each keyword phrase or use the list of phrases to pepper the page. Just cluster the similar queries and map one page to the cluster and then write the content based on what you think users most need.”

    Fox recently wrote a piece for Search Engine Land called, “Is Google’s Synonym Matching Increasing? How Searchers & Brands Can Be Both Helped & Hurt By Evolving Understanding Of Intent,” analyzing just how Google’s treatment of synonyms can affect brands and users.

    In Fox’s article, she shared an interesting example of how Google’s synonym matching can go “awry,” and end up showing more results from a competitor than a brand that the user actually typed into the search box.

    “Presumably, lots of people were searching for h. h. gregg in conjunction with things like laptops, TVs, and printers,” explained Fox in the article. “But lots more people were searching for laptops, TV, and printers in conjunction with Best Buy. So when people searched for [hhgregg site], Google ranked hhregg.com first, but ranked bestbuy.com second.”

    She pointed out that Google was also showing content from bestbuy.com for five other results on the page. If this was one of Google’s seven-result pages, that would account for every organic result apart from the top one.

    Google appears to have corrected the h.h. Gregg/Best Buy results, but one has to wonder how many similar examples are out there in the wild.

    “I see it every so often, but it’s actually pretty rare,” Fox tells us. “Typically, a branded intent is seen very differently from a topical/task intent. But you can see by the steps I outlined in the article how this can happen and seem perfectly legitimate until searcher click behavior shows signs that the result isn’t showing what the user really wanted.”

    Fox suggests in her article that if a brand experiences something like h.h. Gregg, they write a post about it in Google’s discussion forum.

    She tells us, “I’m not sure if they would take manual action or would adjust the algorithm. But while at Google, I created a position called ‘webmaster trends analyst’ specifically to watch for these types of issues in the forums. Search engineers take this information as they do data from searcher behavior to pinpoint what needs to be adjusted.”

    It can’t hurt, either way.

    What do you make of Google’s decreased dependence on keywords? Is the search engine doing a good job of returning relevant results? Share your thoughts.