WebProNews

Tag: SEO

  • Google’s 7-Result SERPs Having A Bigger Effect On Sites Than Panda?

    Google has been making it harder to get first-page rankings. That’s not just because all of the algorithm updates the search giant keeps launching, an increased emphasis on “answers” results, which require users to click over to other sites less often, and the addition of Google’s Knowledge Graph to search results. Sure, these things all come into play, but for more and more queries, Google has also been showing less traditional results altogether.

    Google results pages have commonly and historically showed ten main organic results, but for a growing number of queries, that number has been reduced to seven. Specifically, this is happening on results pages in which the top result displays additional “sitelinks”. Here’s what Google has had to say about it (via Danny Sullivan):

    “We’re continuing to work out the best ways to show multiple results from a single site when it’s clear users are interested in that site. Separately, we’re also experimenting with varying the number of results per page, as we do periodically. Overall our goal is to provide the most relevant results for a given query as quickly as possible, whether it’s a wide variety of sources or navigation deep into a particular source. There’s always room for improvement, so we’re going to keep working on getting the mix right.”

    Dr. Peter J. Meyers, President of User Effect, recently shared some interesting research at SEOmoz about this phenomenon, which seems to have begun in early to mid August.

    Now, BrightEdge has put out some new research on the topic based on analysis of queries for 26,000 keywords. According to CEO Jim Yu, the effects from this are even greater than those of the Panda update.

    In a piece sharing the firm’s analysis at Search Engine Land, he writes, “The percentage of keywords impacted is currently 8% across the industries we examined. This is significant, considering that a critical update like Panda affected 5% of searches.”

    “We have found that the impact varies by industry,” he adds. “The Technology – B2B sector has 9.4% of its keywords affected, while Technology – B2C industry sees 12.1% keywords impacted. Financial Services industry has about 2.7% of keywords affected, and about 3.5% of keywords in Retail are impacted by this change.”

    Even if a site’s rankings did not technically drop, a move from the first page to the second page in search can bring a significant barrier to visibility.

    It’s interesting that Google has not brought infinite scroll to web search as it has to image search. You can get through ten pages of image results in no time with this feature. A simple click to another page may not seem like a huge step for a user, but it’s still an additional step. It seems like introducing this feature to web search would also go along with Google’s emphasis on increasing speed in search. It’s certainly faster to scroll down further than it is to click to another page. Yet, Google seems to be going in the opposite direction, and actually reducing the number of results on the page.

    To be fair, Google usually does its job in returning the information needed on the first page (at least in my experience), and if you have to go past page one, perhaps Google is not doing its job. If you have to go deeper than seven results, even, it’s not doing that great a job. There is, however, a discoverability element that is eliminated, or at least impeded, by showing less results. Perhaps you found what you were looking for in the top results, but missed something that could have been equally helpful or interesting had you had a chance to see it.

    Image: gigglecam (YouTube)

  • Matt Cutts On Schema.org Markup As A Ranking Signal

    In 2011, Google teamed up with Microsoft and Yahoo to launch schema.org, an initiative to support a common set of schemas for structured data markup. You might wonder whether or not implementing these schemas may influence your ranking in Google (or the other search engines, for that matter).

    Google’s Matt Cutts posted a new Webmaster Help video talking about this, responding to a user-submitted question:

    I know rich snippets can increase CTR for my mention on a SERP. But is the use of schema.org code beneficial for my actual positions on the SERPs as well?

    “On one hand, I wouldn’t necessarily count on that….Just because you implement schema.org doesn’t mean you necessarily rank higher. But there are some corner cases like if you were to type in ‘lasagna,’ and then click over on the left-hand side and click on ‘recipes,’ that’s the sort of thing where using schema.org markup might help, because then you’re more likely to be showing up in that at all. So there are some cases where it can be helpful to use schema.org markup.”

    “I wouldn’t necessarily count on that giving you any sort of ranking boost…I’m not going to take it off the table, but for example, it might make sense in some of those specific topic areas, but just because somebody implements schema.org markup, that doesn’t mean that they’re necessarily and automatically a better site, so I wouldn’t count on that giving you a good ranking boost, although it can be a good idea to markup things in a rich structure just because, you know, then different people can slice and dice and find your site more easily if they are doing more digging.”

    Let’s put it this way: it seems unlikely that it will hurt your rankings.

  • Google Takes mod_pagespeed Out Of Beta For A Faster Web

    Nearly two years ago, Google unveiled mod_pagespeed, an open source Apache module tool for developers to help make the web faster. Today, Google announced that it has left beta status. It has gone through eighteen different releases to get here.

    “We’re committed to working with the open-source community to continue evolving mod_pagespeed, including more, better and smarter optimizations and support for other web servers,” Joshua Marantz and Ilya Grigorik of Google’s PageSpeed Team say in a joint blog post on the Webmaster Central blog. “Over 120,000 sites are already using mod_pagespeed to improve the performance of their web pages using the latest techniques and trends in optimization. The product is used worldwide by individual sites, and is also offered by hosting providers, such as DreamHost, Go Daddy and content delivery networks like EdgeCast. With the move out of beta we hope that even more sites will soon benefit from the web performance improvements offered through mod_pagespeed.”

    Here’s an hour-long discussion about mod_pagespeed:

    “mod_pagespeed is a key part of our goal to help make the web faster for everyone,” the pair adds. “Users prefer faster sites and we have seen that faster pages lead to higher user engagement, conversions, and retention. In fact, page speed is one of the signals in search ranking and ad quality scores. Besides evangelizing for speed, we offer tools and technologies to help measure, quantify, and improve performance, such as Site Speed Reports in Google Analytics, PageSpeed Insights, and PageSpeed Optimization products. In fact, both mod_pagespeed and PageSpeed Service are based on our open-source PageSpeed Optimization Libraries project, and are important ways in which we help websites take advantage of the latest performance best practices.”

    Emphasis added.

  • Google Launches New Page Layout Update (Yes, ANOTHER Update)

    Google is on a roll with these updates. I think webmasters are starting to understand what Google’s Matt Cutts meant when he said a while back that updates would start getting “jarring and jolting”. It seems, that rather than one major update, we’re getting a bunch of updates in a short amount of time. This past Friday, Google launched its latest Penguin refresh. A week before that, it was the EMD update and a new Panda update.

    Tuesday, Cutts tweeted about a Page Layout update:

    The Page Layout update was first announced early this year, months before we ever saw the first Penguin update. It’s sometimes referred to as the “above the fold” update. It was designed to target pages that lack content above the fold. At the time, Cutts wrote in a blog post:

    As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.

    We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

    It looks like Christmas has come early for webmasters this year. Although, on that note, this could be a sign that Google is getting all of this stuff out of the way before the holiday season, so they don’t mess too much with your rankings during this crucial time of year for ecommerce. They’ve shown in the past that they’ve learned from the infamous Florida update.

  • Matt Cutts Weighs In On Guest Blogging For Links

    Google’s Matt Cutts, even with all of the algorithm updates he’s been announcing, has somehow found time to release a new Webmaster Help video. This one is about guest blogging for links.

    He basically says it can be good to have a reputable, high-quality writer do guest posts on your site, and that it can be a good way for some lesser-known writers to generate exposure. However…

    “Sometimes it get taken to extremes,” he says. “You’ll see people writing…offering the same blog post multiple times or spinning the blog posts, offering them to multiple outlets. It almost becomes like low-quality article banks.”

    Obviously, this isn’t the route you want to go.

    “When you’re just doing it as a way to sort of turn the crank and get a massive number of links, that’s something where we’re less likely to want to count those links,” he says .

    Watch the video for his entire response.

  • Google Penguin Update: This Person Has A “Huge Recovery” Story

    Google has been pushing major updates left and right in recent weeks, and plenty of webmasters are feeling the effects for better or for worse. In late September, Google announced the EMD update targeting low-quality sites with exact match domains. Later, we found out Google had also rolled out a new Panda update around the same time. Business owners who saw their referrals from Google decline had enough fun trying to dig through that and determine which update they were actually hit by (this should have been easier for those who did not have exact match domains).

    Before the dust settled on those updates, Google went on to announce a new Penguin data refresh on Friday, after months of anticipation. So far, we have not seen many recovery stories, but we have seen one. We also haven’t seen a whole lot of people claiming to have been hit by the latest refresh (though there have been some). We have, however, seen plenty who have been working on trying to recover from previous Penguin launches, but have not been able to please the algorithm this time around.

    Have you seen changes from the latest Penguin data refresh? Let us know in the comments.

    When Google’s Matt Cutts tweeted about the Penguin refresh on Friday, he said it would “noticeably” affect 0.3% of English queries. He later tweeted that the refresh would be completed that same night. That means that the effects of the refresh should have already been felt by any webmasters affected.

    There has been at least one reported recovery from this round of Penguin. Marketer Donna Fontenot claims that she has one client that saw a “huge recovery”. Here are some comments she made on Twitter:

     
     

    Additionally, Fontenot has been talking about the recovery in the Cre8asite forums (via Search Engine Roundtable). There, she writes, “Long story short, they needed to get rid of excessive footer backlinks, links that looked like paid backlinks (and some were), etc. The really tough part? Getting the client to be patient and wait for another Penguin update to roll around so we could determine if the efforts were going to help or not. Six months later. SIX MONTHS. To a client, six months of waiting is forever.”

    “No one got those terrible links for them,” she says later in the thread. “They accumulated the links themselves over several years. But I can attest that they didn’t go out and get new links to get out of this penalty. They strictly went through a huge process of getting rid of backlinks that looked like possible suspects for a penguin penalty.”

    She admits that she was concerned (while waiting for Penguin to roll out again) that she was having her client get rid of too many backlinks, adding, “What if I had them remove links that were actually helping rather than hurting? Then, when Penguin waddled back through, even if the penalty was lifted, it was possible that they wouldn’t recover because now they would be missing links they needed to keep their rankings. Luckily, that didn’t happen.”

    While in this case it may not have happened, this is still a legitimate concern for those trying to clean up their link profiles. From what we’ve seen and heard, there has likely been a great deal of overreaction when it comes to sites getting rid of backlinks. Many sites spent time and effort getting rid of links that they would have otherwise liked to have kept, but elected not to in the off chance that they could be hurting the site in Google. More on all of that madness here.

    It seems fairly likely that following Fontenot’s story, people will continue down a similar path. Still, there are some out there that doubt her story. Alan, commenting on the Search Engine Roundtable post, says, “No offense but there are a lot more non-recovery stories than recovery stories…Unless I see proof I won’t believe her.”

    Another reader adds, “I agree, I would like to see some proof. A ton of people got rid of links and never recovered. If it was that easy we would be seeing recoveries all over the forums. That is not the case.”

    While we’re not holding our breath, perhaps Fontenot will put together a case study about the recovery for the benefit of other webmasters and SEOs. Here’s what she said about that when asked about the possibility on Twitter:

    The fact that footer links are the main component of Fontenot’s claimed recovery case is interesting. As you may know, Google recently updated its webmaster guidelines. One of the new changes is the addition of “widely distributed links in the footers of various sites” to the “link schemes” section of the Quality Guidelines section.

    Keep in mind, the Penguin update is specifically aimed at targeting sites violating the quality guidelines. This is the exact quote from Google’s original announcement about the update, which Cutts also linked to in his latest announement: “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.”

    Back in May, we reported on another Penguin update recovery story, which also apparently had a lot to do with footer links. Read that story here.

    In the forum thread, one member, Dr. Marie, writes, ‘Donna, yours is the only recovery I have heard of so far. I am thinking that sites who have excessive footer links pointing at them can recover because in many cases they can get the footer links removed and therefore remove a huge percentage of links. But, for sites that have participated in widespread article spam with links pointing back at their site the task for removal is way too huge. Plus, the webmasters who host these article sites are less likely to respond to requests for link removal.”

    Member Jonbey adds, “When people have these footer links etc. are they usually pointing at the homepage? I was thinking for any links pointing at internal pages you could just change the URLs of those so that the links point to 404’s instead.”

    Fontenot’s response to that was, “Jon, in my client’s case, yes, the footer links generally all pointed to the home page.Some of my client’s main phrases (very competitive btw) actually ended up even better than before Penguin. Instead of being #2 or #3, for instance, some are at #1 now. And in case you were interested, Penguin had originally moved their rankings down into the 50’s, 60’s, 70’s, 100’s, and below. So it was a major fall, and now a major recovery.”

    What do you think?

    Have you seen or heard of any signs of recovery from the latest Penguin refresh? Let our readers know in the comments.

  • Google Penguin Update: Latest Refresh Completed Friday

    As previously reported, Google’s Matt Cutts announced on Friday that Google launched the latest data refresh for the Penguin update.

    If you dig through Cutts’ replies to various Twitter users (as Barry Schwartz has apparently done), you’ll also find this tweet from him, indicating that the refresh would also be completed on that same day:

     
     

    So, at least you can consider the latest in the Penguin saga to be over. Considering how much this has been built up, however, we may see a lot more complaints about Penguin the next time it hits. On the other hand, releasing the EMD update, a Panda update, and a Penguin refresh all so close together could probably be described as “jarring” by some webmasters.

  • The Google +1 Button Has No “Direct Effect” On Rankings, But…

    Google’s +1s do not have a direct effect on a site’s ranking in search results. Google’s Matt Cutts said as much in a “Power Searching With Google” hangout on Google+ (via Alex Graves). However, he did indicate that Google is really only getting started with authorship, which he hinted will only become a stronger signal going forward.

    Have you seen benefits of getting people to click the +1 button on your content? Let us know in the comments.

    Right now, it seems, Google cares a lot more about authorship, and expects this to become possibly a weightier signal in the future. Here’s more of what Cutts had to say in the hangout:

    “In the short term, we’re still going to have to study and see how good the signal is, so right now, there’s not really a direct effect where if you have a lot of +1s, you’ll rank higher. But there are things like, we have an authorship proposal, where you can use nice standards to markup your webpage, and you’ll actually see a picture of the author right there, and it turns out that if you see a picture of the author, sometimes you’ll have higher click through, and people will say, ‘oh, that looks like a trusted resource.’ So there are ways that you can participate and sort of get ready for the longer term trend of getting to know not just that something was said, but who said it and how reputable they were.”

    “I think if you look further out in the future and look at something that we call social signals or authorship or whatever you want to call it, in ten years, I think knowing that a really reputable guy – if Dan has written an article, whether it’s a comment on a forum or on a blog – I would still want to see that. So that’s the long-term trend.”

    “It’s just the case that that picture is just more likely to attract attention. It’s just a little more likely to get the clicks, and you now, it’s almost like an indicator of trust.”

    “The idea is you want to have something that everybody can participate in and just make these sort of links, and then over time, as we start to learn more about who the high quality authors are, you could imagine that starting to affect rankings.”

    This is not the first time Cutts has downplayed the significance of the +1 button with regards to ranking. At SMX Advanced back in June, Cutts said (according to a liveblogged account of the conversation), “When we look at +1, we’ve found it’s not necessarily the best quality signal right now.”

    It’s going to be interesting to see how Google progresses with how it handles social signals, because it may have some major competition right around the corner. You know that little social network that just surpassed a billion active users? They’re talking about doing search. Here’s what Facebook COO Sheryl Sandberg said about it this past week:

    “As Mark said, I think people are surprised how much search is done on Facebook, you know, every day there’s enormous percentage of search. There’s also a promise in the market that search could become more social that we don’t think this has been met. When you’re looking for information, the question is who do you want it from, the wisdom of crowds or the wisdom of friends? Our answer is the information that’s most relevant for users is really about friends. That if I’m looking for a restaurant to go to in New York this week, I’d rather get a recommendation from a friend. That’s really what we’re working on.”

    OK, so getting a bunch of +1s on your content is not going to necessarily going to get it ranked higher in Google’s organic listings, and Google is not necessarily looking to it as a quality signal. However, there are still clear benefits to the button for search visibility. Consider that many people are seeing Google’s “Search Plus Your World” results, which push social connections (frequently from Google+) into the search results, making the +1 button a much more significant factor. For that matter, Google tends to show less regular organic results on pages these days. The button can also, of course, spread content throughout Google+ itself. And there’s still no reason to think Google won’t adopt it as a more significant signal in the future.

    Based on what Cutts said, however, authorship is practically a must for content providers. And even if your’e not seeing major benefits from that now, it sounds like we’re only in the beginning of how Google is going to use it. Of course, authorship is linked directly to your Google+ profile.

    On a related note, Google Webmaster Trends Analyst John Mueller has also been talking about authorship, and how the sites you’re linked to don’t necessarily have any effect on each other based on your connection to them. Barry Schwartz points to this Google Webmaster Help thread where he said, “No, there generally wouldn’t be a connection with regards to crawling, indexing, or ranking between two websites that are both linked from your Google Profile.”

    So, in other words, if you have one site that was penalized by Google, it should have no direct negative effect on another site you’re connected to, just because you’re part of both sites.

    Do you think Google is getting social search right? How could Google improve it? Let us know what you think in the comments.

  • Matt Cutts Warned Us About The EMD Update Over A Year Ago

    Todd Malicoat wrote an interesting article about the exact match domain update for SEOmoz this week. In that, he points to a Webmaster Help video from Google’s Matt Cutts from early 2011, where he hinted that Google would be “turning down” exact match domains as a ranking signal. Here’s what he said exactly:

    “We have looked at the rankings and weights that we give to keyword domains and some people have complained that we’re giving a little too much weight for keywords in domains. And so we have been thinking about adjusting that mix a little bit and sort of turning the knob down within the algorithm so that given two different domains, it wouldn’t necessarily help you as much to have a domain with a bunch of keywords in it.”

    Malicoat also points to an article from Bill Slawki from last year looking at a patent for Systems and methods for detecting commercial queries, which talks about exact match domains.

  • Google EMD Update: It Will Be Back Repeatedly

    Not that this will come as much of a surprise, but Google’s EMD update, which Matt Cutts announced a week ago, will be an ongoing, periodic update, much like our other algorithmic friends Panda and Penguin.

    Danny Sullivan confirmed as much with Google. He writes, “Google confirmed for me this week that EMD is a periodic filter. It isn’t constantly running and looking for bad EMD domains to filter. It’s designed to be used from time-to-time to ensure that what was filtered out before should continue to be filtered. It also works to catch new things that may have been missed before.”

    Like I noted here, Sullivan says the advice for EMD recovery is pretty much like that for Panda recovery. “After you’ve removed the poor quality content, it’s waiting time. You’ll only see a change the next time the EMD filter is run,” he says. “When will that be? Google’s not saying, but based on the history of Panda, it’s likely to be within the next three months, and eventually it might move to a monthly basis.”

    Google, if you haven’t heard, actually did launch a new Panda update to roll out alongside the EMD update, so webmasters have had to deal with both updates at the same time, trying to figure out which one they’re actually being affected by. Luckily, the cure is probably the same for both. Quality.

    By the way, Google has been making other changes to its algorithm related to the quality of pages. More on that here. There was also another recent domain-related algorithm tweak (in addition to the domain diversity update).

  • Google Helps Users Refine Searches About People Who Have The Same Name As Someone Else

    Google released a big list of 65 “search quality highlights” or changes it made over the course of August and September. We’ve discussed numerous elements of this list in various articles. You can find those here.

    One change in particular should make it easier for Google users to find the people they’re looking for in cases when that person has the same name as other people.

    The change comes under the project name “Refinements”. Here’s what Google says exactly: “This change helped users refine their searches to find information about the right person, particularly when there are many prominent people with the same name.”

    The change was actually made in August, so by now we may have even experienced the effects of it without even realizing it.

    Other changes Google has made under the “Refinements” project label in previous lists include:

    • Zivango. [project codename “Refinements”] This change leads to more diverse search refinements.
    • Autocomplete predictions used as refinements. [launch codename “Alaska”, project codename “Refinements”] When a user types a search she’ll see a number of predictions beneath the search box. After she hits “Enter”, the results page may also include related searches or “refinements”. With this change, we’re beginning to include some especially useful predictions as “Related searches” on the results page.

    Making it easier to find a person who has a common name has been one of the things Google’s Knowledge Graph has helped with, at least in cases where “prominent” people share names.

    For example, if you search for “brett butler,” Google assume’s you’re talking about the actress from Grace Under Fire, but with Knowledge Graph, Google can also show a box for “see results about” Brett Butler the former Major League Baseball center fielder (bottom right corner):

    Brett Butler

    In some cases, Google isn’t assuming you’re talking about one vs. the other, like in the example above. Here, we see Google let you pick one right from the start:

    Eric Wright

  • Google Continues To Work On Getting Better At Synonyms

    Google continues to move further away from keyword dependence by understanding words and user intent better. Part of this is through how Google is able to interpret synonyms, and this is something the search engine’s team continues to work on.

    This week, Google released a big list of 65 changes it made throughout August and September. Two of them were listed under the project “Synonyms” label.

    Regarding one of them, Google said, “This change made improvements to rely on fewer ‘low-confidence’ synonyms when the user’s original query has good results.”

    Of the second one, Google said, “This change improved the use of synonyms for search terms to more often return results that are relevant to the user’s intention.”

    When Google last released a big list of changes, comprised of June’s and July’s changes, there were four other synonym-specific changes listed. More on those here.

    We recently had an interesting discussion with former Googler Vanessa Fox about Google’s treatment of synonyms. She told us, “Google was already much better than a lot of people realized at synonyms when I worked there. But things have definitely improved considerably.”

    “Since Google is always looking to better understand what the searcher is looking for and what pages on the web most satisfy that search, you can imagine that they spend a lot of time in this area — not just synonyms but overall query intent and page meaning,” she added.

    You can read our whole conversation here.

  • Recent Google Changes Video Creators Should Note

    On Thursday, Google released a big list of changes it made throughout August and September. There were sixty-five in all, and we’ve discussed a handful of subsets of them in various articles, which you can find here.

    A few of the changes were related to how Google deals with video, so if video is part of your content mix, these may be worth noting.

    For one, Google has made changes to the way it indexes videos, and it’s also gotten better at understanding video intent in relation to when it will show universal search results. As any video provider knows, these video universal results are a major factor of what makes video good for SEO.

    Here are the changes Google announced that relate to video:

    • Maru. [project “SafeSearch”] We updated SafeSearch to improve the handling of adult video content in videos mode for queries that are not looking for adult content.
    • #83613. [project “Universal Search”] This change added the ability to show a more appropriately sized video thumbnail on mobile when the user clearly expresses intent for a video.
    • #82546. [project “Indexing”] We made back-end improvements to video indexing to improve the efficiency of our systems.
    • #83406. [project “Query Understanding”] We improved our ability to show relevant Universal Search results by better understanding when a search has strong image intent, local intent, video intent, etc.
  • Google Results Are Getting More Local

    Google Results Are Getting More Local

    Google released a big list of 65 changes it has made to its algorithms over the course of August and September, and some of those changes are specifically geared towards making Google better for finding local information.

    Google has been working on improving its local experience for years (though businesses aren’t always happy with the directions the search engine decides to take), and that continues to be the case.

    With recent changes, Google says it has improved the precision and coverage of its system, which helps users find more relevant local web results. “Now we’re better able to identify web results that are local to the user, and rank them appropriately,” Google says.

    Google has also improved its ability so show relevant “universal” results for local, among other categories.

    Here are the local-related changes Google listed:

    • #83659. [project “Answers”] We made improvements to display of the local time search feature.
    • nearby. [project “User Context”] We improved the precision and coverage of our system to help you find more relevant local web results. Now we’re better able to identify web results that are local to the user, and rank them appropriately.
    • #83377. [project “User Context”] We made improvements to show more relevant local results.
    • #83406. [project “Query Understanding”] We improved our ability to show relevant Universal Search results by better understanding when a search has strong image intent, local intent, video intent, etc.
    • #81360. [project “Translation and Internationalization”] With this launch, we began showing local URLs to users instead of general homepages where applicable (e.g. blogspot.ch instead of blogspot.com for users in Switzerland). That’s relevant, for example, for global companies where the product pages are the same, but the links for finding the nearest store are country-dependent.

    It looks like they still have some work to do in the natural language meets local department:

    Where can I get a taco?

    This would be especially helpful in voice search scenarios. The results for the above query were no better via voice search from Android.

    Google has been working on improving its natural language understanding capabilities, but clearly this is no easy feat to master.

  • Google Has Been Messing Around With The Way It Displays Snippets In Search Results

    Google has been busy as usual making numerous changes to its search algorithms, and on Thursday, the company posted a big list of 65 changes it made during the months of August and September. 7 of these changes were related to the snippets Google shows on search results pages.

    Google has refreshed the data is uses to generate sitelinks in snippets, and a few of them are related to titles specifically. Here’s the list of snippets-related changes:

    • #83105. [project “Snippets”] We refreshed data used to generate sitelinks.
    • #83442. [project “Snippets”] This change improved a signal we use to determine how relevant a possible result title actually is for the page.
    • #82407. [project “Other Search Features”] For pages that we do not crawl because of robots.txt, we are usually unable to generate a snippet for users to preview what’s on the page. This change added a replacement snippet that explains that there’s no description available because of robots.txt.
    • #83670. [project “Snippets”] We made improvements to surface fewer generic phrases like “comments on” and “logo” in search result titles.
    • #84652. [project “Snippets”] We currently generate titles for PDFs (and other non-html docs) when converting the documents to HTML. These auto-generated titles are usually good, but this change made them better by looking at other signals.
    • #84211. [project “Snippets”] This launch led to better snippet titles.
    • #84460. [project “Snippets”] This change helped to better identify important phrases on a given webpage.

    Speaking of snippets, Google updated its Webmaster Guidelines this week, and has some new stuff about rich snippets. You can read more about that here.

  • Google Makes Changes To “Other Ranking Components”

    Google has released a new list of algorithm changes it made during the months of August and September. We’ve been looking at different groups of changes individually. So far, we’ve looked at domains, freshness, page quality, autocomplete, and answers-related changes.

    Some of the changes on Google’s list come under the “Other Ranking Components” project label. This appears to be something of a catch all label for things that aren’t categorized by other projects. Interestingly, Google acknowledges here that it is showing fewer results on more pages. Google says this is to show the most relevant results more quickly. Here’s that list entry:

    #82279. [project “Other Ranking Components”] We changed to fewer results for some queries to show the most relevant results as quickly as possible.

    Google has also adjusted the way links are used in ranking through what it is calling a “minor bug fix,” and has changed how it ranks documents for location-based queries. Here are three more entries to Google’s big list:

    LTS. [project “Other Ranking Components”] We improved our web ranking to determine what pages are relevant for queries containing locations.

    #83709. [project “Other Ranking Components”] This change was a minor bug fix related to the way links are used in ranking.

    #84586. [project “Other Ranking Components”] This change improved how we rank documents for queries with location terms.

  • Google Makes More Changes To How It Displays “Answers”

    Google released its big list of 65 changes it made during the months of August and September, and once again, quite a few of them had to do with “answers”. These are the the results Google delivers that don’t take you to another site.

    Last time Google released a big list, we looked at how Google is getting better at not having to send users to other sites. This would all be an extension of that. Here are the answers-related changes from the new list:

    • #83818. [project “Answers”] This change improved display of the movie showtimes feature.
    • #83819. [project “Answers”] We improved display of the MLB search feature.
    • #83820. [project “Answers”] This change improved display of the finance search feature.
    • #83459. [project “Alternative Search Methods”] We added support for answers about new stock exchanges for voice queries.
    • #83659. [project “Answers”] We made improvements to display of the local time search feature.
    • #84063. [project “Answers”] We added better understanding of natural language searches for the calculator feature, focused on currencies and arithmetic.
    • #83821. [project “Answers”] We introduced better natural language parsing for display of the conversions search feature.
    • #84083. [project “Answers”] This change improved the display of the movie showtimes search feature.
    • gresshoppe. [project “Answers”] We updated the display of the flight search feature for searches without a specified destination.
    • #84068. [project “Answers”] We improved the display of the currency conversion search feature.
    • #83391. [project “Answers”] This change internationalized and improved the precision of the symptoms search feature.

    It’s probably worth noting that 10 out of 11 of those changes took place in August, so perhaps Google’s focus has shifted away from this area a little for the time being.

  • Google Makes A Bunch Of Changes To Autocomplete

    Google released a big list of 65 changes it has made over the course of August and September, and quite a few of them were tweaks to its autocomplete feature.

    This is all part of Google’s goal of getting you to what you’re looking for more quickly, and with less steps, something the search engine has made tremendous strides on over the years.

    The following 10 changes deal specifically with autocomplete:

    • #83197. [project “Autocomplete”] This launch introduced changes in the way we generate query predictions for Autocomplete.
    • essence. [project “Autocomplete”] This change introduced entity predictions in autocomplete. Now Google will predict not just the string of text you might be looking for, but the actual real-world thing. Clarifying text will appear in the drop-down box to help you disambiguate your search.
    • #84259. [project “Autocomplete”] This change tweaked the display of real-world entities in autocomplete to reduce repetitiveness. With this change, we don’t show the entity name (displayed to the right of the dash) when it’s fully contained in the query.
    • TSSPC. [project “Spelling”] This change used spelling algorithms to improve the relevance of long-tail autocomplete predictions.
    • Dot. [project “Autocomplete”] We improved cursor-aware predictions in Chinese, Japanese and Korean languages. Suppose you’re searching for “restaurants” and then decide you want “Italian restaurants.” With cursor-aware predictions, once you put your cursor back to the beginning of the search box and start typing “I,” the prediction system will make predictions for “Italian,” not completions of “Irestaurants.”
    • #84288. [project “Autocomplete”] This change made improvements to show more fresh predictions in autocomplete for Korean.
    • espd. [project “Autocomplete”] This change provided entities in autocomplete that are more likely to be relevant to the user’s country. See blog post for background.
    • #83391. [project “Answers”] This change internationalized and improved the precision of thesymptoms search feature.
    • #82876. [project “Autocomplete”] We updated autocomplete predictions when predicted queries share the same last word.
    • #80435. [project “Autocomplete”] This change improves autocomplete predictions based on the user’s Web History (for signed-in users).

    Last month, Google Autocomplete stopped excluding the term “bisexual,” attracting some headlines for the feature – probably the most positive headlines the feature has seen in recent memory, given that they didn’t involve Google getting in trouble for making controversial suggestions about specific people.

  • Google Page Quality Algorithm Changes Don’t Always Come Under The Panda Label

    When it comes down to it, most of the signals Google uses to rank the web’s content are rooted in quality. Google’s constantly changing algorithms are specifically geared towards creating a higher quality search experience for its users. Not everyone believes that, but it’s generally the stance Google takes. When Google releases its lists of changes it has made for each month (as it did today for August and September), it calls them “search quality highlights”. It’s ALL about quality.

    Still, only a certain subset of the changes are directly related to “page quality”. Presumably, that means the quality of the web pages it is ranking in organic search results (even if less of them are making the first page these days).

    The Panda update falls under the broader “Page Quality” project banner. In August’s list, Google notes that it refreshed data for the Panda “high quality sites algorithm,” and this is listed under project “Page Quality”. This is not the only page quality change Google announced on the big list of 65 changes, however. There are three others they are:

    #82862. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources.

    #83689. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources.

    #84394. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources.

    Really helpful. I know. That’s as specific as Google is going to get on that. It’s Panda and these other vague changes. There were three known Panda updates/refreshes in August/September. See the dates here, yet there were four Page Quality changes in these lists for that time period, and only one of them is specifically described as Panda by Google.

    It should tell you one thing, however. The search industry press may talk about Panda a lot, and it has certainly wreaked havoc on plenty of webmasters and businesses, but Google is always making other changes directly related to the quality of your pages that don’t carry the Panda banner, though from their vague descriptions, they seem to set to out to accomplish the same basic thing.

    It always comes back to quality content. See our article on the EMD Update and Panda here.

    Image: Awesome fat Panda eating=]] (YouTube)

  • Google Continues To Tinker With Freshness In Recent Algorithm Adjustments

    Is Google getting close to where it wants to be in terms of how it handles freshness of content in search results? This has been one major area of focus for Google for the past year or so. Last November, Google launched the Freshness update, and since then, it has periodically been making various tweaks to how it handles different things related to freshness.

    Google has been releasing regular lists of algorithm changes it makes from month to month all year, and some of these lists have been quite heavy on the freshness factor. On Thursday, Google released its lists for changes made in August and September. Somewhat surprisingly, “freshness” is only mentioned twice. Two changes were made (at least changes that Google is disclosing) under the “Freshness” project banner.

    We actually already discussed one of them in another article, as it is also related to how Google deals with domains (which Google seems to be focusing on more these days). That would be this list entry:

    #83761. [project “Freshness”] This change helped you find the latest content from a given site when two or more documents from the same domain are relevant for a given search query.

    That change was made in September. The other one was made in August:

    Imadex. [project “Freshness”] This change updated handling of stale content and applies a more granular function based on document age.

    Quite frankly, I’m not sure what you can really do with that information other than to consider the freshness of your content, in cases where freshness is relevant to quality.

    This is actually a topic Google’s Matt Cutts discussed in a Webmaster Help video released this week. “If you’re not in an area about news – you’re not in sort of a niche or topic area that really deserves a lot of fresh stuff, then that’s probably not something you need to worry about at all,” he said in the video.

    I’ve been a fairly vocal critic of how Google has handled freshness, as I’ve found the signal to get in the way of the information I’m actually seeking far too often. Plenty of readers have agreed, but this is clearly an area where Google is still tinkering. Do you think Google is getting better at how it handles freshness? Feel free to share your thoughts in the comments.

  • Google Reveals Yet Another Domain-Related Algorithm Tweak

    Google finally released its big lists of algorithm changes for the months of August and September. There are 65 changes on the lists in all. We’ll be discussing various components in different articles.

    The first thing that strikes me about the two-month list is that the word “domains” is only mentioned once. We know that Google launched the “Domain Diversity” update in September, as Google’s Matt Cutts tweeted about it when it happened. Then, this past Friday, he also tweeted about the EMD update targeting exact-match domains. With both of these actually announced via Twitter, it seemed to indicate a new focus on domain-related signals from the search giant.

    That’s why I’m a bit surprised that there aren’t more entries to this list that are directly related to domains. In fact, there aren’t even two (which would account for both of the ones Cutts tweeted about). Perhaps they didn’t bother to include them, because they thought the tweets were enough (though they still included a previously tweeted about Panda refresh).

    Anyhow, here’s the one domain-related entry from Google’s latest lists, and it happened sometime in September, interestingly enough, under the “Freshness” project banner:

    #83761. [project “Freshness”] This change helped you find the latest content from a given site when two or more documents from the same domain are relevant for a given search query.

    So what do we know about how Google is treating domains differently now? For one, the domain name signal itself appears to have been reduced with the exact-match domain update. Google is wanting to show less results from the same domain in more instances (with the domain diversity update), and for search results pages that do still show multiple results from the same domain, Google is likely to rank the newer one higher (based on the listing above).

    Another domain-related tweak

    This does indeed suggest a new focus on domain-related signals, and given that much of this has come to light only around the end of September, it seems entirely possible that Google will continue this focus into October. Of course, at this rate, we’ll have to wait until sometime in December to even know about them.

    We’ll talk about about freshness more in a coming article.