WebProNews

Tag: SEO

  • Google Discusses Google Play SEO For First Time

    Search Engine Optimization has changed a lot over the years, and with mobile ecosystems rising in prominence, there are things that need to be taken into consideration that simply didn’t exist when the industry first came into being.

    One of those things is the rise of mobile apps and app stores like Google Play. If you have a mobile app, you are presented with the challenge of getting it in front of people, and ideally doing so while they’re using their mobile device. Little is known about the Google Play search engine, however. In fact, this is the first time the company has even discussed the Google Play search or discovery engines publicly.

    On day three of the event, Google Play’s Ankit Jain opened up about this stuff in a session called “Getting Discovered on Google Play”.

    One of the first questions asked of the audience at the beginning of the session is, “How many of you would like to make even more money on Google Play? If your answer to this question is yes, you should probably watch this.

    “As an Android application developer, your goal is to get your app discovered,” Google says in the video description. “Google Play’s goal is to surface the most relevant content for Android users. In this session, we discuss best practices for app creators in view of both goals. We will demonstrate, through concrete examples, best practices to help your application rise above others in the Google Play Store. We will discuss the signals that go into creating the top and trending lists, personalized recommendations, and Google Play Search. Come hear the inside story from the person who leads search & discovery on Google Play.”

    Android has seen 900 million activations.

  • Google Just Took Out Thousands Of Linksellers

    Earlier this week, Google’s Matt Cutts ran down a bunch of new stuff Google’s web spam team is working on. Cutts tweeted an extension of that today, noting that Google will continue to tackle link networks, and that in fact, they just took action on “several thousand linksellers” today.

    Let the good times roll.

    Webmasters continue to anxiously await an upcoming, bigger version of the Penguin update, and Cutts also indicated that Panda would be easing up a bit.

    As part of Cutts’ big video, he said Google would continue to be vigilant when it comes to all types of link spam. Already, the webspam team is making good on its word.

  • Google Penguin Update Gets Ready To Bite Webmasters’ Noses [Penguin 2.0]

    Matt Cutts revealed late on Friday that Google Penguin Update 2.0 is on the way, and that it will be a big one. Yes, there have been multiple iterations of the update to come out, but those have simply been data refreshes of the original update. Google is readying a big new version of it, and when we say big, we mean bigger than the original.

    Matt Cutts says the internal team at Google is referring to it as Penguin 2.0, despite what other numbers are making the rounds out there.

    Get ready, because it’s coming:

    In case you don’t get the nose-biting reference, enjoy this scene from Tim Burton’s Batman Returns featuring Danny DeVito as The Penguin.

    Hat tip to Danny Sullivan

  • Webmasters Hope This Google Test Doesn’t Become A Reality

    Google has been running a test that eliminates URLs from search results pages (for the most part).

    Does this make results pages better? Would you be in favor of Google implementing this as a new design? Let us know in the comments.

    Note: We’ve updated this article after people have had time to react to the test.

    Google tests different things with its search interface all the time. Sometimes we cover the tests, and sometimes we don’t. Frankly there are just too many to keep track of. Matt Cutts has said that Google runs 20,000 search experiments a year.

    This one is kind of interesting though, as it completely removes URLs from search results pages (apparently unless there is authorship involved). Tecno-Net tipped Search Engine Roundtable with a couple screen caps.

    Here’s what it looks like on desktop:

    Google SERP without URLs

    And on mobile:

    Mobile No URLs

    SER’s Schwartz posted about it on Search Engine Land here. It’s clear that this would not be a popular change if implemented. Here are a few sample comments from that article:

    John Mitchell: “Hmm.. not sure if I like this as a user, I tend to look at the URLs in the results as there are some sites that I don’ trust even if Google does and places their pages in the results. In the examples above I’d probably be looking for the relevant page on the Microsoft site and there is no clue as to which result(s) that is.”

    Liam Fisher: “Sounds like a huge way of opening the door to dodgy sites impersonating legitimate ones.”

    Nick Boylan: “I wouldn’t like this at all. I glance at the URLs all the time, to determine the legit-ness of the source. Particularly if I’m looking for legal information or otherwise, or government services, etc.”

    We’ve also gotten a few comments opposing the change:

    Michele: “Worst Idea Ever!!! When I search for something, I have my own opinion about various sources – if google removes that information, I have to click through to determine that I don’t want to go there. Another case where I HATE it when technology thinks it’s smarter than I am.”

    Vincent J. Eagan III: “Horrible idea! You need to see the URL so you know what kind of page it might be – otherwise it will be easy for scammers to set up pages.”

    As Schwatz points out, the test is all being discussed a lot in the forums.

    In Google’s own forum, one user writes, “I really hope that this isn’t a forerunner of a real change to the search results page – we’ve had too many of these recently, from the removal of the instant preview to the messing around with green arrows to see more information. Google needs to realise that people get used to and trust a particular format and anything different (like the Google+ merged results for example) only confuses people and makes them trust the results less.”

    These are all valid points, and it’s hard to imagine Google implementing any change knowing that it could help spammers. It also makes the page less informative, which seems like a step in the wrong direction. Clearly most people who have seen the change aren’t wild about it.

    Still, Google has made plenty of changes in the past with varying degrees of popularity. Recent changes to Google Image search seem to be quite unpopular with webmasters, for example.

    What do you think? Should Google get rid of URLs on results pages? Share your thoughts.

  • Is International SEO More Important Now?

    Is International SEO More Important Now?

    Ranking in search engines, particularly Google, is not getting any easier, but how often are you considering the search engines around the globe? Many in the industry see international SEO as only gaining in importance.

    Do you think it’s more important for marketers to optimize for different search engines around the world than it used to be? Share your thoughts in the comments.

    A recent report from BrightEdge indicates that the majority of search marketers think that it is becoming more important for sites to rank in global search engines. According to the firm’s survey, six out of ten believe it will become either “more” or “much more” important this year, compared to last year. 36% said “more,” while 27% said “much more.”

    Global SEO

    “SEO marketers at global companies aspire to reach customers worldwide, and drive leads, revenue and traf!c through global SEO initiatives,” says BrightEdge in the report. “Looking beyond a single country also helps them demonstrate a greater ROI on marketing investments. Not only does this boost marketing ROI but also maintains global brand consistency while accommodating local nuances. A global concerted approach to SEO marketing addresses these needs.”

    Respondents were specifically asked about Chinese search giant Baidu, with 31% saying it would be much more important to rank in Baidu in 2013, and 10% saying “much more important”.

    BrightEdge - Baidu

    “With roughly 540 million internet users, 900 million mobile users and 388 million mobile internet users, China is the world’s largest internet market,” says BrightEdge. “Baidu, China’s dominant search engine, is one of the most valuable gateways to this large internet user base.”

    You can download the report in its entirety here. It deals with numerous topics, far beyond the topic of global SEO.

    Another recent report (via MarketingCharts) from Covario found that Baidu generated three times more global paid search clicks than Yahoo/Bing in Q1.

    “I no longer believe it makes sense for any company to roll out an international SEO programme to multiple countries without also having a PPC campaign in place,” writes WebCertain CEO Andy Atkins-Krüger in a post for Search Engine Land about multinational SEO. “In some cases, we would recommend leading with PPC and landing pages first, rather than full blown (and relatively expensive) international SEO.”

    He adds, “There are a number of reasons why we recommend this, but one is that user satisfaction on your site can be measured much more quickly with PPC than with SEO. Behavior really matters — so if you can study it first and quickly with PPC, your SEO efforts later will work out to be much more successful. I do worry that the association of search engine warnings with SEO being ‘bad’ are beginning to stick with people who are newer to the industry, and therefore, SEO is having a health warning attached.”

    Dave Davies has a great article on international SEO considerations at Search Engine Watch, in which he concludes, “While expanding one’s market is generally a good thing, what people often forget is that you still have to maintain what you have, so make sure you have the resources. Many wars have been lost simply by trying to fight them on too many fronts.”

    “If you have just enough resources to dedicate to a successful SEO strategy in your own country, it doesn’t make sense to expand in that you’ll be drawing resources away from the strategy that’s keeping the lights on,” he adds. “You need to make sure it’s the right decision for your business and if it is, make sure that you’re picking the right strategies to maximize your odds of success in the shortest period of time.”

    In your international optimization efforts, you may also want to keep in mind some recent changes Google has made to its indexing systems. They’re now treating some country-code TLDs differently in terms of geography vs. generic. The list will change over time, but right now, these are the ccTLDs Google is considering generic: .ad, .as, .bz, .cc, .cd, .co, .dj, .fm, .gg, .io, .la, .me, .ms, .nu, .sc, .sr, .su, .tv, .tk and .ws.

    Are you increasing your focus on international SEO, or are you simply focusing on your own region? Let us know in the comments.

  • Guess Which SEO ‘Misconception’ Matt Cutts Puts To Rest

    In Google’s latest Webmaster Help video, Matt Cutts is asked about a common SEO misconception that he wishes to put to rest. The answer: Google is not doing everything you read about in patents.

    Cutts says, “There a sort of persistent misconception that people often have, which is that just because a patent issues…that has somebody’s name on it, or someone who works at search quality, or someone who works at Google, that doesn’t necessarily mean that we are using that patent at that moment.”

    He continues, “Sometimes you’ll see speculation, ‘Oh, Google had a patent where they mentioned using the length of time that the domain was registered.’ That doesn’t mean that we’re necessarily doing that. It just means that, you know, that mechanism is patented.”

    Cutts recalls, “Somebody else at Google had gotten a patent on the idea (or the mechanism, not just the idea, the actual implementation) by which you could look at how people had changed their webpage after an update, and basically say, ‘Oh, these are people who are responding to Google, or they are dynamically SEOing their stuff,’ and so there were a lot of publishers who were like, ‘Ugh, I’m just gonna throw up my hands. Why bother at all if Google’s just gonna keep an eye?’ and you know, ‘If we change, and Google’s just using that and monitoring that, and changing their ranking in response,’ and it’s the sort of thing where just because that patent comes out, doesn’t mean that Google’s currently using that technology.”

    “So, patents are a lot of interesting ideas,” he adds. “You can see a lot of stuff mentioned in them, but don’t take it as an automatic golden truth that we’re doing any particular thing that is mentioned in a patent.”

    It is true that patents provide a lot of insight into the kinds of ideas that Google is thinking about, and often we can only really speculate about certain things that it is actually implementing.

  • Google Updates Indexing To Treat TLDs Differently

    Google has been updating its indexing systems to treat some TLDs differently than in the past. Some country-code TLDs are being treated as generic TLDs.

    The list, which may still change more over time, of generic country code TLDs is as follows: .ad, .as, .bz, .cc, .cd, .co, .dj, .fm, .gg, .io, .la, .me, .ms, .nu, .sc, .sr, .su, .tv, .tk and .ws.

    Google’s Pierre Far shared the news in a Google+ post (via Search Engine Roundtable).

    Pierre Far

    Expanded list of ccTLDs treated as Generic ccTLDs

    Over the past few months, we've been updating our indexing systems to treat certain country country-code TLDs as generic TLDs; that is, even though the top-level domain has a country code, we would treat it, by default, as not targeting a specific country. Now that all the pieces are in place, we also updated our Help Center article listing the TLDs we treat as gTLDs:

    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1347922

    The latest addition includes the quite-popular (and personal favorite 🙂 ) .io.


    Geotargetable domains – Webmaster Tools Help
    Generic top-level domains (gTLDs) don’t target specific countries. If your site has a generic top-level domain, such as .com, .org, or any of the domains listed below, and targets users in a particula…

    Google’s Matt Cutts recently did a Webmaster Help video discussing location and ccTLDs. If you’re reading this article, you might find it helpful. You can get the gist of it in text if you click the link, in case you don’t feel like sitting through the two-and-a-half-minute video.

  • Google Has People Scared To Link To Their Own Content

    Some webmasters are afraid to link to their own content, for fear of Google penalizing them. This isn’t exactly new, but the point is being emphasized lately (ironically by Google itself).

    Has Google ever worried you about your own linking practices? Let us know in the comments.

    This past week, Google Google’s Matt Cutts addressed the following question in a Webmaster Help video:

    Suppose I have a site that covers fishing overall (A) & I make another fishing site that solely focuses on lure fishing (B). Does linking to A from B violate guidelines? I’ll make sure both have high quality content & disclose that they’re both owned by me.

    “Just linking from A to B is not a violation of our quality guidelines,” says Cutts. “If you only have two sites, they’re thematically related, a person on A would be interested in B…then it makes perfect sense to link those two sites. The problem gets into [when] you don’t have two sites, but you have fifty sites, or eighty sites, or a hundred and fifty sites, and then suddenly linking all of those sites starts to look a lot more like a link network and something that’s really artificial, as opposed to something that’s organic.”

    “So if you really do have just a small number of sites – you can count them on one hand – and they’re all very related to each other, it can make perfect sense to link those together,” he continues. “It’s when you start to get a lot more sites – you know, you don’t need 222 sites about car insurance. It looks a little weird if you have howdoigetmycarinsurance.net and wheresthecheapcarinsurance.com…I’m making these domain names up, so I’m not saying these particular site owners are bad – maybe they’re great. Who knows? But if you have 222 different copies of that, usually you’re not putting as much work into each individual site, and so as a result, you’ll end up with shallow or superficial sites, lower quality content, you’re more likely to see doorways…that sort of thing.”

    It says something about Google’s power over webmasters (at least those that depend on it too much) that people have to check with Google to see if Google is okay with them putting a link on their own website to another of their own websites.

    And this isn’t the first question Cutts has addressed regarding people linking to their own content in recent days. In another video, the user asked about internal links leading to lower rankings because of the Penguin update. The exact question was:

    Do internal website links with exact match keyword anchor text hurt a website? These links help our users navigate our website properly. Are too many internal links with the same anchor text likely to result in a ranking downgrade because of Penguin?

    So here, the person knows that the links are used to help users, but they’re still concerned how Google will view them. In other words, regardless of whether or not it’s actually good for people who visit the site, they need to make sure it’s going to be okay with Google, because presumably the users will never find it in the first place if they can’t find it in a Google search. Here’s Matt’s response to that:

    “My answer is typically not,” says Cutts. “Typically, internal website links will not cause you any sort of trouble. Now, the reason why I say ‘typically not’ rather than a hard ‘no’ is just because as soon as I say a hard ‘no’ there will be someone who has like five thousand links – all with the exact same anchor text on one page. But if you have a normal site, you know…a catalog site or whatever…. you’ve got breadcrumbs…you’ve got a normal template there…that’s just the way that people find their way around the site, and navigate, you should be totally fine.”

    “You might end up, because of breadcrumbs or the internal structured navigation, with a bunch of links that all say the same thing, that point to one page, but as long as that’s all within the same domain, just on-site links, you know, that’s the sort of thing where, because of the nature of you having a template, and you have many pages, it’s kind of expected that you’ll have a lot of links that all have that same anchor text that point to a given page,” he says.

    So basically, this isn’t an issue you should have to worry about, but if you abuse it, it could become an issue. The problem is that clearly well-intentioned people are still worried about whether their practices will be considered abuse by Google, even if they think they’re just doing what’s right for the user.

    After the Penguin update, we saw a lot of overreaction in link removal requests by those who were afraid links from other sites were hurting them. In the process, because of their fear of Google, some requested the removal of links they would have otherwise found valuable. Google has since offered the Link Disavow tool, but Google even suggests that most people don’t use it.

    It will be interesting to see how people continue to approach links moving forward. This week marked the one-year anniversary of the Penguin update, and a big update to that is expected in the near future.

    Are people worrying about Google too much when it comes to links? Let us know what you think in the comments.

  • Google Image Search Changes Have Not Been Kind To Webmasters

    Earlier this year, Google launched a new design for its image search, and ever since, there has been a substantial amount of backlash from webmasters claiming that the changes have decreased the amount of traffic they get to their sites.

    Have you seen less traffic from Google Image Search since the redesign? Let us know in the comments.

    Webmasters complaining about changes made by Google is nothing new. Every time Google releases a major algorithm update like Penguin or Panda, the outcry is everywhere. But, like it or not, that’s Google trying to better its algorithm, and ultimately improve its search results. You could also argue that any traffic one site loses, another gains. Somebody wins.

    The Image Search story is a bit different, however. This is not an algorithmic change designed to point users to higher quality images or more relevant image results. It’s a cosmetic change, and while some users may find the experience to be an upgrade, it’s clear that many webmasters have not welcomed the redesign.

    We got over seventy comments about the changes on a previous article we published. Not many were positive. In fact, most were from webmasters talking about the traffic they lost almost instantly. Here are a few examples:

    “55% dropped for websites with images…”

    “My traffic has dropped to 1/5 of what it was before the new Google Images search roll out…”

    “My traffic was cut by half overnight…”

    “My image based website has lost 2/3 of the visitors after the change…”

    “Google image traffic has dropped by 50-70% on my site…”

    We could go on. See for yourself.

    That was back in January. It doesn’t appear that things have gotten much better.

    Define Media Group published some findings from a recent study on Monday (hat tip to Search Engine Land). According to the firm, you might as well spend your time in other areas of search engine optimization and online marketing, and not worry so much about optimizing for image search anymore.

    “We analyzed the image search traffic of 87 domains and found a 63% decrease in image search referrals after Google’s new image search UI was released,” explains Shahzad Abbas. “Publishers that had previously benefitted the most from their image optimization efforts suffered the greatest losses after the image search update, experiencing declines nearing 80%.”

    “In the eleven weeks after Google’s new image search was released, there has been no recovery – which means for image search, the significantly reduced traffic levels we’re seeing is the new normal,” he adds. “In the aftermath of the new image search experience, image SEO has been severely compromised, and we have no choice but to recommend deprioritizing image SEO when weighed against other search traffic initiatives.”

    Of course, there’s always the chance that your images could turn up in universal search results on Google’s web results pages, but even then, personalized “Search Plus Your World” results tend to get the emphasis when applicable.

    It’s all made even more interesting due to the fact that Google pitched the changes as good for webmasters, indicating that they would actually drive more traffic to sites.

    “The domain name is now clickable, and we also added a new button to visit the page the image is hosted on,” wrote associate product manager Hongyi Li in the announcement. “This means that there are now four clickable targets to the source page instead of just two. In our tests, we’ve seen a net increase in the average click-through rate to the hosting website.”

    “The source page will no longer load up in an iframe in the background of the image detail view,” Li added. “This speeds up the experience for users, reduces the load on the source website’s servers, and improves the accuracy of webmaster metrics such as pageviews. As usual, image search query data is available in Top Search Queries in Webmaster Tools.”

    It’s possible that some sites are seeing more traffic from the Image Search changes, and just aren’t being as vocal, but there has been an overwhelming amount of complaints since the redesign, and this new study is not doing anything to defend Google’s case.

    Of course, Google is all about placing users first (even over webmasters), and they’ll continue to do what they think is best for them. From a user experience perspective, the changes aren’t bad. But that’s little consolation for those who now have to find other ways to get their content in front of an audience.

    Do you see Google’s recent Image Search changes as a positive or a negative? Let us know in the comments.

  • Google Penalizes Mozilla For Web Spam [Updated]

    Update: It turns out that Google only penalized a single page from Mozilla. Matt Cutts weighed in on the “penalty” in that same forum thread (hat tip: Search Engine Land).

    Google has penalized Mozilla.org, the nonprofit site of the organization that provides the Firefox browser. This doesn’t appear to be an accident like what recently happened with Digg. This was a real manual web spam penalty.

    Mozilla Web Production Manager Christopher More posted about it in Google’s Webmaster Help forum (hat tip to Barry Schwartz), where he shared the message he got from Google:

    Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles. As a result, Google has applied a manual spam action to your site.

    “I am unable to find any spam on http://www.mozilla.org,” said More. “I have tried a site:www.mozilla.org [spam terms] and nothing is showing up on the domain. I did find a spammy page on a old version of the website, but that is 301 redirected to an archive website.”

    Google Webmaster Trends analyst John Mueller responded:

    To some extent, we will manually remove any particularly egregious spam from our search results that we find, so some of those pages may not be directly visible in Google’s web-search anymore. Looking at the whole domain, I see some pages similar to those that Pelagic (thanks!) mentioned: https://www.google.com/search?q=site:mozilla.org+cheap+payday+seo (you’ll usually also find them with pharmaceutical brand-names among other terms).

    In addition to the add-ons, there are a few blogs hosted on mozilla.org that appear to have little or no moderation on the comments, for example http://blog.mozilla.org/respindola/about/ looks particularly bad. For these kinds of sites, it may make sense to allow the community to help with comment moderation (eg. allow them to flag or vote-down spam), and to use the rel=nofollow link microformat to let search engines know that you don’t endorse the links in those unmoderated comments.

    For more tips on handling UGC (and I realize you all probably have a lot of experience in this already) are at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=81749

    Also keep in mind that we work to be as granular as possible with our manual actions. Personally, I think it’s good to react to a message like that by looking into ways of catching and resolving the cases that get through your existing UGC infrastructure, but in this particular case, this message does not mean that your site on a whole is critically negatively affected in our search results.

    Let this be a lesson to all webmasters and bloggers. Keep your comments cleaned up.

    Mozilla still appears to be showing up in key search results like for “mozilla” and for “web browser”. It’s not as bad as when Google had to penalize its own Chrome browser for paid links.

  • If You’re Dumping A Ton Of Pages On The Web, Do It Gradually, Says Matt Cutts

    Google posted a new Webmaster Help video today. This time, Matt Cutts addresses a question posed by fellow Googler John Mueller, who asks:

    A newspaper company wants to add an archive with 200,000 pages. Should they add it all at once or in steps?

    Cutts says, “I think we can handle it either way, so we should be able to process it, but if we see a lot of pages or a lot of things ranking on a site all of a sudden, then we might take a look at it from the manual web spam team. So if it doesn’t make any difference whatsoever to you in terms of the timing of the roll-out, I might stage it a little bit, and do it in steps. That way, it’s not as if you’ve suddenly dropped five million pages on the web, and it’s relatively rare to be able to drop hundreds of thousands of pages on the web, and have them be really high quality.”

    “An archive of a newspaper is a great example of that,” he adds. “But, if it’s all the same to you, and it doesn’t make that much of a difference, I might tend to do it more in stages, and do more of a gradual roll-out. You could still roll them out in large blocks, but you know, just break that up a little bit.”

    So it doesn’t sound like you’re going to have any major problems if you do it all at once (provided you’re not actually spamming Google with low quality content), but you might be raising a red flag with the web spam team, so it’s probably better to err on the side of caution, as Cutts suggests.

  • Matt Cutts On The SEO Mistakes You’re Making

    In the latest Webmaster Help video from Google, Matt Cutts discusses the biggest mistakes that a lot of webmasters are making when it comes to SEO.

    The top mistake, he says, is not making your site crawlable, which includes not having a domain.

    After that, it’s not including the right words on the page. Think about what the user is going to type, and include those words, he says. Going further, also have the things that people would be likely to look for on the page. For example, include your business hours. If you’re a restaurant, inlclude a menu in plain text – not just a PDF.

    The third thing is that people are thinking too much about “link building” when they should really just be thinking about creating compelling content and maketing (including things like talking to newspaper reporters).

    The fourth thing is the lack of good titles and descriptions. Make sure you include something that makes people actually want to click on the search result, and something that will be helpful, should users bookmark the site. “Something that lets them know you’re going to have the answer they’re looking for – something that makes them understand this is a good resource,” says Cutts.

    Finally, the other big mistake is not taking advantage of webmaster resources, including (but not limited to) Google’s own Webmaster Tools. Cutts also suggests following Google’s blogs and webmaster videos, attending search conferences, talking to people online (including in Google’s forums), and even mentions Blekko’s link tool. Basically, just follow the SEO industry.

  • How Page Load Speed Impacts SEO And User Experience

    Since Spring 2010 Google has used page load times as a factor in its search ranking algorithms. Google’s position is that faster-loading pages should be ranked higher because they provide a better experience for users. In Google’s own words, “Faster sites create happy users […] Like us, our users place a lot of value in speed – that’s why we’ve decided to take site speed into account in our search rankings.”

    What do we know about Google’s page speed algorithms? What can webmasters do (both for Google and for users) to speed up page load times?

    How much does page load speed impact Google rankings?

    Google representatives have stated on several occasions (for an example see this post on the Google Webmaster Central Blog) that the page speed algorithm impacts rankings for less than 1% of search queries. There have been a few reports from webmasters of page speed significantly impacting rankings :

    This data suggests that page speed doesn’t often impact rankings, but when it does the effect can be significant. One plausible interpretation (from Geoff Kenyon) is that “site speed will affect only queries where other ranking signals are very close or when the load time is exceptionally poor.” This interpretation seems to be consistent with Google’s statements and community feedback.

    Should SEOs worry about site speed? Page rankings are often based on a combination of dozens of small algorithmic factors; therefore, though page speed is a minor factor, even a small boost could be beneficial for your site. Don’t obsess over page load speeds, but it would be a good idea to dedicate a small amount of your SEO time and/or budget to speeding up your site. Page speed is also one of the factors totally within your control so its prudent to optimize this ranking factor.

    How does Google measure page load times?

    Google receives site/page load speed data from

    Google Chrome and Google Analytics are other possible sources for page load speed data.

    How fast are Google’s top 5 websites?

    According to data from Experian, nearly 20% of clicks from Google SERPs go to Facebook, YouTube, Yahoo, Wikipedia and Amazon. We used Google’s PageSpeed tool to see how well the home pages of these sites are following Google’s PageSpeed best practices:

    Pagespeed Scores

    While these figures shouldn’t be overemphasized, it is interesting to note that these top 5 websites definitely have opportunities to improve their page load speeds.

    What is an average page load speed? Google has conducted research on average page load times which can be used as a benchmark for page load times:

    Page Load Times

    How does page load speed impact user experience?

    Google implemented page speed into its algorithm because research shows that faster page load times mean happier users.

  • A Google study found that “slowing down the search results page by 100 to 400 milliseconds has a measurable impact on the number of searches per user.”
  • Shopzilla achieved a 25% increase in pageviews and a 7-12% revenue increase by speeding up its site.
  • AOL presented data showing that page load speeds can impact pageviews per visit by up to 50%.
  • A 1 second delay can decrease conversions by 7%.
  • 75% of users said that they would not return to a website that took longer than 4 seconds to load.
  • Nearly half of users expect webpages to load in 2 seconds or less.
  • Feel the need for speed? How to make your webpages pull a fast one

    Step 1: Measure.

    Your first step should be to measure your site’s page load speed. This will provide a baseline measurement from which you can work to improve. Two good tools for measuring page load speeds are Pingdom Page Load Time tool and Google Analytics Site Speed reports.

    Step 2: Upgrade your server.

    Many dynamic websites have to execute hundreds of lines of code, respond to dozens of requests, and make multiple database queries to display a single page to a single user. Hosting your website on a more powerful server can result in webpages being served faster.

    If your site is hosted on a shared hosting account, consider upgrading to a VPS or dedicated server. VPS and dedicated servers typically allow your website to have more server resources (i.e. CPU and memory) available. A VPS costs about $25 – $100 per month, depending upon the technical specifications and features you select. A dedicated server usually costs $100-$300 per month, depending upon the server’s specifications and the level of support that is included.

    Read The Differences Between VPS and Dedicated Hosting if you need help determining which is the best choice for your site.

    Step 3: Optimize your code and files.

    There are many ways to optimize your server side code, HTML/CSS/Javascript code, and images to minimize page load times. Google and Yahoo both provide excellent lists of best practices that you can implement to decrease page load time.

    Conclusion

    1. Page load speed impacts website user satisfaction, site usability and conversion rates.

    2. Google uses page load speed metrics as a minor ranking factor.

    3. Load times can be decreased by upgrading your web hosting and by implementing a list of best practices provided by Google and/or Yahoo.

    What are your favorite tools and strategies for improving page load times? How fast is your site?

  • Bing Webmaster Tools Gets Geo-Targeting, New Malware Tool

    Microsoft has launched two new features for Bing Webmaster Tools: a new malware tool and geo-targeting.

    If Bing detects that a site is serving malware (either willingly or unwillingly) it will send a notification to the webmaster in Bing Webmaster Tools, as well as alert users before they actually visit the site.

    “The new Malware tool provides information about the malware we detected on the site,” Bing says in a blog post. “Previously some of this information was encompassed in the Crawl Information tool, but with the new Malware tool you not only get access to more details that help you understand the nature of the issue, it also allows webmasters to submit and track a malware re-evaluation request once they have cleaned their site.”

    When a webmaster submits a request, they can immediately track the status and progress right inside the malware tool.

    With geo-targeting, webmasters can provide Bing with info about the intended audience of their site or of a section of their site.

    “Whereas other Webmaster Tools let you geo-target sites only at the site level, Bing Geo-Targeting provides you with a more flexibility: multinational sites do not need to verify each section they want to geo-target separately,” says Bing. “Instead, Bing allows you to define a country affinity for your entire website or for sections of your website from within a single view and from within a single site account.”

    Geo-targeting can be done at the domain level, the subdomain level, the directory level or the page level. Geo-targeting input will be used as one of many signals Bing uses when determining when and where to show your pages to users.

  • Google Penguin Update Is A Year Old

    A year ago today, webmasters and spammers (especially spammers) were rocked by un expected update from Google which has affectionately become known as Penguin.

    Initially referred to as the “webspam update,” it didn’t take long for Google to give the name of another black and white animal to associate with the algorithm. The update was designed to algorithmically enforce Google’s quality guidelines, and dealt blows to numerous sites that did not abide. Of course not everyone affected believes they were truly doing anything wrong. The whole thing also sent waves of panic throughout the web, leading to excessive link removal requests.

    There have only been two additional Penguin updates since the first one. One came in May, and the other in October. Google has indicated that another big one is on the way.

    Peruse our coverage of Google’s Penguin update from the past year here.

    Hat tip to Search Engine Roundtable for remembering the date.

    Image via wanart, where you can purchase the wooden card, and send to your bitter SEO rivals if you like.

  • Google Talks Keeping “No Results” Pages Out Of Index

    Google’s Matt Cutts takes on an interesting question in today’s Webmaster Help video:

    What is being done to detect and remove results from larger sites when they don’t have unique content that is relevant to a query (e.g. yelp.com results with no reviews, Facebook “business” pages that weren’t actually created by the business)?

    Cutts says he likes the question, but wouldn’t just restrict it to larger sites.

    “In general, we look at the value add, or you know, whether there’s some compelling value add, even at a page level, and we try to write algorithms to reflect that, but it is the case that sometimes you will find pages that get indexed that say, you know, ‘Zero reviews found,’ and so there’s basically no content to actually base your opinion on when you visit that page,” says Cutts.

    He continues, “So even starting back in 2009, I found a blog post that I did – ‘Give Google Feedback on No Results Pages,’ and so if people – it was a complaint even back then – people didn’t like having empty review sites, where you click through and it says there are no reviews for that product. So either do a spam report or show up at the forum or you might even still be able to use the form that I mentioned in that 2009 blog post.”

    He adds, “But basically, we are happy to say, ‘Hey look, if you are even doing search, and there’s no search results on that page, that’s the sort of thing that users get really angry about – they complain about. And so that is the sort of thing that, under our technical guidelines (if you look at our quality guidelines), we do say that we’re willing to prune out those sort of search results.”

    Here is the blog post he references. Here is a link to Google’s Quality Guidelines.

  • Matt Cutts On Penguin And Internal Links

    In the latest Webmaster Help video from Google, Matt Cutts responds to a question about Penguin’s effect on internal links that use the same anchor text. The exact question is:

    Do internal website links with exact match keyword anchor text hurt a website? These link help our users navigate our website properly. Are too many internal links with the same anchor text likely to result in a ranking downgrade because of Penguin?

    “My answer is typically not,” says Cutts. “Typically, internal website links will not cause you any sort of trouble. Now, the reason why I say ‘typically not’ rather than a hard ‘no’ is just because as soon as I say a hard ‘no’ there will be someone who has like five thousand links – all with the exact same anchor text on one page. But if you have a normal site, you know…a catalog site or whatever…. you’ve got breadcrumbs…you’ve got a normal template there…that’s just the way that people find their way around the site, and navigate, you should be totally fine.”

    He continues. “You might end up, because of breadcrumbs or the internal structured navigation, with a bunch of links that all say the same thing, that point to one page, but as long as that’s all within the same domain, just on-site links, you know, that’s the sort of thing where, because of the nature of you having a template, and you have many pages, it’s kind of expected that you’ll have a lot of links that all have that same anchor text that point to a given page.”

    Long story short, this isn’t an issue you should have to worry about. Like with everything else, just don’t abuse it and make it an issue.

  • Google On Buying Spammy Domains: Don’t Be The Guy Left Holding The Bag

    In the latest Webmaster Help video from Google, Matt Cutts takes on an interesting topic. Can you buy a domain that has been penalized by Google for spam, clean it up and recover rankings?

    Well, it depends, and Cutts explains why.

    “This is a tricky question because on the one hand there’s algorithmic spam, and then there’s manual spam, and all manual spam does have an eventual time out, so if you were to completely clean up all the content on the domain, [and] do a reconsideration request, in theory, that domain can recover,” says Cutts. “However, on the algorithmic side, if there are a ton of spammy links that the previous owner built up, that can be a little bit hard to go through, and try to clean up and get all those links taken down, and make a list of all those links.”

    He continues, “The way to think about it is, there are a lot of spammers out there that do basically what’s known as a ‘churn and burn’ tactic, where they just use as many techniques to try and make a domain rank as they can, and then as soon as that domain is awful or bad, or Google has caught it, then they sort of movie on, and they go on to some other exploit, and they try to tackle it with another domain. Now what you don’t want to do is be the guy who gets caught left holding the bag.”

    Long story short: how bad do you really want this domain?

  • Google Introduces ‘x-default hreflang’ Annotation For Webmasters

    Google introduced a new rel-alternate-hreflang annotation for webmasters to specify international landing pages. It’s called “x-default hreflang,” and it signals to Google’s algorithms that a page doesn’t target a specific language or location.

    “The homepages of multinational and multilingual websites are sometimes configured to point visitors to localized pages, either via redirects or by changing the content to reflect the user’s language,” explains Google Webmaster Trends analyst Pierre Far. “Today we’ll introduce a new rel-alternate-hreflang annotation that the webmaster can use to specify such homepages that is supported by both Google and Yandex.”

    “The new x-default hreflang attribute value signals to our algorithms that this page doesn’t target any specific language or locale and is the default page when no other page is better suited,” says Far. “For example, it would be the page our algorithms try to show French-speaking searchers worldwide or English-speaking searchers on google.ca.”

    If example.com/en-gb targets English-speaking users in the UK, example.com/en-us targets English-speaking users in the US, example.com/en-au targets English-speaking users in Australia, and exmaple.com/ tshows users a country selector and is the default page worldwide, then the annotation would something like this:

    <link rel="alternate" href="http://example.com/en-gb" hreflang="en-gb" /> 
    <link rel="alternate" href="http://example.com/en-us" hreflang="en-us" /> 
    <link rel="alternate" href="http://example.com/en-au" hreflang="en-au" /> 
    <link rel="alternate" href="http://example.com/" hreflang="x-default" />

    The annotation can also be used for homepages that dynamically alter their contents based on users’ geolocation or the Accept-Language headers.

  • Matt Cutts On Web Hosts That Host Spam

    In the latest Google Webmaster Help video, Matt Cutts discusses whether or not a site that is hosted by a hosting service that also hosts spam, has to worry about a negative impact on rankings. The answer is pretty much no, but he talks about some “corner cases” where it’s a little more complicated.

    “Typically a hosting company has a lot of different stuff on it. Some of it will be good. Some of it will be bad,” says Cutts. “There will be some spam, but just because you happen to be on an IP address or using a hosting company that also hosts some spam, that doesn’t mean that you should be affected.”

    “There have been a very few situations…I remember several years ago there was a really bad hosting company,” he says. “I’m not even sure whether you would call them a hosting company. It was something like 27,000 sites of porn spam and like two legit sites. On one IP address…so if you were one of those two legit sites…in order to catch the 27,000 porn sites…you know, that was something where we were like, ‘Well, okay that’s really pretty excessive.’”

    He says they haven’t looked at IP address in a “long, long time” in terms of gathering sites. In most situations, you don’t have to worry about it.

  • Why Google Changes Your Rankings Over Time

    There has probably been at least one time when you noticed that one of your pages used to rank for a certain search query, but then later dropped for some unexplained reason. Matt Cutts, in the latest Google Webmaster Help video, talks about why this might be the case.

    Cutts responds to the following submitted question:

    When we create a new landing page with quality content, Google ranks that page on the top 30-50 for targeted keywords. Then why does the rank get decreased for the next 2 to 3 weeks? If pages didn’t have required quality, then why did it get ranked in the first week?

    “That’s a fun question because it sort of opens up how writing a search engine is kind of a complex task,” says Cutts. “You’re basically trying to make sure that you return the best quality result, but you also have to do that with limited information. For example, in the first minute after an earthquake, you might have different people, you know, saying different things. You know, ten minutes after an earthquake you have more information. An hour after an earthquake you have a lot more. With any event that has breaking news, it’s the sort of thing where it can be hard to know, even if multiple people are all saying the same thing, and one person might be the original author, one might be using that RSS. It can be difficult to try to suss out where was this content appearing originally.”

    “And over time – over the course of hours, or days, or weeks – that gets easier,” he continues. “But it can be harder over the course of just minutes or hours. So a lot of the times whenever you see something ranking for a while, we’re taking our best guess, and then as more information becomes available, we incorporate that. And then eventually, typically, things settle down into a steady state. And then when there’s a steady state, we’re typically able to better guess about how relevant something is.”

    As Cutts goes on to note, Google finds that freshness is deserved for some queries, while evergreen content works better for others. In my experience, Google struggles with this a lot, but seems to give more weight to freshness more often than not. Of course, I’m typically writing about newsy topics, so that makes sense to some extent (though there are plenty of times in researching topics that freshness gets a little too much weight).