WebProNews

Tag: SEO

  • SEO Reports in Google Analytics

    Google announced the launch of a limited pilot for SEO reports in Google Analytics, which are based on search queries data from Webmaster Tools.

    “Webmasters have long been asking for better integration between Google Webmaster Tools and Google Analytics,” Google says on the Webmaster Central Blog.

    The SEO reports also take advantage of Google Analytics’ filtering and visualization capabilities for deeper analysis, Google says. “For example, you can filter for queries that had more than 100 clicks and see a chart for how much each of those queries contributed to your overall clicks from top queries.”

    Google SEO Reports from Webmaster Tools data in Google Analytics

    Search queries data includes:

    • Queries: The total number of search queries that returned pages from your site results over the given period. (These numbers can be rounded, and may not be exact.)
    • Query: A list of the top search queries that returned pages from your site.
    • Impressions: The number of times pages from your site were viewed in search results, and the percentage increase/decrease in the daily average impressions compared to the previous period. (The number of days per period defaults to 30, but you can change it at any time.)
    • Clicks: The number of times your site’s listing was clicked in search results for a particular query, and the percentage increase/decrease in the average daily clicks compared to the previous period.
    • CTR (clickthrough rate): The percentage of impressions that resulted in a click to your site, and the increase/decrease in the daily average CTR compared to the previous period.
    • Avg. position: The average position of your site on the search results page for that query, and the change compared to the previous period. Green indicates that your site’s average position is improving.To calculate average position, we take into account the ranking of your site for a particular query (for example, if a query returns your site as the #1 and #2 result, then the average position would be 1.5).

    Webmasters can use the search queries data to review the query list for expected keywords and compare impressions and clickthrough rates. It can also be helpful for keyword ideas for paid search campaigns.

    “We hope this will be the first of many ways to surface Webmaster Tools data in Google Analytics to give you a more thorough picture of your site’s performance,” said Trevor Claiborne of the Google Analytics Team. “We’re looking forward to working with members of the pilot to help us identify the best ways to make this happen.”

    If you’re both a Webmaster Tools verified site owner and a Google Analytics admin, you can sign up for the pilot here. Each individual user must sign up for the pilot if they want access to the new reports.

  • Who You Are Becoming More Important in Google

    Who You Are Becoming More Important in Google

    Google announced today that it is now supporting authorship markup, which it will use in search results. The company says it is experimenting with using this data to help people find content from authors in search results, and will continue to look at ways it could help the search engine highlight authors and rank search results.

    This seems to indicate that Google will be placing even more emphasis on authority and/or personal connections with content. We have to wonder how this will affect content farms down the line.

    In the Webmaster Central Help Center, Google says, “When Google has information about who wrote a piece of content on the web, we may look at it as a signal to help us determine the relevance of that page to a user’s query. This is just one of many signals Google may use to determine a page’s relevance and ranking, though, and we’re constantly tweaking and improving our algorithm to improve overall search quality.”

    “We now support markup that enables websites to publicly link within their site from content to author pages,” explains software engineer Othar Hansson on Google’s Webmaster Central Blog. “For example, if an author at The New York Times has written dozens of articles, using this markup, the webmaster can connect these articles with a New York Times author page. An author page describes and identifies the author, and can include things like the author’s bio, photo, articles and other links.”

    “The markup uses existing standards such as HTML5 (rel=”author”) and XFN (rel=”me”) to enable search engines and other web services to identify works by the same author across the web,” continues Hansson. “If you’re already doing structured data markup using microdata from schema.org, we’ll interpret that authorship information as well.”

    Schema.org was revealed last week – an initiative on which Google, BIng, and Yahoo all teamed up together to support a common set of schemas for structured data markup on web pages. Schema.org provides tips and tools for helping sites appear in search results.

    How to Implement it

    To implement the authorship markup, Google says:

    To identify the author of an article or page, include a link to an author page on your domain and add rel=”author” to that link, like this:

    Written by <a rel=”author” href=”../authors/mattcutts”>Matt Cutts</a>.

    This tells search engines: “The linked person is an author of this linking page.” The rel=”author” link must point to an author page on the same site as the content page. For example, the page http://example.com/content/webmaster_tips could have a link to the author page at http://example.com/authors/mattcutts. Google uses a variety of algorithms to determine whether two URLs are part of the same site. For example, http://example.com/content, http://www.example.com/content, and http://news.example.com can all be considered as part of the same site, even though the hostnames are not identical.

    You can also link multiple profiles, as author pages can link to other web pages about the same author. You can tell Google that all of these profiles represent the same person by using a rel=”me” link to establish a link between the profile pages. More on this in the help center.

    Google’s rich snippets testing tool will also let you check your markup and make sure Google can extract the proper data.

    Google has already been working with a few publishers on the authorship markup, including The New York Times, The Washington Post, CNET, Entertainment Weekly, and The New Yorker. They’ve also added it themselves to everything hosted by YouTube and Blogger, so both of these platforms will automatically include the markup when you publish content.

  • Good SEO Starts with Smart Purchasing Decisions

    I don’t know about you, but sometimes I get completely overwhelmed with the sheer amount of time, energy and raw hours that go into properly marketing a website online. The thing that gets me the most is that with SEO and other forms of online marketing, there really is no situation when you can sit back and say “we’ve arrived.” Once you optimize a site, there are still so many things that can be assessed, analyzed, uncovered and corrected that you never really can say, “It’s Miller time!”

    This is what I envy about web designers. They get to produce a finished work, then go and collect awards for their work. But, online marketing – that’s a different ballgame all together. Sure, we can celebrate top rankings, but tomorrow there is another keyword that needs improvement!

    Making a Smart Purchasing Decision

    Ninety percent of the online marketing services my company provides are based on the amount of time we guesstimate the job will take to get results. There are a few expenditures the clients may have to buy into (directory submission fees, requested analytics tools, etc.), but most of the cost associated with SEO services comes down to determining how many hours are needed on a month-to-month basis.

    We look at time needed for researching, writing, analyzing, tweaking, optimizing, communicating, reporting and linking, just to name a few. Sometimes I think it’s difficult for clients to fully appreciate the time invested in doing a job properly, especially when they see “less expensive” options floating round. Sure, you can hire some kid down the street to mow your lawn, or you can hire the gardener to take care of your lawn, garden and flowerbeds and to get rid of unwanted rodents, weeds and other pests while making sure everything is properly fertilized and pruned each week. The time difference between the two is substantial.

    The problem comes, in SEO at least, when many people are expecting to hire the gardener at lawn mower kid wages. There is just no way the gardener can do their job effectively in the time it takes for the kid to mow the neighbors lawn across the street. Can’t happen.

    How Much Time Does a (Good) Job Take?

    When it comes to purchasing an SEO or SEM strategy for your online business, there are two things to consider: How many hours does it take to meet your expectations, and how much are you willing to pay for each hour that goes into meeting those expectations?

    Many SEOs charge a pre-determined package price. That just means they have pre-determined how many hours they will be providing you for their service. If you purchase an SEO package for $3000 per month, you can get anywhere from 30 hours ($100/hour) to 10 hours ($300/hour). The question you have to ask yourself is – can the $100/hour guy get the same results as the $300/hour team?

    If you can confidently say yes, then maybe that’s your guy. If not, maybe you need to consider the more “expensive” option. But we all know, cheap and ineffective usually turns out to cost a lot more than the expensive option that gets results!

    Ten hours per month on SEO or SEM doesn’t seem like much, but in the right hands, a lot can be accomplished. Here is a simple breakdown of what I would consider the average, high-quality SEO campaign:

    • Site Architecture and Site-Wide SEO: five to 10 hours needed at the onset to analyze the initial site architectural problems and create a concatenation schema to make all pages “search engine friendly.”
    • Keyword Research: initially, up to five hours to research the site’s core terms, determine which pages/keywords are a top priority for optimization and create an optimization plan moving forward. An additional 30-60 minutes of keyword research can go into each specific page being optimized.
    • On-Page Optimization: one to two hours per page to optimize keywords into the text, streamline the code (if necessary) and implement onto the site.
    • SEO Maintenance: two to four hours each month to review past optimization efforts and implement tweaks and changes designed to improve site performance. This also includes reviewing site usability and conversion issues.
    • Link Building/Social Media: five to six hours each month, at a minimum. New or competitive sites can, and often do, need much stronger link building or social media campaigns.
    • Analytics and Testing: three to five hours per month. No SEO campaign is complete without some way to analyze the overall performance of the optimization, usability and conversion improvement efforts that are being invested. The better the analysis, the more hours that must be invested.

    These numbers can fluctuate depending on the size of the site, but this is what we would consider a pretty basic campaign. If you’re looking for the best pricing option, how much from this do you feel you can cut before you’re cutting into your success?

    That’s the key question. If you’re looking solely at pricing and not factoring in the actual work, you’re bound to make a bad purchasing decision. The real question is, will the price you’re paying (or willing to pay) give you the ROI you need to make a profit? It’s probably not a good idea to purchase SEO until you can answer that question affirmatively.

    Originally published at E-Marketing Performance

  • DaniWeb Forum Hurt By Google Panda. Why?

    DaniWeb Forum Hurt By Google Panda. Why?

    If you feel your site was wrongfully hit by Google’s Panda update, there might be hope for you yet. We recently looked at a couple sites who have seen some minor recovery since being hit hard by the update, and since then, we’ve spoken with Dani Horowitz, who runs the IT discussion forum DaniWeb (one of those sites) about what she’s been doing to get back into Google’s good graces.

    Should forum content rank well in Google search results when relevant? Comment here.

    DaniWeb’s US traffic went from about 90,000 visitors per day down to about 40,000 per day after the update, she tells WebProNews. This sent her into “complete panic mode”.

    “I just went into crazy programmer SEO mode, just removing duplicate content and things like that,” she says. She thinks duplicate content may have been a big factor, but duplicate content and its relationship to backlinks, specifically.

    “We syndicate our RSS feeds, and there are a lot of websites out there that syndicate our content, duplicate our feeds legitimately…they just take our RSS feeds and they syndicate that,” she explains,noting that many of these sites were linking back to DaniWeb.

    “My hypothesis right now is that Google Panda figured out all these sites are really content farms – are really just syndicators, and we just lost half our backlinks,” she says. “So I think it might not necessarily be that Google is penalizing us for being a content farm, but that Google is penalizing all the content farms that are syndicating our content, effectively diminishing the value of half of our backlinks.”

    What DaniWeb Has Done to Aid Recovery

    First off, she says she entirely redid the site’s URL structure. The actual URL of every single page has changed, Horowitz says.

    She removed tag clouds, which were at the bottom of every single page, saying that Google frowns upon these because they can look like keyword stuffing. “What I went and did was made my tag clouds actually populate via javascript in such a way that it actually improves page load time for the end user because they’re no cache, except Google can’t actually spider the actual tag cloud pages, because I added them to the robots.txt file.”

    It’s been established that Google takes page speed into consideration as a ranking factor, so certainly this could only help (though it does make you question Google’s whole philosophy of “creating pages for users and not for search engines”). In fact, Horowitz recently showed the correlation of pages Google was indexing with the improvements in page load time:

    Pages Crawled vs load time from daniweb

    Horowitz says she added a robots.txt to all search results pages, because Google also frowns upon actually having search-like pages in its index. Google wants to be the search engine itself, and point to the content – not to other search results.

    She made heavy use of nofollow and noindex tags. “Basically what I did was I took hundreds of thousands of pages out of Google’s index from our domain, but hopefully the advantage being beneficial to the end users…”

    Specifically, she noindexed forum posts with with no replies, hoping that Google will recrawl, and start indexing them after they do get replies. She notes that this is simply an experiment.

    Finally, she made the Facebook and retweet buttons more prominent. Clearly, Google is moving more and more toward social as an indication of relevancy, so this can’t hurt either.

    Horowitz notes that it is entirely possible that the uptick in post-Panda traffic might also be related to other updates Google has implemented since the Panda update. They make changes on a daily basis, and it could simply be that DaniWeb was positively impacted by a different tweak.

    Forums and Their Value to Search Results

    With the Panda update being all about the quality of search results and the content they deliver, we asked Dani about her thoughts on the value that forums have in this department.

    “Forums are in my opinion the best way to get content online, and to get the answers to questions that people want online, where you have not just a single publisher or an editor and team of staff writers, but actually [are] able to poll the entire Internet and [are] able to get expertise from anyone who has it,” she says. “I definitely think that forums are growing. They’re not going to end anytime soon,” she adds, noting that they may change in format.

    “It is a double-edged sword, because you have all this great content that’s contributed by the people who know the content best – know the answers best – as opposed to being limited by a team of staff writers, but the flip side is you have people who are not talking in 100% U.S. English, and you have people that don’t have correct grammar, and you have spelling mistakes,” she continues. “So now, we’re leaving it up to Google’s algorithm to try to figure out which…if someone is querying Google…which page has the correct answer. Is it the page that is written by some staff writer that doesn’t necessarily have a complete interest in the topic, but does have a three-paragraph/five-paragraph article that’s written in full-sentence English or is it written by someone who’s a complete expert in the topic, and knows everything…but maybe isn’t a native English speaker and is writing in broken english with lots of spelling and grammar mistakes. It’s hard to have an algorithm try to figure out which is the better result to show.”

    Google did include the question, “Does this article have spelling, stylistic, or factual errors?” in its recently released list of “questions that one could use to assess the ‘quality’ of a page or an article”.

    Better Google Results?

    When asked if she thinks Google’s results are better now, she told us that the rankings for DaniWeb content have gotten a bit weird. She says that they were ranking for “round robin algorithm” (a computer science term that would make sense in terms of DaniWeb’s content) before the Panda update, but not after the update. Meanwhile, DaniWeb started ranking for the odd keyword “rectangle” after the update (though this was no longer the case after she posted about it in the Google Webmaster forum).

    “Before Panda, we were ranking number one for some really great articles that were very relevant,” she says. “Post Panda all of our number one rankings for all of these great articles went down, but we started ranking for some really weird stuff.”

    She also noted that her experience searching with Bing “sucked”.

    Recovery?

    To be clear, it’s not as if DaniWeb has experienced a full recovery since the uptick in traffic began. “We’re still nowhere near where we were before,” she says. “We’re still down nearly 50% but literally we just stopped the bleeding, and there’s [been] a very small improvement week after week the past three or four weeks, but if nothing else, it’s not going down anymore…”

    She’s still looking at other things that can be done, and concentrating on building backlinks – trying to create great linkbait.

    Do you think DaniWeb should have lost Google rankings? Tell us what you think.

  • Delegating Your SEO Tasks Into a Successful SEO Campaign

    For a lot of people, running a business is about making money. But, for me there is so much more to running my company than that. Sure, I love the opportunity to create profits, but I also love that I can do what I love, set my own schedule, and work with some of the best people in the industry.

    I also enjoy trying to build the very best business possible. I want to see my clients succeed. I always tell them, it’s in our best interest that the work we do for them is profitable. If it isn’t, we lose a client!

    As a manager, I want to see my team succeed. I want to give them the opportunity to build their strengths, and explore new areas, all while trying to create an environment where they do not dread coming to work on Monday morning.

    Very few people are an island amongst themselves, and even fewer have succeeded solely on their own. I admire many of the sole practitioners in the SEO industry, but frankly, I don’t understand how they do it. It is a very difficult role to be an expert in SEO, link building, copywriting, analytics, PPC, social media, conversion analysis and coding all at once. Not to mention the time spent blogging, reporting, reading, analyzing, testing, and keeping up with the latest industry changes. That sounds like several full-time jobs to me, so kudos to those that can do it all!

    I figure they either have more hours in the day than I do, or they are getting paid an hourly rate which I have yet to attain!

    But, the job I particularly enjoy is the role of Project/Client Manager. As much as I love everything about SEO, I love running the business even more. As our company has grown, it’s taking quite a bit of an effort for me to let go of the old jobs and delegate those responsibilities to others. But, I can’t do it all. This, after all, is the purpose of having a team.

    A Cord of Three Strands is Not Quickly Broken

    D.L. Moody once said, “You can do the work of ten men, or get ten men to do the work.” For a business owner, doing the work of ten men yourself has its advantages. It puts more money in your pocket while also building feelings of pride and self-accomplishment. You have no one to blame for mistakes, and you can make sure the job gets done right the first time.

    The gains with such a do-it-all-yourself mentality can be substantial. But, what you lose is often far more valuable than what you gain.

    I used to not mind working 10-12 hours a day, but over the past couple of years, I realized that I wasn’t leaving much time for my kids. Not as much time as they would have liked, anyway.

    Several years ago I talked to a very successful business woman about her company. She told me that she decided early on that she was not going to work more than eight hours in a day. Today she flies all over the country and performs seminars for business owners looking to capitalize on their wealth, but is still determined to keep her workload to something that can be managed in those 40 hours each week.

    This flies in the face of the mentality of most small business owners, including mine at that time. We’re told that you have to put in 50-80 hour work weeks in order to succeed. Maybe this is true for a lot of small businesses, but unfortunately, once most start down that path, they find it hard to slow down and start delegating responsibilities. This, I believe, hinders their growth potential and leads to stress, burnout, and, in many cases, a lot of problems in their personal lives.

    I think the goal of any business owner should be to grow their business to the point where the business operates effectively without their involvement. The owner continues collect a paycheck from their investment, while doing very little of the ongoing work.

    I know this is my goal, at least.

    That means that I have to focus more on growing the company and less on doing the work of the company. I make it a point to find good people that know (or can learn) more than I do about key service areas we offer. I might be able to be really good at any number of things, but I can’t be an expert in all of them. So I’ll find someone who is.

    Learning the Art of Delegation

    I never read the book Robin Hood, but years ago a friend of mine who did told me something interesting about the story. Robin would never let anyone into his gang that couldn’t beat him in a fight. If Robin could kick their butt, they were out. But, if they could kick his, then they were welcomed in.

    I try to use that same principle with my business. I want to hire people who do (or are capable of) knowing their area of expertise better than I do. Hiring this way ensures I get quality people and I have less to worry about when delegating responsibilities to them.

    Unfortunately, too many business owners and managers are unable–or unwilling–to delegate responsibilities, despite the fact that this often holds them back from greater success.

    There are four main reasons people don’t delegate:

    Fear of losing authority

    One of the greatest fears managers and bosses have is that their employees may end up knowing more about something than they do. Once this happens, they fear, the employee will leave the job for greener pastures, demand more pay, or worse, take their job title from them.

    Poor managers combat this by holding on to certain jobs and over burdening themselves with busy work that would best be handled by someone else. By being fearful of losing position, or power, the inadvertent result is a sabotaging of the business.

    Delegation requires trusting others to make important decisions and allowing them to gain the knowledge and the skills necessary to do that without your input. That can be a scary thing.

    Fear of work being done poorly

    It’s often very true that if you want something done right, you have to do it yourself. But at the same time, if you don’t want to have to do everything yourself, you’ve got to delegate tasks to others.

    Will they ever fail? Yes. Will they cause delays, loss of money, and even lose a client or two? There is certainly that risk. But, there is no reward without a bit of risk attached. Properly implemented delegation can take small risks and turn them into far greater rewards.

    Fear of work being done better

    Pride can be a very strong inhibitor to doing the right thing. Smart business people surround themselves with people that have potential to shine, and help them achieve greatness!

    While he was president, Ronald Reagan had a plaque in the oval office that read, “There is no limit to what a man can do or where he can go if he doesn’t mind who gets the credit.” People in leadership positions often want the credit for their leadership capabilities. However, delegating means that somebody else might be recognized for a job well done that you may have had a strategic hand in.

    You can see that as someone else taking credit away from you, or see it as a credit to yourself for helping this person get such accolades. Instead of being afraid that someone below you will get rewarded for a job well done, you can take credit for finding, training, and building a team that is extraordinary.

    If you’re worried about losing your position, then chances are you’re not doing what you need to do to keep it.

    Unwillingness to take the necessary time

    I’m a very task-oriented person. I know what needs to be done, know how to do it and can do it faster than anybody else I know. (I’m humble, too!) That means I have little patience when others I’ve delegated tasks to are not as quick as I am.

    Therein lies the difficulty in delegating. You usually only do it when you can’t handle the workload any more and, by that time, you’re so swamped you don’t have the time to bring someone new up to speed.

    But, this is also why delegating early is so important. By delegating, each task may take more time individually, but collectively you get a lot more done in the same amount of time. Doing the work of ten men may seem noble, and give you a nice boost in pride, but it’s been said that nobody lies on their deathbed wishing they had spent more time at work!

    Whether you’re running an online business, are a marketing manager or perhaps managing an SEO firm, delegating your SEO responsibilities isn’t just about freeing up your time to do more things outside of work (clearly a benefit), it’s also about freeing up your time at work to be the brains rather than the brawn. Letting others do the “important” operational tasks frees you up to provide more oversight, develop new ideas and make your company more profitable.

    Theodore Roosevelt said, “The best leader is the one who has a sense to pick good people to do what he or she wants done, and enough self-restraint to keep from meddling with them while they do it.” Good SEO delegation creates a business far greater than the sum of its parts.

    Originally published on E-Marketing Performance

  • The Latest On Panda Straight From Google

    The Latest On Panda Straight From Google

    Google’s Matt Cutts engaged in a live chat with webmasters on YouTube, and had some things to say about the Panda update.

    Barry Schwartz posted the above video, capturing a Panda-related segment of the chat, in which Cutts discusses the update.

    “It came from the search quality team,” he says. “It didn’t come from the web spam team, so web spam engineers have been collaborating with search quality folks on it since the initial launch, but it originated from the search quality team, and it’s just an algorithmic change that TENDS to rank lower quality sites lower, which allows higher quality sites to rank higher, so it’s not a penalty, and I talked about how algorithms are re-computed, so there’s been no manual exceptions.”

    “I don’t expect us to have any manual exceptions to Panda,” he says. “This is something where the signal is computed, and then when the signal is re-computed, if the sites are slightly different, then that can change the sites that are affected, and we’re going to keep iterating.”

    “So we’ve had Panda version 1 in February and Panda version 2 in April I believe, and…possibly March…and that started to use blocking of sites along with some other signals,” he continues. “And then we’ve had smaller amounts of iterations…”

    Referring to before the update came about, he says, “We had heard a lot of complaints. We’ve been working on it even before we’d heard a lot of the complaints to try and make sure that lower quality sites were not ranking as highly in Google search results.”

    He then mentions the list of questions Google released a few weeks ago for content providers to ask themselves about their own content quality. The list, he says, “Helps to step into the Google mindset and how we think about these sorts of things.”

    In the talk, Cutts mentioned that the update will still roll out internationally in other languages in time, “maybe in the next couple months”. So far, it’s been launched globally, but only in the English language.

  • Google Panda Victim EzineArticles Calls on Users to Improve Link Quality

    Google Panda Victim EzineArticles Calls on Users to Improve Link Quality

    Since Google’s Panda update, we’ve been looking at a lot of the sites negatively impacted, as examples to learn from. Determining what these sites have been doing wrong can help us understand how other sites may be viewed in Google’s eyes.

    We’ve also been looking at what some of these sites have been doing to try and recover some of their lost search traffic. Some sites have indeed seen an uptick in search referrals, after being victimized by the update.

    EzineArticles has been one of the more widely-publicized victims of the update, and also one of the most vocal in terms of reaching out to its users in efforts to improve content. On the company blog, there have been numerous tips and guidelines discussed in recent months. This continues with a new post from a managing editor, who discusses landing page quality for article contributors.

    The editor lists a few “article rejection-worthy scenarios,” including: linking to a site unrelated to the article content, duplicating article content on landing pages, having more than one exit pop-up, having limited or poor user-navigation or forcing users to affiliate pages without transparent intent, and having a poor or unbalanced ad-to-content ratio.

    In terms of navigation, the editor specifically mentions slow-loading landing pages.

    While much of the guidance EzineArticles has been giving to users of late has been focused on the content that actually appears on EzineArticles itself, it is interesting to see them now turning focus to the content that EzineArticles is linking to.

    In other words, the goal is not only to improve the site’s content, but to improve the content that is being associated with it in the eyes of the search engines.

    Linking out to poor quality content is not good SEO. This isn’t new to Panda. It’s a pretty old, well-known element of the game. EzineArticles’ reminder of this to its users somewhat reflects a point we discussed in another recent article. Panda victims will do well not only to dwell on the Panda update specifically, but to get back to SEO basics.

    Remember, Google has over 200 signals, and just because your site may have gotten hit by the Panda update, doesn’t mean that there aren’t other SEO practices you couldn’t be doing better – practices that may have benefited you all along, pre and post Panda.

    The point about site speed is a valid one too. We know Google uses site speed as a ranking factor. We don’t know the weight of it (though DaniWeb has some interesting stats related to this). If Google views the landing page as a less quality page because of the page load time, it can’t help the article linking to that page either.

    So, if EzineArticles is able to get its users to take quality more seriously on their own sites, it could go a long way in helping Google’s perception of EzineArticles itself. Whether or not users will follow the advice and this will happen, remains to be seen.

    It’s worth noting that EzineArticles is still running some pretty ad-heavy content. At the time of this writing, this is the top article on the site. There are three Google ads above the content body, six ads directly below it, two Google ads below those ads, a bunch of “related” links below those, and five more ads below those. To the right of the article body there are ten Google ads, stretching a vertical length of about twice the amount of the body itself. Oh, and there are 4 more Google ads at the very top of the page, above the title. This seems to be the basic article template the site is running with.

  • Are Some Sites Recovering From The Google Panda Update?

    Are Some Sites Recovering From The Google Panda Update?

    It would appear that some of the victims of Google’s Panda algorithm update are starting to see at least slight recoveries after using some elbow grease. A couple examples of sites that have gained some attention for upswings in traffic post-Panda, after getting hit hard by the update, are DaniWeb and One Way Furniture.

    Have you seen any recovery in search traffic since Panda hit? Let us know.

    DaniWeb Sees an Uptick in Traffic Post-Panda

    DaniWeb is an IT discussion community site. It’s a place where people can go to discuss issues related to hardware, software, software development, web development, Internet marketing, etc. This is exactly the kind of site that can actually provide great value to a searcher. I can’t tell you how many times I’ve had some kind of frustrating software issue only to find the solution after a Google search pointing me to a discussion forum with people openly discussing the pros, cons, and merits of a given solution or idea. The very fact that it is a discussion forum means it is a potentially great place for different angles and ideas to any given topic, with the ongoing possibility of added value. More information means you can make better informed decisions.

    Sure, there is no guarantee that all of the information is good information, but that’s the beauty of discussion. There is often someone there to shoot down the bad. The point is, many searchers or search enthusiasts might take issue with a site like Daniweb being demoted in search because of an algorithm change that was designed to crack down on shallow and lesser-quality content.

    The good news for DaniWeb, and anybody that finds it to be a helpful resource, is that since being hit by the update it is starting to bounce back. To what extent remains to be seen. Time will tell, but Dani Horowtiz, who runs the site, recently revealed a Google analytics graph showing an upswing:

    Daniweb traffic Panda and Post-panda

    “The graph indicates a slight dip towards the end of February when just the US was affected by Panda, and then a huge dip when Panda went global,” she says. “However, you can see that over the past couple of weeks, traffic has been on the upswing, increasing day after day. We’re not yet near where we were before Panda, but there definitely is hope that we will get back there soon.”

    DaniWeb has recovered from Google Panda … Sorta http://bit.ly/liGYiT 3 days ago via twitterfeed · powered by @socialditto

    She is careful to note, “Many algorithm changes have already gone into effect between when Panda first was rolled out and today. Therefore, I can’t say without a doubt that our upswing is directly related to us being un-Pandalized in Google’s eyes and not due to another algorithm change that was released. In fact, in all honestly, that’s probably what it is.”

    Still, it should serve as a reminder that Panda isn’t everything. Google has over 200 ranking signals don’t forget.

    One Way Furniture Slowly Climbs Back Up

    If you’re a regular reader of WebProNews or have been following the Panda news, you may recall earlier this month when NPR ran a story about a furniture store called One Way Furniture that had been feeling the wrath of the Panda, mainly due to its use of un-original product descriptions, which the e-commerce site was drawing from manufacturer listings.

    Internet Retailer Senior Editor Allison Enright spoke with One Way Furniture CEO Mitch Lieberman this week (hat tip to SEW), and he said that the site is slowly climbing back up in the search rankings. “It’s been extremely challenging, but exciting, too,” he is quoted as saying. “Even in a downturn like this, it is exciting to see the effects of what you are doing to get you back to where you were.”

    How They Are Doing It

    So great, these sites are evidently working their way back into Google’s good graces. How does that help you? Luckily, they’ve shared some information about the things they’ve been doing, which appear to have led to the new rise in traffic.

    “In a nutshell, I’ve worked on removing duplicate content, making use of the canonical tag and better use of 301 redirects, and adding the noindex meta tag to SERP-like pages and tag clouds,” says Horowtiz. “I’ve also done a lot of work on page load times. Interestingly enough, I’ve discovered that the number of pages crawled per day has NOT decreased in tandem with Panda (surprisingly), but it HAS been directly affected by our page load times.”

    Look at the correlation between DaniWeb’s pages crawled per day and time spent downloading a page:

    Pages Crawled vs load time from daniweb

    “I guess it also goes without saying that it’s also important to constantly build backlinks,” says Horowitz. “Like many other content sites out there, we are constantly scraped on a regular basis. A lot of other sites out there syndicate our RSS feeds. It is entirely possible/plausible that Google’s Panda algorithm [appropriately] hit all of the low quality sites that were just syndicating and linking back to us (with no unique content of their own), ultimately discrediting half of the sites in our backlink portfolio, killing our traffic indirectly. Therefore, it isn’t that we got flagged by Panda’s algorithm, but rather that we just need to work on building up more backlinks.”

    According to Internet Retailer, Lieberman fired the the firm he was using to get inbound links before and hired a new one. He also hired some new copywriters to write original product descriptions aimed at being “friendly to search engines.” Enright writes:

    “For example, a bar stool that previously used a manufacturer-supplied bullet list of details as its product description now has a five-sentence description that details how it can complement a bar set-up, links to bar accessories and sets the tone by mentioning alcoholic beverages, all of which makes it more SEO-friendly, Lieberman says. “We decided to change it all up,” he says. “What we’re seeing now is what is good for customers and what they see on the site is also good for Google.”

    OneWayFurniture.com is also slimming down content that causes pages to load more slowly because this also affects how Google interprets the quality of a web page. “We’re focused on the basics, the structure of the site and on doing things that are not going to affect us negatively,” Lieberman says.

    More Things You Can Do to Recover from Panda

    In addition to the things dicussed by Horotwitz and Lieberman, there are plenty of other things to consider in your own SEO strategy that migjht just help you bounce back if you were negatively impacted by the Panda update.

    First off, simply check up on your basic SEO practices. Just because you got hit by the Panda update doens’t mean there aren’t other totally unrelated things you could be doing much better. Remember – over 200 signals. They’re not all Panda related.

    You should also keep up to date on future changes. Read Google’s webmaster blog and it’s new search blog. Follow Google’s search team on Twitter. Read the search blogs. Frequent the forums. Google makes changes every day. Stay in the loop. Something that has worked for years might suddenly stop working one day, and it might not get the kind of attention a major update like Panda gets.

    Panda doesn’t like thin content, so bulk it up. Dr. Peter J. Meyers, President of User Effect, lays out seven types of “thin” content and discusses how to fatten them up here.

    Some have simply been relying more heavily on professional SEO tools and services. SEOMoz Founder Rand Fishkin said in a recent interview with GeekWire, ““I can’t be sure about correlation-causation, but it seems like that’s [Panda] actually been a positive thing for us. The more Google talks about their ranking algorithm, how it changes how people have to keep up, the more people go and look for SEO information, and lots of times find us, which is a good thing.”

    You may need to increase your SEO budget. Like search strategist Jason Acidre says on Blogging Google at Technorati, “This just shows how imperative it is to treat SEO as a long-term and ongoing business investment, seeing that Google’s search algorithm is constantly improving its capability to return high-quality websites to be displayed as results to their users worldwide. As the biggest search engine in the world is requiring more quality content and natural web popularity from each website who desires to be on the top of their search results, it would certainly require quality-driven campaigns and massive fixes on their websites, which of course will necessitate them to upsize their budgets to acquire help from topnotch SEO professionals.”

    “Authority websites that were affected by this recent Google update are losing money by the day,” he adds. “They are in need of high quality service providers who can actually meet their needs, and in order to get the kind of quality that can be seen genuinely useful by both users and search engines, they’ll probably need to make a much expensive investment on content management and link development, as this campaign would require massive work and hours to really materialize.”

    Set up alerts for SEO elements of your site, so you’re constantly up to speed on just what’s going on. Arpana Tiwari, the Sr. SEO Manager of Become Inc. has some interesting ideas about this.

    We all know that Google loves local these days. Local content even appeared to benefit from the Panda update to some extent. If you have anything of value to offer in terms of local-based content, it might not be a bad idea to consider it. Obviously quality is still a major factor. The content must have value.

    Then of course there’s Google’s own “guidance”. Don’t forget the 23 questions Google laid out as “questions that one could use to assess the ‘quality’ of a page or an article”.

    The silver lining here for Panda victims is that there is hope of recovering search visibility from Google. Nobody said it is going to be easy, and for a lot of the victims, it’s going to be harder than others. Let’s not discount the fact that many of the victims were victimized for a reason. Google’s goal is to improve quality, and much of what was negatively impacted was indeed very lackluster in that department.

    Serious businesses will continue to play by Google’s rules, because today, Google is still the top traffic source on the web. It’s simply a vital part of Internet marketing, and the Internet itself is a much more significant part of the marketing landscape than it has ever been before.

    Impacted by Panda? What are some things you’ve done to aid your recovery? Share in the comments.

  • Fame Trumps SEO in Battle of David Leonhardt Rankings

    All those of you with common first and last names like John Smith or Jessica Jones or Bob Johnson will appreciate how hard it is to rank for your personal brand – your name. There must be hundreds of people active on the Internet who share your name.

    And any reader with a name like Drew Barrymore or Larry Page… well, you know the chances you’ll ever rank well for your name.

    But perhaps the worst off are those with common first and last names who also share their name with a huge celebrity. Think Dan Brown or George Harrison or Megan Fox.

    David Leonhardt Posers

    Well, this is a personal story. If you search “David Leonhardt” right now, you will see there are three of us with the exact same name with a presence on the Internet. (Guess who the two imposters are.)

    When I first started on the Internet, the guy with the domain name ranked #1 – DavidLeonhardt.com ranked at the top for “David Leonhardt”. In fact, the David Leonhardt Jazz Group held several top-10 rankings, as he was in fact the original David Leonhardt active on the Internet.

    As I grew increasingly active, some pages related to me started to rank in Google’s top ten for my name. Yay!

    But another dude who writes for the New York Times was also getting active, so he also was breaking into the top 10 in a big way.

    This New York Times David Leonhardt was in fact causing problems for me offline, too. A friend saw his by-line in the Toronto Star (I think it was) and the topic was even related to my happiness book, and a friend thought it was my article.

    Even worse, my brother saw one of his articles in the Globe and Mail (I think it was) and again the topic was related to my happiness book. This time my brother thought it was my article.

    And just over a month ago, this New York Times guy who shares my name (never asked my permission, mind you) goes and wins himself the Pulitzer Prize for “Commentary”. Thanks a lot!

    As anyone who reads this blog knows, I am all over on the Internet, commenting on blogs, active in social media, building links, networking – you don’t get more active than me.

    And the winner is…

    So let’s take a look at what Google thinks of all of us David Leonhardts. This is a snapshot at the time of writing…

    1. New York Times writer
    2. New York Times writer
    3. New York Times writer
    4. New York Times writer
    5. New York Times writer
    6. Me
    7. Me
    8. Jazz Group
    9. Me
    10. New York Times writer

    What can we conclude by this case study?

    We know that the domain name is important, as is anchor text – and surely the David Leonhardt Jazz Group has plenty of inbound links with “David Leonhardt” in the link text. (I did not check, but I do know he owns a number of other name-related domains specifically for wedding performances, etc.)

    We also know that activity, inbound links, social media signals – all the stuff that I am doing just naturally every day (with a bit of SEO-savvy thrown in) are also important.

    But it appears fame trumps SEO. New York Times David has six out of ten positions, including the top five. I am holding my own, sort of, perhaps down just a bit from my peak a couple years ago (I think I had as many as five spots at one point, including the third place ranking). And the once dominant Jazz Group David risks being pushed off the top 10 completely.

    The lesson: If you want top rankings, get famous. Do things that win you real acclaim out in the real world, and Google will reward you on the Internet for your renown.

    Originally published at David Leonhardt’s SEO and Social Media Marketing

  • Reasons Google Might Skip Your Canonical Tag

    Reasons Google Might Skip Your Canonical Tag

    This week, Google’s Matt Cutts has been discussing rel=canonical, providing some info that webmasters might find pretty helpful. “A user submitted a question to Matt, which said, “It takes longer for Google to find the rel=canonical pages but 301 redirects seem to lose impact (link juice) over time. Is there similar churn with rel=canonical?”

    He addressed this in the above video. Cutts’ response was to say that some people ask how much PageRank/link juice if they lose if they use a 301 redirect, and that they lose just a “tiny, little bit” or “not very much at all”.

    “If you don’t lose any, then there’d be some temptation for people to use 301 redirects for all the stuff on their site rather than links, since some amount of PageRank always sort of evaporates or disappears whenever you follow a link – people would say, ‘Oh, why use links and not just use 301 redirects for everything?’” he says.

    In regards to 301 redirects vs. rel=canonical, he says in general, he would use 301 redirects if you can, because they’re more widely supported, everyone knows about how to follow them, and any new search engine is going to have to handle those. Also, if you can have it work within your own CMS, he says, then the user’s browser gets carried along with the redirect.

    Cutts also took to his personal blog to discuss rel=canonical a bit more, and said that Google actually doesn’t use it all cases. “Okay, I sometimes get a question about whether Google will always use the url from rel=canonical as the preferred url. The answer is that we take rel=canonical urls as a strong hint, but in some cases we won’t use them,” he says.

    This applies to cases where Google thinks you’re “shooting yourself in the foot by accident,” like pointing it to a non-existent/404 page, or if they think your site has been hacked and the hacker added a malicious rel=canonical.

    Google will also not use rel=canonical if it is in the HTML body or if it sees “weird stuff” in the HEAD section of the HTML. “For example, if you start to insert regular text or other tags that we normally only see in the BODY of HTML into the HEAD of a document, we may assume that someone just forgot to close the HEAD section,” he says, suggesting that you make rel=canonical one of the first things (if not THE first thing) in your HEAD section.

    Here’s what Cutts had to say about the canonical tag when it was announced and WebProNews interviewed him about it a couple years ago:

  • Why Time is a Big Factor in Big-Time SEO Success

    To know the value of one year – ask the student who failed their final.
    To know the value of one month – ask the mother of a premature baby.
    To know the value of one week – ask the editor of a weekly magazine.
    To know the value of one day – ask the wage earner with six children.
    To know the value of one hour – ask the lovers who are waiting to meet.
    To know the value of one minute – ask the person who missed the plane.
    To know the value of one second – ask the person who survived the accident.
    To know the value of one millisecond – ask the Olympic silver medalist.
    –John Maxwell

    In SEO and SEM, time management is critical. Almost anybody in the industry will tell you that you can spend countless hours “tweaking” a website, looking at traffic analysis and conversion stats, and employing link building campaigns. These are all essential parts of a good SEO service, but at the same time, some limits have to be placed on the amount of time you will spend on these activities for any single client.

    Newer clients, or those that have a lot of problems, need to have more time dedicated to each of the activities above. Yet, the SEO has to maintain a workable time budget in order to prevent profits from circling the drain.

    We often find ourselves needing more time in the day to get things done. I know I’ve wished for more. I honestly don’t know how people much busier than me do it. I have the same 24 hours to use each day as Trump or Obama have.

    When I go home after a full day of what feels like non-stop rushing to manage one client after another, I often think about how these guys must feel. They have much more responsibility than I, but still the same number of hours in which to get stuff done, and always seem to find time for golf!

    If I could have one wish, it would be to have more hours in the day and to require less sleep each night. OK, that’s two wishes, but I’d settle for either one of those (preferably the latter.)

    Value Add or Cost Add?

    When a client asks, “What more can we do to stay ahead of our competitors?”, one of the first things I do is look at their current contract. If the plan they have has any areas of weakness, I’ll let them know what more can be done to reach their goals. Inevitably, it is based on their willingness to invest in the additional time and resources required for a more aggressive campaign.

    Unfortunately, that usually ends the conversation for some clients.

    When they are asking “What do we have to do?” What they really mean is, “What more will you do?” I’m always willing to do more, but there is that pesky issue of our time and whether we’re willing to work for free or not. Usually not.

    I never mind providing a value-added service every now and then. Sometimes we’ll do less of one thing so we can do more of another. But, eventually there comes a point of diminished profits, unless the client is willing to step up and pay for what they want us to help them achieve.

    Accurately Budgeting Time for Process and Results

    Whenever I put together a proposal for a prospective client, my goal is to estimate the number of hours that will be needed over the contract’s duration. This includes one-time only tasks, monthly tasks, and yearly tasks. All of that gets thrown in to create an estimated number of hours that we then use to figure a monthly pricing level.

    That becomes our benchmark, and we use it with the knowledge that clients will occasionally need more time spent each month on a task (especially in the early months), and less time in other months.

    Trying to accurately predict the number of hours needed over the next 12 months can be daunting. I have to look beyond time spent on research and implementation. Both ongoing consulting and client communications factor in a great deal, as does analysis. Most clients don’t realize that every call or email requesting a status update is time that is taken away from research, analysis and implementation.

    Every SEO must determine how much consulting time will be factored into the campaign cost. Ultimately, the client wants, and needs, to feel taken care of. Failure to factor in consulting and management into pricing will reduce campaign performance, or create a client that feels out of the loop. Both can be hazardous to client satisfaction.

    Time management, regardless of your field, becomes one of the most important aspects of your professional and personal life. It affects what you can do and what your client feels you should do. Those that don’t manage their time wisely are doomed to fail.

    If the SEO wants to be successful – and if the client wants the SEO to be successful – then both must consider the time involvement in any new task or request being made. These things add up and eventually, if left unchecked, can tip the scales in bringing both the SEO and the client into unprofitable territory. This is a lose/lose scenario. But, if both manage time expectations and costs, both the SEO and the client and be in a win/win situation that will bring big-time success.

    Originally published on E-Marketing Performance

  • SERP Alert: Google Social Search Goes Global

    SERP Alert: Google Social Search Goes Global

    Google announced via its new official Search Blog that it is rolling out Social Search around the globe. This comes just days after Bing upped the ante in the social search game by integrating Facebook data in much more elaborate ways. Google’s social search, however, may prove useful in some cases, but you may see more content from strangers than you do from your real friends.

    Does Google’s Social Search make results less relevant? Comment here.

    Google has been doing social search since 2009, and earlier this year it was updated to be more useful, with social results appearing throughout the SERP, as opposed to just in a cluster at the bottom of the SERP. Google says they’re mixed in based on relevance.

    “For example, if you’re looking for information about low-light photography and your friend Marcin has written a blog post about it, that post may show up higher in your results with a clear annotation and picture of Marcin,” says Google software engineer Yohann Coppel.

    “Social Search can help you find pages your friends have created, and it can also help you find links your contacts have shared on Twitter and other sites. If someone you’re connected to has publicly shared a link, we may show that link in your results with a clear annotation,” says Coppel. “So, if you’re looking for information about modern cooking and your colleague Adam shared a link about Modernist Cuisine, you’ll see an annotation and picture of Adam under the result. That way when you see Adam in the office, you’ll know he might be a good person to ask about his favorite modern cooking techniques.”

    How Google Determines What to Show In Social Search Results

    First of all, users must be logged into Google to get the benefits of social search. “If you’re signed in, Google makes a best guess about whose public content you may want to see in your results, including people from your Google Chat buddy list, your Google Contacts, the people you’re following in Google Reader and Buzz, and the networks you’ve linked from your Google profile or Google Account. For public networks like Twitter, Google finds your friends and sees who they’re publicly connected to as well,” explains Coppel.

    Google deserves credit for giving users great deal of control about what people they’re using here, though they could still go further. You can go to your Google Dashboard, find the Social Circle and Content section, and edit accordingly. If you go to the “view social circle link” you can see every single person listed by:

    • Direct connections from your Google Chat buddies and contacts. It even shows you which of these people have content and which don’t. For the ones that do, it shows you which sites they have content on.

      One important thing to note: it actually does include Facebook Page content. For example, I’m connected to Danny Sullivan in my social circle, for example, and Google will show me updates from his Facebook page, as he has it linked to his Google Profile. What’s missing, however, is your personal Facebook network of friends (which in my opinion is the most valuable social data there currently is on the web, if you’re a Facebook user).

    • Direct connections from links through Google Profiles or Connected Accounts “For example, if you listed your Twitter account on your profile or if your Twitter posts appear in your public Buzz stream, then relevant content from people you follow on Twitter will show up in your search results,” Google explains in that section. “You can change these relationships by visiting the corresponding services and adding or removing connections.”
    • Secondary connections that are publicly associated with your direct connections. In other words – friends of friends (at least public friends of friends). There is a little less control here, unfortunately. You can’t remove these people from your social circle unless you remove the friend that’s connecting you to them.

      To me, this actually seems like a step backwards in relevancy of social search. You’re probably a lot less likely to care about what someone thinks just because they know someone you know, than you are if you actually know them. A lot of people don’t even care about what the people they actually do know think.

      Naturally, this is the biggest list and potential source of material for Google to draw from, making it more likely that you see results from people you don’t know than people you do.

    A cool thing about the entire list is that you can click “show paths” next to any name that has content, and it will show you exactly how you’re connected. You can be linked to someone via Twitter, and if that person links their Twitter account to their Quora account, you might see their Quora content too. If that Quora account links to their Facebook account, you might see stuff from their Facebook account if you have permission to see that content (which if set to public or if you’re Facebook friends, you should be able to see it).

    Where are my friends?

    I notice one gaping hole in Google’s social search strategy besides the lack of comprehensive Facebook integration (though it’s certainly connected to that). That would be the lack of a substantial amount of my actual closest friends. I can only assume that many users have a similar issue.

    That’s exactly why Bing’s Facebook integration is a very important factor in its competition with Google. Bing, unlike Google, does tap into your actual Facebook friends for search relevancy (though there is plenty of room for improvement on Bing’s part as well). The Wajam browser extension is still currently a better solution to the problem, if yo ask me. It will add your Facebook and Twitter friends to your results on both Google and Bing.

    It is also for this reason (at least partially) that Google is competing more directly with Facebook now in social. Google wants users to develop the kinds of relationships among friends that people currently have on Facebook, on Google’s own network (which runs throughout various products, but ultimately the Google account, which is at the center of nearly everything – Gmail, YouTube, Buzz, Docs, Chrome OS, etc. The list goes on.

    As long as Google and Facebook aren’t going to play nice together, Google needs to succeed in social to have the best search relevancy in the social part of search. And that part of search is clearly becoming more and more important. That’s simply one competitive advantage Bing has over Google right now. It’s also why Facebook itself is a threat to Google search in some ways.

    It will be very interesting to see how far Google takes social search over time. We know Google is currently working on increasing its presence as a force in social, and the upcoming +1 button should play a significant part in that. As search gets more social, however, it presents new challenges for search engine optimization, and perhaps less significance on algorithm updates (like Panda) from the webmaster point of view.

    Social can not only be a signal of relevance on a personalized level, but if content is shared a lot, it can also be seen as a signal of quality, because people don’t share content that sucks, unless they’re doing it as a joke or using it as an example of what not to do (like I said, it’s just a “signal”). This is nothing new, but it shows the importance of diversifying your traffic sources.

    If you rely heavily on search, as many of the big victims of the Panda update have, you will always be at the mercy of the search engines. If you can find ways to get more love from social networks and links from others, it’s bound to help you in search as well.

    Is Google’s social search helpful or does it miss the mark? Tell us what you think.

  • J.C. Penney Sees Some Google Visibility Recovery After Paid Link Scandal

    J.C. Penney Sees Some Google Visibility Recovery After Paid Link Scandal

    Earlier this year, J.C. Penney was caught gaming Google. A New York Times article exposed that the company had been benefiting enormously from excessive paid links, which is obviously against Google’s rules. They had ranked number one or close for some very prominent search queries like “skinny jeans,” “home decor,” “comforter sets,” “furniture”, “tablecloths,” etc.

    As the news came out, Google took action to penalize the site for its practices. A J.C. Penney spokesperson had stated, “J. C. Penney did not authorize, and we were not involved with or aware of, the posting of the links that you sent to us, as it is against our natural search policies.”

    Searchmetrics has uncovered data which shows that J.C. Penney has seen its search visibility rise again. “Searchmetrics’ Organic Performance Index recorded a dramatic drop in visibility at the time for the site but now it sees that there has been a significant increase in visibility,” a representative for SearchMetrics tells WebProNews.

    J.C. Penney gets a second chance http://ow.ly/4XjmT 7 hours ago via HootSuite · powered by @socialditto

    In a post on the the SearchMetrics blog, the company shows a couple of graphs: one showing the general organic performance, and the other showing a specific keyword: “jewelry”:

    J.C. Penney SearchMetrics data

    J.C. Penney SearchMetrics data

    “What happened? We cannot see a massive reduction/change in their link structure – this also would be way to fast and require more time,” SearchMetrics says on the blog. “So it might be that the people at J.C.Penney have managed to convince Google that they really had no clue about what was going on at their agency – or the algorithm is giving them another chance. We have observed this happening for algorithm penalties many times before: after a couple of weeks or months the penalty has been taken back – at least partially. What’s surprising in this case is that the reinstatement did happen for what clearly was a manual adjustment.”

    SearchMetrics, which also shared a lot of widely-publicized data about the Panda update and it’s victims, is careful to note that none of this is actually Panda-related.

    J.C. Penney isn’t the only site recently penalized by Google to bounce back. In February, Google penalized Overstock.com after the site’s pages had ranked near the top of results for dozens of common searches. The site had been encouraging college and university sites to post links to Overstock pages for discounts, though by the time the penalty hit, the site had already discontinued the program.

    Late last month, Overstock announced (even putting out a press release) that it was no longer being penalized by Google. Overstock’s CEO Patrick Byrne was quoted as saying, “Google has made clear they believe these links should not factor into their search algorithm. We understand Google’s position and have made the appropriate changes to remain within Google’s guidelines.”

    Google, of course, would not comment on any specific site, so why should that be any different with J.C. Penney? Granted, this policy seems to only apply in certain cases, as Google’s Matt Cutts did tweet about J.C. Penney after the New York Times story came out, saying, “I really wish that our algorithms or other processes had caught this much faster – I’m definitely not celebrating.”

  • Despite New Panda Guidelines, Google Still Burying Authoritative Results

    Despite New Panda Guidelines, Google Still Burying Authoritative Results

    There are a lot of elements of Google’s Panda update to discuss, and we’ve certainly discussed many of them over the last few months, but let’s not lose sight of the reason the update was launched to begin with – to improve search quality.

    Do you think Google’s search results are better now? Tell us what you think.

    While quality is often in the eye of the beholder, there are certain kinds of queries where the information being retrieved is simply more important than others. We’ve talked about this before, as it’s been a problem in some Google results.

    One example we’ve looked at a few times is where an eHow article written by a freelance writer with no clear authority on cancer (and whose body of work includes a lot of plumbing-related articles) was ranking at the top of Googe’s results for the query “level 4 brain cancer” above numerous other sources that would seem to be of greater authority on such a subject.

    Level 4 Brain Cancer in Google

    In fact, the article did get bumped down after the Panda update, but it does still rank number 2, followed by another result from eHow. Granted, this is just one example, and Demand Media has efforts in motion to improve its own content quality, but you get the point.

    Queries related to things like health or law demand authoritative advice. Not SEO’d content.

    We had a conversation with Mark Britton, founder and CEO of Avvo about this subject. Avvo is a site that offers Q&A forums where consumers can ask medical or legal questions and get responses from qualified doctors and lawyers. It provides apparently authoritative content in these two areas from certified professionals.

    This seems like the kind of content that should be ranking well for a lot of these types of queries. Does it not? Britton thinks it’s “very important” for commentary from experts in the medical and legal fields to surface high in search results for relevant topics.

    “There is a lot of noise both online and offline regarding health and legal issues,” he tells us. “This comes in the form of lay people, professional commentators and even celebrities who often offer advice that is well-intentioned but inherently inferior to that of a doctor or lawyer trained in the area. However, it is not always easy to get doctors and lawyers to speak. Some still look down on the Internet as a publishing or marketing vehicle. Others just downright fear it, as they have seen too many movies where someone says something on the Internet and they are subsequently hunted and killed by terrorist hackers.”

    “There is always room for improvement — especially with our newer pages,” he says of Avvo’s own search rankings. “We just launched our doctor ratings directory and our free medical question and answer forum in November, and it will take some time for those pages to rank as well as our legally related pages.”

    Look at the results for a query like “Does type 2 diabetes shorten life expectancy?” Avvo’s page on the subject ranks on the second page, while eHow ranks at the top of the first. The Avvo result has actually fallen since I began writing this article. It used to be right below the number one result from eHow and the number 2 from Yahoo Answers.

    Diabetes Results in Google

    eHow’s is an article (not very long by any means) by a guy whose bio says he “has been a freelance writer since 2007. He writes extensively in the fitness, mental health and travel sectors and his work has appeared in a range of print and online publications including Scazu Fitness and USAToday Travel Tips…[and] holds a Master of Arts in community psychology.”

    Keep in mind that USA Today has a deal with Demand Media for travel tips. So that presumably means his Demand Media content is simply published by USA Today. Does “Master of Arts in community psychology” indicate more authority to answer a life/death question about type 2 diabetes than say a licensed and practicing MD? That’s who provided an answer on Avvo’s page, which just got pushed further down in the search results.

    If you change the query to something simpler like “type 2 diabetes life expectancy” eHow still ranks close to the top, and Avvo’s result slips to….get ready for it….page 18! That’s with various articles from places like eHow, EzineArticles and Suite101 (all victims of the Panda update) ranking ahead of it. Now, I’m not saying that Avvo’s result is necessarily the one ultimate result for this query and should necessarily be the highest ranked, but come on. Interestingly enough, the result was on page 3 for this query when I started writing the article (yesterday) and it’s slipped that much further into obscurity just since then. I wonder where it will be in another day.

    Google has given publishers a list of questions to ask themselves about their content, as guidelines the company goes by as it writes its algorithms. The very top one is “Would you trust the information presented in this article?”

    While neither of the articles provide any helpful links to sources of information, the Avvo article comes from a medical doctor. I think most people would find that slightly more trustworthy, even if the article isn’t as long or as well SEO’d. Here’s the eHow article. Here’s the Avvo one.

    The second question on Google’s list is, “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?”

    While Google makes it clear that these questions aren’t actual ranking signals, they must be used to determine the signals at least, and you have to wonder just how much weight authority on a topic carries.

    Britton maintains that ALL of the site’s advice comes from qualified professionals, claiming that this is one of the site’s “greatest differentiators.”

    “We CERTIFY every doctor and lawyer offering free advice on the site in two principle ways: First, we verify with the state licensing authorities that the answering doctors or lawyers are licensed and in good standing,” he explains. “Second, we rate the professionals from 1 (“Extreme Caution”) to 10 (“Superb”), which was unheard of prior to Avvo’s entry into the professional ratings arena. We are big believers that not every doctor or lawyer is ‘Super’ or ‘Best’ which was the steady-state in professional ratings for decades.”

    “This was really just an extension of the Yellow Pages model, where the ‘recommended’ professional is the one paying the most money to advertise,” he continues. “But consumers are getting wise and demanding greater transparency regarding the qualifications of their doctors and lawyers.”

    “We have three ratings that speak to the expertise of our contributors: The Avvo Rating, client/patient ratings and peer endorsements,” says Britton. “For the Avvo Rating, we start with the state licensing authorities and collect all the information we can regarding a professional. We then load that information into our proprietary web crawler, which we call ‘Hoover.’ Hoover goes out and finds all the additional information it can regarding the professional. We match the licensing data with the Hoover data and then we score it. The scoring is based on those indicators of the professional’s reputation, experience and quality of work.”

    Britton says Avvo was not really affected by Google’s Panda update. “We saw a small dip, but things came back fairly quickly.”

    “While I understand the intent of Google’s latest update, I’m not sure they entirely hit their mark,” he says. “We noticed a number of pure lead-generation sites – i.e., sites that are selling leads to the highest bidder — jump ahead of us in certain key terms, which is not good for consumers.”

    Avvo encourages people to ask questions on the site, claiming it its Q&A boasts a 97% response rate.

    Avvo asked us to let readers know that in support of Skin Awareness Month, it is donating $5 to the Melanoma Research Foundation for every doctor review during the month of May.

    Should authority and certification of expertise carry greater weight in Google’s search rankings? Comment here.

  • SEO Isn’t A Fairy Tale

    SEO Isn’t A Fairy Tale

    There are many reasons companies invest in Search Engine Optimization ranging from a desire to attract new customers through online marketing channels to diversifying customer acquisition to ego.  That’s right, ego. Not every marketer makes SEO investment decisions based on pulling in prospects and customers to brand content for engagement and conversions.

    Often times, brands think of themselves as the leader in their category and therefore think their website should top Google’s list for queries on generic industry terms. The trouble is, leading an industry offline isn’t the same thing as being the BEST answer for a search query online.  Chasing after such terms is very much driven by ego and not unlike a fairy tale of chasing after unicorns where there’s an expectation that being #1 on a single word will magically solve their problems.

    However, going after broad industry terms isn’t a complete waste of time. When ego-driven SEO is productive, it’s geared towards building brand reputation and PR value. Of course, by “PR” I mean public relations, not page rank.  The affinity and credibility that comes from being in a top position for a generic industry term can add a lot of value to online public relations efforts, recruiting and investor relations.

    Achieving top placement on broad keywords can certainly drive a substantial amount of website traffic. In fact, TopRank Marketing has quite a few clients that have top spots for generic industry phrases and some with single word terms sending  a good portion of organic search visitors.

    In terms of buying cycle, broad queries tend to be “tire kickers” and have value for creating awareness and education but not conversions. And that’s ok, because the search experience isn’t just a single event – especially in B2B or with more sophisticated buying decisions. But brands that want those top spots need to understand what it takes to translate their offline industry dominance to search engines like Google and Bing.

    A while back I had a customer that said he wanted to be #1 on Google for the word “brain”. This client had a blog with a few thousand uniques per month.  While many SEO consultants will talk about how tough that will be and suggest options, my first response is to always ask “Why?”. Understanding motivation (chasing unicorns vs. a fighting chance at achieving goals) is essential for assessing the value and contribution to business goals.

    The client wanted to have top visibility for “brain” because it was a fairly relevant and highly popular search term. Top placement for such a word would send a significant amount of traffic and hopefully sales.  A few things to consider in such a situation include:

    • What is the potential contribution to website goals in what timeframe for a first page or top of fold position for the phrase?
    • What resources in what timeframe might it take to achieve this goal?
    • What are the current brand content and digital assets available to work with?
    • What is the current inbound link profile for the brand site?
    • What is the current position for brand content on the desired keyword(s)?
    • How many search results pages (SERPs) are there for the keyword(s)?
    • How many of those SERPs contain the exact match keyword(s) in title tags, on-page titles, in URLs?
    • How many inbound links are there to the top ranking pages for the target keyword(s)?
    • How many inbound links contain the exact match keyword(s)?
    • What is the distribution of website types as link sources? (news, blogs, web pages, .edu, .gov, etc)
    • How often are the top webpage URLs mentioned in Tweets, FB updates and other social streams?
    • What is the link acquisition growth over time for the current top pages for the target keyword(s)?
    • How many pages on the current websites showing well for the target keyword(s) are specifically optimized for those terms?
    • How old are the sites currently showing well for the target keyword(s)?
    • How much content is dedicated to the target keyword(s) on and offsite for top pages?
    • What is the difference on key metrics like quantity/quality of optimized pages, inbound links and social mentions of brand content vs. pages that occupy the top 5-10 positions for the target keyword(s)?

    A competitive assessment plus a forecast of resources, timeframe and business impact can paint a clearer picture for brands that want to chase after “unicorn” keywords and SEO.  When budget is not an issue at all, then by all means, satisfy basic business case requirements and go for it. But unlimited budget is rarely the situation.  Most SEO programs operate within a scope of work and resources must be allocated according to the SEO strategy.

    In the case of the “brain” client, a presentation of the numerous hospitals, universities and government websites plus the websites that had thousands of pages and many years head start with link building resulted in the conclusion that going after “brain” would be a losing proposition. Especially within the scope of available hours. The decision was made to go after a mix of keyword phrases representative of the interests potential customers might have in the cilent’s offering.   Better to go after keyword phrases that are achievable within a shorter time frame resulting in business outcomes like sales, than allocating a substantial portion of the program to a keyword that might take a year or years to achieve a first page placement on. This client’s blog has now achieved upwards of 350,000 uniques per month focusing on long tail phrases and opened up a new business model for advertising.

    Does this mean, going after all broad industry keyword terms is chasing keyword unicorns? No.  Go after the broad phrases or word(s) if:

    • There are substantial resources for content creation (creativity and diversity), link building, online PR, social media and networking and reverse link engineering.
    • The brand site is nearly the online leader in content and links for the desired keyword(s) and simply needs SEO refinement, targeted link building and process adjustments internally
    • The acquisition of top placement for the broad phrases is forecast within a reasonable time period and with a desirable outcome in comparison to resources and budget necessary.

    Companies that expect to drive customer acquisition and ongoing engagement through search should be focusing on customer-centric keywords anyway and not on ego phrases that give them a warm fuzzy with little chance of returning business value.  We’ve experienced a focus on keywords that represent consideration and purchase buying cycle behaviors to be more achievable more quickly. The interesting thing is, over time, broad phase visibility can still occur.

    The fork in the eye of my logic is when a senior executive with the brand simply wants the unicorn, period. They want that trophy and the internal marketer/SEO vendor are charged with finding a way to make it happen. If budget and resources can allow for succes – great. If not and logic fails, there’s not much more you can do.

    What’s your decision process for going after broad or single terms in a keyword mix? Do you dismiss in favor of long tail? Do you see it as a challenge and go after it anyway? Do you evaluate on the criteria I’ve listed above? What additional criteria would you include?

    Originally published on Lee Odden’s Online Marketing Blog

     

  • Does the Number of Ads on Your Website Affect Your Linkability?

    Does the number of ads on your website affect it’s linkability? IMHO, yes it does. I do most of my reading on my iPad, using either Ziteapp or Instapaper. Both of these tools do a good job of stripping the content down to its most essential elements and displaying it in a readable format. Unfortunately for publishers, this includes stripping out the advertising (see Advertising and Usability).

    Often, I will come across something I enjoy reading or that I am inspired by and want to link to and will email it to myself in a link. When I get to the full page version, I am often shocked. Recently I had this happen on a post called How To Write A Killer Article in 30 Minutes. Compare the stripped-down version in instapaper on the left to the ad-saturated version on the right:


    IMHO, the number of advertisements on the website might be negatively affecting the number of links the post gets. Now this may seem a bit hypocritical because anyone who views my site will see a similar number of ads in the sidebar and integrated into the content. That said, I think the SEO space in general is a lot more tolerant of advertising than many other vertical markets, but I have made some conscious decisions about implementation.

    Post Age – Ads don’t appear on posts when they are published. They only appear on posts that are more than 7 days old.

    Ad integration – I can’t tell you how many websites I visit where the first thing on the page–sometimes even before the post title–is an advertising banner or an adsense block. I made a very conscious decision to show the title, picture, and byline before showing any ads. IMHO nothing screams MFA like a block of ads before content (yes images are content–see my posts on image optimization for more information). I operate many websites not in the SEO space and have avoided linking to, tweeting about, sharing on Facebook, and social media bookmarking many related sites because of their overly aggressive advertising implementation.

    Finding a Balance – I think it’s important that sites monetize themselves (see Adsense: Why Bloggers Don’t Get it). I also think it’s important to integrate ads and thank your advertisers (see Blog Advertising is Broken). It”s something I do every month. However, I think that having too many ads can work against you. The extra pennies you make don’t offset the links and social signals you are giving up. Google recently filed a patent about ad detection … just sayin’ …

    Dealing with Ad Blockers – Ad blocking plugins and integrated browsing/reading technology like Instapaper and Readability are on the rise. In fact, Apple will be including the technology in an upcoming browser version. Publishers need to find ways to display ads in a format that allows them to remain financially viable.

    What are the takeaways from this post:

    • Look at your site from a user’s perspective. Does your site have so many ads it turns readers off?
    • Look at your pages from a long-term linkability angle. Is your ad placement too aggressive, and is it turning off the linkerati?
    • Try to find a balance that allows you to make money without turning off the linkerati or discouraging social social sharing.
    • Look at your website using ad blocking technology. Find a workaround that shows your ads but isn’t offensive.

    Originally published on Graywolf’s SEO Blog

  • Not Every Google Tweak Is Still Panda.

    There’s talk going around of a “Panda 3.0” or a “Panda 2.1” in reference to recent tweaks to the Google algorithm. The fact is that Google makes many adjustments to its algorithm on an ongoing basis, and generally only feels the need to officially comment on the really big ones.

    Don’t expect guidance from the company every time it makes a tweak. It’s actually somewhat surprising they’ve discussed the Panda update as much as they have.

    On May 6, Google Fellow Amit Singhal wrote in a post on the Google Webmaster Central blog, “Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.”

    Search Engine Land Editor-in-Chief Danny Sullivan says, “Google won’t release the percentage of queries impacted but says this is far less than in the other updates. Changes were made in the past few days.”

    In other words, not every tweak Google makes needs a name. Nor should every tweak since Panda be considered part of the Panda update.

    “We’re continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search,” said Singhal. “As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture.”

    Those would be the questions we looked at here.

  • Website Siloing: An SEO Strategy For Optimal Results

    Relevancy in Google’s eyes begins with strong content. Furthermore, that same content needs to be well structured. One of the best ways to inject positive results into your organic campaign, is to structure your website into a symmetrical organization – silo structuring.

    A disjointed website will many times wreak havoc and minimize the keyword performance of your site. On the other hand, if you tightly theme your content, search engines will index your site better and give you a boost in keyword rankings.

    A clean site structure is clearly one of the most powerful and underrated SEO strategies today.

    Latent Semantic Indexing (LSI)

    Before we delve into silo structures, let’s touch on Latent Semantic Indexing (LSI). The definition of LSI is complex, and the benefits many. As it relates to search engines, it is used as an indexing and retrieval method to identify patterns in the relationships between the terms and concepts contained in an unstructured collection of text.

    How does LSI work? Along with determining which keywords a web page or document contains, other web pages/documents are also examined for the same words. In essence, LSI attempts to determine relevancy on a grand scale.

    As you can see, LSI plays a direct role in your silo strategy. Your goal is to help your website play nicely within the LSI confines.

    Silo Structure to the Rescue

    Silos are used to categorize themed content in an effort to help the search engines determine relevancy. The tighter the focus, the more relevant Google will see each page of your website.

    Figure 1 shows the a typical website siloing structure:

    Silo Structuring

    For example, suppose you have a website on guitars. If you offer up relatively dissimilar topics such as how to play acoustic guitar, how to play electric guitar, and even how to play classical guitar, all on the same page, the relevancy will be diluted. Even though your content may be far better and more useful than your competitors, poor structure will hurt you at some level.

    To aid the LSI process along, the webmaster can structure the site into one of two types, Physical or Virtual. Both silo types are highly effective for SEO because it creates a working ecosystem for the content.

    Physical Silo Structures

    A physical silo is a means of theming or grouping your website content into like categories. Also referred to as a directory silo, they are known to be the easiest to set up and maintain. Think of a directory silo as a filing cabinet. Everything in a file must be distinctly associated within the category to remain relevant.

    The directory-style silo structure is the easiest for both search engines and visitors to follow, and most times should be the starting point when designing your site.

    Virtual Silo Structures

    Once a website becomes established, the structure can break down over time. The virtual silo model can enforce the structure once again through highly targeted internal linking strategy.

    Virtual silos use a drill-down cross-linking structure to enforce distinct categories. In other words, the top landing page of each silo is supported by the pages linking to it. Figure 2 shows how you can prop content using a virtual silo model:

    Virtual Silo Model

    Internal linking is a major component of virtual silos. Linking should be done between like-topics, avoiding unrelated categories as much as possible.

    Obviously, there will be times when you need to link to different silos to make a point or improve an article. When cross linking between unlike silos, use the Rel=”NoFollow” attribute to reinforce the structure.

    For new websites, a physical silo structure should almost always be used. Directory silos are much easier to set up and manage. Established sites, on the other hand can find value in virtual silos if the physical structure is non-existent, or breaks down over time.

    Get on Track using a Silo Structure

    Many websites never reach their potential due to a lackluster theming strategy. You can have a good looking website with great content. However if your site lacks clarity in terms of topical relevance, your targeted keywords will be discounted, and even devastate the most well meaning campaign.

    Siloing your content is a form of on-page SEO that deserves special attention throughout the life of your website.

    The main objective of siloing is to create a website that ranks well for both short and long-tail keywords. Get on track by using sound silo structuring. A symmetrical organization is one sure-fire way to propel your keyword rankings and improve visitor experience.

  • Google Panda Update: New Advice Directly From Google

    Google Panda Update: New Advice Directly From Google

    Google’s Panda update left a slew of victims in the wake of its warpath (the war, of course being on shallow and low-quality content). While Google has dropped some hints here and there on its philosophies for what it considers to be low quality, the company has now been clearer than ever as to what it’s looking at.

    Do you think Google’s results have improved since the Panda update? Tell us what you think.

    “Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year,” writes Google Fellow Amit Singhal on the Google Webmaster Central blog. “In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.”

    Google lists the following as “questions that one could use to assess the ‘quality’ of a page or an article”:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    The company is careful to note that it’s not disclosing actual ranking signals used in its algorithms, but these questions will help you “step into Google’s mindset.” These questions are things that Google says it asks itself as it writes algorithms.

    Singhal also reminds webmasters, “One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low quality pages to a different domain could eventually help the rankings of your higher-quality content.”

    We’ve already seen victims of the update respond by taking this approach. For one, Demand Media announced a big new clean-up initiative, in which it is cleaning house on user-generated content used on its eHow site – deleting some articles, while sending others back through the editorial process.

    I’m sure we will be digging into all of this more very soon.

    Are there any of Google’s questions that surprise you? Let us know in the comments.

    [Image Credit: Stéfan on Flickr]

  • Google Algorithm Update – Is Bounce Rate a Ranking Signal?

    Update: Looks like we have a direct answer now.

    Forget for a moment everything you think you know about Google and how they rank content. Put yourself in the role of a person who is tasked with ranking results. One result gets clicked often, but most of the time the user only stays on the page for a few seconds (if that), returns to the results page, and clicks on another result.

    Meanwhile, another result on the same page gets clicked on a lot too, but when users click on that one, they stay on the page longer, and don’t even return to the results page to find another result to click on. Nor do they refine their query. Which page is most likely the one that has the better content for that particular search?

    Should bounce rate be a ranking signal? Comment here.

    Well, being a human, you have the luxury of looking at both pages and making that call. Now, pretend you’re not a human. You’re a computer algorithm tasked with ranking the world’s information for the majority of searchers. While you have over 200 signals that can help you determine which one should rank higher, wouldn’t this be one that could help?

    This is not exactly bounce rate, but it’s related. In this case, it is the bounce in the direction of back to the SERP, and while there has been a lot of discussion and argument about whether Google uses actual bounce rate as a signal, it seems pretty likely that they are looking at this specific element of it.

    SearchMetrics, after releasing data about the Panda winners and losers in the UK, said, “It seems that all the loser sites are sites with a high bounce rate and a less time on site ratio. Price comparison sites are nothing more than a search engine for products. If you click on a product you ‘bounce’ to the merchant. So if you come from Google to ciao.co.uk listing page, than you click on an interesting product with a good price and you leave the page. On Voucher sites it is the same. And on content farms like ehow you read the article and mostly bounce back to Google or you click Adsense.”

    “And on the winners are more trusted sources where users browse and look for more information,” the firm added. “Where the time on site is high and the page impressions per visit are also high. Google’s ambition is to give the user the best search experience. That’s why they prefer pages with high trust, good content and sites that showed in the past that users liked them.”

    WebmasterWorld Founder Brett Tabke wrote in a recent forum post, discussing what he calls the “Panda metric“, that “Highly successful, high referral, low bounce, quality, and historical pages have seen a solid boost with panda.”

    In a recent video from Google’s Matt Cutts, on ranking in 2011, he talks about increasing site speed, and how this can keep users on your site longer (IE: not bouncing), you can increase your ROI. Speed is a ranking signal. We know that. Speed can reduce bounce rate. Even if Google doesn’t use bounce rate directly, there is a strong relationship here.

    A reader (hat tip to Jordy) sent us this link from Matt McGee at SearchEngineLand, posted last June:

    Bounce rate and rankings? Matt [Cutts] says Google Analytics is not used in the general ranking algorithm. “To the best of my knowledge, the rankings team does not use bounce rate in any way.” He tiptoed around this question a bit, choosing his words very carefully.

    The part about tiptoeing is somewhat intriguing in and of itself, but it’s also important to note that this was nearly a year ago, and the Panda update was not announced until just this past February (and has even been tweaked since then).

    We also picked the brain of SEO vet Jim Boykin. We asked Jim how important he thinks bounce rate is. He says, “I think that some aspects of bounce rate are very important in the post-panda world.”

    “It’s important to note how Google defines Bounce Rate,” he adds. This is below:

    “Bounce rate is the percentage of single-page visits or visits in which the person left your site from the entrance (landing) page. Use this metric to measure visit quality – a high bounce rate generally indicates that site entrance pages aren’t relevant to your visitors. The more compelling your landing pages, the more visitors will stay on your site and convert. You can minimize bounce rates by tailoring landing pages to each keyword and ad that you run. Landing pages should provide the information and services that were promised in the ad copy.”

    He also points to how it is defined in Google Analytics:

    “The percentage of single page visits resulting from this set of pages or page.”

    “Personally, I don’t think that a single page visit is a bad thing. To me, it tells me the visitor found what they were looking for. Isn’t that what Google would want? If I were Google, I’d want a searcher to find the answer to their search on the exact page they clicked on in a search result…not 1 or 2 clicks in. If I were Google, I’d look more at ‘Who Bounces off that page, and returns to the same Google search, and clicks on someone else, and then never returns to your site,’ but I’m not Google, and that’s just my ‘if I were Google’ thoughts”.

    Regardless, it can’t be a bad thing to strive to make every page of yours the best page of its type – the solution to the searcher’s problem. At its heart, that is really what the Panda update is about. Really, that’s what search ranking is about in general. Delivering the BEST result for the query – signals aside.

    As far as links, while Boykin says it’s “kind of” fair to say that making sure your links point to quality pages can have a major impact on how Google ranks your site post-Panda, he says, “The final solution should be to remove or fix the low quality pages, and thus, all your links would point to ‘quality pages’.”

    Again, this should improve bounce rate.

    “I think most agree that there’s a ‘Page Score’ or a ‘set of pages score,’ and when that has a bad score, it affects those pages, and somehow ripples up the site,” Boykin adds. “It could quite well be that if you have a page that links out to 100 internal pages, and if 80 of those pages are ‘low quality’ than it just might affect that page as well. A lot of this is hard to prove, but there are some smoking guns that can point in this direction.”

    “Bounce rate is important, and yes, many sites that got hit did have a high bounce rate, but comparing this to sites/pages that weren’t hit doesn’t exactly show any ‘ah ha’ moments of ‘hey, if your bounce rate is over 75%, then you got Panda pooped on,’ because the bounce rate Google shows the public is missing many key metrics that they know, but don’t share with us.”

    I think the best advice you can follow in relation to all of this is to simply find ways to keep people from leaving your site, before they complete the task you want them to complete. That means providing content they want.

    Is bounce rate important in the post-Panda world? Tell us what you think.

  • Google Panda Update – Webmasters Still Trying to Crack the Code

    Google Panda Update – Webmasters Still Trying to Crack the Code

    WebMasterWorld Founder Brett Tabke recently started an interesting thread in the forum looking at different user behavior elements, and a combination of factors that make up what he calls the “Panda Metric.”

    “In this socialized world, it just makes sense that Google would start using more engagement metrics such as demographic, psychographic, and behavioral metrics. I started to put together a list of possible data sources Google could use as signals, and the list quickly grew large,” says Tabke. “Most of the engagement metrics Google can use, will fall into the realm of user behavior. Those data sets can be combined with a successful search result into a powerful metric for your website. I believe that metric is now replacing Page Rank as the number one Google indicator of a quality site. I have been calling this mythical metric, the User Search Success Rate (USSR) or the Panda Metric (PM). This is the rate at which any search results in a happy searcher.”

    He goes on to breakdown different things Google is looking at, like referrals, location data, browser request headers, site/advertiser tracking, cookies, query entry, SERP behavior, and results clicked. He concludes that “Highly successful, high referral, low bounce, quality, and historical pages have seen a solid boost with panda.”

    SEO Jim Boykin, references Tabke’s post, and concludes that Panda is all about “User Behavior in relation to each page of your website (or sets of pages on your site).”

    Bounce rate is a large part of the user behavior formula, and that’s certainly come up in the past, when discussing Panda. When SearchMetrics put out its data looking at the top Panda winners/losers in the UK last month, CTO and co-founder Marcus Tober suggested that time spent on sitter is a major factor.

    “It seems that all the loser sites are sites with a high bounce rate and a less time on site ratio. Price comparison sites are nothing more than a search engine for products. If you click on a product you ‘bounce’ to the merchant. So if you come from Google to ciao.co.uk listing page, than you click on an interesting product with a good price and you leave the page. On Voucher sites it is the same. And on content farms like ehow you read the article and mostly bounce back to Google or you click Adsense.”

    “And on the winners are more trusted sources where users browse and look for more information,” he added. “Where the time on site is high and the page impressions per visit are also high. Google’s ambition is to give the user the best search experience. That’s why they prefer pages with high trust, good content and sites that showed in the past that users liked them.”

    Of course original content is key as well. It doesn’t look like some amount of syndicated content is totally frowned upon, but if you don’t offer original content as the majority, you might be in trouble.

    NPR actually has a new report out about a furniture seller called One Way Furniture feeling the wrath of the Panda update, possibly based on un-original product descriptions. They have been pulling descriptions from manufacturers. The report says:

    We all wanted to believe it was something else,” he says. “Because rewriting the content is a tremendous task, when you have 35,000 pages.”

    All those pages showcase dozens of individual pieces of furniture. Lauren Fernstrom is one of the writers tasked with rewriting the product descriptions of each of these items. She writes 20 in an hour. (That’s three minutes per item.) And the rewrites cost Lieberman about $1 per barstool. But each item on the site — and there are a lot of them — gets the makeover.

    Anybody with a substantial amount of content that was hit by the Panda update has a decision to make: scrap the content that dropped off (as it must be considered low quality by Google)…that is unless it is providing value in other ways.. or work (and spend the time and money it takes) to improve it. For sites with loads of content, the decision might not be an easy one, particularly if Google is penalizing good content based on the less-than-stellar content that it is associated with, as some seem to believe.

    The general consensus among seasoned SEO professionals seems to be that backlinks have little (if anything at all) to do with Panda. However if you’re linking out to lower-quality content, that might be a different story.

    This would make sense on several different levels. For one, if you’re linking to content that’s not useful, it makes your content itself less useful. Secondly, you can control what you’re linking to. You can’t control who links to you. Since you can control what you’re linking to, and you’re most likely linking to your own content in many cases, it can’t hurt to make sure that your own content that you’re linking to is up to snuff itself. That, in turn, can only help the time the user spends on your site.

    The bounce rate discussion is continued in the comments, and further in this article.