WebProNews

Tag: SEO

  • How Often Should You Update Evergreen Content?

    The idea of evergreen content is that it is essentially unchanging, intended to last “a long time,”and have little or no maintenance. So isn’t updating evergreen content a bit of an oxymoron? In some cases yes, in some cases no.

    This discussion is part of a larger discussion I’ve been having on Twitter about re-tweeting old posts that have no published dates on them. While I do have a solution for displaying custom date formats that doesn’t adversely affect my Google click through rate, the fact that I may be tweeting outdated information exists.

    So what’s a workable solution to evergreen content?

    Make it part of your annual content audit process. Every 6-18 months, review and prune your dead posts. Review your top content to see what needs to be updated or cleaned up. Treat posts that you update like seasonal content and keep the living URL the same. Clear the “already tweeted” or “already published” flags (so the post will retweet when you change the publish date) and hit the “publish” button. (side note – for those of you who are using a plugin to post your tweets and may not know, when they “publish,” it sets a field in the database so it doesn’t retweet when you make any edits. In this case you want to override that behavior and make it retweet again as if it were a new post).

    If you review the content and it needs very minor changes or no changes at all, treat it as seasonal content: clear the tweeted flags and update the publish date. This will add a bit of maintenance but not much. If the information is still up to date, your followers won’t mind “a rerun or two from last season” as long as there are regular posts and you don’t tweet them in “batch mode.” (side note: as an SEO, we like to work in batch mode, so updating 20 posts in one day and having them retweet in “batch” probably won’t win you any friends). If your audience is made up of whiny SEO’s or short attention span social media gurus, expect some hating. Regular people who aren’t on Twitter all day don’t really mind; in fact, many studies have shown retweeting is an effective way to reach these people.

    So what are the key takeaways from this post:

    • When you do a content audit, also look for posts with outdated information.
    • Decide if you need a full rewrite or just a cleanup.
    • Treat the content like a living URL and don’t lose any existing link equity.
    • Clear out any “already tweeted” and “already published” flags.
    • Update the publish date, republish, and re-tweet.

    Originally published at Graywolf’s SEO Blog

  • Google Panda Update – A Broader View of U.S. Traffic Patterns

    Experian Hitwise has released some new Panda-related data (obtained by Forbes), casting a broader view of what some of the update’s victims’ search traffic is looking like since early in the year – before Panda’s first wave.

    There are some interesting findings here indeed. Forbes was kind enough to share a spreadsheet of the data, looking at U.S. weekly downstream traffic from Google.com to selected sites. The spreadsheet would appear to show the true top losers of the Panda update in the U.S.

    Hitwise Panda Data
    Click image to see larger version

    It’s worth noting that not all of the data here is necessarily representative of Panda – just Google traffic in general. Believe it or not, Panda is not the only thing that can come into play here.

    The thing that has everybody talking is the -40% hit Demand Media has taken in downstream traffic from Google in the U.S. Demand Media’s Answerbag took a -80% hit, LiveStrong took a -57% hit, and the company’s real bread and butter site, eHow, took a -29% hit. That’s from January 08 to April 23.

    The usual suspects are also included on the list. For the same time period, Articlebase, the top loser on this list, took a -83% hit, Suite101 took a -79% hit, Mahalo took a -78% hit, EzineArticles took a -77% hit, HubPages took a -67% hit, and Yahoo’s Associated Content took a -61% hit.

    Again, this is just up to three days ago from close to the beginning of the year. I wonder how the patterns will develop for these sites after another month or two.

    Also worth noting – Overstock.com is on the list at -32%. Just this week, the company announced that they are no longer in “Google’s Penalty Box”.

    Among the winners: Walmart, JC Penney (interesting considering recent events), Forbes, Whitepages, Etsy, eBay, YouTube (a discussion on whether this is justified here), YellowPages, and About.com.

    Taming the Panda

    If you are one of Panda’s victims, you’ve likely already been doing your fair share of site evaluation (and perhaps business model evaluation) and soul searching. There are many factors to consider when trying to get your site up to Google’s code for quality. Of course nobody knows exactly what that code is, but there are plenty of hints and starting points. We’ve looked at a lot of them here.

    SEO Jim Boykin wrote an interesting piece about Panda, with a bit of a history lesson, referencing Google’s “supplemental index,” which was heavily discussed about 5 years ago.

    “I believe that after they removed the ability to clearly see which pages are in the supplemental results, that they then went on a binge of putting way way more % of pages into this ‘Supplemental index’,” he says. “So something to understand today with Panda is that google was already pretty good at tossing the majority of everyone’s pages on their sites into the supplemental results. At least the Deep Pages, and the Pages with Little content, and the pages of dup content…”

    He goes on to talk about different signals Google has added to its algorithm since then, and looks at post Panda interviews with Google that we have looked at in the past (see all of our past Panda coverage here). Boykin’s lengthy article is worth a read, but he concludes that the biggest question site-owners should be asking themselves (to avoid Panda Hell) is: “How do I get people to not quickly go back to the same google search?”

    The answer, I would say, is to provide as much relevant information as possible to answer the questions users are likely seeking the answers to. Of course you have to consider that Google has a total of over 200 different factors it’s looking at.

  • URL Shorteners and SEO, According to Google

    With Google looking more at social media these days, in terms of ranking signals, a lot of webmasters continue to wonder how Google treats URL-shorteners in terms of SEO.

    This isn’t completely new information, but it still seems to be a topic that continues to come up fairly regularly. Google’s Matt Cutts addressed the issue in a video posted to Google’s Webmaster Help YouTube channel.

    “Custom URL shorteners are essentially just like any other redirects,” he explains. “If we try to crawl a page, and we see a 301 or permanent redirect, which pretty much all well-behaved URL shorteners (like bit.ly or goo.gl) will do, if we see that 301 then that will pass PageRank to the final destination.”

    “So in general, there really shouldn’t be any harm to using custom URL shorteners in your SEO,” he continues. “The PageRank will flow through. The anchor text will flow through, and so I wouldn’t necessarily worry about that at all.”

    “Now, just to let you know, if you look at, for example, Twitter’s web pages, many of those links have a nofollow link,” he adds. “So those links that are on the webpage, may not necessarily flow PageRank, but we might be able to find out about those links through some other way – maybe a data feed or something like that. But just URL shorteners, as far as how they relate to SEO, are not necessarily a problem at all.”

    When we spoke with Gil Reich of Answers.com at SMX Advanced last summer, he suggested using shorteners that let you get keywords in the URLs.

     

     

     

    Google actually updated its own URL shortener, goo.gl, with some new features this past week. New features include: copy to clipboard, remove from dashboard, spam reporting, and improvements in speed and stability.

    “Even as we add features, we continue to focus on making goo.gl one of the fastest and most reliable URL shorteners on the web,” said Google software engineer Devin Mullins. “We’ll continue working hard to ensure that we add minimal latency to the user experience and extend our track record of rock-solid reliability—we’ve had no service outages since we launched last September.”

    Anything to add to the URL-shortener/SEO conversation? Comment here.

  • Sorry, Folks. It’s Not Always About the SEO.

    Sorry, Folks. It’s Not Always About the SEO.

    Yesterday, Matt Cutts, Supreme Master of Google Space, Time, and SEO, posted his latest video response to a question that I thought was just priceless.

    If you don’t want to play the video, the question was, “When analyzing rankings for highly competitive keywords in our industry, we have found sites not as optimized as ours (on-page), and that have few links, and little content are still ahead of us. What gives? Why are ‘unoptimized’ sites ranking so well?”

    Matt provides a perfect and accurate answer to the question, in a technical sense, which I’m not going to copy here, but basically he chalked it up to a variety of unseen factors, like the fact that you can’t see all the links to a competitor’s site using the “link:” variable, etc. etc.

    Look, you can get as technical with this answer as you want, but the one thing he really didn’t say in the end (and this isn’t a dig at all) is that maybe that site is just more relevant than hers. Sure, Google uses its algorithm to mimic the way a human would see something as being more relevant than another, but it still comes back to one site being more relevant than another.

    You can optimize the daylights out of a site… do everything right, get the content, set the architecture, get the inbound links and still this site is sitting on top of you, probably for a very good reason.

    Now, there are a whole host of goofballs that comment on Matt’s posts on YouTube and they all talk about spam and “FAIL!” and all sorts of other crazy, kooky theories like Google was some sort of shadow government. But the truth is, Google doesn’t need to BS you with their answers… the real answers are more complicated than you can imagine.

    This is why we always start our relationships with new clients with an SEO audit that looks at their site with best practices in mind for our big three target areas, content, site architecture, and inbound links. Because 99% of the time, you missed something that was pretty basic. Now you have a plan, a strategy for attacking the SEO issues that are right there on the surface, and there’s a great chance it will help you move up the ranks.

    But still, we can’t promise anything… and especially that you’re going to overtake your competitor for that one term that really, really bugs you. Any agency that does, it selling you snake oil and you should run away from them with a quickness.

    Just please don’t try and make it sound like Google is picking on you… it just sounds pitiful (but that’s for another post).

    Originally published at fang digital marketing

  • The Top 10 Things to Consider When Searching for a Professional SEO Company

    The Top 10 Things to Consider When Searching for a Professional SEO Company

    Nowadays, it seems that everyone does search engine optimization, or at least make the claim. But it’s more than just developing good content and slapping on a few meta tags. True SEO takes a lot of research, implementation, monitoring, and time. It truly is its own service. I suggest seeking out a company that specializes in SEO for the best results possible.

    When it comes to shopping around for a quality SEO company, many are not sure where to begin. There are multitudes of SEO companies out there offering different services at different rates. That being said, I have put together a list of what I believe to be the “top 10 things to consider” when looking for the right Search Marketing agency – A guide to finding your diamond in the rough.

    1. Guarantees specific rankings

    No reputable SEO company will ever make such a claim! It states within the Google Webmaster Guidelines,”No one can guarantee a #1 ranking on Google.” And no companies run Google or have any “insider secrets” to their complex ranking algorithm. Google does not just hand out #1 rankings. If they did, I would be out of a job.

    2. Offers instant results

    Rankings take time. It can take weeks to months for a page to increase in Google search results. Plus, your site will be competing against other sites for relevancy and popularity.

    3. Guarantees an amount of Inbound links

    The important thing to focus on is your content – keep it fresh and relevant. It can take time, but you want your links to be valuable. Quality inbound links are created by humans and can be born and die at any time, so they cannot be guaranteed – be sure the company does NOT buy the links.

    4. Does not offer understandable reports

    You want to know where all your money is going, right? And wouldn’t it be great to be informed on what keywords have risen and dropped in rankings? If they cannot offer concrete results, there is something wrong.

    5. How does their site look

    “If you can talk the talk, then you can walk the walk”: clean and simple – does their site look good and is it optimized? If they cannot even promote themselves, how are they going to help you?

    6. Unsolicited offers

    If you open a generic looking email template offering you their services, then you probably have received a spam email. Do not get me wrong, there is a difference between a company reaching out to you, and a company reaching out to thousands with the same email. Most reputable companies will not reach out to you unless there is a reason for it.

    7. Does not offer insights

    Correct me if I am wrong, but isn’t it the job of the agency to lend a hand in your search efforts? After all, you did hire them. Not only should they be offering assistance on ways for your site to rank higher, but also show you where these insights are coming from (this is where reporting comes in).

    8. How much do they cost

    Good quality SEO is not cheap. When it comes to cost, SEO services prices varies greatly. On one end of the spectrum, you will have a company offering you the world for practically nothing, and on the other hand, there are companies who charge in the thousands for their services. It is best to keep in mind, you pay for what you get.

    9. Does not do in-depth keyword research

    Understanding how people perform searches related to your brand is the foundation of SEO. Slacking off here can mean the difference between having a successful campaign or not. If you do not know how your customer is searching for your brand or products, then how are you going to reach them?

    10. Does not Perform black-hat SEO practices

    This is very important, and may be hard to spot by the untrained SEO eye. What is important here is making sure the company practices good ole’ SEO practices – make sure they follow the Google Webmaster Guidelines. Black-hat SEO is not only unethical (buying links, stuffing a site with keywords, etc), but it also can get you banned from Google. If it appears that the company is trying to be sly and pull a fast one on Google, then step away. We all saw how JC Penney got ratted out and Google-slapped.

    Once you weed out the competition and narrow it down to some worthy prospects, here are some questions you might want to ask about their agency (just to make sure they are up to snuff):

    1. Can you show me some recent case studies?
    2. Does your company abide by the Google Webmaster Guidelines?
    3. How long have you been in business?
    4. How familiar are you with my industry?
    5. How do you measure success?
    6. Will you supply me with reports?
    7. How often will we be in contact?

    These questions should help you get even closer to finding what you are looking for. Also, do not be afraid to ask questions. You want an agency that wants to hear everything you have to say. This only gives them more information on creating a winning game plan for your site.

    When in doubt, keep this in mind: “You pay for what you get.” And not to be cynical, but if it sounds too good to be true, then it probably is.

    Remember, the goal of SEO is to ensure that the content of your site is matched and displayed on search engines results from relevant search queries – It is all about the end user and what they are searching for. This type of process is not over night and takes a lot of time and care to implement. Even though companies offer it as an add-on package, it truly is its own niche. A truly dedicated search company, who primarily focuses on quality SEO services, will often provide the best results.

    Do not become a victim of bad SEO. Use this guide and find the right internet marketing company and speak with a search engine optimization consultant who can aid you in devising a strategic plan that works best for you and your company.

    Happy hunting!

    Originally published at Search Marketing Sage

  • Google Panda Update Winners, Losers, and Future Considerations

    So the controversial Google Panda update is now live throughout the world (in the English language). Since Google’s announcement about this, data has come out looking at some of the winners and losers (in terms of search visibility) from both SearchMetrics and Sistrix. Hopefully we can all learn from this experience, as search marketing continues to be critical to online success.

    Has Panda been good to you? Comment here.

    Throughout this article, keep a couple of things in mind. The SearchMetrics and Sistrix data are limited to the UK and Europe. The Panda update has been rolled out globally (in English). It seems fair to assume that while the numbers may not match exactly, there are likely parallel trends in visibility gain or loss in other countries’ versions of Google. So, while the numbers are interesting to look at, they’re not representative of the entire picture – more a general view.

    Also keep in mind that Google has made new adjustments to its algorithm in the U.S. In the announcement, they said the new tweaks would affect 2% of queries (in comparison to 12% for the initial Panda update). Also note that they’re now taking into consideration the domain-blocking feature in “high confidence” situations, as the company describes it. So that may very well have had an impact on some of these sites in the U.S.

    As there were the first time around, the new global version of Panda has brought with it numerous interesting side stories. First, let’s look at some noteworthy sites that were negatively impacted by the update.

    eHow

    eHow managed to escape the U.S. roll-out of the Panda update, and actually come out ahead, but the site’s luck appears to have changed, based on the SearchMetrics/Sistrix data. According to the SearchMetrics data, eHow.co.uk took a 72,30% hit in visibility. eHow.com took a 53,46% hit. Sistrix has eHow.co.uk as its top loser with a -84% change.

    eHow Hit by Panda in the UK

    In the U.S. after looking at some queries we tested before, it does appear that eHow has lost some positioning in some areas – most notably the “level 4 brain cancer” example we’ve often referenced to make a point about Google’s placement of non-authoritative content over more authoritative content for health terms.

    EzineArticles

    EzineArticles, which was heavily impacted the first time around got nailed again, based on the data. SearchMetrics, looking at UK search data has EzineArticles with a drop in search visibility of as much as 93.69%. Sistrix, looking at Europe, has the site as its number 2 loser with a change of -78%.

    This is after an apparently rigorous focus on quality and guidelines following the U.S. update.

    Mahalo

    You may recall that after the U.S. update, Mahalo announced a 10% reduction in staff. “All we can do is put our heads down and continue to make better and better content,” CEO Jason Calacanis told us at the time. “If we do our job I’m certain the algorithm will treat us fairly in the long-term.”

    Since the global roll-out, we’ve seen not indications from Calcanis or Mahalo that more layoffs are happening. “We were impacted starting on February 24th and haven’t seen a significant change up or down since then,” he told us.

    Still, the SearchMetrics UK has Mahalo at a 81.05% decrease in search visibility. The Sistrix data has the site at a -77% change.

    “We support Google’s effort to make better search results and continue to build only expert-driven content,” Calcanis said. “This means any videos and text we make has a credentialed expert with seven years or 10,000 hours of experience (a la Malcolm Gladwell).”

    HubPages

    HubPages was hit the first time, and has been hit again. SearchMetrics has the site at -85,72%. Sistrix has it at -72%.

    An interesting thing about HubPages is that a Googler (from the AdSense department) recently did a guest blog post on the HubPages blog, telling HubPages writers how to improve their content for AdSense.

    Suite101

    Suite101, another one of the biggest losers in the initial update, was even called out by Google as an example of what they were targeting. “I feel pretty confident about the algorithm on Suite 101,” Matt Cutts said in a Wired interview.

    Suite101 CEO Peter Berger followed that up with an open letter to Matt Cutts.

    This time Sixtrix has Suite101 at a -79% change, and SearchMetrics has it at -95,39%. Berger told us this week, “As expected by Google and us, the international impact is noticeably smaller.”

    Xomba

    Xomba.com had a -88,06% change, according to the SearchMetrics data. They also had their AdSense ads temporarily taken away entirely. This doesn’t appear to be related to the update in anyway, but is still a terrible inconvenience that got the company and its users a little bit frantic at a time while their traffic was taking a hit too.

    Google ended up responding and saying they’d have their ads back soon. All the while, Google still links to Xomba on its help page for “How do I use AdSense with my CMS?”

    A lot of price comparison sites were also negatively impacted. In the UK, Ciao.co.uk was a big loser with -93,83% according to SearchMetrics.

    You can see SearchMetrics’ entire list of losers here.

    The Winners

    As there are plenty of losers in this update, somebody has to win right? The big winners appear to be Google, Google’s competitors, news sites, blogs, and video sites. A few porn sites were sprinkled into the list as well.

    All winner data is based on the SearchMetrics data of top 101 winners.

    Google properties positively impacted:

    – youtube.com gained 18.93% in visibility.
    – google.com gained 6.14% in visibility.
    – google.co.uk gained 3.99% visibility.
    – blogspot.com (Blogger) gained gained 22.8% visibility.
    – android.com gained 33.92% in visibility.

    Google competitors positively impacted:

    – yahoo.com increased 9.47%
    – apple.com increased 15.19%
    – facebook.com increased 9.14%
    – dailymotion.com increasd 17.80
    – wordpress.com increased 18.62
    – msn.com increased 8.13%
    – metacafe.com increased 6.45%
    – vimeo.com increased 18.85%
    – flickr.com increased 12.39%
    – typepad.com increased 43.86%
    – tripadvisor.co.uk increased 7.81%
    – mozilla.org increased 19.44%
    – windowslive.co.uk increased 29.46%
    – live.com increased 6.62%

    News sites, blogs, and video sites positively impacted include (but are not limited to):

    – youtube.com – 18.93% visibility increase
    – telegraph.co.uk 16.98% visibility increase
    – guardian.co.uk – 9.73% visibility increase
    – bbc.co.uk – 5.46% visibility increase
    – yahoo.com – 9.47% visibility increase
    – blogspot.com (Blogger) – 22.8% visibility increase
    – dailymail.co.uk – 12.72% visibility increase
    – dailymotion – 17.8% visibility increase
    – ft.com – 16.17% visibility increase
    – independent.co.uk – 21.53% visbility increase
    – readwriteweb.com – 152.46% visbility increase
    – thegregister.co.uk – 13.47% visibility increase
    – itv.com – 22.38% visibility increase
    – cnet.com – 14.21% visibility increase
    – mirror.co.uk – 24.87% visibility increase
    – mashable.com – 22.61% visibility increase
    – wordpress.com – 18.62% visibility increase
    – techcrunch.com – 40.72% visibility increase
    – time.com – 55.24% visibility increase
    – metacafe.com – 6.45% visibility increase
    – reuters.com – 36.82% visibility increase
    – thenextweb.com – 3.85% visibility increase
    – zdnet.co.uk – 34.04% visibility increase
    – vimeo.com – 18.85% visibility increase
    – typepad.com – 43.86% visibility increase

    Bounce Rate Significance?

    SearchMetrics sees a pattern in the winners, in that time spent on site is a major factor. “Compare the winners against the losers,” SearchMetrics CTO and Co-Founder Marcus Tober tells WebProNews. “It seems that all the loser sites are sites with a high bouce rate and a less time on site ratio. Price comparison sites are nothing more than a search engine for products. If you click on a product you ‘bounce’ to the merchant. So if you come from Google to ciao.co.uk listing page, than you click on an interesting product with a good price and you leave the page. On Voucher sites it is the same. And on content farms like ehow you read the article and mostly bounce back to Google or you click Adsense.”

    “And on the winners are more trusted sources where users browse and look for more information,” he continues. “Where the time on site is high and the page impressions per visit are also high. Google’s ambition is to give the user the best search experience. That’s why they prefer pages with high trust, good content and sites that showed in the past that users liked them.”

    “This conclusion is the correlation of an analysis of many Google updates from the last 6 months,” he adds. “Also the Panda US and UK updates.”

    The Future

    It doesn’t look like Panda is slowing down search marketing ambition:

    Even Post Panda Search Marketing To Reach $19.3 Billion In ’11, Mobile On Rise http://bit.ly/e5PsgH #Mediapost 2 hours ago via web · powered by @socialditto

    Well, SEO isn’t getting any easier, so that makes sense. Beyond the Panda update, it’s not like Google is going to slow down in its algorithm tweaks. As webmasters and publishers get used to the latest changes, more continue to pour out.

    In an AFP interview, Google’s Scott Huffman says his team tested “many more than” 6,000 changes to the search engine in 2010 alone. 500 of them, he said, went on to become permanent changes. What are the odds that number will be lower in 2011?

    Huffman also told the AFP that plenty of improvements are ahead, including those related to understanding inferences from different languages.

    And let’s not forget the Google +1 button, recently announced. Google said flat out that the information would go on to be used as a ranking signal. Google specifically said it will “start to look at +1s as one of the many signals we use to determine a page’s relevance and ranking, including social signals from other services. For +1s, as with any new ranking signal, we’ll be starting carefully and learning how those signals affect search quality over time.”

    Make friends with Google’s webmaster guidelines.

    Have you been impacted by the global roll-out of Panda? For better or worse? Let us know in the comments.

  • Google Panda Update Officially Goes Global (In English)

    Earlier, we reported that webmasters were finding signs indicating the Panda update may have been launched in more countries. Now, Google has addressed it in a post on the Webmaster Central Blog.

    They’ve rolled out the update globally to all English-language Google users. “We will continue testing and refining the change before expanding to additional languages, and we’ll be sure to post an update when we have more to share,” says Google’s Amit Singhal.

    Singhal also says Google has incorporated “new user feedback signals”.

    “In some high-confidence situations, we are beginning to incorporate data about the sites that users block into our algorithms,” he says. “In addition, this change also goes deeper into the ‘long tail’ of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before. The impact of these new signals is smaller in scope than the original change: about 2% of U.S. queries are affected by a reasonable amount, compared with almost 12% of U.S. queries for the original change.”

    It would be very interesting to know what constitutes a “high-confidence situation”. Domain blocking as a ranking signal could be a tricky area, in terms of the potential for abuse. The company has said in the past that it would look at making this a ranking signal, and that it would tread lightly.

    “Based on our testing, we’ve found the algorithm is very accurate at detecting site quality,” he says. “If you believe your site is high-quality and has been impacted by this change, we encourage you to evaluate the different aspects of your site extensively.”

    Well, that’s exactly what we’ve seen a lot of sites doing. Whether or not it has been working for them remains to be seen. It’s going to be a matter of time.

    Singhal does say webmasters should look at Google’s own quality guidelines. He also suggests posting in the Webmaster Help Forum. “While we aren’t making any manual exceptions, we will consider this feedback as we continue to refine our algorithms,” he says.

    Now the fun really begins. Now we can see how hard some of the sites that were already heavily impacted by the Panda update get hit on a global scale.

  • Google Panda Update Launched in More Countries

    Google Panda Update Launched in More Countries

    If you’ve been following the search industry for the last month or two, I probably don’t have to tell you have big an impact Google’s Panda update has had on a lot of sites. Google had only rolled the update out in the U.S. and it has already sent tremendous ripples throughout the web in terms of traffic and search engine rankings.

    A lot of webmasters felt that their sites were undeserving of being victimized by the update, while many were more obviously the intended targets. Either way, it was just in the U.S. and publishers have been bracing (by scrambling to improve quality of content/design) for the next wave of the Panda update – the expansion into other countries.

    There is some talk in the Webmaster World forum by webmasters who think it has indeed begun rolling out in other countries. So far, this is unconfirmed.

    “Today, for the first time, after checking daily, I can now see US Panda results on Google properties worldwide including but not limited to: .co.uk, .ca, .co.in, .com.au, .fr, .de, .se, etc,” said user incrediBILL, who started a thread on the topic. He later added, “I see the same results on multiple non-US data centers…However, it could be rolling out in waves meaning I’m seeing a segment of the index changed others aren’t seeing yet, but the US Panda results for my niches and some other stuff I track are now worldwide best I can tell.”

    “It has definitely hit the UK, my sites have been hit pretty hard -80% in the last couple of hours,” another user, c41lum, said.

    “It has been live in France for a while now ..but only over the last 48/72 hours have I seen it hammering directories and public info scraper/repackagers hard..they were moving agitatedly before..now some are dropping hard ( 100 to 200 places ) all across the sites,” wrote user Leosghost. “It has hit very hard over the last 15 to 21 days on the shopping comparative sites that just acted as “bridge” pages / aff feeders to other sites ( who frequently did not have the goods ).So no loss there..IMO.”

    Throughout the thread there is a mixture of people indicating that their sites have also been affected, and some who haven’t noticed any change.

    Again, whether or not this is Panda rolling out in these other countries is unconfirmed. Google has yet to respond to a request for comment. It’s worth noting that Google did make a very public (and proud) announcement when the update was initially launched in the U.S. It’s hard to say if we could expect them to do the same for the rest of the world, though the original announcement did say, “To start with, we’re launching this change in the U.S. only; we plan to roll it out elsewhere over time. We’ll keep you posted as we roll this and other changes out, and as always please keep giving us feedback about the quality of our results because it really helps us to improve Google Search.”

    They’ll keep us posted. It would seem that this was either an empty promise, or Panda has not rolled out to more countries yet.

    Update: Google has made it official. More here.

  • Long-tail Keyword – Stretch the Tail for More Conversions

    Long-tail Keyword – Stretch the Tail for More Conversions

    Many organic campaigns start out with high expectations. There is clear focus and intent on working head keywords to drive traffic and begin branding efforts. However every seasoned SEO knows that with a focused plan, heads will eventually turn to tails. Remain pertinacious, and the the tail will continue to grow and ultimately result in what is known as the long-tail keyword phenomena – a converting machine.

    Definition and History of the Long-Tail Keyword

    The long-tail keyword is simply a combination of multiple words (usually 3 or more), that are specific and relevant to a product or service sold. In essence, long tails are considerably more definitive and often times used to refine the search process by the visitor.

    Targeted buyers anyone?

    The long-tail was first introduced by Chris Anderson, which later resulted in the popular book: “The Long Tail: Why the Future of Business is Selling Less of More” in 2006.

    The premise of the long-tail is a Pareto Distribution combined with web technology, to introduce the concept of a larger proportion of “buyers” exist in the tail of the distribution. We no longer need to target the head of the distribution scale to produce results, as we did in traditional marketing times. Keep in mind, the theory behind the long-tail is one of shifting away from a relatively smallnumber of searches at the head to a huge number of niches in the tail.

    Right Church, Wrong Pew?

    If long-tail keywords convert so well, why is there such a high degree of emphasis placed on the head keywords at the onset of a campaign? That’s simple, we’re human and want hard and fast results.

    The intent from many organic campaigns is to rank for head or brand keywords in an effort to gain traction in a market. Initial keyword research will show that the head is “where the action is” and becomes too tempting to pass up.

    There is one problem though.

    Head keywords are often times, too broad and correspond only to people at the initial phase of a buying cycle or information-seeking mode. Hardly an equation for converting visitors. Normally, the long-tail becomes a force when content becomes ingrained in the site. When you look at the web analytics of a mature site with a multitude of tail niches, head keywords represent only a small piece of the overall traffic.

    “Forget squeezing millions from a few mega hits at the top of the charts. The future of entertainment is in the millions of niche markets at the shallow end of the bit stream.” – Chris Anderson

    Welcome to the World of the Long-Tail

    Depending on market and site authority, there will be anywhere from 3 to 20 head keywords.

    The Figure 1 shows a graph of how a head and tail keyword would look in a typical market. The dotted line represents where the head separates from the tail. This is also where the magic begins.

    Long tail Figure 1

    For illustration purposes, below is a recap of one of my sites for a recent 30- day period:

    • 42,268 Total Visits
    • 36,457 Visits from Search
    • 17,424 Keywords from Search
    • Top 10 keywords made up 2,727searches
    • Top 10 keywords consisted of 7% of the total searches.
    • Over 36,000 visits resulted from over 17,000 different keywords!

    Lesson learned: you will receive a great deal of volume from the top 10 keywords. However, the numbers pale in comparison to the aggregate total of searches from the other 17,000 keywords.

    Search Optimization: Achieving Power with SEO and SEM

    Okay, so we know the long-tail keyword is powerful, how do we work a campaign to achieve long-tail and highly targeted results? Every word on your site is a potential keyword, therefore content is the answer.

    If SEO execution is adequate at the beginning, companies should easily rank for their products or company name.

    While the initial SEO campaign begins to pick up steam, Pay per Click (PPC) can work the tail and long-tail keywords. PPC can prove an invaluable and cost-effective tool to measure and test conversions. The results can then be used to ramp-up an aggressive and highly targeted organic campaign.

    Leveraging SEO to brand and unleashing PPC for the long-tail, is a powerful combination that works very well at the onset of your campaign.

    Producing tail-bias content is one of the most effective ways to increase the tail of your marketing campaign. Below, is one of many methods to “cherry pick” long-tail keywords and incorporate them naturally into your content to increase search volume over time:

    Summary of the Long-tail Keyword Content Creation Method:

    1. Using Google Adwords keyword tool, select your main targeted tail keywords (usually one or two main tail keywords) that have decent traffic.

    2. Using the same tool, select 3-5 similar longer-tail keywords that have lower traffic volumes than the targeted keywords.

    3. Write your content targeting the primary tail keywords, sprinkling the long-tail keywords into the content naturally.

    Summary

    Key Takeaways:

    • Long-tail keywords reside toward the end of the Pareto Distribution channel.
    • Many leaders of web strategies will want to focus only on head keywords for immediate results and branding.
    • Leverage SEO and SEM strengths at the onset of your campaign for best results.
    • Long-tail keywords from a healthy head and tail, will result in targeted buyers.
    • As a site matures and content increases, tail keywords will increase as well.
    • For a mature site, the majority of your traffic will come from long-tail keywords.
    • Increase long-tail keywords by increasing content and targeting long-tails in your content.

    The long-tail keyword is one of the best ways to target buyers of a market. People at the end of a tail keyword can also be at the end of the buying cycle.

    A long term plan of consistently creating quality content using several long-tail keywords, will slowly add massive keyword substance to your site and increase conversions.

  • How to Optimize Your Images For Search Engine Traffic

    The following is part of a multiple part series covering image optimization techniques. This article is intended for beginners through intermediate SEO’s; if this doesn’t pertain to you, you may want to skim as most of this will probably be review material for you.

    Some of the big questions many people ask are why would they even want to perform image optimization? Doesn’t it just help people who want to steal or hotlink images? And is there really any meaningful traffic or links that you can get from image optimization? IMHO the answer is yes. Let’s say someone is going on a trip to Italy. They might do image searches for things to do or see in Italy and for famous Italian landmarks like the Leaning Tower of Piza, the Trevi Fountain, or St. Peter’s Basilica. Thanks to Google’s universal search results, images provide a way to get onto the first page (or, in some cases, the top result) and get a click through, an ad view, or adsense impression. It might even get a lead generation completion. Maybe you run a fish store. If a university professor or government agency needs a picture of a fish and your image result appears, and you allow your images to be reused in exchange for a link, this can be huge way to passively build links slowly over time (true story! It happened for a client I used to have). Now that we’ve got the why out of the way, let’s talk about the “how” of image optimization.

    Filenames

    This is one of the most basic elements of image optimization. If you have an image of blue widgets, I would name your image “blue-widgets.jpg” or “blue-widgets.gif”. You can use other formats like PNG, but I have gotten better results with “jpg” and “gif” files. You can use other characters like underscore as word delimiters, but I get better results with hyphens. You can run the words together if they are separate in other factors. I have found stemming plays a role (ie widget vs widgets), but you can get around it using other factors. I haven’t seen capitalization play a role, but I prefer to use all lower case because I usually use Apache servers and case sensitivity matters. If you are going to have multiple images of the same object-type, I suggest adding a “-1″, “-2″ onto the end.

    Now, before the hate mail or hate tweets start, it is entirely possible to have an image rank without the keywords being in the file nameIF there are enough other factors in place. However, you should ask yourself why would you give up a chance to give a search engine a signal about what an image is about? If you work on a large ecommerce platform or other large database application, chances are good that your gold diamond earrings will have an image file name like “GDX347294.jpg” that corresponds to the item’s SKU or other internal classifier. So, yes, you will have to sacrifice the keyword for business reasons.

    ALT Text

    Let’s get the basic information out of the way: ALT text was designed for screen readers or visually-impaired people to know what they weren’t seeing. Your goal is to use it to satisfy the screen readers while being keyword focused enough for the search engines and without being a keyword stuffing spammer. Here’s an example of ALT text variations:

    Keyword stuffed: discount hotel room paris france

    ALT text only: Eiffel Tower

    SEO optimized: Eiffel Tower from Louvre Bons Enfants hotel room

    Striving to find a balance between pleasing the search engines and text readers can be a juggling act. If you are risky with some of your other SEO techniques, I’d play this on the safe side.

    Headings and Bold Text

    If image optimization for a particular image is important, I really like to optimize the image with bold or a heading tag of the term I’m chasing right above the image. I’ve found this really helps give a strong signal to the engines

     

    Oceanus Statue from Trevi Fountain

    Image Captions

    Image captions like the one to the right are another way I really like to give the search engines a good nudge in the direction I want them to go. Try to place the search term you are trying to optimize for at the front of the caption.

    Image size

    I’ve found that if you keep your images a reasonable size you generally do better with image optimization. That’s not to say really big or really small images won’t rank, just that images that are larger than 100×100 and smaller than 1200×1200 work best. Using a thumbnail that links to a larger picture can be helpful.

    Image Traffic

    So what can you expect from image traffic? Like all things, it depends on what you are chasing, but I have one image that ranks on the first page for a single word term that brings in hundreds of views for me every month. The page has adsense on it and, over a single year, it brings in several hundred dollars worth of revenue. It’s something to think about before you write off image optimization.

     

    Images Traffic Data

    So what are the takeaways from this post:

    • Try to name your images with your keywords if possible, using the hyphen as a delimiter.
    • Shorter names are better than longer. Avoid using more than 4 words if possible.
    • Keep your ALT text keyword focused without being stuffed or spammy.
    • If possible use headings or bold tags above or directly next to the image.
    • Use captions if at all possible and keep the keywords closer to the front of the caption.
    • Keep the images a reasonable size. They should be large enough for people to see but small enough to fit on a screen.
    • If you own the image, encourage people to reuse your image in exchange for a link.
    • Try to find a way to monetize image traffic with CPM advertising, adsense, or affiliate links.

    Originally published at Graywolf’s SEO Blog

  • Using Living URLs for Seasonal Content

    I’ve mentioned incorporating living URL’s into your SEO strategy before.  Now that I have been experimenting with them for a while, I have some tips about using them a little bit more intelligently.

    If you read fashion magazines, you know that every year they publish a spring fashion update; bridal magazines publish new spring wedding dresses; and bands that have been around a long time will always return to give concerts in major cities like New York or Los Angeles. Traditionally, publishers would use a solution like this:

    Title: Women’s Spring Fashion For 2011

    URL: example.com/womens-spring-fashion-2011/

    Title: Spring Bridal Dresses 2011

    URL: example.com/spring-bridal-dresses-2011/

    Title: U2 Concert Tickets Madison Square Garden 2011

    URL: example.com/u2-concert-tickets-madison-square-garden-2011/

    To use the living URL strategy, you need to understand how Google handles dates and to know when to use them and when to avoid them. My recommendation is to keep the year in the page/html title (updating it every year) but to keep the year out of the URL. So it would look like this:

    Title: Women’s Spring Fashion For 2011

    URL: example.com/womens-spring-fashion/

    Title: Spring Bridal Dresses 2011

    URL: example.com/spring-bridal-dresses/

    Title: U2 Concert Tickets Madison Square Garden 2011

    URL: example.com/u2-concert-tickets-madison-square-garden/

    In the examples used above, I would put up new content on the existing URL’s and not save the old content. However, there are some cases where you would want to save the content, as in my WordPress SEO plugins post, but that’s a decision you’re going to have to make for yourself based on your niche and whether the archived content has value. A third option is to put the new content on the old URL and move the old content on a new URL. It sounds complicated, but it’s really pretty easy. I go into more detail in keeping an editorial calendar in the new car model section.

    Next you need to consider the implications of social media on your living URL strategy. If your blog auto tweets new posts, you’ll need to clear/reset the fields that control those actions. Next, you need to update the publish date, especially if you do any date-based sorting or displays or if you include the date in your XML site map. If your content has a link bait social bookmarking quality, you need to experiment with how the site you are targeting handles submitting the same URL. I suggest experimenting with someone else’s domain–not your own. I suggest that you try adding parameters or try camouflaging your intent with dummy parameters like “utm_source=rss”.

    So what are the takeaways from this post:

    • Use predictive SEO and an editorial calendar to identify candidates for a living URL strategy.
    • Use the date in the title but avoid it in the URL.
    • Have a plan for archiving, relocating, or eliminating the old content.
    • Update the publish date and reset any autotweet fields.
    • Know how any social media sites handle resubmissions. Try using parameters as a workaround.

    Originally published at Michael Gray Graywolf’s SEO Blog

  • Google’s Algorithm Impact Over the Years in Graphic Detail

    SEOBook has posted a very interesting infographic from Jess.NET, about Google’s “Collateral damage” and “How the Evolving Algorithm Shapes the Web”.

    The infographic illustrates the story of Google’s rise to Internet power and the impact it has had on webmasters and publisher. While not covering every algorithm change over the years, it does a pretty good job of highlighting the major shifts in webmaster practices that have been largely influenced by Google.

    It wraps up with the Panda update and plays heavily on Demand Media’s content business wemodel, which is still proving successful. It does emphasize, however, just how dependent on Google webmasters and publishers have become, and shows why it is in your best interest to diversify your traffic sources.

    Google's Collateral Dmage
    Click to View Full Size

    “Rather than using unobtrusive measurement, Google both measures & monetizes the publishing ecosystem,” says SEOBook. “Their most recent algorithmic update likely shifted over $1 billion in online ad revenues. Their editorial philosophy & ad programs have likely had more impact on the shape of the web than anything or anyone since Tim Berners-Lee created it.”

    “Some of the biggest problems in search (like content farms) were created by Google,” the site adds. “This image highlights how the search ecosystem has changed since Google has become a serious player, and how Google has used their amazing marketshare to bend the web to their will.”

    Google’s Panda update has been incredibly controversial for something that was intended to improve the quality of results for users. Many think that the results are indeed better now, while others are skeptical or flat out disagree. Either way, it’s affected a great deal of sites  – some deservedly so, and others which are more debatable.

  • Branding vs. Keyword Rich Domain Names – Which Does Google Prefer?

    When it comes to picking a domain name most webmasters are in a dilemma of whether to choose a keyword rich domain name that will get better rankings on Google or a branded domain name that is more likely to get more visitors to the site and be easier to remember.

    In recent times Google has been putting more emphasis on brand names over keyword rich domains which has led to the widespread presence of various branded online business at top positions on the search result pages even if they did not meet all the SEO guidelines. Though Google has always denied such favoritism, their criteria to assess the relevancy of any site are constantly changing to keep away spammers, gamers, and black-hated sites and Google is now adopting more novel signals to help them improve their page ranking system.

    A recent video by Google’s Matt Cutts explains the situation further. Matt emphasizes that Google is ready to experiment mixing up the old and new signals in their algorithm and see its impact on the evaluation of a website.

     

     

    Google has been looking at using more robust signals like link authority, page rank, and keyword-rich domains over the traditional signals like on-page optimization, link anchor text, domain names, meta-tags and meta-description. However, we can’t forget about parameters like branding, social networking, reviews, and personalization – they seem to be getting more attention from Google in analyzing the credibility of a site.

    Aaron Wall from SEOBook sums up the current situation pretty well:

    Classical SEO signals (on-page optimization, link anchor text, domain names, etc.) have value up until a point, but if Google is going to keep mixing in more and more signals from other data sources then the value of any single signal drops.

    Have you adjusted your SEO strategies lately as a result of these new ranking factors? Feel free to share your thoughts below.

    Originally published on ineedhits

  • Has SEO Peaked?

    Richard J. Tofel at Nieman Lab posted an interesting article, saying that, “someday, the sun will set on SEO,” and that “the business of news will be better for it.”

    Will the sun set on SEO? Tell us what you think.

    To sum up a lengthy post (at least my interpretation of it), the point Tofel makes is that publishers are abusing search to get views (nothing new there), and the news industry is suffering for it, but with Google taking stronger action, SEO tactics might fall by the wayside.

    He does make some interesting points. For example, “The Huffington Post/AOL deal may mark something of a watershed in this progression,” he writes. “Much of the $300-million-plus in value HuffPo has built has been in playing very smartly by the SEO rules of the first decade of this century. But if it is true that most entrepreneurs sell out near the top, and it is, then perhaps we have just been sent a signal by one of its masters that the dark arts of SEO have peaked and that the century’s second decade will see them fade, perhaps into near nothingness by the third decade. In other words, it seems increasingly likely that, when the history of this era is written, SEO will turn out to have been a transitional phenomenon.”

    He also refers to Google’s recent crackdown (Panda update) on low quality content as a “small step in an inevitable direction, with the direction being the sunset of SEO.”

    Google’s Panda Update did make it pretty clear that quality content is more of a focus than ever before, as many sites felt its wrath. Google’s search results still have quality issues, and probably always will. It’s not perfect, and Google would no doubt be the first to acknowledge that. That’s why Google’s engineers always have their work cut out for them, keeping up with new tactics (and old ones that still work) employed by site-owners to get their content moved up in the rankings. Google has even said that its current algorithm can still be gamed. You can still optimize.

    Google, of course encourages many white hat optimization tactics – those which help it index the content more efficiently, and provide a better user experience. It is the black and even “gray” hat stuff, I think, that Tofel is mostly referring to. He does say “the dark arts of SEO.”

    A tactic’s placement on the gray scale, will vary, depending on who you talk to anyway. The Panda update should be considered a call-to-action to not rely totally on Google for web traffic. Smart site-owners have always known this, but sometimes (to quote Cinderalla), “you don’t know what you got, ’til it’s gone.” That goes for web traffic.

    We’re seeing a large trend of publishers trying to put a great deal more emphasis on social channels to decrease their dependence on Google. Meanwhile, social channels are also becoming go-to sources for finding a lot of types of information.

    This is why getting social search right is so important. Google is trying, but so far failing. Don’t get me wrong, Google’s social search can be useful, and it is getting better, but as long as it doesn’t include Facebook data, it’s just not going to be as good as it otherwise could be – that is unless Facebook goes the way of MySpace, which is looking more and more unlikely at this point. Even then, however, it would still leave Google imperfect. There are still millions of people who use MySpace.

    This is working in Bing’s favor. Bing has been doing more with Facebook data, but it has the challenge of winning over users – a challenge that isn’t as difficult for Google at this point, and there are still plenty of areas where Google simply offers a good user experience. Social isn’t everything. But it is is very important.

    This is also why Facebook itself could eventually make a huge splash in the search space, if it chose to do so. The social network seems to be taking on just about every other industry (now movie rental). Why not search? We discussed this in more detail here.

     

    Facebook Search Update Separates Categories

    Every time the “is SEO dead?” conversation comes up, which is fairly often, the general consensus, is ordinarily along the lines of “no, it’s just changing.” I think that still holds true. Even as social grows more important, search will always have its place, because users will always need to find something, and their friends will not always have the answers. Even if some friend does have the answer, they’ll have to find it – sometimes without starting a new conversation.

    While the big brands battle with corporate politics and struggle to strike a perfect version of social search, other smaller companies are doing interesting things, building upon the big services we already use – companies like Greplin, Wajam, Backupify, etc.

     

     

     

    Right now, users have options to enhance their own search experiences. These options will continue to improve. It will be interesting to see if the big players can take these types of things mainstream, and make them the normal search experience for the average user.

    As far as SEO peaking, I’m going to go with the usual, “it’s just continuing to evolve” conclusion.

    What do you think? Comment here.

  • Should SEO, Web Hosting Firm Be Held Responsible For Counterfeit Site?

    Precedence setting, or lone case? I’m sure SEO experts who have discovered this case are interested to see what becomes of it. A South Carolina judge has found the SEO Firm, Bright Builders Inc., responsible for damages done by a counterfeit golf club site.

    The judge ruled Bright Builders was guilty of contributory trademark infringement, and other charges, due to them providing marketing and web hosting services. The judge ordered Bright Builders to pay $770,050 in statutory damages, while the site’s owner, Christopher Prince paid $28,250.

    Do you agree with the difference between damages owed? Let us know.

    Cleveland Golf was the company who filed the suit, and originally only targeted Prince. When Cleveland’s lawyers discovered of Bright Builder’s services, they decided to file suit against them as well. The argument presented by Cleveland’s lawyers is Bright Builders was knowingly aware of the scam, and continued their services.

    Christopher Finnerty, one of the lawyers for Cleveland Golf said this of the ruling, “For Internet Intermediaries like SEOs and web hosts, this should be a cautionary warning” he continues, “The jury found that web hosts and SEO’s cannot rely solely on third parties to police their web sites and provide actual notice of counterfeit sales from the brand owners. Even prior to notification from a third party, Internet intermediaries must be proactive to stop infringing sales when they knew or should have known that these illegal sales were occurring through one of the web sites they host.

    Finnerty also stated how this was the first time a service provider was found liable for infringement, without being notified prior to the lawsuit. Being the first of its kind, it brings about the question of whether or not this is a precedence setting case.

    I’m not totally surprised the judge found Bright Builders guilty of contributing to trademark infringement. Considering we have no idea of the knowledge the judge has or developed during the case in terms of web hosting, or SEO practices. What I find odd is the discrepancy between the amount of money responsible between Prince, and Bright Builders. How could they be responsible for so much more than the actual owner of the site?

    Bright Builders has never had a sterling reputation. If you research them on BBB, you’ll find a rating of ‘C-‘. 26 complaints have been filed against them. If you search for them on Google, one of the top links will direct to a website called scam.com.

    The case certainly leaves a worrisome feeling for SEO experts, and firms. It has the potential to make experts more aware of the content they’re working with. It also provides a debate among those interested.

    Should SEO and hosting services be responsible for the tools they’re providing counterfeit sites? If they have no knowledge of the site being counterfeit, is there a defense to be found? Considering how much a SEO service needs to know in order to be successful, it presents a rousing debate.

    Let us know how you feel about this case in the comments.

    EDIT: To clarify, the verdict was found by a jury panel and the judgement was handed down by the South Carolina Judge.

    UPDATE: Stephen Gingrich, Vice President of Global Legal Enforcement and HR for Cleveland® Golf/Srixon said this in a press release, “While individuals who sell counterfeits pose major problems for the manufacturer, companies like Bright Builders who can amplify the impact and scope of this problem are even more dangerous” he continues, “Counterfeiting has existed for thousands of years but has been a localized issue. The Internet, ease of global shipping and payments, combined with SEO’s and web hosts injecting steroids into the situation has brought the issue into every consumer’s living room.

  • Decreasing Google Dependence: A Growing Trend

    John Citrone, editor at the online writing community Xomba.com, says Xomba saw Google’s Panda update coming, and started preparing last summer, when it started to draw up a plan to prepare for an “algorithmic shift” from Google.

    “Around the first of the year, we began creating a new site design with new community networking features for people who want to express themselves in more than 140 characters,” he tells WebProNews. “Our new design will reduce or eliminate our dependence on Google to bring us traffic through its search results; our focus is to build a community of people who want to network with each other and share their experiences and their passions.”

    “Xomba’s approach to revenue sharing is similar that of HubPages and the like, but we will no longer emphasize that aspect of writing at Xomba,” he says referring to one of the sites that got hit hard by the Google Panda update. “We are, instead, changing the way people approach writing online.”

    “We were lucky, considering the broad-reaching impact the Google changes have meant for sites like ours,” Citrone tells us. “We have been preparing for this for quite a while, so though we may have experienced an immediate hit, we are confident that our site relaunch will not only put us where we were pre-algorithmic shift, but will also mean more independence for us in the future.”

    “In the past, Google was our primary source of traffic, but last summer we decided to make changes that would offer us more flexibility and independence,” he continues. “We knew this would be important if we wanted to remain viable in the long-term.”

    “Since we are moving toward building a community of users, we will capitalize on integrating with existing social networks like Facebook and Twitter,” he adds. “We have created a host of networking features within the site as well, to offer our users greater command over where their content is posted and how it is being viewed. We’re more interested in building a community similar to Twitter or Tumblr than competing with ‘low-quality content farms.’”

    While most content sites rely on Google or search in general for the bulk of their traffic, Xomba’s approach reflects a newer way of thinking throughout the web – that less dependence on search (and being less at the mercy of algorithms) is a better approach for a sustainable business. In other words, it’s best not to put your eggs all in one basket.

    Sites are likely to find their best traffic sources to come from the channels in which they put the most effort into. If SEO is your game – and you really take it seriously – you probably get most of your traffic from search. If you ignore SEO, but spend endless hours improving your social strategy, you probably get more from social channels. Of course a good mix is ideal, but the point is, there are potential traffic sources besides Google.

    That said, Google is an incredible force on the web with its huge share of the search market. It’s hard to trump Google visibility, but people are spending a great deal of time these days using social channels, and compelling content is what people are sharing.

    “Last summer, we decided to push our users toward posting substantive content and promoting it through networking features,” says Citrone. “We even went so far as to change our posting rules to raise the ‘quality’ bar. We want users to post content they care about. Our position is that content that is well-put-together, content people inject with their own passion, will find an audience. It’s also something they can be proud of and promote independently, without relying solely on Google.”

    “All user-generated content sites could be considered content farms,” he says. “But let’s be clear: We don’t employ writers nor do we instruct our users on what to write about. The term ‘content farm’ has taken on a stigma that is unbecoming, and it certainly is not what we are about. We want people coming to Xomba because the content is strong, entertaining and substantive. If it hits high in search engines, great. If not, it’s still worthwhile. That’s what matters to us.”

    Sites most commonly lumped into the content farm category (the actual definition of the phrase is widely debated) are also going out of their way to improve quality. Demand Media, which is often the first company associated with this label in the public eye (though the company itself will have no part of it) has been particularly vocal about improving its quality – even long before the Panda update – and even long before its recent IPO.

    The fact of the matter is that most sites with large amounts of content have a wide spectrum of quality. Demand Media’s properties are included in that. Demand has been making high-profiles deals with brands and celebrities to enhance the quality and perception of the content it offers. The company has also put an increasing amount of focus on social media – less dependence on Google.

    Demand Media is looking at implementing some kind of “curation layer” to its content, which the company has said will be a way of “using (something like) Facebook” as a way to give feedback on how helpful articles are. They would then use that feedback to improve the quality of content.

    HubPages has taken steps to improve its own site search – a good way to keep visitors from going right back to Google to continue searching for what they’re looking for.

    “We foresaw the Google changes and have been working hard preparing for them over the last year,” Citrone tells us. “This seems to have kept us in front of the curve. Specifically, over the last couple of months we’ve informed our users that we’re raising the standards for acceptable content.”

    “The new website is being built with a philosophy similar to websites like Twitter, but for people who want to write more than 140 characters,” he says. “There’s a high demand for this kind of community, especially now with the fall of ‘low-quality’ ‘content farms.’ Our success should not be determined by changes in any search engine algorithm, but by the acceptance and enthusiasm of our audience.”

    Xomba Redesigns Site, Hopes to Reduce Dependence on Google For Traffic
    Xomba is in its final design stages and is ready to launch its new design strategy next month. While we have yet to see how effective Xomba’s redesign will be itself, the philosophy behind it is dead on. Create a great user experience that makes people want to use your site. The best traffic is direct. That’s the traffic that sticks around for a while and comes back after it’s gone. Never stop looking for ways to improve the user interface of your own site.

    As far as referral traffic, Google should not be ignored, but there is also a growing number of potential new sources as more ways to share become available – new social channels, new mobile apps, etc.

  • Google Panda Update: Lack of Consistency on Quality?

    We’ve been talking with a lot of people who have had their sites impacted negatively by Google’s recent “Panda” algorithm update. Our thinking is that the more sides of the story we hear, the more webmasters and content producers will be able to learn from it. With that, we had a conversation with Paul Edmondson, CEO of HubPages, which made the list of the hardest-hit sites.

    Have you gained additional insight into the Panda Update as time has progressed? Share in the comments.

    HubPages, which launched in 2006 as a social content community for writers to write “magazine-like articles”,  pays 60% from the ad impressions to the writers. You may find how-to articles, not unlike those you would find at eHow, but also in a variety of other styles. Authors on HubPages publish nearly 3,000 “Hubs” a day, over 7,000 comments and thousands of questions and answers and forum posts, according to Edmondson. Last year, there were over 13,000 total incremental pieces of content a day, he said.

    When directly asked if HubPages is a content farm, Edmondson told us, “Actually, HubPages is to articles what YouTube is to video. Like YouTube where enthusiasts post videos of their choice, our community write articles about whatever they wish and are passionate about. This covers a wide range of content from poetry to recipes, and pretty much everything in between. Writers choose what they write about, and they own their content.  In return, they stand behind the content, build readership and interact within the HubPages community.”

    Paul Edmondson of HubPages Talks QualityOn where HubPages stands out compared to sites from Demand Media, Associated Content, and others, Edmondson said, “First, we think authors rule! We align our interests with our authors – and this is key to our long-term success. At HubPages, authors choose what they write about and they own the content they write.  We also share impressions with the author for as long as the content is published on HubPages.  Our incentives are aligned with the authors’ needs. At its core, HubPages is a passionate community of writers. The value that is created goes well beyond the revenue opportunity,” he said. “To have a truly healthy ecosystem, writers need social interaction, feedback and praise.  To this end, HubPages has offered the ability to fan authors – which turned out to be commonly known as ‘follow’ in 2006.  Some authors have thousands of followers for everything they publish.  We also developed an accolade system to give back positive feedback and encouragement.  When these items are combined, HubPages is revealed to be a very unique collection of people, with authentic voices, sharing their knowledge with the world.”

    When asked how frequently HubPages articles rank among the top results for searches in Google,   Edmondson said,  “Some of our content ranks very high, and some isn’t in the index at all. We don’t have the data to answer this question for every piece of content and the potential search terms that a Hub could rank for in the search results.”

    When asked if articles are written specifically for search, he said, “It’s up to our writers. We let them choose what they wish to write within our editorial policies. We offer tools and education for our users to become better online writers – this includes – among a vast array of things – best practices for search.”

    “SEO has to be an important part of any publisher’s traffic sources,” he said. “We make SEO tools available to our writers, its up to them whether they want to use it or not. We are very sensitive around not abusing search engine practices and will take down articles that are obviously trying to game the system.”

    We had a separate conversation with a HubPages writer, who for this article will just go by the name Chuck. He is a college economics instructor. Chuck tells us,  “HubPages is a very good site for earning money.  This has been both my experience as well as that of other writers on the site who have published Hubs on their earnings experience.”

    Chuck says the freedom to write on whatever topics a writer wants to write about is one of the main things that makes HubPage attractive. “This not only allows writers, including me, to focus on topics that interest or excite us but also gives us freedom to explore new areas of writing.  I find that this freedom offers me an opportunity to challenge myself as well as the opportunity to broaden the range of my writing.  It also lets me test the money making potential of other areas outside my immediate area of interest and expertise.”

    Chuck says the HubPages team is constantly updating the site by regularly giving writers new tools to work with. “These changes and updates also include continual updating of the look and feel of the site which keeps it fresh and new-looking for visitors,” he says. “HubPages has an excellent training area on the site which enables both new and existing users to learn how to use the various tools as well as allowing all of us to keep our skills current in other business aspects of the site.”

    “The Professionalism and quality of the site attracts good writers which in turn attracts increasing numbers of viewers which benefits all of us,” he continues. “The HubPages team puts a strong emphasis on marketing which not only continually brings in new viewers and writers, but also enables the team to keep writers informed about current reading tastes and habits of visitors.”

    “HubPages is also a good social networking site,” he adds. “ It offers the opportunity to meet and interact with others from around the world both through exchanges on Hubs as well as in the Forums.  In addition to learning from the writings of others and comments left on my Hubs and those of others that I read, I have met and been in direct contact with a few people both Hubbers and visitors which have resulted in some mutually beneficial exchanges of information.  These exchanges have included my receiving some photographs from fellow Hubber Ralph Deeds which I was able to use on 2 of my Hubs.  On my Hub about Mathew Juan I was contacted by a visitor who not only provided me with additional information for my research but I was also able to provide him with some information which he used to update his website.  I received a nice email from a local artist whose work I wrote about on my Hub about Public Art who requested permission to use my photos in his advertising.  I have had other, similar exchanges, on of which I am still following up on and which might lead to some paid writing assignments.”

    Chuck says he’s never written for Demand Media, Associated Content, or Suite101 (all of which have seen some impact from the Panda update – some more than others). “However, I have done some writing for TheInfomine.com, MyGeoInfo.com, Xomba.com and SheToldMe.com,” Chuck says. “ Most of my writing on all of these have consisted of articles related to Hubs I have written (but not copies) with links back to my related Hub article. I have collected a few dollars in Google Adsense money from these but am reconsidering keeping my AdSense code on them as I believe it was Jimmy The Jock in his piece on Success Stories who said that he found he made more Adsense earnings by not having ads on outside sites as more of the readers then tended to go to his Hubs rather than wandering off to an ad on the other site.  His experience showed him that bringing the people to HubPages generated more ad clicks.”

    Last week, HubPages launched a new ad platform. “Changing the long-held equation between advertisers and individual online writers, HubPages is launching its HubPages Ad Program that will give its writers access to the premium ad rates that so far have been restricted to giant publishers,” Edmondson said.  “This offering is the first time that any online writer will be able to access significant advertising revenues, as available via premium advertising networks and direct sales, while retaining all rights to their own content.Individual writers have always been considered too small to be worth advertiser attention and the agency model wasn’t built to work with millions of content producers. While the democratization of content has occurred, the earning power has not been available until now.  HubPages is leveraging its size and scale as a top 50 site (Quantcast) to negotiate better ad money on behalf of our writers.”

    Edmondson addressed the Panda update on the HubPages blog recently, saying that they hadn’t seen it consistently drive traffic to better-quality Hubs. “On one hand, some of our best content has seen a drop in traffic; simultaneously, we have seen traffic rise on Hubs that are just as great,” he wrote. “We are taking this seriously — behind the scenes, we have been crunching data and focusing on making sure that we are doing everything right from our side. We have an editorial policy and internal system that rewards original useful content, and this aligns with what Google wants, too.”

    “We have several internal quality metrics that make up HubScore and we have deeply analyzed things like content length, view duration, Hub Hopper ratings, and HubScore,’ he added. “These elements have been compared to changes in Google referrals, and again, based on the way we rate content quality, the fluctuation so far looks random at this stage of the update. We believe that a change of this size will take a settling-in period. We have reached out to Google and will continue to study the update.”

    Like any other site that has user-generated content or a massive amount of articles, it stands to reason that there is a mix of both good and bad quality content on HubPages – not unlike YouTube. It’s how Google ranks the content in search results that ultimately matters to users of the search engine, which at this time accounts for the majority of Internet users. HubPages’ quality metrics are probably not identical to Google’s quality metrics, but it’s interesting that Endmondson thinks some of the site’s best content was negatively impacted. My guess is that HubPages is not alone in this.

    We’ve still seen examples in the wild, where Google continues to rank less authoritative content over more authoritative results. We’ve referred to the “level 4 brain cancer” example several times, which continues to show an eHow article as the top result over actual experts in the brain cancer field. In fact, one of our own articles is even showing up on the first page now (presumably from having referenced it a few times). While we’re flattered that Google would consider us enough of an authority on the subject, I think users would still prefer to see more useful advice from a medical standpoint.

    Another interesting side story to this whole Google search quality thing is that Google has a patent application out for essentially what Demand Media does – suggesting topics for people to write about based on search. Are we going to see Knol results “filling in the gaps”? More on that here.

    Many of the sites hit hardest by the Panda update are trying to find ways to become less dependent on Google. It’s wise not to be too dependent on any one traffic source anyway, but the Panda update has really driven this point home. HubPages has taken some time to improve its own internal search. As Mike Moran said in a recent article, this is a good way to keep from driving your visitors back to Google to find what they’re looking for.

    For additional insight:  Google “Panda” Algorithm Update – What’s Known & What’s Possible

    Thoughts on Google’s search quality post-Panda? Share in the comments.

  • Blekko and BrightEdge Partner on Fighting Brand-Based Search Spam

    Blekko has partnered with SEO platform BrightEdge to fight search spam. Blekko is giving BrightEdge access to real-time link graph information, so companies can verify that their own links are legitimate as well as whether sites ranking ahead of them are using black hat or unethical standards to game search engine rank.

    “This is an unprecedented partnership between such companies in an effort to break down the search black box so that brands can rise above all the clutter on the Web,” a representative for Blekko tells WebProNews. “As you know, blekko’s mission has been to clean up the swampy Web since day one, and this partnership marks the next step.”

    Brands will be able to see how spam creators are manipulating paid links and using black hat strategie, taking advantage of brand searches.

    “Next to consumers, the biggest losers in the current state of search are legitimate brands that are being outranked by spam,” said Blekko CEO Rich Skrenta. “But too often legitimate companies encounter the unknown when trying to figure out how to get their information in front of consumers who are searching for it. This is a continued step in our efforts to bring transparency to the SEO process.”

    “This unprecedented partnership enables everyone from the CMO to the SEO manager to have confidence that the SEO being practiced around their brand delivers results in a responsible manner,” added BrightEdge CEO Jim Yu. “Through this partnership, any brand on the web can instantly monitor its own and any competitor’s SEO practices on an almost real-time basis.”

    Financial terms of the partnership have not been disclosed.

    A couple weeks ago, Blekko launched its AdSpam algorithm, flagging over a million domains as spam. This is in addition to other hand-picked domain blocking of content farm-like sites.

    Blekko also lets users block sites from their search results themselves – a practice that Google has just recently adopted, and one that screams at content producers to maintain a high level of quality.

    The fact is that now you not only have to worry about how Google is going to rank your content based on quality – which it is also greatly focused on – but putting out good enough content that people won’t block your site from future searches.

    Google’s Personal Blocklist Chrome extension, which the company released before actually integrating this functionality into the search results themselves, is being installed 11,464 times a week itself. You have to wonder how frequently domains are being blocked by searchers.

  • Google Domain-Blocking Chrome Extension Installed 11,464 Times a Week

    There are 129,004 people using the Personal Bocklist Chrome extension from Google. This is the browser extension that the company launched on February 14, ahead of the Panda Update, which lets users simply block domains from ever appearing in their Google search results.

    According to the extension page, it’s been getting 11,464 Weekly installs. For an extension that has only been around for just over a month, and available for a single browser, it’s interesting to think that this many people are using it, and blocking domains from their search results.

    Personal Blocklist Chrome Extension from GoogleI wonder how many of those who have installed the extension are actively blocking domains. When Google unleashed the Panda update shortly after it released the extension, the company said:

    It’s worth noting that this update does not rely on the feedback we’ve received from the Personal Blocklist Chrome extension, which we launched last week.

    However, we did compare the Blocklist data we gathered with the sites identified by our algorithm, and we were very pleased that the preferences our users expressed by using the extension are well represented. If you take the top several dozen or so most-blocked domains from the Chrome extension, then this algorithmic change addresses 84% of them, which is strong independent confirmation of the user benefits.

    Google was apparently so thrilled with the response to the Chrome extension, that they later announced that they were just incorporating similar functionality into the Google experience itself – no extension needed. This feature has not finished rolling out yet, but you have to wonder – if this many people have been taking advantage just through a Chrome extension, how much domain-blocking is going to go on once it’s just a click away for every Google user? Update: A Google spokesperson tells WebProNews: The feature is fully rolled out on google.com in English for people using Chrome 9+, IE8+ and Firefox 3.5+, and we’ll be expanding to new regions, languages and browsers soon.

    Block Domains in Google Results

    If this isn’t a call for quality content, I don’t know what is. Tom Blue made an excellent point in the comments of our previous article on the feature:

    This change is 5X more helpful than the Panda update. The Panda update is trying to make a judgement call on what people should see, this allows people to choose what they want to see. I believe they shouldn’t even use these results in future algos as it can be gamed and only certain people will use this feature.

    Google did say it would consider making data from domain blocking a ranking signal down the line. Upon the release of the Chrome Extension, Matt Cutts said, ” If installed, the extension also sends blocked site information to Google, and we will study the resulting feedback and explore using it as a potential ranking signal for our search results.”

    Then, when the feature was announced for Google results themselves (the Chrome-less version), Google said, “While we’re not currently using the domains people block as a signal in ranking, we’ll look at the data and see whether it would be useful as we continue to evaluate and improve our search results in the future.”

    One has to wonder how Google would use such a signal in a way that could not be gamed by people getting their competitors’ sites blocked. There are other potential abuse scenarios as well. Personal spite comes to mind.

    It’s possible that it could become a ranking signal and nobody will ever know for sure. Google will not reveal its entire list of signals. They may keep this one close to the chest, although that doesn’t mean it won’t still get abused based on hunches.

    Keep in mind, you do have to visit a link before block option appears. Still,  sites will do well to pay closer attention to the quality of the content they’re putting out than ever before. This really should benefit the web as a whole in the long run. Plus, quality content tends to get shared more anyway.

  • Google Search News You Should Know About

    Probably the biggest piece of search news today is Google’s launch of a domain-blocking feature. We discussed this more here, but essentially, Google has taken the functionality of the Chrome extension it recently released and turned it into part of the search interface. People will now be able to have more control over the quality of the results they see for search queries.
    Block Domains in Google Results

    An interesting tidbit out of SMX West – Barry Schwartz reports: “Google and Bing admitted publicly to having ‘exception lists’ for sites that were hit by algorithms that should not have been hit…Matt Cutts explained that there is no global whitelist but for some algorithms that have a negative impact on a site in Google’s search results, Google may make an exception for individual sites.”

    One would think that this could explain Cult of Mac regaining its rankings after being hit by the Panda update, but Google recently said; “This is an algorithmic change and it doesn’t have any manual exceptions applied to it…”

    Google Operating System reports that Google is testing a feature that shows word counts next to certain search results. Such a feature would complement the search engine’s current instant previews feature, which lets users get an idea of what they’re getting ready to click through to. Instant Previews, by the way, were launched for mobile this week.

    Google added click-to-call technology for emergency-related mobile search results, which applies to things like poison control, suicide hotlines, and common emergency numbers.

    U.S. Senator Herb Kohl (D-WI), Chairman of the Senate Subcommittee on Antitrust, Competition Policy, and Consumer Rights, has announced an active agenda for the new session of Congress. The section on Competition in Online Markets/Internet Search Issues says:

    Access to the wealth of information and e-commerce on the Internet is essential for consumers and business alike.  As the Internet continues to grow in importance to the national economy, businesses and consumers, the Subcommittee will strive to ensure that this sector remains competitive, that Internet search is fair to its users and customers, advertisers have sufficient choices,and that consumers’ privacy is guarded. In recent years, the dominance over Internet search of the world’s largest search engine, Google, has increased and Google has increasingly sought to acquire e-commerce sites in myriad businesses.  In this regard, we will closely examine allegations raised by e-commerce websites that compete with Google that they are being treated unfairly in search ranking, and in their ability to purchase search advertising.   We also will continue to closely examine the impact of further acquisitions in this sector.

  • Adobe to Help Search Marketers Keep Paid Campaigns From Cannibalizing Organic SEO

    Back in the fall of 2009, Adobe announced it was acquiring web analytics software provider Omniture. As a result of that acquisition, Adobe runs SearchCenter+, a paid search campaign management tool. 

    Adobe announced today that SearchCenter+ now allows users to integrate data from both paid search campaigns and organic SEO. 

    "Our customers derive significant value from our ability to serve as the hub to gather, analyze and take action on their valuable marketing data,” said John Mellor, VP, strategy and business development at Adobe’s  Omniture Business Unit. "Customers use SearchCenter+ to collectively manage more than $1 billion of the global paid search spend. Adobe now helps search marketers ensure that their paid search initiatives do not compete with or cannibalize the search volume they’re already receiving from natural search.”

    Adobe SearchCenterUsers can measure paid and organic search rankings, onsite engagement, or conversions in one report, use performance of natural search to adjust their paid search bids, gain insight to help in making keyword bidding decisions, etc. 

    The new version of SearchCenter+ is currently in beta, and is expected to become generally available in the second quarter.

    The Omniture acquisition has been big in terms of making Adobe a bigger deal for search marketers. Late last year, the company also introduced a tool for site search, aimed at helping marketers anticipate visitor search intent and promote relevant products and content.