WebProNews

Tag: SEO

  • Google Panda Update: Antitrust Connections Being Tossed Around

    Ah, the Google Panda update. The search story of the year. It just won’t leave the news will it?

    As recently reported, Google could be facing a massive fine over antitrust complaints in Europe. The European Commission is expected to issue Google a 400-page document talking about its alleged “abuse of dominance.”

    While in my opinion, it may be a bit of a stretch, a new Guardian article is drawing a connection from this to the Panda update. The piece talks about an Irritable Bowel Syndrome site that was hit by Panda back in February, and then jumps to:

    Any day now, the European commission is expected to announce whether it will formally object to what some see as Google’s abuse of its power in the way that it treats smaller sites that offer the same sorts of services as it does.

    If that happens, Google could be forced to comply with strictures on the way it treats rival sites offering particular sorts of search – for news, products, maps, shopping, images or videos – rather than pushing its own on the site. Alternatively, it could face fines of millions of pounds.

    The piece goes on to quote Adam Raff, co-founder of Foundem, a vertical search engine who has filed a complaint against Google:

    “Panda is a collection of disparate updates,” Raff says. She says that though panda was widely touted as an attack on content farms, “it also marks an aggressive escalation of Google’s war on rival vertical search services. First, vertical search services are in many ways the polar opposite of content farms” – because they link to multiple different sites, rather than containing content on one site.

    “Panda wasn’t just deployed in the midst of these investigations; we suggest that it was deployed in direct response to them. By bundling these diametrically opposed updates together, the ‘content farm’ elements could be viewed as providing cover for the vertical search targeted elements.”

    I guess that’s one point of view.

    While on the topic of Panda, a WebmasterWorld member (Content_ed) has an interesting story up about moving his good content from a Panda hit site to a site that was actually boosted by Panda.

    “I moved a half dozen pages that were drawing a few hundred visitors a day from Google on my Pandalized (down 80%) site to my Panda pleased (up over 300%) site this weekend,” he says. “It took a little over 24 hours for Google to start indexing the pages on the new site so I’m not sure if Monday results represent a full day. Of the half dozen pages, three were slightly above their pre-Panda level (year-over-year) on Monday, and three were around 20% under. The average Google traffic for the six pages Monday was around 250 visitors each.”

    It’s something to cosider, given HubPages’ strategy of subdomaining to separate out the better stuff.

    WebmasterWorld moderator Tedster had the following response to Content_ed’s post: “A lot may depend on the number of pages that each site contains. If you moved half a dozen pages to a domain that contains hundreds or thousands of other pages, you may see no changes with future Panda iterations. There’s also a chance, since Panda has a site-wide influence, that these pages were not the source of the Panda problem on their original domain. In which case, you made an excellent move.”

    With that, I’ll leave you with an infographic about Panda from Cognitive SEO:

  • Google Algorithm Updates: The Latest Things To Consider

    Google has been making a big deal about wanting to be more transparent about its search algorithm lately (without revealing the secret sauce too much of course). And so far, I have to say they’re making good on that promise fairly well.

    Is Google being transparent enough for your liking? Let us know in the comments.

    We’ve seen plenty of algorithmic announcements made from the company over the course of the year. In November, they discussed ten recent changes they had made. Here’s a recap of those:

    • Cross-language information retrieval updates: For queries in languages where limited web content is available (Afrikaans, Malay, Slovak, Swahili, Hindi, Norwegian, Serbian, Catalan, Maltese, Macedonian, Albanian, Slovenian, Welsh, Icelandic), we will now translate relevant English web pages and display the translated titles directly below the English titles in the search results. This feature was available previously in Korean, but only at the bottom of the page. Clicking on the translated titles will take you to pages translated from English into the query language.
    • Snippets with more page content and less header/menu content: This change helps us choose more relevant text to use in snippets. As we improve our understanding of web page structure, we are now more likely to pick text from the actual page content, and less likely to use text that is part of a header or menu.
    • Better page titles in search results by de-duplicating boilerplate anchors: We look at a number of signals when generating a page’s title. One signal is the anchor text in links pointing to the page. We found that boilerplate links with duplicated anchor text are not as relevant, so we are putting less emphasis on these. The result is more relevant titles that are specific to the page’s content.
    • Length-based autocomplete predictions in Russian: This improvement reduces the number of long, sometimes arbitrary query predictions in Russian. We will not make predictions that are very long in comparison either to the partial query or to the other predictions for that partial query. This is already our practice in English.
    • Extending application rich snippets: We recently announced rich snippets for applications. This enables people who are searching for software applications to see details, like cost and user reviews, within their search results. This change extends the coverage of application rich snippets, so they will be available more often.
    • Retiring a signal in Image search: As the web evolves, we often revisit signals that we launched in the past that no longer appear to have a significant impact. In this case, we decided to retire a signal in Image Search related to images that had references from multiple documents on the web.
    • Fresher, more recent results: As we announced just over a week ago, we’ve made a significant improvement to how we rank fresh content. This change impacts roughly 35 percent of total searches (around 6-10% of search results to a noticeable degree) and better determines the appropriate level of freshness for a given query.
    • Refining official page detection: We try hard to give our users the most relevant and authoritative results. With this change, we adjusted how we attempt to determine which pages are official. This will tend to rank official websites even higher in our ranking.
    • Improvements to date-restricted queries: We changed how we handle result freshness for queries where a user has chosen a specific date range. This helps ensure that users get the results that are most relevant for the date range that they specify.
    • Prediction fix for IME queries: This change improves how Autocomplete handles IME queries (queries which contain non-Latin characters). Autocomplete was previously storing the intermediate keystrokes needed to type each character, which would sometimes result in gibberish predictions for Hebrew, Russian and Arabic.

    Now, they’ve put out a similar post on the Inside Search Blog, revealing ten more that have been made since than post.

    We just announced another ten algorithmic changes we’ve made! Read more here: http://t.co/VYIow0z8 33 minutes ago via Tweet Button · powered by @socialditto

    Google lists them as follows:

    • Related query results refinements: Sometimes we fetch results for queries that are similar to the actual search you type. This change makes it less likely that these results will rank highly if the original query had a rare word that was dropped in the alternate query. For example, if you are searching for [rare red widgets], you might not be as interested in a page that only mentions “red widgets.”
    • More comprehensive indexing: This change makes more long-tail documents available in our index, so they are more likely to rank for relevant queries.
    • New “parked domain” classifier: This is a new algorithm for automatically detecting parked domains. Parked domains are placeholder sites that are seldom useful and often filled with ads. They typically don’t have valuable content for our users, so in most cases we prefer not to show them.
    • More autocomplete predictions: With autocomplete, we try to strike a balance between coming up with flexible predictions and remaining true to your intentions. This change makes our prediction algorithm a little more flexible for certain queries, without losing your original intention.
    • Fresher and more complete blog search results: We made a change to our blog search index to get coverage that is both fresher and more comprehensive.
    • Original content: We added new signals to help us make better predictions about which of two similar web pages is the original one.
    • Live results for Major League Soccer and the Canadian Football League: This change displays the latest scores & schedules from these leagues along with quick access to game recaps and box scores.
    • Image result freshness: We made a change to how we determine image freshness for news queries. This will help us find the freshest images more often.
    • Layout on tablets: We made some minor color and layout changes to improve usability on tablet devices.
    • Top result selection code rewrite: This code handles extra processing on the top set of results. For example, it ensures that we don’t show too many results from one site (“host crowding”). We rewrote the code to make it easier to understand, simpler to maintain and more flexible for future extensions.

    Seeing just these 20 tweaks listed all together, as changes that have just been made in the past month or so, really puts it into perspective just how much Google is adjusting the algorithm. That doesn’t even include the integration of Flight Search results announced after these updates.

    Google also points to the recently launched Verbatim tool, the updated search app for the iPad and the new Google bar as other recent changes to be aware of regarding Google search.

    Google says all the time that it makes over 500 changes to its algorithm each year, and that it has over 200 signals it uses to rank results. There is always a possibility that one of these changes or signals can have a major impact on your site, as many have found out this past year with the Panda update.

    Even a huge company like Yahoo is at the mercy of Google’s algorithm when it comes to search visibility, and they just finally made some big adjustments with Associate Content, not unlike what Demand Media has done this year.

    Last month, Google also indicated that it is testing algorithm changes that will look more what appears above the fold of a webpage.

    We’re getting close to a new year, and there’s no reason to expect Google’s changes to slow down. Google has been clear, however, that it aims to be more transparent about when these changes occur, and what those changes are. Granted, this transparency will only go so far, because Google will not make all of its signals known, and leave their results too open for gaming. That wouldn’t be good for anybody (except maybe Google’s competitors).

    Google does say that these lists of algorithm changes are now a monthly series.

    What do you think about the latest changes? Good or bad? Let us know in the comments.

  • Google Has “Exciting” Scraper Related Stuff in the Pipeline

    Google Webmaster Central tweeted out a new Matt Cutts video today, discussing the Panda update and scrapers. The specific question addressed is: “I understand that the recent farmer update (Panda) gives a penalty for poor content. Given the penalty scrapers have been outranking original sites. Should webmasters spend time in fighting scrapers directly or work on the poor content?”

    “My advice would be to really concentrate on the quality of your own site,” says Cutts. “It is the case that sometimes scrapers can be returned in Google search results, despite our best efforts. And it is the case that sometimes you see scrapers more or less often, but it’s also the case that Google has been working on trying to find and fix the problems related to scrapers.”

    “We’ve got engineers working on that,” he says. “They’re going to keep working on that. We’ve actually got some good stuff in the pipeline that I’m pretty excited about.”

    Now, it’s worth noting that the actual upload date of this video is August 8, though it was not released as a new video until today.

    New webmaster video: Should I spend more time on improving my content or on fighting scrapers? http://t.co/naZeRPGW 1 hour ago via web · powered by @socialditto

    On August 26, Cutts tweeted:

    Scrapers getting you down? Tell us about blog scrapers you see: http://t.co/6HPhROS We need datapoints for testing. 95 days ago via Tweet Button · powered by @socialditto

    “So we’ll keep working on the scraper side of things,” Cutts says in the video. “My advice for people who may have been affected by the ‘Farmer’ or the ‘Panda’ update is to concentrate primarily more on the quality side – the content side – thinking about how can you either improve the quality of the content if there’s some part of your site that’s got especially low quality content or stuff that was really not all that useful, then it might make sense to not have that content on your site. Things along those lines.”

    See eHow’s strategy.

    “So if you think about it, the fundamental problem, if you’re affected by this particular algorithm update, is that Google is thinking that your site is not providing as high quality content as some other sites,” he says. “So the best thing to do is to concentrate on the root of the issue. To concentrate on trying to make sure that you have the highest quality content so that Google sees that and can assess that, and then you don’t have to worry nearly as much about the scrapers, because you’re doing much better.”

  • Is Google+ Imperative For SEO?

    We all know that Google has over 200 signals it uses when ranking content, but you know what Google loves to rank well when its relevant? Google stuff.

    Should Google rank its own properties over others’ sites? Tell us what you think.

    While there have been plenty of complaints and much regulatory scrutiny into this topic, that’s a discussion for another article. The fact remains that Google gives businesses and sites plenty of tools and resources where you can actually use Google’s own search results to your gain.

    One example of this would be Google Places. It’s simply a great tool for consumers to find local businesses. How often do you use the phone book these days?

    Another example would be YouTube. It may be hard to get your site to rank for certain keywords, but Google loves to put video results on page one when relevant, and Google just so happens to own the world’s biggest video site. Even if this means they’re technically ranking their own stuff for visibility, you can benefit from this by using YouTube and videos to promote your business.

    I’m not going to go through all of these examples, but suffice it to say, you should be trying to be found in Google’s various other search engines, besides straight up Google Web Search. These can in turn get you found in results from regular searches via universal search.

    Google+ has introduced a whole new realm of SEO possibilities based on getting found via Google’s own properties.

    For one, Google ranks Google+ posts in search results, and they often appear on the first page.

    The +1 button obviously helps your search visibility cause. Google made it clear from the beginning that this would be a search signal. If enough people like your content enough to give it a +1, it must be good right? Why not bump it up in the rankings.

    Direct Connect lets consumers find your brand’s page pretty easily from the Google search box.

    Google likes “freshness” now. Stuff that is recent can appear higher in rankings these days. Google+ updates tend to be recent, and can be very rich in content, depending on how much you put into them. They can also spark conversations and sharing, which should all help your cause.

    Google+ is also what Google wants to replace Twitter with in realtime search. You can bet that this will come back sooner or later. Google is failing its mission in search without it.

    Google+ now has trending topics. It will hardly be a surprise if this finds its way more into the Google Search experience in one way or another.

    You can optimize your Google+ Profile itself. Kristi Hines at Search Engine Watch has a great article on this. This image she shares drives the point home.

    Google Plus Profile For SEO

    It’s become quite clear that your Google profile matters tremendously in Google search now. Watch this video from Matt Cutts last year, well before the launch of Google+ and the loss of the Twitter firehose.

    “We’re also trying to figure out a little bit about the reputation of an author or creator on Twitter or Facebook,” he says.

    Of course, now things have evolved a bit. Enter Google+ – a direct way for Google to get access to YOU.

    Also enter authorship markup. Google has been pushing this as a way for content authors to send Google direct signals that content is indeed connected to them specifically. Now, we see author pictures appearing next to Google search results on page one all the time. And guess what those link to. The Google+ profile. This is why in an article a while back, we called your Google profile the next ranking signal.

    Here are some clips from Google instructing you how to implement authorship markup:

    By the way, Google also recently added circle counts and comments to authorship.

    Google+ Pages are of course available now for businesses. If you haven’t started using them, what are you waiting for? And don’t be like McDonald’s and set one up only to not add any content. It’s not worked so well for them so far (although I doubt McDonald’s is too hard up for search traffic).

    And don’t forget about the analytical benefits you can get from Google+.

    At BlogWorld a couple weeks ago, Chris Brogan said, “Google doesn’t index all of Facebook right now. It’s a lost cause for SEO, they’re also no longer indexing Twitter. Google does index anything publicly for Google+

    One thing you will do well to keep in mind is that “Google+ is Google.” This is a mentality that Google has expressed on more than one occasion. If you’re ignoring Google+, you’re ignoring Google. Remember, if you want Google to RESPECT you (give you more search visibility), you should probably respect Google and the means you’re given. Google+ is only going to get more integrated with every aspect of Google.

    Have you had an impact in your search visibility since using Google+? Let us know.

  • Is Google Giving Your Site RESPECT?

    A lot of people feel that Google is treating them unfairly when it comes to search rankings. If you are one of these people, let me be perfectly blunt. There’s a good chance this is your fault. You have to play by Google’s rules if you want to have a good chance of being found in Google (and while there are certainly other ways to generate web traffic, Google is obviously a pretty big one). That said, Google will also be the first to tell you that “no algorithm is perfect”. Sometimes they don’t get it right. But are you doing everything in your own power to earn Google’s RESPECT?

    Is Google giving you the RESPECT you deserve? Comment here.

    Is your site showing up in search results for its targeted keywords? If not, maybe you’re not effectively using these keywords. Google is on to keyword stuffing, and content that is purely written for search. Do not over-saturate your content with keywords you wish to rank for. That said, you can use them as they make sense without compromising the flow of your content. Think titles, image labels (alt tags/title tags/captions), etc. It doesn’t hurt to keep this stuff in mind as you produce content. Just don’t do it in a way that compromises the quality of your page.

    Sitelinks

    Is Google showing site links for your site when it appears in search results?

    Sitelinks

    Right now, sitelinks are automated, but Google says it may incorproate webmaster input in the future. Frankly, I’d be very surprised if they didn’t. Still, there are best practices you can follow. “ For example, for your site’s internal links, make sure you use anchor text and alt text that’s informative, compact, and avoids repetition,” Google says.

    If Google is showing sitelinks for your site, but you don’t like the ones they’ve chosen to display, you can demote URLs to let Google know which ones you don’t think are appropriate. To do this, go to Webmaster Tools, click the site, and go to “sitelinks” under “site configuration”. In the “For this search result” box, complete the URL you don’t want to appear as a sitelink. In the “demote this sitelink URL” box, complete the URL of the one you don’t want to appear. Note that it might take Google a while to reflect this in search results.

    The Algorithm Updates

    It’s not just about what Google has done in terms of algorithm updates. It’s about what you should be doing. But perhaps you have been hit by recent algorithm tweaks. If Panda, for example, hit your site, then drastic changes may be needed. Google considers your site to be of low quality. Perhaps a site redesign is in order. Google has a whole list of questions you should be asking yourself about your site in terms of quality.

    Included on that list is “Does this article have an excessive amount of ads that distract from or interfere with the main content.” Google said last week that it is testing algorithms that look more at this factor above the fold. Be prepared for that.

    Google also just listed ten of its most recent algorithm changes.

    Google Is Listening.

    If you think you’ve done everything you need to do to make your site Panda-friendly, and Google is still not giving you the RESPECT you think you deserve, then let them know. The company insists that it is listening. Go to this thread and make your voice heard. Last week, they even said they have an Excel sheet of about 500 sites from this thread (at least, I assume this is the thread they were referring to). There is a person responsible for false positives, they said. You may have a legitimate beef, and Google, at least to some extent, recognizes this.

    Are You Expecting Google To Be Perfect?

    Google isn’t perfect. They know this. In fact, they make this point themselves all the time It’s why they constantly tweak their algorithm. They’re not launching all of these updates just to mess with webmasters. Google makes over 500 changes to its algorithm over a year’s time. They’re trying to improve the quality of their search results. It’s not in Google’s best interest to return results to users that aren’t helpful. They don’t want to send people to Bing, which is marketing its search engine much more heavily than Google. Whether you think the quality of Google’s results have gotten better or not, this is their goal. Google considers Panda a “positive change across all of its known measurements,” by the way. I’m sure some of you disagree.

    What Are Your Competitors Doing Right?

    Still, you might see lesser competitors ranking above you in search results, and that can be very frustrating. For some reason, Google is giving them more RESPECT. Do you think it’s going to do you any good to just sit back and complain though? It’s your responsibility to analyze your competition. Look at the page that is ranking above yours. Are there some things about that content or page that they are doing better than you? Richer content? A cleaner design? Google has over 200 signals. Keep this in mind. Look for anything positive about that page, and then look at yours and compare and contrast.

    Ultimately, it’s about Google seeking to rank sites on every topic imaginable by a using a combination of at least three big factors to determine relevancy: quality, authority and recency. Sites can rank above you based on how Google is ranking the importance of these three factors at the moment a search is done. If something is in the news, and your site has an up-to-the-minute article on the topic, you might rank above sites with more authority for a while. Your goal as a webmaster should be to become an authority on a specific topic. Get people to link to you. Create author pages, if you are a publisher, and utilize authorship markup.

    Is Google Getting It Right?

    As mentioned, Google is not perfect, but is it getting it right in most cases? What do you think?

    Let’s look at an example of a set of search results for the query “google panda”.

    Google Panda search results

    Note: This is a search performed while logged out, from Lexington KY (location can sometimes play a role. It’s hard for me to say how much of a role it played in this case, as it’s not a location-specific query).

    WebProNews ranks number 5 in this case. Now, the Google Panda update is something that WebProNews has covered rigorously all year long. This is a topic I feel fairly comfortable saying that Google gives us some amount of authority on. But we’re OK with our ranking here. The first result is an informational Wikipedia entry. It makes sense for something like this to appear first for such a broad Panda-related query. It tells you what the term is. Second is an article from SEOmoz – certainly an authority in the search industry. Same with the third result, Search Engine Watch. While we may be competitors with these sites on a query like this, both of those sites are very focused on search. It make sense that they would rank well. While WebProNews certainly covers search a great deal, and we do consider ourselves an authoritative site on this topic, we also have a much broader spectrum of topics we cover.

    Number 4 is unrelated, but it comes from code.google.com itself, and is about “pandas – Powerful Python data analysis toolkit”. This result is a little questionable, but on the other hand, it is from Google’s own domain, and it might be tricky for Google’s algorithm to know this isn’t what I was referring to. Keep in mind, this query was performed while logged out. When I perform the same query logged in, I get more actual panda search-related results before that one.

    Number 5 is WebProNews (though honestly I’m not sure if this is the most relevant article of ours on the subject to come up for this search), and number 6 is Search Engine Land. While Search Engine Land is more in line with SEOMoz and Search Engine Watch in terms of its focus on search only, you can see that the WebProNews piece that ranks above it is more recent, and that may have played a role in this case.

    Are the results for this query the absolute best they could be? I would say no, but they’re not terrible. Again, Google isn’t perfect.

    Create a site that has rich content and is easily crawlable by Googlebot, and loads quickly for visitors. That’s a good start.

    Scrapers

    It’s also frustrating when sites scraping your content are ranking above your own site. The fact is that Google’s algorithm simply can’t always tell which one is the original piece. You might think that the recency factor even favors the scrapers. You can file DMCA complaints and whatnot, but this can be a huge pain, especially if you put out a lot of content, and it’s all getting scraped, and by multiple sites.

    Well, you can let Google know as soon as you publish your content, so they know it is posted before the scraped version. This was one thing discussed at PubCon last week. Barry Schwartz’s liveblogged account of Google’s presentation says:

    Another trend is sending information to Google, such as for scraper sites ranking above you, what if you can ping google with information so they have it first.

    What if you could send a ping to Google to let Google know you’ve published content…

    He then shows a picture of the Slide Google used in the presentation, which says:

    Worth doing now:

    1. Sign up for Webmaster Tools
    2. Sign up for email alerts
    3. Set up “fat pings” when you publish: pubsubhubbub.appspot.com
    4 Subscribe to : Webmaster Blog, Inside Search Blog, Webmaster Video Channel

    Google also has a form for reporting scraper pages. On the form page, it says, “Google is testing algorithmic changes for scraper sites (especially blog scrapers). We are asking for examples, and may use data you submit to test and improve our algorithms.”

    Do You Deserve Google’s RESPECT?

    In some cases Google isn’t giving you RESPECT because you don’t deserve it. Either your site is of poor quality, lacks a following (backlinks, social activity, etc.) or you simply aren’t following the basic Webmaster Guidelines that Google lays out. You can find these here. They’re broken up by Design/content, Technical and Quality.

    You may feel like a seal jumping through hoops, but if you want Google’s RESPECT you gotta RESPECT those hoops.

    Fulfill Searchers’ Needs.

    Another piece of advice to get more Google RESPECT would be to place more focus on the long tail. There are some key words that you’re just not going to rank for. But a lot of search isn’t about that one coveted key phrase. It’s about people looking for help with things, and their queries often stretch far beyond that key phrase. People have gotten better at searching over the years. They are entering longer queries, and are often very specific. This is what has made sites like eHow so successful. The key is to make sure the content is up to par in the quality category. When it’s not, that’s when it becomes a problem.

    There is still opportunity to rank in results for queries seeking very specific things. You should be providing relevant content to satisfy these needs. The great thing about content is that if you keep writing content that is relevant to your industry (including newsworthy topics related to your industry), that will help you send Google those recency signals. If you’re writing every day, for example, you’re always going to have something that’s recent. The quality has to be there obviously, but if you can consistently put out quality, relevant content that will also help establish you as an authority. Then suddenly, you’ve got fresh, relevant, authoritative content, and Google is probably looking at you in a much better light, and hopefully ranking you accordingly.

    RESPECT Google.

    If you want Google’s RESPECT, the most important thing you can do is listen to what they say. Follow news about Google. Pay attention to Webmaster Tools. Listen to Matt Cutts and other Googlers when they talk about how Google search works, whether that means things said at conferences, Matt’s regular webmaster videos (which we often cover here), or things posted to Twitter, Google+, etc. Just pay attention to what Google is saying.

    You may think Google has too much power of the Internet, and in some ways maybe they do, but in the end it’s really about the users. It’s the users using Google that give Google its power.

    You may wish to decrease your dependence on Google for traffic, and by all means do so. That’s a good thing. However, if you want Google’s RESPECT, you simply have to utilize the information they give you, because Google is going to do what Google wants (within regulatory approval), and its share of the search market isn’t shrinking. Ultimately, you’re not pleasing Google to please Google the company. You’re pleasing Google to get regular people to your site, because a whole lot of regular people use Google, and they use it a lot. And generally speaking, they don’t care about Google/webmaster politics. They just want to find what they’re looking for.

    Is RESPECTing Google’s ways enough to improve your search rankings? Tell us what you think in the comments.

    Top Image Credit: odinartcollectables.com

  • Google Raters Guide: Ratings Don’t Directly Affect Rankings

    Earlier this year, following the initial roll-out of the Panda update, Google’s Matt Cutts and Amit Singhal did an interview with Wired, in which they talked about people they had rating search result quality.

    Last month, the raters handbook was leaked. Morris Rosenthal had some interesting things to say about it here.

    The raters are discussed a bit more in this video from Google:

    Cutts and Singhal spoke together at PubCon this week. Here some takeaways from their keynote discussion (which includes talk about algorithms in testing, which would focus more on sites with too many ads above the fold).

    Matt Cutts was reportedly talking about the raters guide at a PubCon networking event. In the WebmasterWorld forums (via Barry Schwartz), user Tedster says the following clarifications came from Cutts:

    Webmasters tend to put a slightly skewed angle on this. The quality raters are actually rating a SERP (that is, a particular algo configuration) as a quality control measure for the algo team. Their ratings do not directly change rankings- but they hep the algo team see if the algo worked as planned or not.

    Also, note that this document is not for the spam team. They also have a training document and use human quality raters – but that document has never been leaked.

    While it may not hurt to pay attention to this handbook, I would consistently refer back to that set of questions Google put out earlier this year. That “above the fold” stuff is right in line with some of the stuff on that list.

    World of Warcraft Mobile Forums

  • Google: Page Speed Affects Rankings In 1 Out Of 100 Searches

    With PubCon going on, Google’s Matt Cutts tweeted out a link to what he calls “a special webmaster video for #pubcon”. It’s about how Google determines page speed.

    Specifically, it’s Matt’s response to a user question:

    “How does Google determine page speed? In GWT some pages are listed as very slow (8+ seconds). But I have tested on older computers/browsers and they do not take anywhere near that long to load. Why might Google show such high numbers?”

    “The fact is we’re looking at using toolbar data, and that’s using toolbar data only from people who have opted in. But that’s looking at real world load times from people for example, in the United States, we might say, how long does it take to load this particular page?” says Cutts. “And so if we’re looking at that, and it takes a long time, sometimes it’s not necessarily your site. It could be the network connectivity. But it’s a good thing to bear in mind.”

    “It’s coming from all these different users, who can have dial-up lines,” he says. “They can have slow connections. And so a lot of times, people say, I’m just going to throw a 500 kilobyte page out there, and they forget there are a lot of people with slower connectivity. So that data is based primarily on toolbar data.”

    “And we’re looking at what it looks like for real users,” he continues. “And do if you’ve got a lot of users who are having a slow experience, then that can affect the overall rating. One thing to bear in mind, however, is that only something like one out of 100 searches is site speed such a factor that it would actually change the rankings to a noticeable degree.”

    “So that’s something on the order of one in 1,000 sites have truly site speed as a really big issue for them,” he says. “It’s always good to see if you can move a little bit faster and try to return results to users a little bit faster. It makes your website experience more fluid. It makes your users happier. There are studies that say the return on investment is definitely worth it. But at the same time, I wouldn’t stress overly about it.”

    Cutts is speaking at PubCon tomorrow morning with Google Fellow Amit Singhal in a session called “Hot Google Topics & Trends.” It should be interesting to see what these two Google search guys have to say.

    Cutts also tweeted out a picture of gummy pandas:

    Pandas made out of acai? This might be the official #pubcon snack. http://t.co/UbkaJPj6 15 hours ago via Twitter for Android · powered by @socialditto

    Note: The image at the top is from Matt’s personal blog, chronicling his moustache adventures for Movember. That’s from last week. Daune Forrester (Bing’s counterpart to Matt Cutts) told me at BlogWorld his moustache isn’t coming along so well. I believe he likened himself to a Spanish cop.

  • Google “Freshness” Update Helps A Bunch of News Sites

    A few days ago, Google confirmed that it launched a new algorithm update with freshness in mind. The update is built “upon the momentum” of Caffeine, the infrastructure update Google completed last year, designed to index fresher content more quickly.

    This new update, the company says, impacts roughly 35% of searches. It’s designed to show more high quality pages that are only minutes old for queries related to recent events or hot topics.

    Without specifying with keywords that you want the most recent event in a series of recurring events, you’re supposed to see the fresh ones – whether that be sports scores, earnings reports, TV shows, or other events.

    You should also notice a difference in getting more frequently updated info on things where info changes frequently, even when it’s not a hot topic or recurring event. Google uses the examples of “best sir cameras” or ‘subaru impreza reviews”. Google tries to give you the latest info when it thinks it’s relevant.

    There have been plenty of interesting reactions to this update from the SEO community – some skeptical, some hopeful. We looked at a few in a recent article.

    SearchMetrics, which has provided data about the winners and losers of search visibility resulting from various iterations of Google’s Panda update, has released a list of top winners and losers from this “freshness” update.

    Interestingly, both lists include plenty of big brands.

    Here’s the winner list:

    Google freshness update winners

    Here are the losers:

    Google Freshness losers

    As SearchMetrics founder Marcus Tober points out, the winners list shows a lot of news sites, while the losers list consists of a wide variety of types of sites with no real clear category of site sticking out.

  • SEO Experts React To Google Algorithm Update

    I’m not sure what other writers are calling Google’s algorithm update, as we’ve only had it for a day. I’m staying with the KISS (keep it simple stupid) method, and calling it “Google Fresh“. No doubt, SEO experts, webmasters, and even users will feel the effects of Google Fresh, with many of those people voicing their opinions on the changes. In fact, many have already sounded off. Let’s take a look at what they’ve had to say.

    SEOMoz’s Rand Fishkin, is the only responder I’ve seen who’s released a video concerning the update. He discussed the changes with Mike King of @iPullRank. One of the most important takeaways from the video is how they believe timestamps, specifically in the XML sitemaps could see a boost in terms of importance. A trend you’ll notice as you read more reactions.

    Wistia

    Gianluca Fiorelli, of I Love SEO calls the update, “Caffeine 2.0” and shared some interesting thoughts about how this change was brought about by advertising needs. Claiming that Google is driven by ad space, and needed to make changes to reflect this:

    Google is an substantially an editor (even it will never recognize it) that sells ad spaces, and Search is still its main product to sell toward the advertiser. So Google needs to have the best product to continue selling ad space. That product was ironically endangered by Caffeine. With Panda Google tries to solve the content issue and with the Social Signals linked inextricably to the Authors and Publishers tries to solve the obsolescence of the Link Graph.

    Ben Will, an author over at Marketing Pilgrim, has already devised strategies to succeed from the Google Fresh changes. He makes note that having correct and up-to-date time-stamps is of utmost importance, like Fishkin and King stated above. However, one of his strategies struck me as interesting. Adding forums.

    Forums…the original social network. The benefits are the fluid conversations that happen. The downsides are that forums require a fair amount of work to be managed. Choose this option carefully.

    One aspect of this change which I saw go unnoticed by most is how it will impact paid search. Jeff Allen, of PPCHero, believes the changes will have an important impact.

    Google didn’t specify if this change would have an impact on paid search. However, their trend has been towards narrowing the gap between organic and PPC. Because of this, I would venture to guess it will have some impact in the future. Time will tell, but my guess is that advertisers in verticals affected by these changes (such as eCommerce sites selling SLR cameras) will want to keep their content fresh.

    As I’ve read reactions, it seems to me that this could potentially be the most subtle algorithm update Google has released. Most of the strategies I’ve read in reaction to Google Fresh, are ones most have been trying to follow for quite some time. It’s always been about producing relevant content, and doing so on a consistent basis. If your site is based around a topic which doesn’t require timely updates, then this change probably won’t have an effect. That is if Google’s claim concerning recency only affecting certain topics holds true. On the flip side, if your site relies on current events or timely topics, you’re probably giddy about this update.

    Have you noticed any changes in your results, since the Google Fresh update went live? Do you see vast potential for your site from Google placing more relevance on freshness?

    Update: There have been more reactions to the algorithm update. There seems to be a divide over whether or not Google putting relevance on freshness will indeed provide better quality for readers. Most of the skepticism comes from those are unsure whether or not Google will know when recency should truly be important.

    A user from WebMasterWorld had an interesting reaction to the update, “So you think the pro staffed sites have been at an advantage before? How about now? On the surface, this looks like the dagger for a lot of folks. Hope I’m wrong. A small enterprise of a few cannot compete with freshness of hundreds of staffers. I haven’t dug into this, but certainly this adds to the already pile of BS that a lot of us have been dealing with and now it’s another heaping truck full on my door step.

    With all the changes, Google results must really have sucked. Guess it was a fluke to gain that market share on a system that so broke that it needs to be gutted. In real terms, there is something fundamentally wrong with ADD characteristics of a company that has 97% of mobile search and what 80%+ or regular search.”

    Could the update lead to certain businesses creating blogs, simply for the aspect of showing up as a fresher result?

    Businesses urged to create blogs, online video as Google changes algorithm to emphasise new, relevant content http://t.co/RZszfS7J 1 day ago via AutoTweet Connector · powered by @socialditto

  • Google Adds New Duplicate Content Messages to Webmaster Tools

    Google announced today that it is launching new Webmaster Tools messages to notify webmasters when its algorithms select an external URL instead of one from their website, in duplicate content scenarios.

    Google defines “cross-domain URL selection” as when the representative URL (the URL representing the content in a duplicate content scenario that Google’s algorithm decides to use) is selected from a group with different sites.

    In other words, the selection Google goes with when two or more sites are showing the same content.

    “When your website is involved in a cross-domain selection, and you believe the selection is incorrect (i.e. not your intention), there are several strategies to improve the situation,” says Google Webmaster Trends Analyst Pierre Far.

    Google highlights three main reasons for unexpected cross-domain URL selections: duplicate content including multi-regional sites, configuration mistakes and malicious site attacks. Far points to various resources for each scenario in this blog post.

    “In rare situations, our algorithms may select a URL from an external site that is hosting your content without your permission,” says Far. “If you believe that another site is duplicating your content in violation of copyright law, you may contact the site’s host to request removal. In addition, you can request that Google remove the infringing page from our search results by filing a request under the Digital Millennium Copyright Act.”

    At least the WMT messages should help alert you when it’s happening.

  • SEOs Not Buying Google’s Privacy Motive for Encrypting Search

    Google caused quite a ruckus in the search marketing community after it announced some changes to search. Last week, the search giant said that it would begin encrypting logged-in searches that users do by default, when they are logged into Google.com. This further integration of a Secure Sockets Layer (SSL) will prevent search marketers from receiving referral data from the websites consumers click on from Google search results.

    What do you think of Google’s move to encrypt searches? We’d love to know.

    While this change is only supposed to affect a single digit percentage of referral data, many SEOs are not happy with the move and believe that Google has gone too far. Eric Enge, the Founder and President of Stone Temple Consulting, told us that he was completely “baffled” when he saw the news. Rebecca Lieb, the Digital Advertising and Media Analyst at the Altimeter Group, was also surprised by the move and called it “evil.”

    “I hate to say this about Google because they’re a company that I admire and like and respect, but I think this is evil,” she said.

    “Google is taking something away that is a very, very valuable tool for anybody practicing SEO,” Lieb added.

    Amanda Watlington, the Owner of Searching for Profit, also shared with us that she would not be able to give her clients as much value as she has in the past.

    “I have learned more from the referral data that comes into the that lets me benefit the user – I won’t have that data to mine, “ she said. “Personally, it will make it harder for me to (a) understand what the performance of my pages are and (b) to learn from my pages.”

    Google has said that it did this in order to make search more secure, but the SEO community doesn’t agree. Enge told us that he didn’t recall any outcry from privacy organizations in regards to search term data and, therefore, is not convinced that security was Google’s real motive. If this were the case, he thinks that Bing and Yahoo would have had to make changes as well.

    Others, including Amanda Watlington, think that Google did this for financial purposes. She told us that it was “all about the Benjamins.” Matt Van Wagner of Find Me Faster also said that he could see the search giant thinking this move would make its search engine look more attractive to shareholders since it could potentially push more people to use paid search – its primary revenue model.

    Lieb takes a slightly different approach and said that Google could have done this to appease regulators. What’s bad though, as she points out, is that most regulators don’t understand referral data and other aspects of Internet marketing.

    “I think Google may (It’s a theory – I can’t prove it) be throwing a bone to somebody on Capitol Hill with this move,” she said.

    Is Google making moves to try to improve its reputation with regulators? What do you think?

    Todd Friesen, the Director of SEO at Performics, agrees that Google made this move as part of a greater effort. He told us that Google frequently makes small moves and waits to see how everyone reacts before it pushes out its bigger plan.

    “Google doesn’t do anything on a whim,” he said. “They’re definitely thinking 5 and 10 years out.”

    “There’s definitely a bigger plan behind it, and it’s probably big and scary with teeth and claws,” he added.

    A big part of the reason why SEOs aren’t buying into the privacy theory is because the changes do not impact advertisers. This is ironic since consumers don’t typically complain about organic search data, but they are usually concerned about targeted advertising. It seems as though Google is saying that consumer information is important for advertisers to make money, but it turns into a consumer privacy issue when it relates to organic search results.

    “The fact that they’re keeping all this referrer data alive for advertisers is strongly, if not irrefutably, indicative that the money is not where the mouth is,” said Lieb.

    Friesen also said that it’s a “hypocritical standpoint” on Google’s part. If the motive is really about privacy, he doesn’t think that Google should be passing referrer to advertisers, or anyone for that matter.

    Another point that Lieb raised was that paid search could eventually take a hit from this move. If small businesses that are investing in organic search through Google are not able to get the data they need, she doesn’t think that they would want to pursue a paid search campaign with it either.

    “It’s certainly something that would make me, as an advertiser, almost inclined to go to Bing or Yahoo just because… just because this isn’t right,” added Lieb.

    Google maintains that this change is very small and that it will only impact a small percentage of searches. Matt Cutts also pushed this message on Twitter:

    @Sam_Robson I believe it will affect things based on the referrer, but only for a small percentage of searches (signed in on .com). 9 days ago via web · powered by @socialditto

    @Rhea And we’ll be rolling out slowly(weeks). We ran some tests before launch, and I don’t think anyone even noticed the change. @blafrance 9 days ago via web · powered by @socialditto

    The SEOs, however, are not convinced. There are so many unanswered questions that this move raises that one can’t help but wonder about the future of SEO. Watlington, for instance, told us that she could see Google monetizing the data going forward and that this move is the first step.

    “To me, the move to give it to an advertiser is a monetization of the data,” she said. “What additional monetization will be, I’m waiting to see.”

    Van Wagner told us that, since he primarily does paid search, he is glad that Google didn’t include advertisers at this point. But, this move could result in more competition in paid search, which is not something is in favor of either.

    The biggest concern is the fact that no one knows what is next. Lieb told us that if Google does decide to roll this out further, SEO could really be in danger.

    “People have a right to be upset about this because, even if it’s only 10 percent now, or only 15 percent now, it could get more dire,” she said.

    Watlington believes that search marketers may have to rethink what they do moving forward. She even said that they might have to “look away from search” and focus more on traditional marketing. At this point, Google is the primary search player and everything it does directly impacts search marketers, which, according to Watlington, does not indicate a promising future for search marketing.

    “We have one very large player, a monopolistically-sized player… holding enough of the cards,” she said. “That’s not exactly what I call a real long-term strategy because whatever that player does, it impacts us.”

    Friesen, on the other hand, doesn’t really think that this impacts what SEOs do. He thinks that the process of how they track and report on it changes but said that the job of an SEO doesn’t actually change.

    “What, unfortunately, it does is drives us back to rank checking as a more important metric,” he explained.

    He does admit that the SEO industry could be more heavily impacted if Google makes a further move in this area.

    “At this point, it’s less than 5 percent… but if it starts to climb, then we get into a reporting issue,” said Friesen. “We get back to the ‘SEO is black magic voodoo stuff.’”

    Incidentally, a petition called Keyword Transparency has been created that hopes to get Google to reverse this action. The “About” section on the site says:

    This petition has been created to show Google the level of dissatisfaction over their recent changes to keyword referral information, and will be presented to the search quality and analytics teams at Google.

    The argument that this has been done for privacy reasons sadly holds little weight, and the move essentially turns the clock back in terms of data transparency.

    The argument that this only affects <10% of users is also concerning as this is likely to increase over time, even up to a point where it affects the majority of users being referred from search.

    At this point, there are over 1,000 signatures on the petition.

    Is Google’s move to encrypt searches just the first of many? And if so, is the future of SEO in question? Let us know your thoughts in the comments.

  • Petition Seeks to Keep Google From Blocking Referral Data

    Earlier this month, Google announced that it would begin encrypting search queries with SSL as the default experience at Google.com for users who search while logged into their accounts.

    Sites visited from Google’s organic listings will be able to tell that the traffic is coming from Google, but they won’t be able to receive info about each individual query. They will, however, receive an aggregated list of the top 1,000 search queries that drove traffic to the site for each of the past 30 days in Webmaster Tools.

    “This information helps webmasters keep more accurate statistics about their user traffic,” said Google product manager Evelyn Kao. “If you choose to click on an ad appearing on our search results page, your browser will continue to send the relevant query over the network to enable advertisers to measure the effectiveness of their campaigns and to improve the ads and offers they present to you.”

    “When a signed in user visits your site from an organic Google search, all web analytics services, including Google Analytics, will continue to recognize the visit as Google ‘organic’ search, but will no longer report the query terms that the user searched on to reach your site,” said Amy Chang on the Google Analytics blog. “Keep in mind that the change will affect only a minority of your traffic. You will continue to see aggregate query data with no change, including visits from users who aren’t signed in and visits from Google ‘cpc’.”

    “We are still measuring all SEO traffic. You will still be able to see your conversion rates, segmentations, and more,” she added. “To help you better identify the signed in user organic search visits, we created the token ‘not provided)’ within Organic Search Traffic Keyword reporting. You will continue to see referrals without any change; only the queries for signed in user visits will be affected. Note that ‘cpc’ paid search data is not affected.”

    Since all of this was announced there has been a fair amount of backlash from the webmaster/SEO community. There’s a petition at KeywordTransparency.com (via Danny Sullivan) for Google not to take away referral data.

    The about section attached to the petition says: “This petition has been created to show Google the level of dissatisfaction over their recent changes to keyword referral information, and will be presented to the search quality and analytics teams at Google. The argument that this has been done for privacy reasons sadly holds little weight, and the move essentially turns the clock back in terms of data transparency. The argument that this only affects <10% of users is also concerning as this is likely to increase over time, even up to a point where it affects the majority of users being referred from search." It is certainly true that Google is doing a lot to get people signing up for Google accounts (obviously Google+).

    The actual letter that you’re signing when you sign the petition says:

    Dear Google,

    As publishers of content on the internet, we feel that the removal of keyword referrer information from the natural search results damages our ability to deliver good quality content to our users.

    By removing this data Google is not only hurting legitimate websites, but potentially pushing lower quality sites further into black hat data collection methods (ie spyware) in order to compensate for this data loss.

    We believe that the security argument is fatally undermined by the inconsistency in allowing keyword data to still be sent unsecured via your advertisers.

    There are ways of securely sending keyword referral information to websites without compromising privacy, and without negatively affecting webmasters’ ability to create good quality websites, and we ask that you seriously consider alternatives to the current implementation that would support this.

    Yours,
    [insert name here]

    And look at the list. Matt Cutts is even on it multiple times (guessing not really Matt Cutts).

    Do you agree with what this petition is saying? Let us know what you think in the comments.

  • Occupy Google, Siri Pours Beer & Giant LEGO Man Washes Ashore

    Today’s video round-up, as it often does, contains several Google-related pieces, including Matt Cutts talking about whether or not Google sees SEO as spam. No real surprises here, but always nice to hear it from the horse’s mouth. There’s also a nice mix of cool, funny, and creepy.

    View more daily video round-ups here.

    Matt Cutts talks Google’s view on SEO:

    Occupy Google (radio satire):

    Diablog III:

    Darren Rowse looks at Google Analytics Real Time Stats:

    The Nest Learning Thermostat:

    Get Siri to pour you a beer (via Revision3/GeekBeat.Tv):

    Creepy:

    Conan does Siri:

    Any unicycle stunt is unbelievable as far as I’m concerned:

    Giant LEGO man washes ashore. Why not?

  • 5 Steps Beyond Competitive Link Analysis

    The job of link building is getting tougher. The introduction of encrypted searches, the series of Panda updates, and whatever Google come up with next is putting more and more pressure on us all to drop any shortcuts and concentrate on quality link building. And the proven formula for quality link building is ‘great content, well promoted, equals great links’.

    Not only must we continue to create great content, we’ve got to find more quality sites from which to get links. And to find more quality sites, we’ve got to go beyond simple competitive link analysis.

    For many marketers, the first thought in link building is to do a competitive link analysis and then target the sites that are linking to your competitors but not to you.

    That’s a good start, but it will never bring you all you need: if ‘follow your competitors’ is all you do, you’ll only be chasing links from sites where your competitors have already succeeded and that means you’ll always be behind them.

    To be really effective in link building we’ve got to be more creative and go way beyond competitive link analysis in looking for new link opportunities.

    Step 1: Broaden your idea of relevance

    You have got to have relevant links, right?

    That’s true but it’s only part of the picture. Many people’s idea of relevance is limiting.

    Take BobsRedMill.com who produce whole grain foods. As you’d expect they get links from food sites like Chow.com, Epicurious.com and VegWeb.com.

    But they also get links from:

    All of these are relevant links in the context in which they appear.

    If you take only a limited view of relevance, you won’t even think of opportunities like these.

    Step 2: Maximize your relationship with sites that already link to you

    Sites that have already taken the step of linking to you, have done so for a reason. Do you really understand what that reason is and what their motivation is for going to the trouble of writing the code that gives you the all important link?

    Perhaps:

    • they’ve used your products and found them particularly useful
    • you solved a specific problem for them
    • they’re compiling a resource list
    • perhaps they’re posting on a specific topic and they found something you wrote relevant
    • …etc.

    Discovering the specific reasons why gives you the basis of strengthening your relationship with a site. That could lead to:

    • further coverage and links in the future
    • keyword-rich linking text
    • links to deep content on your site
    • interest in joint ventures or affiliate relationships
    • and much more…

    You get the idea – linking to you is a sign that they’re interested in what you’re doing and you should follow up with something that strengthens your relationship.

    Step 3: Check out who links to the sites that link to you

    So you’ve looked at sites that link to you, you’ve understood why they linked, and you’ve approached them to strengthen your relationship. Now it’s time to move on and win some new links.

    The sites that already link to you can be seen as an informal ‘organic link network’ that has evolved due to their interest in what you do. Sites that link to them are also sites that are likely to be interested in what you do.

    For example, Footlocker.com get a link from the fashion blog [Nitrolicious.com] which in turn gets a link from another fashion blog, [ElementsOfStyle.com] – that blog and many hundreds of others could be more targets for Footlocker.com.

    So find out who links to the sites that link to you and you’ll find many more linking opportunities.

    Step 4: Check out who links to the top magazines in your market

    Top magazines in your market can be a great source of high quality link prospects.

    Magazines, newspapers and online news sites often quote and link to each other, so compiling links to the top magazine will reveal many other media outlets. This helps you build lists of target publications and identifies journalists and editors who could be interested in your company. Furthermore, bloggers will comment upon, link to and share any interesting article or news piece they come across.

    So if you’re interested in ‘gourmet food’ for example, sites that link to leading food magazines like bonappetit.com, saveur.com, cooksillustrated.com, foodandwine.com and epicurious.com
    are likely to be of great interest.

    Step 5: Collect lists of the top blogs in your market

    Link building is a tough task and you need all the help you can get. So how about getting some help from all those wonderful people out there who compile lists on the ‘top blogs’ in any given industry. Such people will probably have reviewed the sites, maybe even published some metrics that can help you identify blogs that you can target.

    For example, SportsManagementColleges.net provide a list of the [top 50 skiing blogs]

    And even better, you can use the top blogs that you find as I’ve used the top magazines in the example above. Blogs tend to link to and comment upon posts made by other blogs in their industry. So using them as a source for finding new links is very productive.

    Finally

    I’ve outlined 5 techniques for going beyond competitive link analysis and being creative in where you look for new link prospects. But of course, don’t just follow my ideas, develop your own unique methods and you’ll soon be discovering a ton of relevant link prospects that you competitors haven’t even thought of.

    And, if you’ve got some good prospecting techniques I haven’t mentioned, please post them in the comments below.

  • Changing the “Used Car Salesman” Perception of Selling

    When you think of selling, what do you think of? If you’re like me, you totally think of the sleazy used car salesman like in the image below. It’s not, of course, that all salesmen are like this, and even all used car salesmen for that matter, but that type of stigma is just naturally associated.

    Can this perception of selling be changed? Let us know in the comments.

    For salesmen, overcoming this obstacle is an obvious challenge. It’s also a challenge for people trying to start a business because they don’t want to be connected to a group of people that are known for taking advantage of people. This sales association has had such an impact on society that it even discourages some people from pursuing their goals.

    According to Johnny B. Truant, an outspoken blogging coach, there is a way to change this perception. He told us that he struggled with this issue but then realized a way around it. As he explained to us, both parties involved in a sales transaction should be happy. The customer wants/needs what you have, so you’re helping them. In turn, they are helping you because they are buying from you.

    “It’s definitely a better way to do business for somebody like me because I don’t feel like I’m taking advantage of everybody,” said Truant.

    Truant runs a different type of business and one that Sonia Simone of Copyblogger calls a “Third Tribe.” Simone coined this term to bridge the gap between the two groups that largely make up the Internet marketing crowd. The one tribe consists of marketers that are focused on selling, and the second tribe is the social media crowd that is primarily concerned about connecting.

    Simone felt that a lot of people didn’t fit into either of these categories, and, as a result, Copyblogger has built an entire brand about the concept of “Third Tribe.”

    “[It’s] the idea that the best connections lead to the best customers and so forth, and the best customers come out of creating true connections,” explained Truant.

    He went on to say that he doesn’t abide by a lot of the standards that people think are necessary for business including SEO and social media.

    “I’m really bad at SEO,” he said. “I don’t even attempt to make it work.”

    It’s not that he thinks these methods are ineffective, but it’s that he has found other ways to do business successfully. He believes so strongly in this method that he encourages others to embrace it as well.

    What do you think about these unconventional business practices?

    WebProNews is partnering with BlogWorld and New Media Expo, the world’s first and largest new media conference, in an effort to broadcast how new media can grow your business, brand, and audience. BlogWorld takes place November 3-5 in Los Angeles. Stay tuned to WebProNews for much more exclusive coverage.

  • Facebook Failures, Twitter Manners & Content for SEO

    In today’s infographic round-up, we get a look at Facebook’s “Wall of Shame,” manners on Twitter, how content pertains to SEO.

    View more daily infographic round-ups here.

    Wordstream looks at Facebook’s “Wall of Shame”:

    Facebook's wall of shame

    Marketing Optimist and One Lily look at Twitter manners:

    Manners and Twitter

    Brafton looks at Content for SEO:

    SEO and Content  

  • Google Panda Update: Cutts Confirms Another Tweak

    As previously reported, Google’s Matt Cutts tweeted out on October 5 to “expect some Panda-related flux in the next few weeks, but will have less impact than previous updates (~2%).”

    SearchMetrics has since shared numbers for sites it found to have bounced back. DaniWeb, which originally tipped us off to the most recent major Panda update (commonly referred to as 2.5), saw a recovery even before that.

    When asked if there was a “minor Panda update” on October 14 (Friday), Matt Cutts replied on Twitter:

    @maheshhari late last night, yes. 2 days ago via web · powered by @socialditto

    Indeed, a number of webmasters reported over the weekend in various forums that they experienced some level of change that would coincide with Matt’s response. Some saw ups, some saw downs, and others saw little to no change.

    Search Engine Roundtable has an interesting look at people who have lost their jobs because of the Panda Update. Barry Schwartz, who runs that site, ran a poll for which he got 250 responses, with the following results:

    Panda Jobs Poll  

    This isn’t the first we’ve seen of the Google Panda update killing jobs. You may recall earlier this year when Mahalo’s Jason Calacanis announced he had to reduce his staff in response to the update. More recently, we’ve seen Demand Media implement a drastic reduction in new article assignments for its freelance writers as part of a larger effort to increase the overall quality of its sites like eHow, which has (or at least had) the reputation as the kind of site the Panda update was designed to have an impact on.

    There’s an entire thread about unemployment caused by Panda in Google’s own Webmaster Central forums. The initial question asked in that is, “How many people are now unemployed due to Google’s launch of the Panda update?”

    The response listed as the “best answer” is: “How many people are employed because of all the free traffic that website[s] get from Google?”

    The guy that started the thread said he started with 15 layoffs (3 administration staff, 1 editor, 9 writers and 2 artists).

    One SEO manager said, “I have more clients today after Panda, so I am happy with it.”

    Despite Demand Media’s reduction in article assignments, and starting out as kind of the poster child for what not to do in terms of what the Panda update goes after, the company has also kind of taken the poster child role for how to move on from Panda, in terms of diversification of traffic sources and becoming less reliant on Google, and instead putting more focus on quality and driving traffic from social sources and content distribution partnerships.

    The lesson here is that while Google can be a tremendous source of traffic, you’d do well to put the effort into creating and improving upon other sources. Google is not going to stop altering its algorithm. Even if you’ve managed to do well so far with regards to Panda survival, you could be hit by another iteration of it, or some other algorithmic change entirely.

  • Does SEO Help Or Hurt User Experience?

    Jeremy Schoemaker, who runs the popular Shoemoney blog, wrote a post about a year and a half ago called “Where My Hatred of SEO Comes From“. It’s basically about site owners who put more effort into pleasing the search algorithms than pleasing users. Given the impact Google’s Panda update has had on a lot of sites, the topic of discussion seems as relevant as ever.

    Would you make a change to your site if you knew it would help you in search, but your users would hate? Let us know in the comments. And if you find this topic interesting, why not share it on StumbleUpon, Facebook, Twitter, Google+, LinkedIn, etc.

    Editor’s note: The opinions expressed in this article belong to those who expressed them, and do not necessarily reflect those of WPN.

    On the one hand, the update is aimed at getting sites that do provide a quality experience ranked better, but on the other hand, it’s sent sites into a frenzy trying to appease Google’s algorithms (though the smart ones have tried to find ways to become less dependent on Google for traffic).

    We had a conversation with Schoemaker about search quality and site quality, as well as one with Atlas Web Service owner and President Michael Gray, whom Schoemaker referenced in his original article as having turned off the comments on his blog, as being an example of worrying more about search than users.

    “I think its a big mistake to completely turn off blog comments,” Schoemaker tells WebProNews. “I recently switched ShoeMoney.com over to Facebook comments.  In doing so I lost a TON of user generated content.  Over 140,000 comments  in all and now they are down probably 80-90%.  The ones that are now are 100% real people and since their name is attached there are real conversations and discussions taking place.”

    Facebook Comments

    “It’s once again enjoyable for me to engage with our commentators,” he adds.

    One factor to consider with comments might be how they impact a page’s quality in terms of how the Panda update looks at content.

    As far as comments impacting a site, Schoemaker says he called a Google engineer friend and asked about that. He says he was told that if anything, it’s “diluting the quality score of my page” by possibly diluting overall keyword density. Another factor could be the few common comments that go through that are clearly spam send signals that the page is not being well maintained.

    “So he said he did not see a positive to leaving indexable comments on my site,” Schoemaker says. That’s when the allusion to a Google+-based comments system we told you about recently came up.

    Michael Gray turned off his comments in July of 2009, and even went so far as to completely delete all old comments in April 2010. “It was one of the best decisions I made, and regret not doing it sooner,” Gray tells us.

    Michael Gray talks comments and SEO

    “Lots of people hate it, especially the unicorn and rainbow crowd, who say it’s not a blog and has no sense of community,” he adds. “To be honest I’m not building a community, I’m not looking to hand out gold stars, trophies or make people feel better, special or that they belong. I’m a thought leader, I have my views, opinions, messages that I want to spread, and vision of where I want my blog to go. There’s a reason there’s only one captain on a ship. Are comments 100% useless? Obviously not, but they aren’t part of my vision. Actually they were an obstacle, and were holding me back.”

    “Is my decision the right one for everyone? No, but there is no law, rule, or guideline that says you have to have comments or that you have to build a community or tribe,” he adds. “Sometimes you standout by doing what everyone else tells you is wrong or crazy. There are pros & cons no matter which way you go, any consultant or employee who tells you you have to ‘have’ them and doesn’t consider or acknowledge that there is a downside, should be promptly shown own the door.”

    We asked Michael if he believes comments can have a significant impact on your time on site metric, and/or that this metric carries significant weight in Google rankings.

    “Does Google take a look at factors like time on site and bounce rate? IMHO yes, but you should be looking to increase those with good information, and solid actionable content, not comments,” he says. “The biggest effect comments have is giving Google a date to show in the SERP’s. This is a huge factor who’s importance can’t be unstated. If I’m looking for how to fix the mouse on my computer, or what dress Angelina Jolie wore to an awards show, having the date show up in the SERP has a lot of value for the user. If I’m looking to learn how to structure a website, the date plays almost no role. The author’s expertise and understanding of information architecture trumps the date.”

    “While I’m not living in the SEO world of 1999, things like keyword focus and density do play a role,” he adds. “If you’re doing your job as an SEO in 95% of the cases the keyword you are trying to rank for should be the most used word/phrase on your page. If you’ve gone to all the trouble to do that why would you now let and knucklehead with a keyboard and internet connection come by and screw that up with comments?”

    We asked Schoemaker whether he thinks Google is beyond manipulation for most sites at this point. “No… and will never be,” he says. “Link selling companies like TLA are recording record profits. Link Wheels are popping up and giving great results.”

    “Here is the thing,” he says. “When you have a branded, quality site you can feel pretty safe in buying links to it, which is ironic because these people are the least likely to feel safe doing so. But as you have seen in the past even when sites get busted for grossly violating Google’s rules they are only out for a couple days.”

    “It makes Google look bad when users can’t find what they are looking for,” he adds. “Plain and simple.”

    “Everyday, people type in ShoeMoney in Google to get to my site,” he says. “If Google were to take me out then people would go to all these scraper sites that may or may not contain my content. This gives the user a bad experience and makes Google look incompetent.”

    “So again, build a good site that people enjoy using,” he says. “The rest will sort itself out.”

    “I just focus on building sites and services that people want to use,” he says. “They tend to rank well because people link to them.  I know it’s not a juicy quote but I have made a lot of money and ranked #1 for every competitive term I have gone for.  It’s not rocket science.”

    While there have been some interesting points made here about the value of comments, we’d still like your feedback. Your comments are welcome.

  • Tips for Diagnosing A Drop In Google Rankings, From Matt Cutts

    Google has posted a new Webmaster Help video. As usual, Head of Web Spam Matt Cutts has answered a user-submitted question. The question is:

    “When you notice a drastic drop in your rankings and traffic from Google, what process would you take for diagnosing the issue?”

    “One thing I would do very early on, is I would do ‘site:mydomain.com’ and figure out are you completely not showing up in Google or do parts of your site show up in Google?” he begins. “That’s also a really good way to find out whether you are partially indexed. Or if you don’t see a snippet, then maybe you had a robots.txt that blocked us from crawling. So we might see a reference to that page, and we might return something that we were able to see when we saw a link to that page, but we weren’t able to see the page itself or weren’t able to fetch it.”

    “You might also notice in the search results if we think that you’re hacked or have malware, then we might have a warning there,” he adds. “And that could, of course, lead to a drop in traffic if people see that and decide not to go to the hacked site or the site that has malware.”

    Then it’s on to Webmaster Tools.

    “The next place I’d look is the webmaster console,” says Cutts. “Google.com/webmasters prove that you control or own the site in a variety of ways. And we’re doing even more messages than we used to do. Not just things like hidden text, park domains, doorway pages. Actively quite a few different types of messages that we’re publishing now, and when we think there’s been a violation of our quality guidelines. If you don’t see any particular issue or message listed there, then you might consider going to the Webmaster Forum.”

    “As part of that, you might end up asking yourself, is this affecting just my site or a lot of other people? If it’s just your site, then it might be that we thought that your site violated our guidelines, or of course, it could be a server-related issue or an issue on your site, of course, on your side,” he says. “But it could also be an algorithmic change. And so if a bunch of people are all seeing a particular change, then it might be more likely to be something due to an algorithm.”

    We’ve certainly seen plenty of that lately, and will likely see more tweaks to Panda for the time being, based on this recent tweet from Cutts:

    Weather report: expect some Panda-related flux in the next few weeks, but will have less impact than previous updates (~2%). 7 days ago via web · powered by @socialditto

    “You can also check other search engines, because if other search engines aren’t listing you, that’s a pretty good way to say, well, maybe the problem is on my side. So maybe I’ve deployed some test server, and maybe it had a robots.txt or a noindex so that people wouldn’t see the test server, and then you pushed it live and forgot to remove the noindex,” Cutts continues in the video. “You can also do Fetch as Googlebot. That’s another method. That’s also in our Google Webmaster console. And what that lets you do is send out Googlebot and actually retrieve a page and show you what it fetched. And sometimes you’ll be surprised. It could be hacked or things along those lines, or people could have added a noindex tag, or a rel=canonical that pointed to a hacker’s page.”

    “We’ve also seen a few people who, for whatever reason, were cloaking, and did it wrong, and shot themselves in the foot,” he notes. “And so they were trying to cloak, and instead they returned normal content to users and completely empty content to Googlebot.”

    “Certainly if you’ve changed your site, your hosting, if you’ve revamped your design, a lot of that can also cause things,” he says. “So you want to look at if there’s any major thing you’ve changed on your side, whether it be a DNS, host name, anything along those lines around the same time. That can definitely account for things. If you deployed something that’s really sophisticated AJAX, maybe the search engines are[n’t] quite able to crawl that and figure things out.

    Cutts, of course advises filling out a reconsideration request, once you think you’ve figured the issue.

  • Google Panda Update: Latest Confirmed as Global, Tweaked with Fresh Data

    If you’ve been following the Google Panda saga, you know that an update was launched last week. At least one site, which was negatively impacted by the update (DaniWeb) suddenly regained its search visibility after Google apparently made some kind of adjustment.

    We asked Google if they made any tweaks to the Panda update since last week’s iteration was launched, to which a spokesperson replied, “We pushed a fresh version of data that incorporated more of the signals that we’ve incorporated after the initial launch of Panda.”

    She also pointed us to the Matt Cutts tweet we wrote about earlier:

    Weather report: expect some Panda-related flux in the next few weeks, but will have less impact than previous updates (~2%). 8 hours ago via web · powered by @socialditto

    So expect more fine-tuning of the latest update.

    The spokesperson also told us that the latest update was global, in case there was any question about that.

    From what we’re hearing, it sounds like we would be correct in assuming that Panda updates from here on out will be global.

    In August, Google rolled out the update across most languages.

  • Google Panda Update: Could Inaccurate Google Data Be Costing Sites Traffic?

    Google Panda Update: Could Inaccurate Google Data Be Costing Sites Traffic?

    Late last week, it was discovered that Google had rolled out another version of the Panda update earlier in the week. Industry voices have dubbed the update “2.5”. Google dubbed it “one of the roughly 500 changes we make to our ranking algorithms each year.”

    Did you notice a drop or increase in traffic in the past week or so? Let us know.

    SearchMetrics put out lists of the top winners and losers from the update. Some sites were surprising, some weren’t. Interestingly enough, eHow and EzineArticles, which were previously “pandalized” were not on the loser list this time. EzineArticles would not offer comment, and eHow (Demand Media) told us that they’ve been pleased with the results of a massive content clean-up initiative they’ve implemented this year.

    Another previous victim, HubPages, was even able to make the winners list this time around. Some of the more surprising “losers” were press release distribution services Business Wire (which actually just patented its SEO strategy) and PR Newswire, and tech blog TheNextWeb. There have been some questions raised over the accuracy of the SearchMetrics data, however.

    “I’m glad to say we had a good summer as far as traffic is concerned,” Rod Nicolson, VP User Experience Design & Workflow for PR Newswire tells us. “We’ll continue to monitor closely, but so far we’re not seeing any unusual changes to our traffic due to Panda 2.5.”

    TheNextWeb Editor in Chief Zee Kane tells us, “We haven’t noticed any effect right now but we’re still digging in. Will hopefully know more over the course of the next week.”

    We’ve reached out to SearchMetrics for comment, but are still awaiting a response. We’ll update when we receive one.

    Update: Here’s what SearchMetrics tells us about its data:

    We monitor a selected and representative set of keywords for Google (in several countries) once a week and analyze the search results pages for these keywords. One of the main indices we calculate from this is the Organic Search Engine Visibility. This is a culmination of figures collated from search volume (ie how often people are searching for a keyword or phrase) and how often and on which position (ie what position on a Google results page) a domain/web site appears. Add them all up (plus some more math applied) and you get the performance index – an estimate for how visible a site is on Google in a specific country.

    The basis for our analysis is a local keyword set for every country we analyze. Our values are local, that’s why we can give you an overview over the SEO and SEM visibility per country. The keyword sets are representative and varied between some hundred thousand and 10 million. The keyword sets are extended every month with new keywords added and irrelevant / outdated keywords deleted.

    While we track millions of keywords, we obviously don’t track every single keyword that is searched. We can be viewed as providing a very good indication of underlying trends. However, results can be off when, for example, a web site has only a very small visibility and is ranking for a small number of keywords or a higher percentage of the keywords a domain is ranking for is not included in our keyword set.

    Please note: Visibility is not the same as traffic. Further, sites that are listed among a ‘losers’ list may still generate traffic from other sources and can still potentially continue to prosper. Our data can only be used as a trend for search engine visibility on Google. But Google isn’t the only traffic source websites can have. So, if a site experiences a reduction in Google visibility, it may still continue to generate good traffic to and continue to prosper. Other sources of traffic include real ‘type-in’ traffic (when visitors type in a URL); social media traffic (ie from Facebook, Twitter, blogs and other); and affiliate traffic etc.

    DaniWeb, which has been an ongoing sub-plot of the Panda storyline throughout the year, due to its victimization and full recovery, was hit again by the most recent update. In fact, Dani Horowitz, who runs the IT discussion community, is the one that tipped us that this was even going on.

    Horowitz and her team have of course been doing some investigating themselves, and documenting this a bit in a Google support forum. In it, she writes:

    So, everyone, thanks to DaniWeb’s handy dandy systems administrator, we have come to a conclusion. Our ‘time on site’ statistic decreased by 75% at 1 pm on August 11th, and has been holding steady at the reduced number, as a result of Google Analytics rolling out their new session management feature.

    There have been MANY reports across the web of the bounce rate and time on site being inaccurate every since August 11th, especially when multiple 301 redirects are involved (which we use heavily).

    As a result, we have been hit by Panda. Or so I gather.

    Now, this is not confirmed, but could a Google Analytics change, and inaccurate data on Google’s part be responsible for sites losing over half of their traffic? If so, that’s not cool.

    Google, who famously won’t reveal its secret recipe for search rankings or even list each of the factors without revealing the weight of each, has been historically vague about its use of Google Analytics metrics in search. Michael Gray recently wrote a post suggesting that you can almost guarantee that Google is using your Analytics data, but he mentions how Google always manages to sidestep questions about its use (or non-use) of data for bounce rate, exit rate, time on site, etc.

    Another interesting side-story to the Panda saga is that Google-owned sites have done well (according to the Searchmetrics data). The timing of the most recent Panda update, which Searchmetrics counts YouTube and Android.com as major winners for, is interesting given recent Senate discussions about Google favoring its own content in search results. A Google spokesperson gave us the following statement on the matter:

    “Our intent is to rank web search results in order to deliver the most relevant answers to users. Each change we make goes through a process of rigorous scientific testing, and if we don’t believe that a change will help users, we won’t launch the change. In particular, last week’s Panda change was a result of bringing more data into our algorithms.”

    The Panda update has appeared to favor video content throughout its various iterations (and not just YouTube). I can tell you that video has some major SEO benefits regardless of Panda, and that it is also great for increasing time on site. If a user is watching a video on your page, they’re on the page for the duration of the video or at least until they lose interest (so use good video content).

    Even Demand Media told us after they announced the eHow clean-up, that it wouldn’t much affect its YouTube strategy.

    Update: Dani Horowitz tells us that DaniWeb has already recovered. More here.

    Do you think Google is improving its search results with the Panda update? Let us know what you think in the comments.