WebProNews

Tag: webspam

  • Google Notifies Webmasters Of Manual Actions Regarding ‘Spammy Structured Markup’

    Google wants you to use structured data markup on your site to give it better information, and help it make more compelling search results.

    “If Google understands the markup on your pages, it can use this information to add rich snippets and other features to your search result,” the company explains. “For example, the search snippet for a restaurant might show its average review and price range. You can add structured data to your page using the schema.org vocabulary and formats such as Microdata and RDF, alongside other approaches such as Microformats. You can also add structured data by tagging the data on your page using Data Highlighter.”

    Google doesn’t, however, want you to use this markup in spammy ways, which people have obviously been doing.

    Google is now reportedly sending webmasters abusing rich snippet markup notifications of manual actions taken on their sites. Search Engine Roundtable points to a Google Webmaster Help forum post where one webmaster shared a message they received:

    Spammy structured markup
    Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.

    The webmaster who posted it said his team couldn’t find the issue. Suggestions from others in the discussion included marking up his name as the author in three different ways on a page, marking up things that aren’t visible on the page, marking up one item as both an article and a blog post, marking up empty space in the footer, and marking up a Facebook page as a publisher of the webpage.

    Maybe we’ll get a Matt Cutts video on the subject soon.

    Image via Google

  • The Latest From Google On Guest Blogging

    The subject of guest blogging has been coming up more and more lately in Google’s messaging to webmasters. Long story short, just don’t abuse it.

    Matt Cutts talked about it in response to a submitted question in a recent Webmaster Help video:

    He said, “It’s clear from the way that people are talking about it that there are a lot of low-quality guest blogger sites, and there’s a lot of low-quality guest blogging going on. And anytime people are automating that or abusing that or really trying to make a bunch of link without really doing the sort of hard work that really earns links on the basis of merit or because they’re editorial, then it’s safe to assume that Google will take a closer look at that.”

    “I wouldn’t recommend that you make it your only way of gathering links,” Cutts added. “I wouldn’t recommend that you send out thousands of blast emails offering to guest blog. I wouldn’t recommend that you guest blog with the same article on two different blogs. I wouldn’t recommend that you take one article and spin it lots of times. There’s definitely a lot of abuse and growing spam that we see in the guest blogging space, so regardless of the spam technique that people are using from month to month, we’re always looking at things that are starting to be more and more abused, and we’re always willing to respond to that and take the appropriate action to make sure that users get the best set of search results.”

    But you already knew that, right?

  • Google Goes After Yet Another Link Network

    Earlier this month, Google revealed that it would be cracking down on more link networks, following a larger trend that has been taking place throughout the year.

    Google’s Matt Cutts hinted on Twitter that Google was taking action on the network Anglo Rank.

    He went on to note that they’d be “rolling up a few.”

    On Friday, Cutts tweeted similarly:

    This was apparently in reference to another network, BackLinks.com.

    No surprises really, but Google is making it quite clear that it’s going to continue to penalize these types of sites.

    Hat tip to Search Engine Land.

    Image: BackLinks.com

  • Google Gives Advice On Speedier Penalty Recovery

    Google has shared some advice in a new Webmaster Help video about recovering from Google penalties that you have incurred as the result of a time period of spammy links.

    Now, as we’ve seen, sometimes this happens to a company unintentionally. A business could have hired the wrong person/people to do their SEO work, and gotten their site banished from Google, without even realizing they were doing anything wrong. Remember when Google had to penalize its own Chrome landing page because a third-party firm bent the rules on its behalf?

    Google is cautiously suggesting “radical” actions from webmasters, and sending a bit of a mixed message.

    How far would you go to get back in Google’s good graces? How important is Google to your business’ survival? Share your thoughts in the comments.

    The company’s head of webspam, Matt Cutts, took on the following question:

    How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how did they fix it, as some of us have spent months try[ing] to clean things up after an unclear GWT notification.

    As you may recall, Interflora, a major UK flowers site, was hit with a Google penalty early this year. Google didn’t exactly call out the company publicly, but after reports of the penalty came out, the company mysteriously wrote a blog post warning people not to engage in the buying and selling of links.

    But you don’t have to buy and sell links to get hit with a Google penalty for webspam, and Cutts’ response goes beyond that. He declines to discuss a specific company because that’s not typically not Google’s style, but proceeds to try and answer the question in more general terms.

    “Google tends to looking at buying and selling links that pass PageRank as a violation of our guidelines, and if we see that happening multiple times – repeated times – then the actions that we take get more and more severe, so we’re more willing to take stronger action whenever we see repeat violations,” he says.

    That’s the first thing to keep in mind, if you’re trying to recover. Don’t try to recover by breaking the rules more, because that will just make Google’s vengeance all the greater when it inevitably catches you.

    Google continues to bring the hammer down on any black hat link network it can get its hands on, by the way. Just the other day, Cutts noted that Google has taken out a few of them, following a larger trend that has been going on throughout the year.

    The second thing to keep in mind is that Google wants to know your’e taking its guidelines seriously, and that you really do want to get better – you really do want to play by the rules.

    “If a company were to be caught buying links, it would be interesting if, for example, [if] you knew that it started in the middle of 2012, and ended in March 2013 or something like that,” Cutts continues in the video. “If a company were to go back and disavow every single link that they had gotten in 2012, that’s a pretty monumentally epic, large action. So that’s the sort of thing where a company is willing to say, ‘You know what? We might have had good links for a number of years, and then we just had really bad advice, and somebody did everything wrong for a few months – maybe up to a year, so just to be safe, let’s just disavow everything in that timeframe.’ That’s a pretty radical action, and that’s the sort of thing where if we heard back in a reconsideration request that someone had taken that kind of a strong action, then we could look, and say, ‘Ok, this is something that people are taking seriously.”

    Now, don’t go getting carried away. Google has been pretty clear since the Disavow Links tool launched that this isn’t something that most people want to do.

    Cutts reiterates, “So it’s not something that I would typically recommend for everybody – to disavow every link that you’ve gotten for a period of years – but certainly when people start over with completely new websites they bought – we have seen a few cases where people will disavow every single link because they truly want to get a fresh start. It’s a nice looking domain, but the previous owners had just burned it to a crisp in terms of the amount of webspam that they’ve done. So typically what we see from a reconsideration request is people starting out, and just trying to prune a few links. A good reconsideration request is often using the ‘domain:’ query, and taking out large amounts of domains which have bad links.”

    “I wouldn’t necessarily recommend going and removing everything from the last year or everything from the last year and a half,” he adds. “But that sort of large-scale action, if taken, can have an impact whenever we’re assessing a domain within a reconsideration request.”

    In other words, if your’e willing to go to such great lengths and eliminate such a big number of links, Google’s going to notice.

    I don’t know that it’s going to get you out of the penalty box in eleven days (as the Interflora question mentions), but it will at least show Google that you mean business, and, in theory at least, help you get out of it.

    Much of what Cutts has to say this time around echoes things he has mentioned in the past. Earlier this year, he suggested using the Disavow Links tool like a “machete”. He noted that Google sees a lot of people trying to go through their links with a fine-toothed comb, when they should really be taking broader swipes.

    “For example, often it would help to use the ‘domain:’ operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links,” he said. “That’s one reason why we sometimes see it take a while to clean up those old, not-very-good links.”

    On another occasion, he discussed some common mistakes he sees people making with the Disavow Links tool. The first time someone attempts a reconsideration request, people are taking the scalpel (or “fine-toothed comb”) approach, rather than the machete approach.

    “You need to go a little bit deeper in terms of getting rid of the really bad links,” he said. “So, for example, if you’ve got links from some very spammy forum or something like that, rather than trying to identify the individual pages, that might be the opportunity to do a ‘domain:’. So if you’ve got a lot of links that you think are bad from a particular site, just go ahead and do ‘domain:’ and the name of that domain. Don’t maybe try to pick out the individual links because you might be missing a lot more links.”

    And remember, you need to make sure you’re using the right syntax. You need to use the “domain:” query in the following format:

    domain:example.com

    Don’t add an “http” or a ‘www” or anything like that. Just the domain.

    So, just to recap: Radical, large-scale actions could be just what you need to take to make Google seriously reconsider your site, and could get things moving more quickly than trying single out links from domains. But Google wouldn’t necessarily recommend doing it.

    Oh, Google. You and your crystal clear, never-mixed messaging.

    As Max Minzer commented on YouTube (or is that Google+?), “everyone is going to do exactly that now…unfortunately.”

    Yes, this advice will no doubt lead many to unnecessarily obliterate many of the backlinks they’ve accumulated – including legitimate links – for fear of Google. Fear they won’t be able to make that recovery at all, let alone quickly. Hopefully the potential for overcompensation will be considered if Google decides to use Disavow Links as a ranking signal.

    Would you consider having Google disavow all links from a year’s time? Share your thoughts in the comments.

  • Did Google Give Webmasters What They Need This Time?

    Webmasters, many of which have businesses that rely on search rankings, have been wanting Google to do more to communicate with more specificity what is hurting their sites in Google search rankings. The search engine can’t seem to do enough to please everybody, but it does continue to launch tools and resources.

    Is Google doing enough to communicate issues it has with sites, or does it still need to do more? What exactly should Google be doing? Let us know what you think.

    Google has added a new feature to Webmaster Tools called the Manual Action Viewer. This is designed to show webmasters information about when Google’s manual webspam team has taken manual action that directly affects their site’s ranking in the search engine.

    To access the feature, simply click on “Manual Actions” under “Search Traffic” in Webmaster Tools. If Google hasn’t taken any action against your site, you should see a message that says “No Manual webspam actions found.” Obviously, this is what you want to see.

    Google notes that only less than 2% of the domains it sees are actually manually removed for webspam, so the likelihood that you see anything other than the message above seems pretty minimal (that is, of course, if you’re not spamming Google).

    The company will still notify you when you get a manual spam action, but the feature is just giving you another way to check. Here’s what you might see if you did have a manual action taken against you:

    Manual Action Viewer

    “In this hypothetical example, there isn’t a site-wide match, but there is a ‘partial match,’” Google’s Matt Cutts explains in a post on the Webmaster Central blog. “A partial match means the action applies only to a specific section of a site. In this case, the webmaster has a problem with other people leaving spam on mattcutts.com/forum/. By fixing this common issue, the webmaster can not only help restore his forum’s rankings on Google, but also improve the experience for his users. Clicking the “Learn more” link will offer new resources for troubleshooting.”

    “Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration,” he adds. “With this new feature, you’ll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking ‘Request a review.’”

    As Cutts notes, this new feature is something that Webmasters have been requesting for some time. While he emphasizes that a very small percentage of Webmasters will actually see any actions in the viewer, it is at least a new way to know for sure if Google has indeed taken a manual action.

    Reactions in the comments of Google’s announcement are a little mixed. Most of the visible comments are praising the tool. One person says they’re already putting the feature to good use. Another says, “Finally!”

    I say visible comments because many of them say, “Comment deleted. This comment has been removed by the author.”

    One user says, “If we have followed Matt’s advice and Google’s guidelines, why would we need this tool? Please give us a tool that can really help us , not distract us.”

    In addition to the new WMT feature, Google has put out a series of seven new videos to go with its documentation about webspam, explaining what each type really means. Cutts, with the assistance of a few other Googlers, covers unnatural links, think content, hidden text, keyword stuffing, user-generated spam, and pure spam. You can find all of them here.

    This is Google’s latest attempt to make its documentation more helpful. A couple weeks ago, Google updated its Link Schemes page to discuss article marketing and guest posting, advertorials and press release links.

    Of course this is all only applicable to those who have been hit with manual penalties, and is of little comfort to those hit by algorithm changes. If that’s your problem, you may want to look into the whole authorship thing, which just might be influencing ranking significantly.

    Are Google’s most recent “webmaster help” efforts truly helpful to webmasters? Let us know in the comments.

  • Here’s A New Google Video About Hidden Text And Keyword Stuffing

    Here’s A New Google Video About Hidden Text And Keyword Stuffing

    Okay, one more. Google cranked out seven new Webmaster Help videos feature Matt Cutts (and in some cases, other Googlers) talking about various types of webspam.

    So far, we’ve looked at three videos about unnatural links, one about thin content, one about user-generated spam and one about pure spam. You can find them all here.

    Finally, on to hidden text and/or keyword stuffing. This, like much of the content found in the other videos is pretty basic stuff and pretty common SEO knowledge, but that doesn’t mean it’s not valuable information to some.

  • If Google Has Accused Your Site Of ‘User-Generated Spam,’ You’ll Want To Watch This Video

    Google pumped out a batch of new videos about webspam via its Webmaster Help YouTube channel. You can find others from the series here.

    We just looked at one about the “pure spam” manual action label. This one is about “user-generated spam”.

    User-generated spam could include forum spam, spammy user profiles, spammy blog comments, spammy guestbook comments, etc.

    “The good thing is that normally when you see this kind of message, it normally means that the manual action we’ve taken is pretty precisely scoped,” Cutts says. “If possible, we try to avoid taking action on the whole domain. We might say something like, ‘Okay, don’t trust this forum. Don’t trust this part of the site, and that’s kind of nice because it doesn’t affect the primary part of your site, as long as your site is high quality. It might just affect your forum. So that’s how we try to do it unless we see so many different parts of the site that have been defaced or have been overrun that we end up taking action on the entire site.”

    The advice if you get this message is basically to clean it up. He suggests looking at new users that have been created, finding the spammy ones and kicking them out of your system. Also, deleting threads that are spammy would be a good idea.

    You also want to do preventive maintenance like CAPTCHAs and comment moderation.

    Google is clearly doing more to educate people about its manual actions. The company also just put out a new Webmaster Tools feature that lets users see when they have a manual action against them.

  • Google’s Cutts Explains The ‘Pure Spam’ Manual Action Label

    Google has put out a series of videos discussing various forms of webspam. You can see others from the series here.

    In this one, Google’s Matt Cutts explains the “Pure Spam” manual action label.

    This basically includes scraping, cloaking and automated black hat drivel. This kind of spam is the vast majority of the sites Google takes action on, Cutts says.

    He does talk about the scenario of buying a domain that had earned this label and getting Google to trust it under your ownership, which some people may find helpful.

    Google, in case you haven’t heard yet, has just added a new feature to Webmaster Tools called Manual Action Viewer, which will let webmasters see if Google has taken a manual action against their site. According to Cutts, this only happens for less than 2% of domains.

  • …And Here’s One More Video Of Matt Cutts Talking About Unnatural Links

    Google has put out a new group of videos about various webspam topics. Three of these are specifically about unnatural links. Here’s one on unnatural links from your site, and here’s one on unnatural links to your site.

    While both of these videos featured Matt Cutts with other Googlers, this one is just Cutts himself talking about unnatural links and their impact.

    “Over time, we’ve gotten more granular, and our approaches have become more sophisticated, and so as a result, if you think perhaps that your site overall is good, but there might be some bad links (it’s not all bad links, but a portion bad links), then we might be taking targeted action on just those links. And that can be bad links or links that you might not think of as typically bad.”

    He goes on to talk about various examples. If you’ve got about ten minutes to spare, you’ll probably want to give it a watch.

  • For Better Or Worse, A Lot Of Change Is Coming To Google SEO

    Google has a lot of stuff in the works that will have a direct impact on webmasters and the search engine optimization community. In a seven-minute “Webmaster Help” video, Google’s Matt Cutts (sporting a Mozilla Firefox shirt), ran down much of what Google’s webspam team has planned for the coming months, and what it all means for webmasters. It involves the Penguin update, the Panda update, advertorials, hacked sites, link spam, and a lot more.

    Are you paying close attention to Google’s algorithm updates these days? Are you looking forward to the updates, or are you afraid of what they will bring? Let us know in the comments.

    Cutts is careful to note that any of this information is subject to change, and should be taken with a grain of salt, but this pretty much the kind of stuff they have planned at the moment.

    Penguin

    We already knew the Penguin update was on the way, and he touches on that.

    “We’re relatively close to deploying the next generation of Penguin,” says Cutts. “Internally we call it ‘Penguin 2.0,’ and again, Penguin is a webspam change that’s dedicated to try to find black hat webspam, and try to target and address that. So this one is a little more comprehensive than Penguin 1.0, and we expect it to go a little bit deeper in have a little bit more of an impact than the original version of Penguin.”

    Before the video came out, Cutts was already talking about this update on Twitter, saying that it would be “larger” and roll out in the “next few weeks”.

    Updates To Panda

    Google recently changed its updating strategy for Panda. Webmasters use to anxiously await coming Panda updates, but Google has turned it into a rolling update, meaning that it will continue to update often and regularly, to the point where anticipating any one big update is not really possible any longer. On top of that, Google stopped announcing them, as it just doesn’t make sense for them to do so anymore.

    That doesn’t mean there isn’t Panda news, as Cutts has proven. It turns out that the Panda that has haunted so many webmasters over the last couple years may start easing up a little bit, and become (dare I say?) a bit friendlier.

    Cutts says, “We’ve also been looking at Panda, and seeing if we can find some additional signals (and we think we’ve got some) to help refine things for the sites that are kind of in the border zone – in the gray area a little bit. And so if we can soften the effect a little bit for those sites that we believe have some additional signals of quality, then that will help sites that have previously been affected (to some degree) by Panda.”

    Sites And Their Authority

    If you’re an authority on any topic, and you write about it a lot, this should be good news (in a perfect world, at least).

    “We have also been working on a lot of ways to help regular webmasters,” says Cutts. “We’re doing a better job of detecting when someone is more of an authority on a specific space. You know, it could be medical. It could be travel. Whatever. And try to make sure that those rank a little more highly if you’re some sort of authority or a site, according to the algorithms, we think might be a little more appropriate for users.”

    Advertorials

    Also on the Google menu is a bigger crackdown on advertorials.

    “We’ve also been looking at advertorials,” says Cutts .”That is sort of native advertising – and those sorts of things that violate our quality guidelines. So, again, if someone pays for coverage, or pays for an ad or something like that, those ads should not flow PageRank. We’ve seen a few sites in the U.S. and around the world that take money and do link to websites, and pass PageRank, so we’ll be looking at some efforts to be a little bit stronger on our enforcement as advertorials that violate our quality guidelines.”

    “There’s nothing wrong inherently with advertorials or native advertising, but they should not flow PageRank, and there should be clear and conspicuous disclosure, so that users realize that something is paid – not organic or editorial,” he adds.

    Queries With High Spam Rates

    Google will also be working harder on certain types of queries that tend to draw a lot of spam.

    Cutts says, “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on Google.co.uk. So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.”

    Denying Value To Link Spam

    Google will continue to be vigilant when it comes to all types of link spam, and has some new tricks up its sleeve, apparently.

    Cutts says, “We’re also looking at some ways to go upstream to deny the value to link spammers – some people who spam links in various ways. We’ve got some nice ideas on ways that that becomes less effective, and so we expect that that will roll out over the next few months as well.”

    “In fact, we’re working on a completely different system that does more sophisticated link analysis,” he adds. “We’re still in the early days for that, but it’s pretty exciting. We’ve got some data now that we’re ready to start munching, and see how good it looks. We’ll see whether that bears fruit or not.”

    Hopefully this won’t lead to a whole lot of new “fear of linking” from webmasters, as we’ve seen since Penguin first rolled out, but that’s probably wishful thinking.

    Hacked Sites

    Google intends to get better on the hacked sites front.

    “We also continue to work on hacked sites in a couple different ways,” says Cutts. “Number one: trying to detect them better. We hope in the next few months to roll out a next-generation site detection that is even more comprehensive, and also trying to communicate better to webmasters, because sometimes they see confusion between hacked sites and sites that serve up malware, and ideally, you’d have a one-stop shop where once someone realizes that they’ve been hacked, they can go to Webmaster Tools, and have some single spot where they could go and have a lot more info to sort of point them in the right way to hopefully clean up those hacked sites.”

    Clusters Of Results From The Same Site

    There have been complaints about domain clustering in Google’s results, and Google showing too many results from the same domain on some queries.

    Cutts says, “We’ve also heard a lot of feedback from people about – if I go down three pages deep, I’ll see a cluster of several results all from one domain, and we’ve actually made things better in terms of – you would be less likely to see that on the first page, but more likely to see that on the following pages. And we’re looking a change, which might deploy, which would basically say that once you’ve seen a cluster of results from one site, then you’d be less likely to see more results from that site as you go deeper into the next pages of Google search results.”

    “We’re going to keep trying to figure out how we can give more information to webmasters…we’re also going to be looking for ways that we can provide more concrete details, [and] more example URLs that webmasters can use to figure out where to go to diagnose their site.”

    So Google has a lot of stuff in the works that SEOs and webmasters are going to want to keep a close eye on. It’s going to be interesting to see the impact it all has. Given that Google makes algorithm changes every day, this has to be far from everything they have in the works, but I guess the video makes up for the lack of “Search Quality HIghlights” from Google in recent months. Still wondering if those are ever coming back. They were, after all, released to keep Google more transparent.

    What do you think of the changes Matt Cutts talked about. Looking forward to any of them? Dreading any? Let us know in the comments.

  • Matt Cutts: Panda Update Coming Friday, ‘Big’ Penguin Update Later This Year

    According to Google webspam head Matt Cutts, we can expect the next Panda refresh to occur within the next few days.

    Speaking at the SMX conference, Cutts said that the next Panda update will take place this Friday, March 15th or by Monday, March 18th at the latest.

    The last Panda update rolled out on January 22nd, and Google said that it affected 1.2% of queries. Even if a Panda update launches this Friday, it will have been the longest time between updates in recent memory. Google previously released a Panda update a few days before Christmas, and two back in November.

    Although the Panda refresh is coming sooner, a Penguin update is also on the horizon – and Cutts said that it’ll be a big one. Cutts said that it will be one of the most talked-about updates of the year.

    They are “working on the next generation of Penguin,” said Cutts.

    More algorithm changes were discussed at SXSW last week. There, Cutts announced a possible crackdown on bad online merchants.

    “We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results,” he said.

    Check here for more on the future of Panda and Penguin in 2013

  • Matt Cutts Talks About Fighting Webspam Around The World

    Google put out a new Webmaster Help video today. This time Matt Cutts talks about the company’s efforts to fight webspam on a global scale, as opposed to just in the U.S. and in English.

    The video was a response to the user-submitted question:

    Europe is small compared with USA, so will Google get a webspam team for smaller markets?

    “It turns out we actually do have a webspam team based in Europe (in Dublin, in fact), and they’re able to handle webspam and tackle spam reports in a wide variety of languages, so on the order o well over a dozen – dozens of languages, because there’s a lot of smart people there,” says Cutts. “So we actually have people on the ground in a lot of different offices around the world, and we also have engineers in Zurich. We have an engineer in Hong Kong, but there’s a lot of people who have native experience…people who think about spam in Russia, but also a lot of people in Dublin, who have done a fantastic job dealing with, you know, if an algorithm misses something, they’re there to find the spam. They know the lay of the land. They know who the big players are, and they’re really quite expert.”

    “But if there’s some kind of really unique link spam going on in Poland, for example, there’s a person there, and those people are on top of that situation,” he adds. “So, I think it’s important that Google not be just a U.S.-centric or an English-centric company. We want to be international. We want to deal with all different languages, and it is the case that we might not have webpam full-time on every single language, but you would be pretty shocked at the number of languages that the webspam team collectively is able to fight spam in.”

    Webspam is always a big issue for Google, but it’s been a particularly big issue in the search industry this year, thanks to Google’s launch of the Penguin update, designed to algorithmically tackle sites violating Google’s quality guidelines.

    In another video from Google this week, Cutts said that about 90% of the messages Google sends out to webmasters are about black hat webspam.

    More recent Webmaster Help videos from Matt Cutts here.

  • Spinfographics: When Will Google Crack Down On Infographic Spam?

    Infographic spam may soon take its rightful place in the grand lineage of splogs, duplicate-content articles and mass directory submissions.

    Spinfographics. Get ready, because they about to flood the Internet.

    You never heard of “spinfographics” before? It means, well, if you know what a splog is, you will probably understand exactly what a spinfographic is. If, not we had best go back to the beginning.

    In the beginning there was Google.

    Google created the Internet, and saw that it was good.

    Then Google created websites, and saw that it was good.

    Then Google ranked websites, and saw that it was good.

    And Google told webmasters to make their websites for users, not for higher rankings. But webmasters were tempted, and they took of the fruit of the Tree of Knowledge, that they might be like Google and know the ranking algorithm.

    And nothing was the same again. Every time webmasters took another bite of the fruit, webmasterkind would spoil it. The pattern was always the same…

    1. Knowledge: A few people discover that they can rank better by adding more terms in the
    keywords meta tag.

    2. Temptation: Everybody decides to stuff their keyword meta tag so they can rank for everything.

    3. A big mess!: Suddenly rankings are scalable, everybody can do it and replicate with ease. Too much quantity, too little quality.

    4. Banishment: Google removes keywords meta data from its algorithm.

    Sometime later…

    1. Knowledge: People learn that directory links can be useful for ranking well.

    2. Temptation: Some people realize that if they can create tons of directories, they can get lots of webmaster traffic.
    Other people discover that if they can auto-submit sites, they can make money for building tons of links.

    3. A big mess!: Suddenly link-building is scalable, everybody can do it and replicate with ease. Too much quantity, too little quality.

    4. Banishment: Google devalues directory links in its algorithm.

    Then…

    1. Knowledge: People learn that article directory links can be useful for ranking well.

    2. Temptation: Some people realize that if they can create tons of article directories, they can get lots of webmaster traffic. Other people discover that if they can auto-submit articles, they can make money for building tons of links. Quickly. Cheaply.

    3. A big mess!: Suddenly article submissions are scalable, everybody can do it and replicate with ease. Too much quantity, too little quality.

    4. Banishment: Google devalues links from duplicate content in its algorithm.

    Then, of course…

    1. Knowledge: People figure out that if they spin each article into various versions, they can use the same basic content without creating duplicate content.

    2. Temptation: Some people realize if they can automate the spinning process, they can create lots of articles easily from the same content. Quickly. Cheaply.

    3. A big mess!: Suddenly article spinning is scalable, everybody can do it and replicate with ease. Too much quantity, too little quality. In fact, so little quality that it starts turning the Internet into a waste bin.

    4. Banishment: Google devalues spun content in its algorithm and penalizes heavy users.

    We are getting closer. And then…

    1. Knowledge: People figure out that keyword rich links in blog content are the best links for ranking well.

    2. Temptation: Some people realize how much money they can make by offering tons of in-content blog links for very little work by creating blogs just to sell links.

    3. A big mess!: Suddenly in-content blog link-building is scalable, and splogs (spam blogs) are popping up like weeds. Too much quantity, too little quality. Yes, the Internet really is looking more and more like a waste bin.

    4. Banishment: Google de-indexes whole networks of splogs and penalizes heavy users. Can you say “Penguin”?

    And next…

    1. Knowledge: Some people figure out that they can get lots of good links by sharing Infographics.

    2. Temptation: Infographics galleries start popping up and some people realize there is a market to be made selling “cheap, easy, DIY Infographics”.

    3. A big mess!: Suddenly Infographics creation and distribution becomes ______________ . Too much quantity, too little quality. (Fill in the blank. Hint, it rhymes with “shwalable”). Yes, we transition from Infographics to spinfographics.

    4. Banishment: Google _________________________ (Fill in the blanks). What do you think Google will do to spinfographers – to webmasters who mass produce and mass distribute Infographics?

    Listen carefully, and you can already hear the moaning and groaning on future webmaster forums, as people complain with surprise that their sites have been penalized or lost rankings because they were mass distributing Infographics to artificially boost their rankings.

    “But Google says, ‘The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community.’”

    OK, sure. But the pattern is always the same. Huge swarms of webmasters looking for shortcuts, trying to mass produce quality, totally oblivious to the oxymoron of their business model. And they spoil it for the rest of us. Already people are advertising services to create “easy” infographics “in minutes” for a very “cheap” price.

    Does this mean the days of Infographics are numbered? I don’t think so. There always has been a place for graphical displays of data. Newspapers have been doing it for decades, and it will continue on the Internet.

    However, I am certain that any popular link-bait strategy using Infographics today will be outdated a year or two from now. Smart webmasters will go back to the table and reconsider how to use Infographics to boost their promotions.

    Done right, I am confident that these will always be useful for search engine rankings. Just as blog links.
    And content spinning. And article links. And directory links. And…well, maybe not meta tags.

    Just as in all these previous techniques, webmasters will have to make sure that it is perfectly clear to the search engines that they are not mass-producing, mass-linking or using a scalable or automated method to create or distribute content.

    And there is a single strategy that applies to all of these. Don’t do it for the search engines; do it for reaching out to new markets. Don’t ignore the search engines; keep one eye on them with everything you do. But if the main goal of any action is aimed at reaching new markets, you will end up creating and distributing the kind of content that Google wants you to. Or at least that Google is now saying that it wants you to – but that is another scary topic for another discussion.

    For now, the key thing is to avoid Spinfographics, because with the Penguin update, Google has shown that it is ready to do more than just devalue scalable links – they are willing to penalize sites involved.

  • Matt Cutts: Here’s How To Expose Your Competitors’ Black Hat SEO Practices

    Google’s Matt Cutts put out a Webmaster Help video discussing how to alert Google when your competitors are engaging in webspam and black hat SEO techniques. The video was in response to the following user-submitted question:

    White hat search marketers read and follow Google Guidelines. What should they tell clients whose competitors use black hat techniques (such as using doorway pages) and whom continue to rank as a result of those techniques?

    Do you you think Google does a good job catching webspam? Let us know in the comments.

    “So first and foremost, I would say do a spam report, because if you’re violating Google’s guidelines in terms of cloaking or sneaky JavaScript redirects, buying links, doorway pages, keyword stuffing, all those kinds of things, we do want to know about it,” he says. “So you can do a spam report. That’s private. You can also stop by Google’s Webmaster forum, and that’s more public, but you can do a spam report there. You can sort of say, hey, I saw this content. It seems like it’s ranking higher than it should be ranking. Here’s a real business, and it’s being outranked by this spammer…those kinds of things.”

    He notes that are both Google employees and “super users” who keep an eye on the forum, and can alert Google about issues.

    “The other thing that I would say is if you look at the history of which businesses have done well over time, you’ll find the sorts of sites and the sorts of businesses that are built to stand the test of time,” says Cutts. “If someone is using a technique that is a gimmick or something that’s like the SEO fad of the day, that’s a little less likely to really work well a few years from now. So a lot of the times, you’ll see people just chasing after, ‘OK, I’m going to use guest books’, or iI’m going to use link wheels’ or whatever. And then they find, ‘Oh, that stopped working as well.’ And sometimes it’s because of broad algorithmic changes like Panda. Sometimes it’s because of specific web spam targeted algorithms.”

    I’m sure you’ve heard of Penguin.

    He references the JC Penney and Overstock.com incidents, in which Google took manual action. For some reason, he didn’t bring up the Google Chrome incident.

    This is actually a pretty timely video from Cutts, as another big paid linking controversy was uncovered by Josh Davis (which Cutts acknowledged on Twitter). Google ended up de-indexing the SEO firm involved in that.

    “So my short answer is go ahead and do a spam report,” Cutts continues. “You can also report it in the forums. But it’s definitely the case that if you’re taking those higher risks, that can come back and bite you. And that can have a material impact.”

    He’s not joking about that. Overstock blamed Google for “an ugly year” when its revenue plummeted. Even Google’s own Chrome penalty led to some questions about the browser’s market share.

    Cutts notes that Google is also happy to get feedback at conferences, on Twitter, online, blogs, forums, “if you’re seeing sites that are prospering and are using black hat techniques.”

    “Now, it’s possible that they have some low-quality links, and there are some links that people aren’t aware of that we see that are actually high quality,” Cutts notes. “But we’re happy to get spam reports. We’re happy to dig into them. And then we’ll try to find either new algorithms to try to rank the things more appropriately in the future. Or we’re certainly willing to take manual action on spam if it’s egregious or if it violates our guidelines. We have a manual web spam team that is willing to respond to those spam reports.”

    According to Cutts, you can even submit spam reports using Google Docs. Here’s a conversation he had on Twitter recently:

    @mattcutts Can we send a link to a Google Docs spreadsheet when reporting spam? #penguin 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    After Google launched the Penguin update, Cutts tweeted the following about post-Penguin spam reports:

    To report post-Penguin spam, fill out https://t.co/di4RpizN and add “penguin” in the details. We’re reading feedback. 5 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Shortly thereafter, he tweeted:

    @Penguin_Spam yup yup, we’ve read/processed almost all of them. A few recent ones left. 10 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    I’m sure plenty more reports have rolled into Google since then, but it does seem like they process them fairly quickly.

    Do you think Google has done a good job at cleaning up webspam? Share your thoughts.

  • Google’s Amit Singhal: Penguin A Success

    Early this morning, Google Fellow Amit Singhal was interviewed by Danny Sullivan at Chris Sherman on stage at SMX London, the sister conference of Search Engine Land. Singhal discussed a variety of Google search-related topics.

    We were hoping to get a some in depth discussion about Google’s recent Penguin update, but apparently that wasn’t a major point of conversation. Daniel Waisberg liveblogged the discussion at Search Engine Land, and Penguin only came up briefly. Here’s the relevant snippet of the liveblog:

    Danny talks about Penguin and asks how it is going from Google standpoint, are search results better? Amit says that in the end of the day, users will stay with the search engine that provides the most relevant results. Google’s objective was to reward high quality sites and that was a success with Penguin. One of the beauties of running a search engine is that the search engines that can measure best what the users feel is the one that will succeed more.

    From Google’s perspective they use any signal that is available for them, more than 200 of them. They have to make sure they are accurate and good. They will use any signal, whether it is organic or not.

    “Google Penguin’s objective is to reward high quality sites and authors” Amit Singhal #smxlondon 4 hours ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Panda and penguin update has gone really well… Can someone show amit the results for Viagra #smx 4 hours ago via Twitter for iPad ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @dannysullivan please ask Amit if he has any Penguin recovery tips apart from removing links #smx 4 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Google’s Matt Cutts also recently said that Google has considered Penguin a success, though plenty out there disagree.

    If you want Google’s advice on Penguin recovery, check out these videos Matt Cutts says to watch, these tips he endorsed on Twitter, and of course Google’s quality guidelines.

  • What If The Google Penguin Update Inadvertently Killed The Web As We Know It?

    Note: Perhaps the headline of this article is a little sensational, but don’t overlook the “what if” part. I’m not suggesting Google has some plot to kill the web. However, many businesses rely on Google and people are freaking out about backlinks. Some are going so far as to threaten legal action if links are not removed. Links. If such legal action ever resulted in the outlawing of links in any capacity, the web as we know it could be put into great jeopardy. People would be afraid to link. I don’t think Google intends for anything like that to happen, but people don’t always respond to things in the most rational of ways. I don’t believe we will see links outlawed, or that the Penguin update will kill the web. However, reactions to Google penalties are leading to some pretty strong actions from some.

    Google has said on multiple occasions that it thinks the Penguin update has been a success. Do you agree? Let us know in the comments.

    PageRank And The Web

    WWW, as you may know, stands for World Wide Web. It’s a web because it it’s connected by links. Sites all over the web link to one another, creating a path for users to click from page to the next. Often those pages are to different sites. This is the way it has worked for years. Just think what it would be like if sites couldn’t freely link to one another. The web would be broken, and users would suffer.

    When Google launched with its PageRank algorithm, it was a revolution in search. It seemed to be a better way of doing search. It gave a rhyme and reason to the ranking of search results. Today, Google uses over 200 signals to rank its search results, which are becoming more personalized than ever before. PageRank still matters, but it’s far from the only thing that matters.

    Yet, it is PageRank that has given links on the web so much power to influence the visibility of web content. Now that just about everyone is on the web, everyone is fighting to have their content seen. Once upon a time, you would have thought: the more links the better. More links can only lead to more chances people will see your content. Now, somewhat ironically, people are finding that that the links they have out there are making their content less visible. In some cases, they’re making it practically non-existent in Google, or at least so buried, it might as well be non-existent.

    Freak Out Time?

    Google’s Penguin update has been a major wake up call to webmasters about certain kinds of linking practices. The update was designed to target sites violating Google’s quality guidelines. Among those guidelines are: “Don’t participate in link schemes” and “Avoid hidden text or hidden links.”

    Some of Google’s guidelines are obvious – avoid obviously unethical practices. But in the link schemes department, things can get a little blurry. Just ask WPMU.org, which got hit by Penguin over a few questionable links (interestingly enough, after seemingly benefiting from Google’s Panda update, designed to reward higher quality sites).

    A lot of webmasters have taken to the forums and blogs to complain about the Penguin update, but Google has, on more than on occasion, deemed the update a success. We’ll also be seeing it come back around every so often, much like its Panda predecessor.

    Even before Penguin, Google was sending out tons of messages to webmasters alerting them of questionable links. All of this has gotten webmasters in to a frenzy to clean-up their link profiles, and reduce the number of links Google considers to be of poor quality, in hopes that their content can find its way back into Google search visibility.

    Legal Action Over Links?

    Some webmasters have even gone so far as to threaten legal action over sites that are linking to them. We referenced this in another article after Barry Schwartz at Search Engine Roundtable mentioned that this was happening. Now, Greg Finn at Search Engine Land has pointed to a specific example where PSKL got a DMCA take down notice from LifeShield, after writing a positive review.

    Now, to be clear, this DMCA takedown notice is not in reference to any content theft or content use. It’s about links. It threatens legal action. It says:

    I request you to remove from following website (pskl.us)
    all links to www.lifeshield.com website as soon as possible.
    In order to find the links please do the following:
    1) If this is an online website directory, use directory’s search system to find “LifeShield” links.
    2) If there are hidden links in the source code of website, open website’s main page and view its source code. Search for “lifeshield.com” in the source code and you will see hidden links.

    It also says:

    LifeShield, Inc will be perusing legal action if the webmaster does not remove the referenced link within 48 hours.

    Jeremy at PSKL actually shares the entire conversation around the matter, which did include an apology, indicating that PSKL shouldn’t have been on the list of sites that received a notice. Jeremy, however, took issue that there was a list of sites getting such notices. Throughout the conversation, it is revealed that LifeShield had a site “cloak lifeshield and generate over 700K back links” without LifeShield’s knowledge, and that “Google stepped in and slapped” them with a penalty, which led to layoffs at the company.

    Jeremy responded with, “So you’re saying that somebody went out and bought 700K back links for you, knowing that it would get you penalized by Google? So does that mean you had (Company name) send out 700K DMCA notices? Talk about throwing good money after bad. Report the linkspam to the spam team at Google, then spend that money on an SEO expert rather than on trying to bully people with intimidation.”

    The response was actually longer than that, and included the metaphor of putting out a house fire with manure, but that was the main gist.

    I suggest reading Jeremy’s entire post. It’s pretty interesting.

    Is This Where The Web Should Go?

    He does make another important point in this: A party creating large quantities of backlinks to a site in order to generate SEO (or, in this case, destroy SEO) is unethical. It is not illegal.

    While many may not have a problem with such practices becoming illegal, it’s the idea that the law could intervene with linking in any form that could lead to greater problems. Just consider all fo the gray area there already is in fair use law. There will always be different interpretations, and that can get dangerous.

    For the record (granted, I’m no lawyer), I wouldn’t expect any legal action, such as that threatened in LifeShield’s DMCA notice to hold much water in a court of law. Finn also points to two cases (Ford Motor Company v. 2600 Enterprises) and (Ticketmaster Corp. v. Tickets.com, Inc.), where the legality of linking prevailed.

    But even if things like this have to go to court, it’s going to be a major inconvenience, and legal fees will have to be paid. If sites practicing legitimate, ethical linking habits get caught up in this, where will that leave the web?

    Is this what linking on the World Wide Web will become? Will you have to worry about getting sued because you linked to a site, and that site may or may not find your site to be a strong enough site to desire a link from? Could you get sued because your page didn’t have a high enough PageRank, and not enough link juice to help the site you’re linking to in its search engine visibility?

    LifeShield seems to be targeting some very specific webspam, but sending out notices to a whole list of sites. It’s likely that LifeShield isn’t the only company panicing and resorting to such action. It’s unfortunate, for the company if some negative SEO (it’s unclear if this was from a competitor) was able to have such an impact on its business, as Jeremy suggests, this may not be the best way of trying to resolve the issue.

    Let’s Give Google Some Credit.

    You can point to Google’s guidelines and its algorithm updates, which clearly do cause some to think this way, but just the same, Google can’t be held entirely to blame for this kind of mentality either. The company has said in the past that people shouldn’t obsess with PageRank, and that it uses over 200 signals to rank content. PageRank is not the only thing that matters. In fact, the company puts out huge lists of signal changes every month.

    It shows the power over society that Google really holds though. It shows how much businesses rely on Google search that they will go so far as to threaten sites that are simply linking to them with legal action.

    Should such legal action ever lead to a victory in court, that could mean very bad news for the Web as we know it, and people could be afraid to link. I would imagine that would spawn more issues of sites not getting the credit (and possible referral traffic) they deserve.

    Do you think Google’s guidelines and penalties can have an influence on the law? Now that would be power, and made even more ironic still, by the fact that Google is constantly under scrutiny of its own.

    Share your thoughts in the comments.

    Image: Batman Returns (Warner Bros.)

  • Gooey Search: Kickstarter Project Claims To Be Google On Steroids (With Privacy)

    There’s an interesting Kickstarter project called Gooey Search, which bills itself as “Google on Steroids with Privacy”. It was developed by a small software company called Visual Purple. The tool comes in the form of an iPad app, as well as a Firefox add-on. I would assume it would be expanded to other platforms, should it reach its funding goal.

    Update: Here’s another interesting search engine project on Kickstarter.

    Visual Purple’s Megan Rutherford reached out to us to tell us a little about the project. First, check out the video:

    “We’ve been building bleeding-edge technology and advanced training simulations for years,” she says. “Recently, we developed and launched a professional data discovery tool for analysts and researchers. The tool is GisterPRO, and does some very powerful things such as read unstructured data – things computers don’t normally like to read.”

    “We agonized over finding a way to bring this technology to the rest of us,” she adds. “Then, we stumbled upon Kickstarter.”

    “Like most people, Google is our go-to search engine,” the Kickstarter page says. “From our extensive study of the mathematics of language, we found a great way to combine smart web bots and intelligent reading technology. Our technology reads every Google result, strips out the spam, and bubbles up only the best results along with the strongest concepts in a kinetic, Gooey Graph.”

    It strips out the spam and bubbles up the best results? Maybe Google should be checking this out as a possible acquisition target, given all the complaints that have been going around regarding the Penguin update.

    “Instead of marketing tags, these concepts are discovered entities that empower you to interact with and explore Google results like never before. Gooey makes search fun and rewarding for kids of all ages,” the page continues. “All you have to do is type your search terms into the search bar (just like you would any Google search). We issue your search to Google but our smart bots literally check every result returned – verifying each link and reading each document for you.”

    “On the right side of the Graphic User Interface (GUI for short) is Gooey Graph – an alive, real-time network diagram of discovered concepts,” the page explains. “Just play with Gooey Graph by deleting or stacking concepts to quickly sort results and find what you need.”

    Rutherford says Gooey Search is designed to bring “professional-grade data discovery technology to the rest of us”.

    “The sub-rosa story is that Gooey brings complete privacy, anonymity and automatic entity extraction to Google searches while neutralizing ‘Filter Bubble’ biasing of search results,” she says.

    More on the Filter Bubble here.

    The Kickstarter page includes the following image used to illustrate how the user can maniplate the “Gooey Graph”:

    Googey Search

    What do you think?

    Gooey will only be funded if it gets $125,000 in pledges by Friday, June 8. So far, it’s attracted 44 backers at $2,380.

  • Matt Cutts Shares Something You Should Know About Old Links

    Google’s Matt Cutts has put out a new Webmaster Help video discussing something that’s probably on a lot of webmasters’ minds these days: what if you linked to a good piece of content, but at some point, that content turned spammy, and your site is still linking to it?

    In light of all the link warnings Google has been sending out, and the Penguin update, a lot of webmasters are freaking out about their link profiles, and want to eliminate any questionable links that might be sending Google signals that could lead to lower rankings.

    A user submitted the following question to Cutts:

    Site A links to Site B because Site B has content that would be useful to Site A’s end users, and Google indexes the appropriate page. After the page is indexed, Site B’s content changes and becomes spammy. Does Site A incur a penalty in this case?

    “OK, so let’s make it concrete,” says Cutts. “Suppose I link to a great site. I love it, and so I link to it. I think it’s good for my users. Google finds that page. Everybody’s happy. Users are happy. Life is good. Except now, that site that I linked to went away. It didn’t pay its domain registration or whatever, and now becomes maybe an expired domain porn site, and it’s doing some really nasty stuff. Am I going to be penalized for that? In general, no.”

    “It’s not the sort of thing where just having a few stale links that happen to link to spam are going to get you into problems,” he continues. “But if a vast majority of your site just happens to link to a whole bunch of really spammy porn or off-topic stuff, then that can start to affect your site’s reputation. We look at the overall nature of the web, and certain amount of links are always going stale, going 404, pointing to information that can change or that can become spammy.”

    “And so it’s not the case that just because you have one link that happens to go to bad content because the content has changed since you made that link, that you’re going to run into an issue,” he concludes. “At the same time, we are able to suss out in a lot of ways when people are trying to link to abusive or manipulative or deceptive or malicious sites. So in the general case, I wouldn’t worry about it at all. If you are trying to hide a whole bunch of spammy links, then that might be the sort of thing that you need to worry about, but just a particular site that happened to go bad, and you don’t know about every single site, and you don’t re-check every single link on your site, that’s not the sort of thing that I would worry about.”

    Of course, a lot more people are worried about negative SEO practices, and inbound links, rather than the sites they’re linking to themselves.

    More Penguin coverage here.

  • Comment Spammers: These Links Are Not Helping You

    In light of Google’s Penguin update, it seems like a good time to suggest that you don’t spam blog comments. Even if you’re not technically spamming, and are leaving semi-thoughtful comments (but your ultimate goal is to get a link), it’s very likely that the blog you’re commenting on implements the nofollow attribute on comment links, which keeps the links from passing PageRank.

    Don’t forget that nofollow was introduced with blog comments in mind. Google put out a post in early 2005 called “Preventing Comment Spam,” in which it said:

    If you’re a blogger (or a blog reader), you’re painfully familiar with people who try to raise their own websites’ search engine rankings by submitting linked blog comments like “Visit my discount pharmaceuticals site.” This is called comment spam, we don’t like it either, and we’ve been testing a new tag that blocks it. From now on, when Google sees the attribute (rel=”nofollow”) on hyperlinks, those links won’t get any credit when we rank websites in our search results. This isn’t a negative vote for the site where the comment was posted; it’s just a way to make sure that spammers get no benefit from abusing public areas like blog comments, trackbacks, and referrer lists.

    SEO consultant Carson Ward recently wrote a great article at SEOmoz about types of link spam to avoid. One of those was comment spam.

    “If I were an engineer on a team designed to combat web spam, the very first thing I would do would be to add a classifier to blog comments,” he wrote. “I would then devalue every last one. Only then would I create exceptions where blog comments would count for anything.”

    “Let’s pretend that Google counts every link equally, regardless of where it is on the page. How much do you think 1/1809th of the link juice on a low-authority page is worth to you?” he wrote, referring to a screen cap of a spam comment on a page with 1808 other comments. “Maybe I’m missing something here, because I can’t imagine spam commenting being worth anything at any price. Let’s just hope you didn’t build anchor text into those comments.”

    It may seem like common sense to many, but it’s amazing how frequently comment spam occurs, even today, even on blogs that implement nofollow on comment links.

    For the Bloggers

    Matt Cutts put out a pretty popular blog post in 2009 about PageRank sculpting. Here’s what he had to say about blog comments in that:

    Q: If I run a blog and add the nofollow attribute to links left by my commenters, doesn’t that mean less PageRank flows within my site?

    A: If you think about it, that’s the way that PageRank worked even before the nofollow attribute.

    Q: Okay, but doesn’t this encourage me to link out less? Should I turn off comments on my blog?

    A: I wouldn’t recommend closing comments in an attempt to “hoard” your PageRank. In the same way that Google trusts sites less when they link to spammy sites or bad neighborhoods, parts of our system encourage links to good sites.

    Some bloggers aren’t opposed to turning off comments though. We had a couple of interesting conversations with bloggers Jeremy Schoemaker and Michael Gray last year, following the Panda update. Panda was all about the quality of content on a page, and obviously blog comments can carry varying degrees of quality.

    Schoemaker told us that he called a Google engineer friend and asked about this. Schoemaker said he was told that if anything, it’s “diluting the quality score of my page” by possibly diluting overall keyword density. Another factor could be comments that go through, but are clearly spam. These send signals that the page is not being well maintained.

    Gray, who turned off his blog comments years ago, told us last year, “While I’m not living in the SEO world of 1999, things like keyword focus and density do play a role,” he adds. “If you’re doing your job as an SEO in 95% of the cases the keyword you are trying to rank for should be the most used word/phrase on your page. If you’ve gone to all the trouble to do that why would you now let and knucklehead with a keyboard and internet connection come by and screw that up with comments?”

    Google says in its help center, “If you can’t or don’t want to vouch for the content of pages you link to from your site — for example, untrusted user comments or guestbook entries — you should nofollow those links. This can discourage spammers from targeting your site, and will help keep your site from inadvertently passing PageRank to bad neighborhoods on the web.”

    “In particular, comment spammers may decide not to target a specific content management system or blog service if they can see that untrusted links in that service are nofollowed,” it says. “If you want to recognize and reward trustworthy contributors, you could decide to automatically or manually remove the nofollow attribute on links posted by members or users who have consistently made high-quality contributions over time.”

    As far as I can tell, nofllow hasn’t done much to detract spammers, but at least it does keep you from passing PageRank to bad neighborhoods.

  • The Blurry Lines Of Google’s Paid Links Policy

    As you probably know, Google isn’t a fan of people paying for links that pass PageRank. It’s considered to be a manipulation of search results and a violation of Google’s quality guidelines, which are the focus of Google’s Penguin update. It’s interesting that there seem to be exceptions to the rule, such as a directory like Best Of The Web, which has users pay for their sites to be considered for links.

    Update: BOTW has gotten back to us since this article was published. Please see BOTW President Greg Hartnett’s comments toward the end of the article.

    Perhaps more interesting is that some similar directory sites, which aren’t necessarily in clear violation of Google guidelines seem to be getting penalized, or at the very least drawing the ire of unhappy webmasters looking to get their link profiles cleaned up after receiving messages from Google.

    Should a directory in which you have to pay to get a listing be treated like other sites that offer paid links? Let us know what you think in the comments.

    Google recently launched a PageRank update, and many directory sites saw their PR plummet. Best Of The Web, meanwhile, has managed to maintain 4s, 5s and 6s. At at a time when flustered webmasters are looking to eliminate lower-end links, the topic of directory links on the web seems more relevant than it’s been for quite some time.

    Webmasters Are Angry

    Barry Schwartz at Search Engine Roundtable ran a very interesting story about Google being “the cause of lawsuits over links to web sites.”

    “Can you imagine writing a story, linking that story to other relevant web sites and then years later being hit with a lawsuit over linking to a web site?” he asks.

    The gist is that webmasters who have been receiving those messages from Google about unnatural links are threatening to sue sites that are linking to them. “Some webmasters are taking extreme measures and threatening to sue publishers and webmasters who are linking to them,” he reports.

    I don’t know how often this is actually happening, but I can’t say it’s much of a surprise. If any such lawsuit is successful, then we have a problem.

    I don’t know about the legal threats, but I do know a lot of directories are getting angry emails from webmasters who have links coming from them.

    Google has taken issue with directories in the past – sort of. Here’s what the company told us in 2007:

    There’s no “outright penalty” for being a directory, but we do value, as I’m sure you’ve heard, “unique, compelling content.”

    Directories can run into the problem of not containing original information.

    There do seem to be some directories that have historically received a bit more respect from Google. This includes Best Of The Web, which as I said, charges users for possible inclusion.

    Google has talked about this in the past. Here’s a video about it from Matt Cutts from 2009:

    The user-submitted questions Cutts was responding to was:

    Will Google consider Yahoo! Directory and BOTW as sources of paid links? If no, why is this different from another site that sell[s] links?

    He doesn’t entirely answer the question, however. He does say:

    “Whenever we look at whether a directory is useful to users, we say, ‘OK, what is the value add of that directory?’ So, you know, do they go out and find their entries on their own, or do they only wait for people to come to them, you know, how much do they charge and what’s the editorial service that’s being charged?”

    “If a directory takes $50 USD and every single person who ever applies in the directory automatically gets in for that 50 dollars, there is not as much editorial oversight as something like the Yahoo directory, where people do get rejected. So, you know, if there is no editorial value add there, then that is much closer to paid links.”

    So basically, it sounds like if a directory rejects some things, this is OK.

    How Best Of The Web Works

    So how does Best Of The Web Work, exactly? You go to submit a site, and you’re presented with a page like this:

    Best of the Web

    It’s clear that the main motivation for submitting to this directory is to help your search engine rankings. It says, “Listing your website in the internet’s most respected directory will help increase your website’s visibility in major search engines.”

    The first example of a “link scheme” Google lists on its page about them is: “Links intended to manipulate PageRank.” While I can’t find anything on BOTW that specifically says anything about PageRank, is that not what submitters are after here?

    Best Of the Web presents multiple quotes from various marketing-types, like:

    “After implementing a plan with listings across several BOTW directories, we were able to see immediate and quantifiable improvement in our rankings. Working with BOTW has been a great success for Marriott.” — Benjamin Burns, Search Specialist

    “BOTW provided excellent service for us and our listings. I would hire them over and over again every time we need directory listings.” — Marek Wawrzyniak, SEO Specialist

    “Best of the Web has proven to be a successful strategy for Extra Space Storage when coupled with other local SEO techniques. We have seen a consistent ranking improvement in many areas with our local storage facilities by having Best of the Web part of our organic strategy.” — Tim Eyre, Interactive Marketing Manager

    It’s obvious that the reason one would want to be listed in this directory is SEO. It’s not because people are going to the directory to search for businesses. It’s an SEO strategy – something BOTW seems pretty up-front about.

    Link Schemes

    Let us refer to that “Link Schemes” help center page (linked to from its Quality Guidelines page) for a moment. That says:

    Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links count towards your rating. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity. However, some webmasters engage in link exchange schemes and build partner pages exclusively for the sake of cross-linking, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. This is in violation of Google’s Webmaster Guidelines and can negatively impact your site’s ranking in search results. Examples of link schemes can include:

    • Links intended to manipulate PageRank
    • Links to web spammers or bad neighborhoods on the web
    • Excessive reciprocal links or excessive link exchanging (“Link to me and I’ll link to you.”)
    • Buying or selling links that pass PageRank

    Let’s read that last one again. “Buying or selling links that pass PageRank.”

    Links That Pass PageRank

    As far as I can tell, if you have managed to get listed in Best Of The Web, the link will pass PageRank. The links I looked at do not include the nofollow attribute, which would prevent them from passing PageRank:

    The links marked as “ads” at the top of category pages do include the nofollow atribute.

    The category page above has a PageRank of 5. Some pages are higher, and some are lower. The home page has a 6.

    Back To The Submission Process

    If you click to get started, you are prompted to provide your email address (twice), and then to fill out a large form. The last part of that form is for the payment details:

    Best Of The Web Payment Details

    You can choose from two plans: annual fee or one time fee. Once you click submit, your card will be charged. You must check the box that says you’ve read the ToS and privacy policy. It’s only when you click through to the ToS, and through one more link there, that you find out your site may not even appear in the listings. It says, “There is no guarantee that my site will be added to the directory” and that the charge is non-refundable. You agree that you understand that, “BOTW editors, in their sole and final judgement, shall determine the suitability, placement, title and description of all sites listed in the BOTW Directory.”

    There’s nothing wrong with BOTW wanting to be selective in the editorial process. That’s what Google has indicated in the past is actually what makes directories like this higher quality in Google’s eyes. That said, Google is always preaching about user experience, and encouraging sites to provide what’s best for the user. User trust has been a major theme, particularly since the Panda update.

    BOTW does require submitters to read the TOS, before charging them, but the part about potentially not being included, even with no refund, seems a bit buried.

    Is BOTW’s practice OK in Google’s eyes because they’re using enough judgment not to include EVERY link that people are paying for in hopes of a listing?

    Is This What Google Wants From A Directory?

    I’m not going to advise you sell or pay for links at all, but I feel like Google is sending some very mixed signals here.

    Search engine industry vet Tim Mayer, who worked from Yahoo until 2010, tells WebProNews, “It is interesting as they [BOTW] are positioned similarly to the Yahoo directory of old with editors and payment. Other directories’ such as business.com model failed due to Google changing their treatment of them. Not sure if this was due to quality or the lack of editorial oversight.”

    “Many other directories are or are considered spam sites/directory link farms as they are just pages of paid links,” he adds. “Seems to me this is may be legacy treatment. But I have not looked at BOTW and analyzed it in some time. Google probably has a better sense of if this is a good authority hub or not. If it is they should use it. I would bet that they are better quality than most directory sites.”

    But it’s not really even an issue of quality. It seems like more of a double standard on Google’s part, given that the company clearly lists “Buying or selling links that pass PageRank” as an example of a link scheme.

    Editorial judgment is clearly a factor, but is it really the “best” of what the web has to offer or is it some of the best, with some that actually paid for reviews getting in there too, regardless of whether or not they’re really the best. Update: Hartnett says “an almost imperceptible percentage” of the links are from those who paid for the reviews.

    Look at this listing for Caagal.com on BOTW’s Business Classifieds category page, for example. A quick glance at this site (complete with loading errors) doesn’t suggest “best” of what the web has to offer in this niche, though this is certainly subjective. It doesn’t even seem to be largely business-oriented, but more property and boat oriented. For the record, I have no idea if this site paid or not.

    Granted, the site is nowhere to be found in Google, for the query “business classifieds” (at least within the first six pages). It’s hard to say how much value that site may have gotten from paying to be listed in Best Of The Web, but I guess they at least got a PageRank 4 link out of it (PR for that category page).

    Obsess With Google’s Quality Guidelines or Not?

    Webmasters are frantically trying to distance themselves from some directory sites after getting messages from Google about unnatural links. Even directories who have never offered paid links are getting emails from upset webmasters. Jayde, for example (disclosure: owned by WPN parent iEntry), has gotten quite a few. Jayde has never offered paid links, and recently made all links nofollow.

    If webmasters are looking to start suing sites that are linking to them because they are under the impression that these links are hurting them, that’s pretty bad.

    Interestingly enough, Google used to encourage directory submissions.

    “In fact, if you look at our webmaster quality guidelines, we used to have a guideline that says, you know, submit your site to directories, and we gave a few examples of directories,” Cutts explains in that video. “And what we find, or what we found was happening, was people would get obsessed with that line and go out and look for a lot of directories.”

    “We ended up taking out that mention in our webmaster guidelines so that people don’t get obsessed with directories and think, “Yes i have to go find a bunch of different directories to submit my site to,’’ says Cutts in the video.

    I realize this video is 3 years old, but I have to say, this seems to be an example of mixed signals coming from Google again.This would indicate that you shouldn’t obsess over the things in Google’s quality guidelines, but as you probably know, the Penguin update, which launched a couple weeks ago, was all about targeting sites violating the quality guidelines.

    To Sum Up

    – Google used to encourage directory submissions from the quality guidelines.

    – Google decided people shouldn’t obsess about that.

    – Now people are freaking out about links that they have from such directories that they submitted to, and some may even be so angry as to threaten legal action (though I can’t imagine there are any legitimate grounds).

    – Best of the Web, who charge money for the chance to have links designed to influence search visibility, which seems like it would violate Google’s guidelines aren’t considered a major problem.

    Something seems wrong with that picture.

    We’ve reached out to Google for comment and have not heard back from them.

    Update: We have received a thoughtful response from Best Of The Web President Greg Hartnett.

    On the criteria for sites to be considered the “best,” and gain a listing, President Greg Harnett says, “Our guidelines for listing are pretty straightforward: we list sites that contain quality, unique content in the most relevant category within the directory. If the site does not provide a user with informative content then we don’t list it. We have always been focused on providing the user with quality content from trustworthy sources.”

    “When users (humans or spiders) come to BOTW, they know that they can trust that (for instance) all of the listings in a San Francisco real estate category contain relevant information about San Francisco real estate,” he adds. “A human being has been in there and verified it. We’ve got a dedicated team of fantastic editors that ensure that.”

    On the percentage of submissions that are rejected, Hartnett says, “I don’t work the submission queue, so I don’t really have a handle on the specific numbers. However, as a percentage of total submissions, I believe that we reject fewer sites now than we did in the past. The overall quality of submissions has increased as the years have gone by. Perhaps in general, people are now building better sites. Perhaps it’s a matter of more people knowing that BOTW doesn’t accept low quality sites, and they don’t even bother submitting. Whatever it is, I know that it makes our editors happier.”

    We asked: It seems like Google advises against paid links, but doesn’t Best of the Web charge users to have their links reviewed for possible listting?

    “Google certainly advises against paid links,” Hartnett tells us. “We’re not a pay for placement, or link buying platform. Payment for review in no way influences whether or not a site is listed within the directory. The fee is for the review, and is non-refundable. It’s not for a link. We caught a lot of flack about that policy in the early years of the directory, but we did it for a reason. We retain complete editorial control and integrity with each submission and listing. It’s completely up to our editors to decide is the site gets listed, and if listed, the title, description and category placement.”

    “It should also most definitely not be overlooked that the review model accounts for a minuscule amount of the listings within the directory,” he adds. “We have millions of listings, of which our editors have added approximately 95% for free. They work daily scouring the web adding quality sites to relevant categories to build a more comprehensive resource. An overwhelming majority of the listings in the directory have had zero interaction with BOTW at all, nonetheless paid for a review.”

    “I have no idea why Google does or does not approve of what it is we are doing,” says Hartnett. “I don’t work for or with Google and I don’t have any access to them outside of what Joe Internet does. I’d be surprised if they thought about us at all, but if they did I would like to think that they respect what it is we have been doing for all these years.”

    “We feel we have put together (and continue to build) a fantastic resource for users that are interested in finding resources that they can trust,” he says. “We have always focused on providing the user with quality resources, and figured users appreciated, and will continue to appreciate, that effort. We’ve recently added the ability for editors and site owners to add social information for each listing, as we continue to evolve with the landscape and provide users with additional information about listings as well. It’s really been a fantastic project to have been working on for the last decade or so, and we’re excited to continue on our mission.?”

    Do you think Google is sending mixed signals about paid links? Let us know in the comments.

  • Google Penguin Update Punishes WordPress Theme Creators?

    James Farmer at WPMU.org wrote a very interesting Penguin-related article, which doesn’t make the update look too great, despite its apparently honorable intentions.

    The update hit WPMU.org, sending it from 8,580 visits from Google on one day pre-Penguin to 1,527 a week later. Farmer shares an Analytics graph illustrating the steep drop:

    Penguin drop

    Farmer maintains that WPMU.org engages in no keyword stuffing, link schemes, and has no quality issues (presumably Panda wasn’t an issue).

    According to Farmer, the Sydney Morning Herald spoke with Matt Cutts about the issue (which may or may not appear in an article), and he provided them with three problem links pointing to WPMU.org: a site pirating their software, and two links from one spam blog (splog) using an old version of one of their WordPress themes with a link in the footer. According to Farmer, Cutts “said that we should consider the fact that we were possibly damaged by the removal of credit from links such as these.”

    That raises a significant question: why were pirate sites and splogs getting so much credence to begin with? And why did they make such an impact that this site with a reasonably sized, loyal audience appears to be a legitimate, quality site, with many social followers?

    Farmer wonders the same thing. He writes, “We’re a massively established news source that’s been running since March 2008, picking up over 10,400+ Facebook likes, 15,600+ Twitter followers and – to cap it all 2,537 +1s and 4,276 FeedBurner subscribers – as measured by Google!”

    “How could a bunch of incredibly low quality, spammy, rubbish (I mean a .info site… please!) footer links have made that much of a difference to a site of our size, content and reputation, unless Google has been absolutely, utterly inept for the last 4 years (and I doubt that that’s the case),” he adds.

    Farmer concludes that the site was punished for distributing WordPress themes. That is, specifically, for creating the themes that people wanted to use, and being punished because spammers also used them and linked to the site. He suggests to others who may have this issue that they remove or add nofollow to any attribution link they put in anything they release.

    Hat tip to SEOmoz CEO Rand Fishkin for tweeting the article. Fishkin, by the way, has acknowledged that Penguin hasn’t been Google’s greatest work. He recently told WebProNews, “It’s done a nice job of waking up a lot of folks who never thought Google would take this type of aggressive, anti-manipulative action, but I think the execution’s actually somewhat less high quality than what Google usually rolls out (lots of search results that look very strange or clearly got worse, and plenty of sites that probably shouldn’t have been hit).”

    The whole thing speaks volumes about what many have been saying about Penguin’s effects on negative SEO practices – the kind that Fishkin has challenged the web with. For Fishkin, however, everything seems to be going well so far.

    Google is usually quick to admit that “no algorithm is perfect,” and I’m guessing they know as much about Penguin. It will be interesting to see if sites that shouldn’t have been hit are recovered in reasonably timely fashion, although at this point, it’s hardly timely anymore.