WebProNews

Tag: SEO

  • Paid Links Scandal Gets Marketing Firm iAcquire De-Indexed From Google

    Paid Links Scandal Gets Marketing Firm iAcquire De-Indexed From Google

    Earlier this week, we posted an article about an investigation from blogger Josh Davis, which exposed marketing firm iAcquire for allegedly purchasing backlinks for clients. Davis’ report caught the attention of Google’s head of webspam, Matt Cutts:

     

    @JoshD nice write-up. Most people don’t go the extra mile to call up and try to get comments from the other side.
    3 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    Now, it turns out that iAcquire has been de-indexed from Google (along with other parties involved in the scandal, according to Search Engine Land’s Barry Schwartz). Schwartz even got iAcquire’s Director of Inbound Marketing to mention this on Twitter:

     

    @rustybrick sure was. There’s no network for them to kill so that’s them throwing their hissy fit.
    13 hours ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    On the topic of whether or not its better for companies to run their own SEO in-house, Davis told WebProNews, “It is hard to say. From small businesses all the way up to large corporations there are so many hours in the day. SEO seems to be one area where considerable oversight is needed as various black and grey hat techniques still seem to be part of some SEO companies’ toolboxes.”

    As far as trust a firm not to engage in paid linking on your behalf, Davis says, “Backlink monitoring is certainly a key. There are a number of enterprise grade resources that provide daily updates on links. In my case I was just using a crude, free backlink service, but this space is filled with vendors who offer high quality monitoring. Having a third party do an audit of links might also be needed when a large company’s reputation is at stake.”

  • Google Penguin Update: Webmasters Wondering If Another One Came Out

    Update: Google says there has not been another one.

    As usual, webmasters are speculating about the latest big name Google algorithm change as some have experienced sudden traffic issues. This has happened quite frequently since last year’s Panda update was launched. Sometimes it was Panda, and other times it wasn’t.

    Now, there is talk in the WebmasterWorld forum that there may have been another Penguin update. We’ve reached out to Google for confirmation one way or another, and will update accordingly.

    In one forum thread, member Tedster writes, “A number of members who were hit by Penguin are now reporting some movement in various threads. Can anyone else see evidence of a Penguin refresh?”

    Some are seeing lower traffic, but assuming it’s because of the coming holiday weekend. However, user Anteck writes, “Getting loads of zombie traffic, suddenly the majority of visitors arnt converting. Googles up to something. All Australian sites. No holidays here.”

    There is also some speculation that the Knowledge Graph came into play. Member Rasputin writes:

    I would say that our ‘penguin’ sites recovered a few % around the 19th May while non-penguin sites fell a similar amount at the same time (no changes made – they are mostly old, small sites), but the variation after a few days is small enough that it could possibly just be seasonal variation.

    It is also possible that the introduction of Knowledge Graph is distorting the figures for our (travel) sites that fell a small amount around the same time (or the sunny weather in the UK could have got people away from their computers…)

    Google, by the way, appears to consider Knowledge Graph one of its crowning achievements. CEO Larry Page was sure talking it up this past week.

    Either way, it’s put a little less emphasis on Google+ profiles for some search results, though it’s still thrusting them in the spotlight for others (like Mark Zuckerberg’s).

    It’s hard to believe, but here we are close to the end of May already. Before too long, we should be seeing a new giant list of Google algorithm updates for the past month.

    Image: The Batman Season 4 Episode 2 (Warner Bros.)

  • Do You Trust Your SEO Service Not To Do Paid Links?

    Josh Davis at LLsocial.com put together a pretty in depth report on what appeared to be a Fortune 1000 company purchasing links, violating Google’s Webaster Guidelines. It turned out, as he shared in an update, that the company purchasing the links was a separate company from the Fortune 1000 company. There was some confusion, as the company had sold certain assets to another company, and formed a company with a very similar name. In light of all that, we’ll just omit the name of the company for this article.

    Still, the whole thing is a pretty interesting story, and we’ve had a conversation with Davis on the subject that is still worth sharing. The whole thing even caught the attention of Matt Cutts, Google’s head of webspam:

     

    @JoshD nice write-up. Most people don’t go the extra mile to call up and try to get comments from the other side.
    3 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    The whole thing started when Davis was sent an email from a third party offering to pay him for placing a link to one of the company’s pages. Davis determined that this third party was linked to “a prominent enterprise Search Engine Optimization (SEO) company,” which would lead one to believe that it is part of the efforts of the SEO agency the main company (the beneficiary of the link) had hired.

    I’m not going to rehash the entire thing here. If you want to know more about it, I suggest reading Davis’ report, but the whole thing raises some questions about hiring outside agencies to do your SEO. Google even had to penalize itself when a marketing agency had solicited paid links for Google’s Chrome browser. Google’s Chrome landing page suffered a 60-day penalty. Questions were raised about how this may have impacted Chrome’s market share.

    Google takes this stuff seriously (even if the lines around paid links are blurry at times).

    Davis tells WebProNews it’s difficult to tell if these links help the company at hand. “If you look around the [company’s] site you will see that they have extensive onsite SEO optimization. They also have the advantage of businesses websites placing a [company] badge which links back to [company].”

    “That said, some of their subpages don’t have many inbound links which may be the point of hiring an agency to do offsite SEO,” he adds.

    We asked Davis if he thinks there is a lot more of this paid linking for big companies going on than most people realize.

    “I think it is pretty dangerous to do paid backlinking explicitly for a large company, but I have come across some other smaller companies which seem to be doing it (maybe one other large one, but I am still researching that),” Davis tells us. “At 3000+ words I wasn’t about to try to tackle other companies, but it is possible it is going on.”

    One might wonder if in-house SEO is the way to go, if it’s SEO companies responsible for engaging in paid linking on behalf of clients, without the clients knowing. There are surely many, many white hat SEO companies out there that would never do this, but how does a business know that they won’t be getting something like this?

    “It is hard to say,” says Davis. “From small businesses all the way up to large corporations there are so many hours in the day. SEO seems to be one area where considerable oversight is needed as various black and grey hat techniques still seem to be part of some SEO companies’ toolboxes.”

    As far as trust, Davis says, “Backlink monitoring is certainly a key. There are a number of enterprise grade resources that provide daily updates on links. In my case I was just using a crude, free backlink service, but this space is filled with vendors who offer high quality monitoring. Having a third party do an audit of links might also be needed when a large company’s reputation is at stake.”

    The company in question, according to emails Davis received, is looking into the situation further. Something tells me that Google is too.

     
  • Google: There Was No Penguin Update

    Google: There Was No Penguin Update

    You know that Penguin update we’ve been talking about for the past month (and to some extent even longer)? Well, that doesn’t exist. That is, according to one Googler.

    Danny Sullivan at Search Engine Land shares an amusing story about how one Googler claimed that Google had no such thing. While he doesn’t share any names, he indicates that someone who worked for Google was asked about the Penguin update, and pointed towards some search results on the topic, and responded:

    “I assure you 100% that there has been nothing at Google referred to as ‘Penguin.’…If you notice on those search results you sent me, not a single source is from Google itself…From what I just saw on this whole Penguin thing–it sounds to me like a lot of SEO companies that use shady and unethical practices are upset that their loop holes have been cut out!”

    Google announced the update, which came to be known as Penguin, on 04/24, in a blog post titled, “Another Step To Reward High-Quality Sites“. This appeared to be the update Google’s Matt Cutts “sort of pre-announced” at SXSW. The update was being called the “Webspam update” at first, but then, in a conversation Sullivan had with Google, he revealed that it was called Penguin.

    Right around that time, Cutts also tweeted this picture:

    Matt Cutts Penguin Tweet

    Of course Googlers like Cutts and Amit Singhal have openly discussed the Penguin update. In fact, both have even indicated that it has been a success.

    At SMX London, Singhal engaged in a keynote interview with Sullivan, and mentioned that nobody at Google understands everything at Google. I guess that is being illustrated pretty well here.

    Lead Image: cheezburger.com

  • Matt Cutts Shares Something You Should Know About Old Links

    Google’s Matt Cutts has put out a new Webmaster Help video discussing something that’s probably on a lot of webmasters’ minds these days: what if you linked to a good piece of content, but at some point, that content turned spammy, and your site is still linking to it?

    In light of all the link warnings Google has been sending out, and the Penguin update, a lot of webmasters are freaking out about their link profiles, and want to eliminate any questionable links that might be sending Google signals that could lead to lower rankings.

    A user submitted the following question to Cutts:

    Site A links to Site B because Site B has content that would be useful to Site A’s end users, and Google indexes the appropriate page. After the page is indexed, Site B’s content changes and becomes spammy. Does Site A incur a penalty in this case?

    “OK, so let’s make it concrete,” says Cutts. “Suppose I link to a great site. I love it, and so I link to it. I think it’s good for my users. Google finds that page. Everybody’s happy. Users are happy. Life is good. Except now, that site that I linked to went away. It didn’t pay its domain registration or whatever, and now becomes maybe an expired domain porn site, and it’s doing some really nasty stuff. Am I going to be penalized for that? In general, no.”

    “It’s not the sort of thing where just having a few stale links that happen to link to spam are going to get you into problems,” he continues. “But if a vast majority of your site just happens to link to a whole bunch of really spammy porn or off-topic stuff, then that can start to affect your site’s reputation. We look at the overall nature of the web, and certain amount of links are always going stale, going 404, pointing to information that can change or that can become spammy.”

    “And so it’s not the case that just because you have one link that happens to go to bad content because the content has changed since you made that link, that you’re going to run into an issue,” he concludes. “At the same time, we are able to suss out in a lot of ways when people are trying to link to abusive or manipulative or deceptive or malicious sites. So in the general case, I wouldn’t worry about it at all. If you are trying to hide a whole bunch of spammy links, then that might be the sort of thing that you need to worry about, but just a particular site that happened to go bad, and you don’t know about every single site, and you don’t re-check every single link on your site, that’s not the sort of thing that I would worry about.”

    Of course, a lot more people are worried about negative SEO practices, and inbound links, rather than the sites they’re linking to themselves.

    More Penguin coverage here.

  • Can Negative Parameters Tell You If You Were Hit By Google’s Penguin Update?

    Barry Scwhartz posted this week about a Google query hack, which may or may not enable you to confirm that your site was hit by Google’s Penguin update. Schwartz credits WebmasterWorld member Martin Ice Web with the find.

    The user added a negative parameter for amazon with his keywords, and Google returned his site to pre-Penguin rankings. “For example, he search for [blue widget -amazon] and his rankings pre-Penguin showed up on Google for the query [blue widget],” Schwartz explains.

    It didn’t seem very reliable at first, so I didn’t bother to cover it. Even Schwartz noted that it failed on some of the queries he tried. However, the comments on his article have poured in since then, and it sounds like it has worked for quite a few people, so perhaps it’s worth taking a look at.

    One commenter notes that other negative keywords seem to work as well, such as “-ebay”.

    Remember, if you believe that you were unfairly hit by the Penguin update, Google has a form where you can complain. Of course, it’s likely that Google disagrees. After all, they consider the update to be a success.

    View our Penguin coverage here.

  • Knowledge Graph Reduces Google’s Dependence On Keywords

    Earlier this month, we looked at Google’s big list of algorithm changes from April. One of those, referred to as Bi02sw41, indicated that Google may have reduced its dependence on keywords.

    Today, Google announced the Knowledge Graph, which Google is saying makes it smarter at determining what people mean when they’re searching for things. More on the Knowledge Graph here. It also comes in mobile.

    Google is indicating that this is a step away from keywords. In the official announcement, SVP, Engineering, Amit Singhal, says:

    Take a query like [taj mahal]. For more than four decades, search has essentially been about matching keywords to queries. To a search engine the words [taj mahal] have been just that—two words.

    But we all know that [taj mahal] has a much richer meaning. You might think of one of the world’s most beautiful monuments, or a Grammy Award-winning musician, or possibly even a casino in Atlantic City, NJ. Or, depending on when you last ate, the nearest Indian restaurant. It’s why we’ve been working on an intelligent model—in geek-speak, a “graph”—that understands real-world entities and their relationships to one another: things, not strings.

    Google’s head of webspam, Matt Cutts, tweeted about the feature:

    Big search news: http://t.co/ZMiB88BV Moving from keywords toward knowledge of real-world entities and their relationships.
    23 minutes ago via Tweet Button · powered by @socialditto
     Reply  · Retweet  · Favorite

    On Google+, Cutts said, “Google just announced its Knowledge Graph. It’s another step away from raw keywords (without knowing what those words really mean) toward understanding things in the real-world and how they relate to each other. The knowledge graph improves our ability to understand the intent of a query so we can give better answers and search results.”

    Keywords have, of course, been a major point of spam, which Google is working hard to eliminate (see Penguin update). The less Google can rely on keywords to deliver relevant results, the less susceptible to spam it should be.

    I don’t think the Knowledge Graph has done anything to diminish the value of using relevant keywords in your content, and it doesn’t seem to affect the regular, organic web results, but who knows if this will change somewhere down the line.

    It is interesting to see Google continue to clutter up its search results pages, given that its clean design was one of the big differentiators of the search engine in its early days.

  • Rand Fishkin’s Negative SEO Challenge: 40K Questionable Links And Ranking Well

    Last month, we reported that SEOmoz CEO Rand Fishkin issued a negative SEO challenge. He challenged people to take down SEOmoz or RandFishkin.com using negative SEO tactics.

    “I’ve never seen it work on a truly clean, established site,” Fishkin told us at the time. He is confident enough in his sites’ link profiles and reputation. He also said, “I’d rather they target me/us than someone else. We can take the hit and we can help publicize/reach the right folks if something does go wrong. Other targets probably wouldn’t be so lucky.”

    We had a conversation with Fishkin today about the Penguin update, and about a new SEOmoz project related to webspam. We also asked for an update on how the challenge is going, and he said, “On the negative SEO front – I did notice that my personal blog had ~40,000 more links (from some very questionable new sources) as of last week. It’s still ranking well, though!”

    It sounds like the the challenge is working out so far, which certainly looks good on Google’s part, especially in light of the Penguin update, and the opinions flying around about negative SEO. Just peruse any comment thread or discussion forum on the topic and there’s a good chance you’ll run into some of this discussion.

    I’m guessing the challenge is still on the table, but so far, Fishkin doesn’t seem top be having any problems.

    Of course, most people don’t have the link profile or reputation that Fishkin has established, but that also speaks to the need for content producers to work on building both.

  • Google Penguin Update: SEO And Marketing Services Feel The Effects

    There’s been a great deal of talk about the Google Penguin update since it launched last month, and a lot of webmasters are still trying to sift through the rubble and determine if their sites were even impacted by Penguin or some other Google algorithm change. In addition to Penguin, there were two Panda refreshes last month, and over 50 other changes, which Google finally listed on Friday.

    SEOmoz CEO Rand Fishkin tells WebProNews, “It’s done a nice job of waking up a lot of folks who never thought Google would take this type of aggressive, anti-manipulative action, but I think the execution’s actually somewhat less high quality than what Google usually rolls out (lots of search results that look very strange or clearly got worse, and plenty of sites that probably shouldn’t have been hit).”

    SEOmoz, by the way, has launched an interesting project aimed at tackling Webpsam on its own.

    Fishkin actually posted a new video discussing the Penguin update today, which is worth the watch, particularly if you’ve been affected. There are six main points he discusses, but one in particular that I found interesting is that there are a lot of sites in the marketing industry that appear to have been hit.

    Fishkin says, “There appears to be a very disproportionate level of sites in the marketing/services field affected by this. What I mean is, we have seen more people write in about keywords like, ‘seo services,’ ‘seo company, you know, some particular city name’, or ‘web design services, some particular city name’. Those types of results seem to be hit heavily.”

    “Now, I’m gonna throw out to things I think may be to blame here,” he continues. “One is: a lot of people who operate in these marketing services fields are also likely to have a lot of correlation with the people who are potentially getting the kinds of link spam to their web pages that Google hit in this update. So, it’s not necessarily [that] Google focused on these. It could be the types of spam they focused on and the types of links that these people had just happened to be correlated and connected. The other things is, this could merely a leading indicator…we’re obviously in the marketing and SEO field, and so it could be that we’re just getting a disproportionate number of those types of folks talking about it in Q&A, emailing, tweeting at us…all those kinds of things.”

    “That’s also possible, though usually we see more balance across the board, typically,” he notes.

    Beyond the obviously spam-heavy topics, like making money online and pharmaceuticals, we’d be interested to hear more about what kinds of sites have been impacted most by Penguin. Do you believe you were hit by Penguin? What industry is your site part of?

  • Google Penguin Update Punishes WordPress Theme Creators?

    James Farmer at WPMU.org wrote a very interesting Penguin-related article, which doesn’t make the update look too great, despite its apparently honorable intentions.

    The update hit WPMU.org, sending it from 8,580 visits from Google on one day pre-Penguin to 1,527 a week later. Farmer shares an Analytics graph illustrating the steep drop:

    Penguin drop

    Farmer maintains that WPMU.org engages in no keyword stuffing, link schemes, and has no quality issues (presumably Panda wasn’t an issue).

    According to Farmer, the Sydney Morning Herald spoke with Matt Cutts about the issue (which may or may not appear in an article), and he provided them with three problem links pointing to WPMU.org: a site pirating their software, and two links from one spam blog (splog) using an old version of one of their WordPress themes with a link in the footer. According to Farmer, Cutts “said that we should consider the fact that we were possibly damaged by the removal of credit from links such as these.”

    That raises a significant question: why were pirate sites and splogs getting so much credence to begin with? And why did they make such an impact that this site with a reasonably sized, loyal audience appears to be a legitimate, quality site, with many social followers?

    Farmer wonders the same thing. He writes, “We’re a massively established news source that’s been running since March 2008, picking up over 10,400+ Facebook likes, 15,600+ Twitter followers and – to cap it all 2,537 +1s and 4,276 FeedBurner subscribers – as measured by Google!”

    “How could a bunch of incredibly low quality, spammy, rubbish (I mean a .info site… please!) footer links have made that much of a difference to a site of our size, content and reputation, unless Google has been absolutely, utterly inept for the last 4 years (and I doubt that that’s the case),” he adds.

    Farmer concludes that the site was punished for distributing WordPress themes. That is, specifically, for creating the themes that people wanted to use, and being punished because spammers also used them and linked to the site. He suggests to others who may have this issue that they remove or add nofollow to any attribution link they put in anything they release.

    Hat tip to SEOmoz CEO Rand Fishkin for tweeting the article. Fishkin, by the way, has acknowledged that Penguin hasn’t been Google’s greatest work. He recently told WebProNews, “It’s done a nice job of waking up a lot of folks who never thought Google would take this type of aggressive, anti-manipulative action, but I think the execution’s actually somewhat less high quality than what Google usually rolls out (lots of search results that look very strange or clearly got worse, and plenty of sites that probably shouldn’t have been hit).”

    The whole thing speaks volumes about what many have been saying about Penguin’s effects on negative SEO practices – the kind that Fishkin has challenged the web with. For Fishkin, however, everything seems to be going well so far.

    Google is usually quick to admit that “no algorithm is perfect,” and I’m guessing they know as much about Penguin. It will be interesting to see if sites that shouldn’t have been hit are recovered in reasonably timely fashion, although at this point, it’s hardly timely anymore.

  • Some Free Directories Go Missing From Google, Some Paid Directories Doing Well

    There some discussion going on in the webmaster/SEO community that Google may have de-indexed some free web directories. Barry Schwartz at Search Engine Roundtable points to a WebmasterWorld forum thread on the subject.

    The thread begins with a post from user Sunnyujjawal, who says:

    While checking some sites links I found 50% free submission directories are out of G now. Will Google count such links in negative SEO or unnatural linking?

    Schwartz concurs that about 50% of the ones he searched for did not have listings.

    He points to one example: global-web-directory.org. Indeed, I’m getting no results for that site:

    global web directory

    I’m not sure about the 50% thing though. I’ve looked at a number of others, and haven’t come across many that were not showing listings (though I have no doubt that there are more out there). Either way, there are still a lot of these sites that are still in Google’s index. We do know, however, that quite a few of them recently received PageRank reductions with the recent update.

    This discussion happens to come at a time when we’ve been analyzing Google’s quality guidelines, and its treatment of a certain directory, Best Of The Web, which sells reviews for potential listings, which appear with links that pass PageRank.

    Other directories that follow a similar model, may be experiencing similar treatment from Google. In that same WebmasterWorld thread, user Rasputin writes:

    I have a paid directory that I haven’t touched for about 3 years, only gets about 25 submissions ($10) a year – strange thing is, I just looked and not only is it well indexed but all the internal pages are now showing page rank – for a very long time they were all ‘greyed out’ after the google clamp-down on directories a couple of years ago.

    No idea when it came back, certainly nothing I’ve changed and pretty unlikely it’s attracted natural links.

    That’s pretty interesting.

    User Netmeg adds:

    I don’t think free or paid makes anywhere near as much of a difference as to whether or not the directory is actually curated for quality. Because if it isn’t, what other reason is there for it to exist other than to create links?

    That’s a very relevant point, and that seems to be Google’s reasoning, based on this video from Matt Cutts from several years ago:

    “Standard directory listings remain in our editors complete editorial control, and as such do not need the nofollow tag,” Best Of The Web President Greg Hartnett told WebProNews. “An editor looked at those listings (pay for review or not) and decided that they meet editorial guidelines and as such merit a listing. We vouch for that listing, so why would we nofollow it?”

    If you go to global-web-directory.org,’s submission page, it would appear that they violate Google’s quality guidelines. There is a pricing structure as follows:

    Express Reviews – $2
    Regular Reviews – Free
    Regular Reviews with reciprocal – Free

    While they advertise a paid review process, it’s clearly much different than how Best Of The Web operates. The only payment is for speeding up the review process, from the looks of it. Otherwise it’s free, and they’ll even throw in a reciprocal link for free. That could be the part that Google has a problem with. If sites are really being “reviewed” for quality, perhaps that is one thing, but if your’e saying flat out that you’ll give a link back, that might fall under Google’s “link schemes” criteria, discussed in the quality guidelines.

    It does list “links intended to manipulate PageRank” as the first example, and it does look like the site attempts to show the listings’ PageRank right with the listings.:

    If you really look around the site, however, you’ll find many category pages without listings, just displaying ads. It’s not hard to see why Google wouldn’t want this site in its index.

    Update: There’s an interesting post about this issue at Search News Central, from Terry Van Horne. Terry writes:

    Directories that would be candidates for this kind of “draconian” action were as good as de-indexed ages ago. We sent out our super staffer Mike, with our vetted list of directories to see what he could find. From that (top end list) we found 65 no change, 2 domains parked and 1 de-indexed site; roughly 1.3% were de-indexed.

    Next we went to our friends at Steam Driven Media for the last 100 (based on TBPR) from a list of 1500. From this group we found 1 with low indexation and 9 deindexed/gone – roughly 10% affected. Keep in mind, we have no idea how long these sites were out of the Google index.

    Van Horne questions whether directories are really “getting nuked or not”.

    So far, we’ve not really seen anything indicating it’s as big a change as made out to be by the original poster in the WebmasterWorld thread.

    Have you seen paid directories rising in Google? Free ones disappearing? Let us know what you’re seeing.

  • Watch Google’s Matt Cutts Give Some “Advice” On Ranking #1 (Humor)

    Google’s Matt Cutts has put out hundreds of videos as part of his webmaster help series. I’ll assure you that nothing like what you’re about to hear has ever appeared in any of them.

    Call it the anti-SEO help video of the decade, and if you’re a webmaster you can call it site suicide. You can laugh at this Matts Cutts parody video all you want, just don’t take any of its advice seriously.

    “In addition to keyword stuffing, we look at links to porn sites. Not that many people tend to link that much to sites within the porn industry. That’s the sort of thing that’s going to be really rewarding for users, so link to porn sites. Could it be annoying? Yes, it could be annoying, but that’s perfectly fine.”

    That’s one of the gems from this clever mashup from SEO guy Sam Applegate. He took (probably way to much) time to organize and analyze Cutt’s many videos and came up with this video on how to rank #1 in Google search. Except, as you may have derived from the last quote, this guide won’t have you ranking anywhere near #1.

    “I do think that Bing or Blekko or Duck Duck Go are potentially doing illegal things like hacking sites,” says Cutts in fragments. Check it out below:

    For his part, Cutts is aware of the video and his concern was with how much time it had to have taken its creator:

    @seosammo wow, how much time did that take? 2 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    [h/t Search Engine Roundtable]

  • Want To Tell Google How To Improve? Tell Amit Singhal.

    Matt Cutts fields a whole lot of questions about Google. He often offers helpful advice via his blog, comments on other blogs, Twitter, and of course through his Webmaster Help videos, but Google Fellow Amit Singhal is the guy that leads the team that looks at all the messed up search results.

    Singhal spoke at SMX London this morning, in an on-stage interview with Danny Sullivan and Chris Sherman. While he didn’t delve into Penguin too much, other than to indicate that it has been a success, he did talk a little bit about dealing with flawed search results. Daniel Waisberg liveblogged the discussion at SMX’s sister site Search Engine Land. Here’s the relevant snippet:

    Chris asks Amit how is the evolution process at Google with so many updates; how does Google decide about which update goes live? Google has an internal system where every flawed search result is sent to Amit’s team. Based on that engineers are assigned to problems and solutions are tested on a sandbox. Then the engineer will show how the results will show after and before the update and the update is tested using an A/B test. They discuss the results and this loop runs several times until they find a change that is better in all aspects. After this process the change is send to a production environment for a very low percentage of real user traffic and see how the CTR is changed. Based on this, an independent analyst (that works for Google) will generate a report. Based on that report the group discuss and decides if the change is going to be launched or not. That’s how scientific the process is.

    As Waisberg notes, Google has recently shared several videos discussing how Google makes changes. You can watch these if you’re interested:

    This one has Cutts talking about Google’s experimentation process (among other things):

    According to Sullivan, who tweeted since the keynote discussion, Singhal wants user feedback:

    Think you know how Google Search should run better? @theamitsinghal asked for advice. Leave your comments here http://t.co/fJFbe1QI 3 hours ago via Seesmic twhirl ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    On Twitter, he’s @theamitsinghal. Here’s his Google+ profile.

    Don’t forget, Google has a feedback link at the bottom of every search results page. Of course, there are always spam reports as well.

    Image: Amit’s Google+ Profile Pic

  • Best Of The Web: The End Goal Of A Submission Is Not A Link

    We recently ran an article called “The Blurry Lines Of Google’s Paid Links Policy“. Much of the article talked about web directory The Best Of The Web, which has a submission process, requiring submitters to pay a fee to have their sites reviewed. If BOTW deems the site worthy, it will list a link in the appropriate category.

    Since initially running the article, we’ve had an ongoing dialogue with Best Of The Web President Greg Hartnett, the initial part of which was included in an updated version of the article. Since then, he’s responded to a few more questions.

    BOTW considers any payment to be for the review process, rather than the link itself. However, when you are the webmaster paying for this review, isn’t the end product you are expecting to receive a listing and a link? The review consists of BOTW choosing yes or no on whether to include the submission. Can you imagine anybody paying $150 for a submission to BOTW thinking they are paying for a review rather than inclusion in their directory?

    What BOTW Says

    “The end goal of the submissions is not a link – it’s a review,” Hartnett tells WebProNews. “Obviously we are not privy to the inner thoughts and motivations of the site owners who pay for review.”

    “While one may argue that those owners are in it for a link, maybe their motivation is simply a listing in a relevant category,” he adds. “Maybe it’s additional exposure for their site.”

    Google, according to its guidelines, frowns upon paid links that pass PageRank. As illustrated in the first article, the regular directory listings we looked at were made up of links that did not include the nofollow link attribute, which would keep them from passing PageRank.

    We asked Hartnett if the links on BOTW are passing PageRank.

    “Not all of them, no,” he tells us. “Our spots that are influenced by the site owner, our Ads or sponsorships, are all nofollowed. We do this because even though those are still reviewed by an editor, the site owner has more control of the anchor text, description (which may be more marketing-centric than directory listings) and category placement. Those listings are not fully controlled by our editors, so they get the nofollow tag.”

    The listings marked as ads include nofollow. This reiterates what we said in the first article.

    “Standard directory listings remain in our editors complete editorial control, and as such do not need the nofollow tag,” Hartnett adds. “An editor looked at those listings (pay for review or not) and decided that they meet editorial guidelines and as such merit a listing. We vouch for that listing, so why would we nofollow it?”

    BOTW’s submission page touts the search visibility benefits of getting listed in the directory. Hartnett points out that such language is only used on one page out of over 110,000. However, that one page is the submission page where you would expect to see the benefits of paying for submission to BOTW. Clearly, the reason one would submit a site to BOTW is as BOTW’s submission page says, to “increase your website’s visibility in major search engines”.

    What Google Says

    We asked Google about the issue generically (not mentioning BOTW by name): Just to be clear, if a web directory charges people for links, and advertises these listings as a way to help the submitter’s site gain visibility in major search engines, that would be a violation of Google’s quality guidelines, correct?

    Well, the advertising part may not be a violation, but on the topic of paid links that pass PageRank, the company simply referred us to its guidelines. So, here’s the exact text of Google’s Paid Links page linked to from its quality guidelines:

    Google and most other search engines use links to determine reputation. A site’s ranking in Google search results is partly based on analysis of those sites that link to it. Link-based analysis is an extremely useful way of measuring a site’s value, and has greatly improved the quality of web search. Both the quantity and, more importantly, the quality of links count towards this rating.

    However, some SEOs and webmasters engage in the practice of buying and selling links that pass PageRank, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying or selling links that pass PageRank is in violation of Google’s Webmaster Guidelines and can negatively impact a site’s ranking in search results.

    Not all paid links violate our guidelines. Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such. This can be done in several ways, such as:

    • Adding a rel=”nofollow” attribute to the <a> tag
    • Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

    Google works hard to ensure that it fully discounts links intended to manipulate search engine results, such as excessive link exchanges and purchased links that pass PageRank. If you see a site that is buying or selling links that pass PageRank, let us know. We’ll use your information to improve our algorithmic detection of such links.

    What Do We Make Of It?

    Could the second paragraph be exactly where the lines are blurred? In one sentence, it says, “However, some SEOs and webmasters engage in the practice of buying and selling links that pass PageRank, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites.” Emphasis added.

    The next sentence is: “Buying or selling links that pass PageRank is in violation of Google’s Webmaster Guidelines and can negatively impact a site’s ranking in search results.”

    That second part seems pretty clear, but the bolded part in the first sentence could be where BOTW is able to take advantage. The thinking seems to be that they exercise some editorial judgment, so it’s OK.

    So, by that logic, would it be OK for a publication like BusinessWeek to post an article called the “Top 10 SEO Services On The Web,” but charge different SEO services for the ability to be considered to be on the list? And then at that publication’s own discretion, possibly include some of those paying services on the list, with nice, PageRank-passing links?

    For some reason, it doesn’t seem like that would be cool with Google.

    BOTW is a nicely organized directory, and is not your typical list of spammy links. However, the question is not how great of a directory BOTW is. It’s whether Google’s webmaster guidelines interpret the selling of a “review” that leads to inclusion in its directory as being equivalent to selling a link that passes PageRank. If not, Google ought to be more clear about this in its webmaster guidelines.

  • Penguin Update Will Come Back (Like Panda), According To Report

    Danny Sullivan put out a new article with some fresh quotes from Matt Cutts. From this, we know that he has deemed the Penguin update a success. In terms of false positives, he says it hasn’t had the same impact as the Panda or Florida updates, though Google has seen “a few cases where we might want to investigate more.”

    Sullivan confirmed what many of us had assumed was the case: Penguin will continue into the future, much like the Panda update. Cutts is even quoted in the article: “It is possible to clean things up…the bottom line is, try to resolve what you can.”

    The Good News

    Depending on your outlook, this could either be taken as good or bad news. On the good side of things, it means you can come back. Just because your site was destroyed by Penguin, you still have a shot to get back in Google’s good graces – even without having to submit a reconsideration request. Google’s algorithmically, assuming that it does what it is supposed to, will detect that you are no longer in violation of Google’s guidelines, and treat your site accordingly.

    The Bad News

    The bad news is that there is always the chance it won’t work like it’s supposed to. As I’m sure you’re aware, there are many, many complaints about the Penguin update already. Here’s an interesting one. Many feel like it’s not exactly done what it is supposed to. Another perhaps not so positive element of the news is that sites will have to remain on their toes, wondering if something they’ve done will trigger future iterations of the Penguin update.

    Remember when Demand Media’s eHow as not hit by the Panda update when it first launched, but was then later hit by another iteration of it, and had to delete hundreds of thousands of articles, and undergo a huge change in design, and to some extent, business model?

    But on the other hand, eHow content is the better for it, despite a plethora of angry writers who no longer get to contribute content.

    There’s always the chance that some sites have managed to escape Penguin so far, but just haven’t been hit yet. Of course, Danny makes a great point in that “for any site that ‘lost’ in the rankings, someone gained.”

    It will be interesting to see how often the Penguin update gets a refresh. There were two Panda refreshes in April alone (bookending the Penguin update). It might be even more interesting to see how many complaints there are when the refreshes come back, and how often they’re noticed. Even the last Panda update went unconfirmed for about a week.

    Either way, be prepared for Penguin news to come peppered throughout the years to come. Just like Panda. We’ll certainly continue to cover both.

  • Google Penguin Update Recovery: Matt Cutts Says Watch These 2 Videos

    Danny Sullivan at Search Engine Land put up a great Penguin article with some new quotes from Matt Cutts. We’ve referenced some of the points made in other articles, but one important thing to note from the whole thing is that Cutts pointed to two very specific videos that people should watch if they want to clean up their sites and recover from the Penguin update.

    We often share Google’s Webmaster Help videos, which feature Cutts giving advice based on user-submitted questions (or sometimes his own questions). I’m sure we’ve run these in the past, but according to Sullivan, Cutts pointed to these:

    Guess what: in both videos, he talks about Google’s quality guidelines. That is your recovery manual, as far as Google is concerned. Here are some articles we’ve posted recently specifically on different aspects of the guidelines:

    Google Penguin Update: Don’t Forget About Duplicate Content

    Google Penguin Update: A Lesson In Cloaking

    Google Penguin Update Recovery: Hidden Text And Links

    Recover From Google Penguin Update: Get Better At Links

    Google Penguin Update: 12 Tips Directly From Google

    Google Penguin Update Recovery: Getting Better At Keywords

    Google Penguin Update: Seriously, Avoid Doorway Pages

    Google Penguin Update And Affiliate Programs

    So, in your recovery plan, take all of this into account, and these tips that Cutts lent his seal of approval to.

    And when all else fails, according to Cutts, you might want to just start over with a new site.

  • New Bing Is More Than Just A Pretty Face

    Have you seen the new Bing yet? We reported yesterday that Microsoft pretty much recreated Bing with the launch of its new format. The new format features three columns that display traditional search results, snapshot which provides the most relevant information to the user, and the social column that shows friends’ recommendations. Underneath this massive change is another new Bing that hopes to change the way we search.

    Bing director Stefan Weitz spoke to Fast Company yesterday about the change. He says that yesterday’s change was in response to their need to “reinvent search.” Weitz says that Google’s method of “indexing” and “ranking” pages is no longer good enough. Their new method is focused on finding the answers in the search results themselves.

    This new method of search is like bringing instant answers to the entire Web. Weitz says that their goal is to “model every object on the planet.” He further clarifies by saying that Bing no longer indexes text, but rather associates “data that exists on the Web in all forms with the physical object that spawned it in the fist place.” The hope is that they can create use this data to create apps that provide instant answers so that users won’t have to click through to a Web page to the find the information they’re looking for.

    This method of search is way more ambitious than the current instant answer solutions on the Internet. Google’s instant answers currently extends to things like weather and sports. The only search engine that could be considered comparable is the open source search engine DuckDuckGo. Its new service, DuckDuckHack, allows users to build their own plugins that provide instant answers in the search results for things like Twitter handles and song lyrics.

    Of course, as Fast Company points out, this change has the potential to shake the SEO industry to its very core. Google is issuing update after update to its algorithm and Webmasters are doing what they can to keep up. What happens when Google, Bing and the rest provide the answers you need right from the search results. People aren’t going to click through to the Web page anymore so normal search concepts just aren’t going to be as important anymore.

    Bing’s redesign is just the first step towards this new method of search. It will still be a while before they start to offer an instant answer service that will make a lot of the normal tropes of search obsolete. Who knows? Maybe Google or the rumored Facebook search engine could come out first and beat Bing to the punch.

    Do you think abandoning conventional search tactics is good for users and Webmasters? How will Webmasters adapt to the new rules of search? Will the new rules of search even catch on? Let us know in the comments.

  • Post – Google Penguin Update Content Tips Endorsed By Matt Cutts

    Marc Ensign published a good blog post about staying on good terms with Google, in the post-Penguin world. There are plenty of posts out there on this topic. I’ve seen a fair amount of pretty good ones, but this one might be worth paying particular attention to.

    The post, titled “Google Shakeup: Coming To A Website Near You” has a bullet list for steps to a sound content strategy. There are certainly plenty of good posts on this subject out there too, but Google’s head of webspam Matt Cutts gave something of an endorsement to this list on Twitter in a conversation with Ensign.

    @mattcutts You have a sense of humor, right? Picturing you with black hair and a nose ring seemed like a good idea http://t.co/yBlnCdFQ 23 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @MarcEnsign over the years I’ve grown a pretty thick skin. 🙂 17 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @mattcutts C’mon, you know we all love you! We really don’t have a choice! 🙂 Would love to hear your thoughts on my post if you have time. 17 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @MarcEnsign the bullet points looked solid. I haven’t seen Happy Feet 2, so I can’t vouch for that part. 😉 17 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    So the bullet points from Ensign’s post, Cutts says, “looked solid” include:

    • Create a blog and consistently build up your site into a wealth of valuable content.
    • Work with a PR firm or read a book and start writing legitimate press releases on a regular basis and post them on your site.
    • Visit blogs within your industry and leave valuable feedback in their comments section.
    • Link out to other valuable resources within your industry that would benefit your visitors.
    • Share everything you are creating on 2 or 3 of your favorite social media sites of choice.
    • Position yourself as an expert.

    I should make a point about that second-to-last one. Sharing EVERYTHING you are creating on 2 or 3 social networks. In another article, we looked at a Webmaster Help video Cutts posted in response to a user submitted question about using your Twitter account like an RSS service for every article you post.

    While Cutts indicated that doing that isn’t going to be a problem as far as Google’s quality guidelines, he said it can be annoying if you do it with every post, and you post a whole lot of content. I made the case for why it depends on how the user is using Twitter.

    Just seemed worth pointing out.

    Note:I know I’ve written a whole lot about Matt Cutts lately. I’m not stalking him. I promise. It’s just that webmasters want to rank in Google, and he’s obviously the go-to guy for advice, so it seems appropriate that people know about what he’s saying on these topics. Hence, our extensive Matt Cutts coverage. By the way, perusing that coverage is advised. On our Matt Cutts page, you’ll find a plethora of great advice right from Cutts.

  • Google’s Matt Cutts Talks Search Result Popularity Vs. Accuracy

    Google’s head of webspam, Matt Cutts, posted a new Webmaster Help video today, discussing accuracy vs. popularity in search results. This video was his response to a user-submitted question:

    Does Google feel a responsibility to the public to return results that are truly based on a page’s quality (assuming quality is determined by the accuracy of info on a page) as opposed to popularity?

    “Popularity is different than accuracy,” says Cutts. “And in fact, PageRank is different than popularity. I did a video that talked about porn a while ago that basically said a lot of people visit porn sites, but very few people link to porn sites. So the Iowa Real Estate Board is more likely to have higher PageRank than a lot of porn sites, just because people link to the official governmental sites, even if they sometimes visit the porn sites a little bit more often.”

    Here’s that video, by the way:

    “So I do think that reputation is different than popularity, and PageRank encodes that reputation pretty well,” Cutts continues. “At the same time, I go to bed at night sleeping relatively well, knowing that I’m trying to change the world. And I think a lot of people at Google feel that way. They’re like trying to find the best way to return the best content. So we feel good about that. And at the same time, we do feel the weight, the responsibility of what we’re doing, because are we coming up with the best signals? Are we finding the best ways to slice and dice data and measure the quality of pages or the quality of sites? And so people brainstorm a lot. And I think that they do feel the weight, the responsibility of being a leading search engine and trying to find the very best quality content.”

    “Even somebody who has done a medical search, the difference between stage four brain cancer versus the query grade four brain cancer, it turns out that very specific medical terminology can determine which kinds of results you get. And if you just happen not to know the right word, then you might not get the best results. And so we try to think about how can we help the user out if they don’t necessarily know the specific vocabulary?”

    Interesting example. We’ve pointed to the example of “level 4 brain cancer” a handful of times in our Panda and pre-Panda coverage of content farms’ effects on search results. The top result for that query, by the way, is better than it once once, though the eHow result (written by a freelance writer claiming specialities in military employment, mental health and gardens – who has also written a fair amount about toilets), which was ranking before, is still number two.

    level 4 brain cancer results

    It’s worth noting that Google’s most recent list of algorithm updates includes some tweaks to surface more authoritative results.

    “So I would say that at least in search quality in the knowledge group, we do feel a lot of responsibility,” says Cutts. “We do feel like we know a lot of people around the world are counting on Google to return good quality search results. And we do the best we can, or at least we try really hard to think of the best ways we can think of to return high-quality search results.”

    “That’s part of what makes it a fun job,” he says. “But it definitely is one where you understand that you are impacting people’s lives. And so you do try to make sure that you act appropriately. And you do try to make sure that you can find the best content and the best quality stuff that you can. But it’s a really fun job, and it’s a really rewarding job for just that same reason.”

    Cutts then gets into some points that the antitrust lawyers will surely enjoy.

    “What makes me feel better is that there are a lot of different search engines that have different philosophies,” he says. “And so if Google isn’t doing a good job, I do think that Bing, or Blekko, or DuckDuckGo, or other search engines in the space will explore and find other ways to return things. And not just other general search engines, but people who want to do travel might go specifically to other websites. So I think that there’s a lot of opportunities on the web.”

    “I think Google has done well because we return relatively good search results. But we understand that if we don’t do a good job at that, our users will complain,” he says. “They’ll go other places. And so we don’t just try to return good search results because it’s good for business. It’s also because we’re Google searchers as well. And we want to return the best search results so that they work for everybody and for us included.”

    Well, users do complain all the time, and certainly some of them talking about using other services, but the monthly search market reports don’t appear to suggest that Google has run too many people off, so they must be doing something right.

  • Google Makes Some Local Search Adjustments

    On Friday, Google put out is monthly list of algorithm changes, for the month of April. We’ve taken a closer look at various entries on that list – there were over 50. Here’s our coverage so far:

    Google Algorithm Changes For April: Big List Released
    Google Increases Base Index Size By 15 Percent
    Google Makes More Freshness Tweaks To Algorithm
    Bi02sw41: Did Google Just Make Keywords Matter Less?
    Google Should Now Be Much Better At Handling Misspellings
    Google Tweaks Algorithm To Surface More Authoritative Results
    Google Launches Several Improvements To Sitelinks

    The list, along with the Penguin update and two Panda refreshes in April, is a lot for webmasters to take in. If local search is an areas of focus for you, you should find the following entries to the list among the most interesting:

    • More local sites from organizations. [project codename “ImpOrgMap2”] This change makes it more likely you’ll find an organization website from your country (e.g. mexico.cnn.com for Mexico rather than cnn.com).
    • Improvements to local navigational searches. [launch codename “onebar-l”] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location.
    • More comprehensive predictions for local queries. [project codename “Autocomplete”] This change improves the comprehensiveness of autocomplete predictions by expanding coverage for long-tail U.S. local search queries such as addresses or small businesses.
    • Improvements to triggering of public data search feature. [launch codename “Plunge_Local”, project codename “DIVE”] This launch improves triggering for the public data search feature, broadening the range of queries that will return helpful population and unemployment data.

    The first on the above list is interesting. Subdomains for various locales may be better idea than ever now. However, the implementation and delivery of content will no doubt be incredibly important. Here’s a bit about duplicate content and internationalizing.

    We actually referenced the second one on the list in a different article about how Google treats keywords. It appears that key phrases may carry less weight, at least for some searches. The local examples Google gives here indicate that this is particularly the case when you’re talking local.

    With regards to the third item, it will be interesting to see just how local predictions behave. It’s certainly something local businesses will want to pay attention to and analyze as it pertains to them.

    I’m not sure the fourth one will have many implications for most businesses, but it’s interesting from the use perspective, as Google looks to provide more data directly in search results.

    For some more insight into local search, check out this study from a couple months back, which attempted to identify local ranking factors.

  • Google Launches Several Improvements To Sitelinks

    We’re still digging into Google’s big list of algorithm changes released on Friday. You can read about some of the noteworthy changes in the following articles:

    Google Algorithm Changes For April: Big List Released
    Google Increases Base Index Size By 15 Percent
    Google Makes More Freshness Tweaks To Algorithm
    Bi02sw41: Did Google Just Make Keywords Matter Less?
    Google Should Now Be Much Better At Handling Misspellings
    Google Tweaks Algorithm To Surface More Authoritative Results

    There were over 50 changes announced for April, and 4 of them had to do specifically with sitelinks:

    • “Sub-sitelinks” in expanded sitelinks. [launch codename “thanksgiving”] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.
    • Better ranking of expanded sitelinks. [project codename “Megasitelinks”] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.
    • Sitelinks data refresh. [launch codename “Saralee-76”] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We’ve recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
    • Less snippet duplication in expanded sitelinks. [project codename “Megasitelinks”] We’ve adopted a new technique to reduce duplication in the snippets of expanded sitelinks.

    That “dig deeper” link, by the way, links to Inception on Know Your Meme. You might find the other link from the list a bit more useful though. It goes to a blog post from Google’s Inside Search blog from last summer, talking about the evolution of sitelinks, when they launched full-size links (with a URLs and one line of snippet text) and an increase to the maximum number of sitelinks per query (from 8 to 12).

    Mega Sitelinks

    At that time, they also combined sitelink ranking with regular result ranking to “yield a higher-quality list of links” for sitelinks. Preusmably, it is that aspect, which Google considers to be “megasitelinks” as that is the code name of the change listed in the new list, which talks about better ranking of expanded sitelinks. The change, as noted, provides a minimum score for the sitelink based on a score for the same URL used in general ranking.

    One of the changes was a data refresh, so the sitelinks gathered should be based on fresher information.