WebProNews

Tag: SEO

  • Who Likes Wikipedia More – Bing Or Google?

    Wikipedia, the juggernaut of the online information world. The site which spawned the idea of allowing internet users to update information regarding any topic has grown in such a way that only a handful of sites can be considered larger in scope. One of these sites, google.com, really likes Wikipedia when displaying queries to users utilizing their search engine. However, how much do they like Wikipedia? Also, do they like Wikipedia more than their rival, Bing?

    Wikipedia is the best thing ever. Anyone in the world, can write anything they want about any subject. So you know you are getting the best possible information.” – Michael Scott (The Office)

    A few months ago, Intelligent Positioning released a study which stated Wikipedia appeared on Google UK’s page one of their SERPs for 99% of noun related searches. This percentage was based around 1,000 unique searches, using a random noun generator, and the search settings at the standard ten per page. They also found of the 99%, 56% of the words displayed a Wikipedia link in the 1st position.

    This study raised a red flag in the SEO community as “99%” is a staggering claim to make. Many experts questioned Intelligent Positioning’s methods, which led Conductor to release their own study, utilizing some stricter parameters. Here’s a few key takeaways from their study.

    – Wikipedia appears on the first page of Google SERPs 60% of the time for informational queries, 34% for transactional, for a combined 46%.

    – The longer a keyword string, the less likely the term will appear on the front page. As demonstrated by their graph below.

    Google Wikipedia graph

    – For single word queries, 80% of the time Wikipedia will appear on the first page of Google’s SERPs.

    These studies occurred in March, fast forwarding to May we find Conductor has released a follow-up study, comparing how Google treats Wikipedia with how Bing treats the website in terms of SERP presence. Below, you can see their graph indicating Google features Wikipedia on their first page 15% more than Bing.

    Google Bing Wikipedia

    Reading further into the study, the results become even more interesting. While Google features Wikipedia on their first page more often, Bing features Wikipedia in the top overall spot 18% more often when they’re on the first page.

    Matt McGee of Search Engine Land made some interesting points, and had valid criticisms involving the studies surrounding Wikipedia and Google/Bing. First, it’s ironic he notes Matt Cutts has stated at different conferences that Wikipedia is featured more on Bing than Google. McGee also references yet another study, stating Bing actually favors Google more. This study was done at the same time as the Intelligent Positioning study, and the first Conductor study.

    So, taking in all of the studies from – Intelligent Positioning, Conductor, Search Engine Watch, and Matt Cutts, two of them state Google favors Wikipedia more than Bing, and vice versa for the other two.

    One point McGee makes in his article, which I’m going to unequivocally support and harp on is how none of these studies can truly say who features Wikipedia more. Search engines are continually changing their results, hourly, daily, monthly. So, how can any so-called study make empirical statements about how a search engine features Wikipedia more than the other?

    If I search for Jeffrey Jones (the actor who played the principal in Ferris Bueller’s Day Off), I’m going to get a specific set of searches based on the current state of a search engine’s algorithm. However, if Jeffrey Jones were to smoke meth, get in his car and crash into Harrison Ford’s front yard, the entire list of results on an engine’s SERPs is going to change. In a matter of hours, and in some cases, minutes. So, again, how can anyone be so certain who’s favoring Wikipedia more?

    As a SEO expert, webmaster, or anyone who has a disdain for Wikipedia, it can be irritating to always see the site featured above another for a specific keyword. Especially if it’s your website who’s having to sit below Wikipedia, even if your site is completely dedicated to the keyword. However, this shouldn’t lead you to simply believing a study that claims to have answers you’re seeking, because most likely someone else is claiming something entirely different.

  • Google Tweaks Algorithm To Surface More Authoritative Results

    Who you are matters more in search than ever. This is reflected in search engines’ increased focus on social signals, and especially with authorship markup, which connects the content you produce with your Google profile, and ultimately your Google presence.

    Late on Friday, Google released its monthly list of search algorithm changes, and among them was:

    More authoritative results. We’ve tweaked a signal we use to surface more authoritative content.

    Google has tried to deliver the most authoritative content in search results for as long as I can remember, but clearly it’s been pretty hard to get right all the time. The Panda update, introduced in February 2011, was a huge step in the right direction – that is if you think Panda has done its job well. Perhaps to a lesser extent, the Penguin update is another step, as its aim is to eliminate the spam cluttering up the search results, taking away from the actual authority sites.

    About a year ago, Google released a list of questions that “one could use to assess the quality of a page or an article.” This was as close as we got to a guide on how to approach search in light of the Pand update. There were 23 questions in all. Some of them are directly related to authority.

    Would you trust the information presented in this article?

    Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?

    Does this article have spelling, stylistic, or factual errors?

    Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?

    Does the article provide original content or information, original reporting, original research, or original analysis?

    Does the page provide substantial value when compared to other pages in search results?

    How much quality control is done on content?

    Does the article describe both sides of a story?

    Is the site a recognized authority on its topic?

    For a health related query, would you trust information from this site?

    Would you recognize this site as an authoritative source when mentioned by name?

    Does this article provide a complete or comprehensive description of the topic?

    Does this article contain insightful analysis or interesting information that is beyond obvious?

    Would you expect to see this article in a printed magazine, encyclopedia or book?

    Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?

    Google’s Matt Cutts gave something of an endorsement to a list of tips to consider post-Penguin update, written by Marc Ensign. One of those was “Position yourself as an expert.”

    Of course, we don’t know what exactly Google did to the signal (one of many, I presume) it uses to surface more authoritative content. It’s worth noting that they made a change to it, however, and it will be interesting to see if there’s a noticeable impact in search results.

    It’s one thing for Google to preach about quality content, and saying that’s what it wants to deliver to users, but we continue to see Google cite specific actions it has taken to make good on that, even if we can’t know exactly what they are (Google is vague when it lists its changes). Panda and Penguin are obviously major steps, but Google seems to be doing a variety of other things that cater to that too.

    I mentioned authorship. That’s a big one, and one you should be taking advantage of if you want to be seen as an authority in Google’s eyes. It really means you should be engaging on Google+ too, because it’s tied directly to it. For some authors, Google will even show how many people have you in Circles in the search results. It’s hard to dispute you being an authority if you manage to rack up a substantial follower count.

  • How Google Handles Font Replacement

    How Google Handles Font Replacement

    Google’s Matt Cutts put up a new Webmaster Help video, discussing how Google handles font replacement. The video was created in response to a user-submitted question:

    How does Google view font replacement (ie. Cufan, SIFR, FLIR)? Are some methods better than others, are all good, all bad?

    “So we have mentioned some specific stuff like SIFR that we’re OK with. But again, think about this,” says Cutts. “You want to basically show the same content to users that you do to Googlebot. And so, as much as possible, you want to show the same actual content. So we’ve said that having fonts using methods like SIFR is OK, but ideally, you might concentrate on some of the newer stuff that has been happening in that space.”

    “So if you search for web fonts, I think Google, for example, has a web font directory of over 100 different web fonts,” Cutts says. “So now we’re starting to get the point where, if you use one of these types of commonly available fonts, you don’t even have to do font replacement using the traditional techniques. It’s actual letters that are selectable and copy and pastable in your browser. So it’s not the case that we tend to see a lot of deception and a lot of abuse.”

    “If you were to have a logo here and then underneath the logo have text that’s hidden that says buy cheap Viagra, debt consolidation, mortgages online, that sort of stuff, then that could be viewed as deceptive,” he adds.

    In fact, that’s exactly the kind of thing that can get you in trouble with Google’s Penguin update, even if Google doesn’t get you with a manual penalty. To avoid this, here’s more advice from Google, regarding hidden text.

    “But if the text that’s in the font replacement technique is the same as what is in the logo, then you should be in pretty good shape,” Cutts wraps up the video. “However, I would encourage people to check out some of this newer stuff, because the newer stuff doesn’t actually have to do some of these techniques. Rather, it’s the actual letters, and it’s just using different ways of marking that up, so that the browser, it looks really good. And yet, at the same time, the real text is there. And so search engines are able to index it and process it, just like they would normal text.”

  • Bi02sw41: Did Google Just Make Keywords Matter Less?

    Google is often tight-lipped about its ranking signals. It makes sense, as they don’t want you to be able to game the results and get your content to rank when it shouldn’t. That’s why it is still somewhat surprising that Google decided to start putting out these monthly lists of algorithm changes, such as the one for April they released late on Friday.

    While Google does provide us with all these changes it makes (not ALL of the changes it makes, surely – it makes over 500 a year), Google also tends to send mixed signals, telling users not to focus on the SEO trends. Trends must start when signals are discovered, so it seems odd for Google to release these lists, but the company has indicated it is an effort to be more transparent, without giving away the secret sauce in its entirety.

    But if you look at a signal like this one, they’re clearly not giving much away, even though they’re telling you changes have been made with regards to this particular signal:

    Improvements to how search terms are scored in ranking. [launch codename “Bi02sw41”] One of the most fundamental signals used in search is whether and how your search terms appear on the pages you’re searching. This change improves the way those terms are scored.

    So, from this, we know that Google has changed how it scores key phrases. They don’t say whether they have a greater or smaller impact on how content ranks, though I’d be inclined to speculate that it’s smaller.

    Google is always talking about how it is getting better at understanding content, so it seems unlikely that the algorithm would have to rely on search terms more for ranking. As Google says, this is one of the most fundamental signals used in search. It’s always been an obvious signal. It seems like it would be a step backwards if search terms appearing on a page had a greater impact. That would go against that whole SEO mattering less message Google has been sending lately (particularly with the Penguin update). It doesn’t get anymore SEO than keywords.

    In the last paragraph of Google’s announcement of the Penguin update, Matt Cutts wrote, “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites.” Emphasis added.

    For that matter, keyword stuffing, a classic black hat SEO technique was one of the focal points of the Penguin update. This would effectively render keywords less significant in that regard. On that note, there’s another change on the new list related to keyword stuffing:

    Keyword stuffing classifier improvement. [project codename “Spam”] We have classifiers designed to detect when a website is keyword stuffing. This change made the keyword stuffing classifier better.

    There’s another entry on Google’s new list of changes, which would also seem to support the theory of a lessened weight on keywords:

    Improvements to local navigational searches. [launch codename “onebar-l”] For searches that include location terms, e.g. [dunston mint seattle] or [Vaso Azzurro Restaurant 94043], we are more likely to rank the local navigational homepages in the top position, even in cases where the navigational page does not mention the location. Emphasis added.

    Google is saying outright that it’s going to return results that don’t have the exact search terms the user used. Plus, the Bi02sw41 entry appears directly after that on the list.

    There’s another entry, which could be related. It’s certainly noteworthy either way:

    Better query interpretation. This launch helps us better interpret the likely intention of your search query as suggested by your last few searches.

    This is the type of thing that could very well cause Google to rely less on exact key phrases.

    I wouldn’t advise that you stop using keywords in your content, and I’ve yet to see any real evidence that Google isn’t relying on my exact queries to return results. Keywords obviously still matter a great deal – just maybe not quite to the extent that they used to.

    Even Google itself, in a recent list of SEO DOs and DON’Ts said: “Include relevant words in your copy: Try to put yourself in the shoes of searchers. What would they query to find you? Your name/business name, location, products, etc., are important. It’s also helpful to use the same terms in your site that your users might type (e.g., you might be a trained “flower designer” but most searchers might type [florist]), and to answer the questions they might have (e.g., store hours, product specs, reviews). It helps to know your customers.”

    So, again, I’m not in any way saying keywords don’t matter. They do. Honestly, I’m not really sure what you’re supposed to do with the information in this article, but if Google is giving any less weight to keywords, it’s worth knowing about.

  • Google Makes More Freshness Tweaks To Algorithm

    Google has clearly placed a lot of focus on freshness in recent months, and that continues with the company’s big list of algorithm changes for the month of April. It will be interesting to see if there is a noticeable improvement in results following these changes.

    Have you notices freshness-related improvements yet? Let us know in the comments.

    Here are the changes Google listed today for the month of April, related to freshness:

    • Smoother ranking changes for fresh results. [launch codename “sep”, project codename “Freshness”] We want to help you find the freshest results, particularly for searches with important new web content, such as breaking news topics. We try to promote content that appears to be fresh. This change applies a more granular classifier, leading to more nuanced changes in ranking based on freshness.
    • Improvement in a freshness signal. [launch codename “citron”, project codename “Freshness”] This change is a minor improvement to one of the freshness signals which helps to better identify fresh documents.
    • No freshness boost for low-quality content. [launch codename “NoRot”, project codename “Freshness”] We have modified a classifier we use to promote fresh content to exclude fresh content identified as particularly low-quality.
    • UI improvements for breaking news topics. [launch codename “Smoothie”, project codename “Smoothie”] We’ve improved the user interface for news results when you’re searching for a breaking news topic. You’ll often see a large image thumbnail alongside two fresh news results.
    • No freshness boost for low quality sites. [launch codename “NoRot”, project codename “Freshness”] We’ve modified a classifier we use to promote fresh content to exclude sites identified as particularly low-quality.

    Notice that two of those are pretty much identical. Not sure if that is a mistake or if there is a subtle difference. That is the two about no freshness boosts for low quality. One of them says “content” and the other says “sites”, but the descriptions are the same.

    Either way, it’s a noteworthy change, and it will be interesting to see if there is a clear impact.

    As I’ve written about recently, I have found freshness to be outweighing relevancy in results sometimes, but I don’t necessarily think it’s been in relation to actual poor quality content – just when an older result makes more sense than a newer result, even if the newer one is high quality too.

    Image: Parents Just Don’t Understand (via Fade Theory)

  • Google Panda Update Strikes Again (Really. Again. As In Since Penguin Update)

    Google has already launched another Panda update. By already, I mean since the Penguin update.

    After the Penguin update was announced, and Searchmetrics put out its lists of winners and losers, Google revealed that there had actually been a Panda update a few days prior, and that this was strongly influencing those lists.

    The update reportedly hit on Friday, April 27. With all the Penguin chaos out there, one has to wonder how much the Panda update has skewed webmaster analysis.

    Barry Schwartz over at Search Engine Land reports that he has confirmed as much with Google, sharing the following statement from the company:

    We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the over 500 changes we make to our ranking algorithms each year.

    According to Schwartz, it was a limited update that didn’t affect very many sites.

    Google’s monthly list of algorithm updates is due out anytime now. Soon, we should see what all Google did in April – at least the stuff they’ll share (which is usually quite a bit). It will be interesting to see if any new Penguin or Panda info comes from that list. We’ll certainly be covering the list no matter what’s on it. I’ll also be interested to see if there are more freshness-related tweaks.

    View all Panda coverage here. Penguin coverage is here.

    Image: @mattcutts

  • Matt Cutts: Excessive Blog Updates To Twitter Not Doorways, But Possibly Annoying

    Google’s head of webspam took on an interesting question from a user in a new Webmaster Help video:

    Some websites use their Twitter account as an RSS like service for every article they post. Is that ok or would it be considered a doorway?

    I know he shoots these videos in advance, but the timing of the video’s release is interesting, considering that it’s asking about doorways. Google’s Penguin Update was unleashed on the web last week, seeking out violators of Google’s quality guidelines, and dealing with them algorithmically. One of Google’s guidelines is:

    Avoid “doorway” pages created just for search engines, or other “cookie cutter” approaches such as affiliate programs with little or no original content.

    There is no shortage of questions from webmasters wondering what exactly Google is going after with the update, which will likely come with future iterations, not unlike the Panda update. For more on some things to avoid, browse our Penguin coverage.

    Using your Twitter feed like an RSS feed, however, should not put you in harm’s way.

    “Well, I wouldn’t consider it a doorway because a doorway is typically when you make a whole bunch of different pages, each page is targeting one specific phrase,” he says. “And then when you land there, usually it’s like, click here to enter And then it takes you somewhere, and monetizes you, or something along those lines. So I wouldn’t consider it a doorway.”

    Cutts does suggest that such a practice can be annoying to users, however.

    “Could it be annoying?” he continues. “Yes, it could be annoying, especially if you’re writing articles like every three minutes or if those articles are auto-generated somehow. But for example, in FeedBurner, I use a particular service where, when I do a post on my blog, it will automatically tweet to my a Twitter stream, and it will say New Blog Post, colon, and whatever the title of the blog post is. And that’s perfectly fine.”

    “That’s a good way to alert your users that something’s going on,” he adds. “So there’s nothing wrong with saying, when you do a blog post, automatically do a tweet. It might be really annoying if you have so many blog posts, that you get so many tweets, that people start to ignore you or unfollow you. But it wouldn’t be considered a doorway.”

    OK, so you’re safe from having to worry about that being considered a doorway in Google’s eyes.

    I’m not sure I entirely agree with Cutts’ point about it being annoying, however. Yes, I suppose it can be annoying. That really depends on the user, and how they use Twitter. I’m guessing that it is, in fact, annoying to Cutts.

    Just as some sites treat their Twitter feed like an RSS feed, however, there are plenty of Twitter users who use it as such. A lot of people don’t use RSS, and would simply prefer to get their news via Twitter feed. Some users in this category (I consider myself among them) follow sites on Twitter because they want to follow the content they’re putting out. It’s really about user preference. Not everybody uses Twitter the same way, so you have to determine how you want to approach it.

    Cutts is definitely right in that some may unfollow you, but there could be just as many who will follow you because they want the latest.

    Either way, it doesn’t appear to be an issue as far as Google rankings are concerned.

  • Google Penguin Update: A Lesson In Cloaking

    There are a number of reasons your site might have been hit by Google’s recent Penguin update (formerly known as the Webspam update). Barring any unintended penalties, the algorithm has wiped out sites engaging in webspam and black hat SEO tactics. In other words, Google has targeted any site that is violating its quality guidelines.

    One major thing you need to avoid (or in hindsight, should have avoided) is cloaking, which is basically just showing Google something different than you’re showing users. Google’s Matt Cutts did a nice, big video about cloaking last summer. He calls it the “definitive cloaking video,” so if you have any concern that you may be in the wrong on this, you’d better watch this. It’s nearly 9 minutes long, so he packs in a lot of info.

    Cloaking is “definitely high risk,” Cutts says in the video.

    With Penguin, there’s been a lot more talk about bad links costing sites. Link schemes are specifically mentioned in Cutts’ announcement of the Penguin update, and Google has been sending webmasters a lot of messages about questionable links recently. That’s definitely something you don’t want to ignore.

    But while Google didn’t mention cloaking specifically in the announcement, it did say the update “will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.” Cloaking fits that bill. Google divides its quality guidelines into basic principles and specific guidelines. Cloaking appears in both sections.

    “Make pages primarily for users, not for search engines,’ Google says in the Basic Principles section. “Don’t deceive your users or present different content to search engines than you display to users, which is commonly referred to as ‘cloaking.’”

    In the Specific Guidelines section, Google says, “Don’t use cloaking or sneaky redirects.” This has its own page in Google’s help center. Specific examples mentioned include: serving a page of HTML text to search engines, while showing a page of images or flash to users, and serving different content to search engines than to users.

    “If your site contains elements that aren’t crawlable by search engines (such as rich media files other than Flash, JavaScript, or images), you shouldn’t provide cloaked content to search engines,” Google says in the help center. “Rather, you should consider visitors to your site who are unable to view these elements as well.”

    Google suggests using alt text that describes images for users with screen readers or images turned off in their browsers, and providing textual contents of JavaScript in a noscript tag. “Ensure that you provide the same content in both elements (for instance, provide the same text in the JavaScript as in the noscript tag),” Google notes. “Including substantially different content in the alternate element may cause Google to take action on the site.”

    Also discussed in this section of Google’s help center are sneaky JavaScript redirects and doorway pages.

  • The Importance of Online Brand Reputation

    The Importance of Online Brand Reputation

    The kind folks at DKC News just sent over an infographic covering some stats on the importance of online reputation for emerging small businesses, to compliment Yahoo! Small Business’ launch of it’s new Marketing Dashboard today.

    Yahoo’s data points out that the number one source of small business marketing is the internet, and that 83% of consumers are affected by social proofing, which indicates the relevance of a merchant’s online reputation. Social proofing is the practice of using testimonials and a sort of word of mouth online. Eighty percent of internet shoppers likewise say that a bad online reputation can turn them away from a sale.

    As for e-commerce data statistics, the Yahoo study points out that 17% of web searches are conducted to seek out a specific product – and 30% of marketers need help with SEO and product keywords. This is where Yahoo’s new Marketing Dashboard comes into play. Advanced features of the new platform include:

    Local Visibility Pro: Small businesses can increase their online visibility by submitting their business information to over 100 search engines and directories
    Reputation Management Pro: Users get more comprehensive data, plus the ability to track their competitors, receive email alerts, and share positive customer comments via social channels or email
    Integrated campaign tracking: Small businesses can also attract new customers by taking advantage of marketing services from featured, best-in-class third party marketing vendors, including Constant Contact and OrangeSoda, and display and monitor results from campaigns with those vendors from within the Yahoo! Marketing Dashboard

    An added perk to the new dashboard is that the entire service is free.

  • Google Penguin Update Recovery: Hidden Text And Links

    There’s been a lot of discussion about Google’s Penguin update since it was launched. The update, if you haven’t been following, is about decreasing the rankings of sites that are violating Google’s quality guidelines. With that in mind, it seems like a good idea to take a good look at these guidelines.

    Here are some articles about:

    Cloaking
    Links
    Keyword Stuffing

    The guidelines have been around for a long time, and Google has enforced them for just as long. In that regard, the Penguin update is nothing new. It’s just that Google thinks it has a new way to better enforce the guidelines. You should expect that Google will only continue to improve, so your best bet is to simply abide. That is, if you care about your Google rankings.

    Another one of Google’s guidelines is: Avoid hidden text or hidden links. This means, don’t use white text on a white background. It means don’t include text behind images. Don’t use CSS to hide text. Don’t set the font size to 0. These are specific examples of what not to do straight from Google’s help center page on the topic. When you do these things, Google is deeming your site untrustworthy, and will go out of its way not to rank your site well or most likely de-index it completely.

    “If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages,” Google says. “When evaluating your site to see if it includes hidden text or links, look for anything that’s not easily viewable by visitors of your site. Are any text or links there solely for search engines rather than visitors?”

    Google defines hidden links as that are only intended to be crawled by Googlebot, but are unreadable to humans, because they consist of hidden text, CSS has been used to make them as small as one pixel high, or it is hidden in a small character (such as a hyphen in the middle of a paragraph).

    “If you’re using text to try to describe something search engines can’t access – for example, Javascript, images, or Flash files – remember that many human visitors using screen readers, mobile browsers, browsers without plug-ins, and slow connections will not be able to view that content either,” Google says. “Using descriptive text for these items will improve the accessibility of your site. You can test accessibility by turning off Javascript, Flash, and images in your browser, or by using a text-only browser such as Lynx.”

    Here a couple of relevant videos from Matt Cutts that you should probably watch, if you have any questions about how Google handles hidden text. They’re short, so don’t worry.

    Probably the best takeaway from Google’s advice on hidden text and links is that you should either remove them or make them more easily viewable (in the case that they’re actually relevant).

    More Penguin Update coverage here.

    Image: Batman TV Series (ABC)

  • Google Penguin Update Recovery: Getting Better At Keywords

    Last week, Google unleashed its Penguin update upon webmasters. The update, as you may know, was designed to decrease the rankings of sites engaging in black hat SEO tactics and webspam. One of the classic black hat tactics is keywords stuffing, so if you’ve been doing this and getting away with it in the past, there’s a good chance the update took you down a notch.

    Specifically, Google’s Matt Cutts said the update “will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. Avoiding keyword stuffing has long been one of these guidelines. The guideline says, “Don’t load pages with irrelevant keywords.”

    Google has a page about this in its help center, where it elaborates a little more. Here’s what Google says, verbatim, about keyword stuffing there:

    “Keyword stuffing” refers to the practice of loading a webpage with keywords in an attempt to manipulate a site’s ranking in Google’s search results. Filling pages with keywords results in a negative user experience, and can harm your site’s ranking. Focus on creating useful, information-rich content that uses keywords appropriately and in context.

    To fix this problem, review your site for misused keywords. Typically, these will be lists or paragraphs of keywords, often randomly repeated. Check carefully, because keywords can often be in the form of hidden text, or they can be hidden in title tags or alt attributes.

    Unlike some of the other black hat tactics advised against in the guidelines, such as cloaking, Google specifically named keyword stuffing in its announcement of the Penguin update. Cutts even provided the following image in the announcement, highlighting this particular tactic:

    Penguin Announcement: Keyword Stuffing

    Cutts has spoken out about the practice plenty of times in the past. Here’s a humorous example of when he called out one site in particular about five years ago.

    More recently – last month, in fact – Cutts talked about a related violation in a Google+ update. He discussed phone number spam, which he essentially equates to keyword stuffing.

    ““I wanted to clarify a quick point: when people search for a phone number and land on a page like the one below, it’s not really useful and a bad user experience. Also, we do consider it to be keyword stuffing to put so many phone numbers on a page,” he wrote. “There are a few websites that provide value-add for some phone numbers, e.g. sites that let people discuss a specific phone number that keeps calling them over and over. But if a site stuffs a large number of numbers on its pages without substantial value-add, that can violate our guidelines, not to mention annoy users.”

    Here’s the image he was referring to:

    Phone Number Spam

    Getting Better At Keywords

    Cutts has advised that you not spend any time worrying about the keywords meta tag (though Google does use the meta description tag):

    In March, Google released a video about 5 common SEO mistakes and 6 good ideas:

    One of the “good ideas” was:

    Include relevant words in your copy: Try to put yourself in the shoes of searchers. What would they query to find you? Your name/business name, location, products, etc., are important. It’s also helpful to use the same terms in your site that your users might type (e.g., you might be a trained “flower designer” but most searchers might type [florist]), and to answer the questions they might have (e.g., store hours, product specs, reviews). It helps to know your customers.

    I’d suggest including them in your titles as well.

    Matt Cutts has talked about keywords a lot in various Webmaster Help videos. If you want to make sure you’re getting keywords right, I’d advise watching some of these discussions (straight from the horse’s mouth). They’re generally short, and won’t require a lot of time:

  • Google Gives More Details On Human Raters

    Google Gives More Details On Human Raters

    Google has people that it pays to rate the quality of search results. They’re called raters. Google mentioned them last year in a widely publicized interview with Wired – the interview, in fact, in which the Panda update’s name was revealed.

    Given that the Panda update was all about quality, many webmasters became very interested in these raters and their role in the ranking process.

    Google talked about them a little at PubCon in November, and that in December, Google’s Matt Cutts talked about them some more, saying, ““Even if multiple search quality raters mark something as spam or non-relevant, that doesn’t affect a site’s rankings or throw up a flag in the url that would affect that url.”

    Cutts posted a new video about the raters today, giving some more details about how the process works.

    Very in-depth video today about how Google uses human eval rater data in search: http://t.co/9Nhn44TP Please RT! 1 hour ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    “Raters are really not used to influence Google’s rankings directly,” says Cutts in the video. “Suppose an engineer has a new idea. They’re thinking, oh, I can score these names differently if I reverse their order because in Hungarian and Japanese that’s the sort of thing where that can improve search quality. What you would do is we have rated a large quantity of urls, and we’ve said this is really good. This is bad. This url is spam. So there are 100s of raters who are paid to, given a url, say is this good stuff? Is this bad stuff? Is it spam? How useful is it? Those sorts of things.”

    “Is it really, really just essential, all those kinds of things,” he continues. “So once you’ve gotten all those ratings, your engineer has an idea. He says ‘OK, I’m going to change the algorithm.’ He changes the algorithm and does a test on his machine or here at the internal corporate network, and then you can run a whole bunch of different queries. And you can say OK, what results change? And you take the results the change and you take the ratings for those results and then you say overall do the return– do to the results that are returned tend to be better, right? They’re the sort of things that people rated a little bit higher rather than a little bit lower? And if so, then that’s a good sign, right? You’re on the right path.”

    “It doesn’t mean that it’s perfect, like, raters might miss some spam or raters might not notice some things, but in general you would hope that if an algorithm makes a new site come up, then that new site would tend to be higher rated than the previous site that came up,” he continues. “So imagine that everything looks good. It looks like it’s a pretty useful idea. Then the engineer, instead of just doing some internal testing, is ready to go through sort of a launch evaluation where they say how useful is this? And what they can do is they can generate what’s called a side by side. And the side by side is exactly what it sounds like. It’s a blind taste test. So over here on the left-hand side, you’d have one set of search results. And on the right-hand side you’d have a completely different set of search results.”

    Google showed the raters in a video last year, which actually showed a glimpse of the side-by-side:

    “If you’re a rater, that is a human rater, you would be presented with a query and a set of search results,” Cutts continues. “And given the query, what you do is you say, “I prefer the left side, ” or “I prefer the right side.” And ideally you give some comments like, ‘Oh, yes, number two here is spam,’ or ‘Number four here was really, really useful.’ Now, the human rater doesn’t know which side is which, which side is the old algorithm and which side is the new test algorithm. So it’s a truly blind taste test. And what you do is you take that back and you look at the stuff that tends to be rated as much better with the new algorithm or much worse with the new algorithm.”

    “Because if it’s about the same then that doesn’t give you as much information,” he says. “So you look at the outliers. And you say, ‘OK, do you tend to lose navigational home pages? Or under this query set do things get much worse?’ And then you can look at the rater comments, and you can see could they tell that things were getting better? If things looked pretty good, then we can send it out for what’s known as sort of a live experiment. And that’s basically taking a small percentage of users, and when they come to Google you give them the new search results. And then you look and you say OK, do people tend to click on the new search results a little bit more often? Do they seem to like it better according to the different ways that we try to measure that? And if they do, then that’s also a good sign. ”

    Cutts acknowledges that the raters can get things wrong, and that they don’t always recognize spam.

  • SEOmoz Raises $18 Million in VC Funding

    SEOmoz Raises $18 Million in VC Funding

    SEOmoz, a startup that develops Search Engine Optimization (SEO) software, today announced that the company has raised $18 million in venture capital funding. The announcement came on the company’s Daily SEO blog in a Google+Reader”>post by SEOmoz CEO Rand Fishkin. The $18 million in series B funding was provided by The Foundry Group and Ignition Partners. The company’s only other round of funding was in 2007, when it raised $1.1 million from Ignition Partners and Curious Office. This round of funding brings Foundry up to a 17% share in the company, with Ignition having a 15% share. A recent WebProNews interview with Fishkin about a past failure to raise funds for SEOmoz can be viewed here.

    “SEOmoz is one of those companies that you just know is going to do big things, “ said Brad Feld, Managing Director of The Foundry Group. “I tend to judge the organizations I invest in based on character, culture and leadership. I believe Moz has exceptional depth in all these areas and the financial growth trajectory to back them up. The relationship is a great fit from all angles and I’m positive we have a very successful future ahead of us.”

    The software that SEOmoz creates crawls websites to find errors or missed opportunities for SEO and then makes recommendations based on SEO best practices. The startup also helps websites battle negative SEO. SEOmoz has 15,000 current paying subscribers and predicts it will take in $18-20 million in 2012.

    In addition to the highly detailed blog announcement that tells the entire story of the funding, SEOmoz has put out an official press release that is littered with internet memes. One example can be seen below, and represents the tenuous explanation for their less-than-serious release. The rest of the memes, which, I must warn you, are not all winners, can be seen here.

    SEOmoz meme press release

    Despite the silly press release, SEOmoz is sincerely hoping this investment will allow their startup to grow and flourish. “In that first phone call with Brad, I knew we’d found someone special,” said Fishkin. “I was, honestly, scared of starting another fundraising process after our previous two attempts, but the chemistry between Foundry and Moz was instant – we couldn’t ask for a better fit. This new partnership coupled with the continued support of our original investors, Ignition Partners, gives us the ability to achieve some remarkable milestones in the years to come.”

    How do you feel about companies that cast their SEO magic on websites? Should all companies be less serious about their press releases? Leave a comment below and let us know.

  • Google’s Fresh Results: Irrelevancy In Action

    Google continues to place a certain emphasis on the freshness of search results. Even with its latest monthly list of algorithm changes (which reminds me, another one should be coming out any day now), Google had five different changes related to freshness.

    Do you think Google’s increased emphasis on freshness has made results better? Let us know what you think in the comments.

    I’ve hinted at it several times while writing about Google, but I’ve never come out and written an article specifically about this. Google’s emphasis on freshness is often burying the more relevant results. While I run into this problem fairly often, I ran into it while I was working on my last article, so I decided to go ahead and point out an example of what I’m talking about.

    WebProNews puts out a lot of content. I put out a fair amount myself, and sometimes I simply find it easiest to go to Google to search for past articles I know we’ve written, when I want to reference something we’ve talked about in the past. When I do this, I’ll usually search for “webpronews” and a few keywords I know are are relevant to the article I’m looking for. Sometimes Google will give me exactly what I need immediately. Sometimes, however, freshness is getting in the way, and this example proves that.

    In this case, I was looking for the article I wrote back in August called “Does Google Need Twitter?” So I searched, “webpronews, does google need twitter”. I can’t imagine what else could be more relevant to that query than that article. According to Google (and this is with or without Search Plus Your World turned on, mind you), two more recent stories I wrote about the Penguin update (both from today) were more relevant to that search.

    Fresh isn't always more relevant

    The only mention of Twitter in either of the two articles ranking above the one I was actually looking for, comes in the author bio sections, where it says to follow me on Twitter. I’m not sure what signals Google was looking at to determine that these results would be relevant to me for that query, but clearly freshness was given too much weight.

    This is just one example, of course, but I see this all the time. I’ve seen others mention it here and there as well. We had a comment from Matt, on a past article, for example, who said:

    “I find that recency is often given more credence than relevancy. Sometimes the content I’m looking for is older. Not all of the best content on the web happened in the last week.” Exactly! I thought it was just me. Freshness over relevancy was driving me nuts, I started using Bing it was getting so bad. Turns out Bing is actually pretty awesome.

    Google may be looking to compensate for its lack of realtime Twitter data, which it lost as the result of a deal between the two companies expiring last year (in fact, that’s what “Does Google Need Twitter” was about).

    We get it. Google can index new, fresh content. That’s good. I wouldn’t have it any other way. However, when Google had realtime search, it came in the form of a little box in the results, much like other universal search results appear – like when you get results from Google News. The latest tweet wasn’t presented as the top, most relevant result, just because it was indexed a minute ago.

    Realtime search was Google’s best example of freshness, in my opinion, and that went away with the Twitter deal, although Google has hinted that it could return, with Google+ and other data. I don’t think it would work as well without Twitter though. But this is one important area of search where Google isn’t cutting it anymore. If you want the latest, up-to-the-second info or commentary on something, where are you going? Google or Twitter?

    Interestingly enough, the fact that Twitter is better in this case, gives Google one line of defense against antitrust accusations. There is competition. In fact, verticals like this, with efforts from different companies (including Twitter) that have the potential to chip away at various pieces of Google dominance, may just be Google’s biggest weakness. I’ve had a conversation with one Googler, which leads me to believe the company tends to agree.

    We saw how Google was falling short in the area of realtime search, in particular, when Muammar Gaddafi died.

    Google continues to make changes to its algorithm every day, and a focus on quality, both with the Panda update and the Penguin update is good, even if these updates may not be entirely perfect. It’s also good to have content that’s as fresh as possible, so I also don’t want to say that Google’s focus on improving freshness is bad either, but I do feel that Google may be giving a little too much weight to this signal in its ranking process, just as it may be giving a little too much weight to social signals for some types of queries.

    Either way, it clearly pays to keep putting out fresh content.

    Have you noticed relevancy being sacrificed for freshness in Google results? Let us know in the comments.

    Image: The Fresh Prince Of Bel-Air (via)

  • Google Panda Update: Tips To Stay On Its Good Side

    Martin Panayotov recently wrote a post for SEOmoz about how a site (MyMovingReviews.com) he works for managed to benefit from the Panda updates. It’s an interesting article because usually you see articles about people who have been hit by the Panda update. Those who actually gained from the updates don’t have much to complain about, so they’re not as vocal.

    We reached out to Panayotov for more on the site’s Panda success, so perhaps you can learn a thing or two (if your site was negatively impacted) from what he has to say.

    Some will no doubt dispute this, but Panayotov believes Google’s search quality has gotten better with Panda.

    “Until the last update, Google emphasized on quality with the Panda updates,” he tells us. “We saw a lot of content farms going down to make space for the niche websites. Usually the niche websites will be more informative since they are operated by experts in the niche. General websites have less in-depth knowledge on the subject and given that they should be below the experts. Also duplicate content is gone and the thin pages are nowhere to be found on the 1st page.”

    So what did the site do right?

    “With My Moving Reviews we started with improving the quality,” he says. “We thought what would someone need to know before moving. Since we are in the moving and relocation niche, we try to be the most user friendly and informational source on the subject. Also we wanted to improve the visibility of all the great articles within the website.”

    “This helped keeping the visitor more engaged and at the same time improving the metrics that Google monitors so closely within the Panda algorithm – the bounce rate for example,” he says. “We also worked on the CTR to make sure we are preferred within the SERPs. We also included some rich snippet markups as well as authorship markup as our authors are experts in the niche. We wanted to make sure the readers know who created the article and to be able to interact and connect.”

    “We love user generated content,” he adds. “Starting from the moving company reviews, the company responses to the blog comments and Facebook comments – we are working on making it even easier for the visitors to share, comments and interact. We want to cover every side of the story by making the process easier and more user friendly. I think every major website and brand should try to utilize more and more UGC within their websites. This will also help with the search engines since they love fresh content.”

    They do love fresh. Google, in particular, has displayed an increased emphasis on freshness of search results in recent month – even since the Freshness update in November, Google has made more subtle improvements here and there related to fresher results.

    I say improvements, though in all honesty, I think they go too far with freshness sometimes. The right answer isn’t always from this week. Though, Google certainly had to do something to help fill the void left by the expiration of its deal with Twitter.

    “We analyzed our traffic closely,” continues Panayotov. “It turned our that people are searching for moving services directly from their phones more and more. First we created iPhone and Android apps to cover that, but it wasn’t enough. This is why we created a mobile website. This was a great move because we instantly increased conversions, increased the apps downloads, reduced the bounce rate and we made your visitors happier.”

    By the way, Google just released some new mobile AdWords features that could help in this department as well.

    Panayotov offers five points of advice for those who have struggled with Panda:

    1. Produce great content on a regular basis. Make sure you have a plan on content marketing. Don’t go for the keywords articles only. There are a lot of great content opportunities out there – make sure you utilize that.

    2. Sometimes the most unexpected articles get the most shares and re-tweets, so make sure you try different approach with every article to find the right spot.

    3. Improve social metrics and especially Google Plus and Twitter. Be active there and connect with your industry leaders. Share great stuff on a regular basis not only from your websites, but from other great sources in the niche. This will help with SEO too and also will give you ideas about new content.

    4. Work on your website and make sure you can receive user generated content. This will help your engagement metrics and will boost your rankings. Also try to markup all content properly. If you do, you may get rich snippets which will increase your click-through rate. If applicable, go with the authorship markup.

    5. Improve bounce rate and make sure visitors won’t leave your website seconds after being there. Make them read what you have to say. You can do that by making your articles easy to scan before reading. Use a lot of H tags, bullet points and great images. Save your visitors’ time by structuring your information better. There are also some other tricks you can do like adding a fly-box on your blogs or by having a news section visible from within the posts.

    With regards to recent Google updates, Panayotov says, “There is another factor introduced that I think have something to do with keyword density and synonyms. It affects some of the heavily mentioned main keywords as well as long tails. As it is a pretty fresh update there is still not a lot of information out there.”

    “On the other hand, the new Penguin update might be looking after the links,” he adds. “There are some recent speculations about penalizing footer links and also too many links from blog websites. There are some interesting signals from Google that we are just starting to analyze.”

    “As Google tries to make sure it gets harder for SEOs to manipulate the SEPRs, an idea would be to structure your website and information as there was no Google,” says Panayotov. “Focus on the visitor. Make sure you lead them the way they were supposed to and not for the best SEO benefit. I believe this would be the best long-term advice.”

    Given that Google always says it wants to deliver the best content for the user, and that Google’s Matt Cutts said Google wants people doing no SEO at all to be “free to focus on creating amazing, compelling web sites,” that’s probably not bad advice.

  • Recovering From Google’s Penguin Update

    First, before you start your campaign for Penguin recovery, you should probably determine whether you were actually hit by the Penguin update, or by the Panda update (or even some other Google algorithm change).

    Shortly after the Penguin update rolled out, Google’s Matt Cutts revealed that Google had implemented a data refresh for the Panda update several days earlier. This threw off early analysis of the Penguin update’s effects on sites, as the Panda update was not initially accounted for. Searchmetrics put out a list of the top losers from the Penguin update, which was later revised to reflect the Panda refresh.

    Google also makes numerous other changes, and there’s no telling how many other adjustments they made between these two updates, and since the Penguin update. That said, these two would appear to be the major changes most likely to have had a big impact on your site in the last week or two.

    According to Cutts, the Panda refresh occurred around the 19th. The Penguin update (initially referred to as the Webspam Update) was announced on the 24th. The announcement indicated it could take a “few days”. Analyze your Google referrals, and determine whether they dropped off before the 24th (and around or after the 19th), and you should be able to determine if you are suffering the effects of Panda or Penguin, at least in theory.

    If it looks more likely to be Panda, the best advice is probably to focus on making your content itself better. Also, take a look at Google’s list of questions the company has publicly said it considers when assessing the quality of a site’s content. We’ve written about these in the past, but I’ll re-list them here:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Penguin is different. Penguin and Panda are designed to work together to increase to the quality of Google’s search results. Whether or not you think this is actually happening is another story, but this does appear to be Google’s goal, and at the very least, that’s how it’s being presented to us.

    Google’s announcement of the Penguin update was titled: “Another step to reward high-quality site“.

    “The goal of many of our ranking changes is to help searchers find sites that provide a great user experience and fulfill their information needs,” Cutts wrote in the post. “We also want the ‘good guys’ making great sites for users, not just algorithms, to see their effort rewarded. To that end we’ve launched Panda changes that successfully returned higher-quality sites in search results. And earlier this year we launched a page layout algorithm that reduces rankings for sites that don’t make much content available ‘above the fold.’”

    If your site was hit by Penguin, you should, again, focus on quality content, and not trying to trick Google’s algorithm. All that Penguin is designed to do is to make Google better at busting you for abusing its algorithm. It’s designed to target those violating Google’s quality guidelines. The guidelines are not new. It’s not some new policy that is turning SEO on its ear. Google just found a way to get better at catching the webspam (again – at least in theory).

    So, with Penguin, rather than a list of questions Google uses to assess content, as with the Panda list, simply look at what Google has to say in the Quality Guidelines. Here they are broken down into 12 tips, but there is plenty more (straight from Google) to read as well. Google’s guidelines page has plenty of links talking about specific things not to do. We’ll be delving more into each of these in various articles, but in general, simply avoid breaking these rules, and you should be fine with Penguin. If it’s too late, you may have to start over, and start building a better link profile and web reputation without spammy tactics.

    Here’s a video Matt Cutts recently put out, discussing what will get you demoted or removed from Google’s index:

    Assuming that were wrongfully hit by the Penguin update, Google has a form that you can fill out. That might be your best path to recovery, but you really need to determine whether or not you were in violation of the guidelines, because if you can look at your own site and say, “Hmm…maybe I shouldn’t have done this particular thing,” there’s a good chance Google will agree, and determine that you were not wrongfully hit.

    By the way, if you have engaged in tactics that do violate Google’s quality guidelines, but you have not been hit by the Penguin update, I wouldn’t get too comfortable. Google has another form, which it is encouraging people to fill out when they find webspam in search results.

    To report post-Penguin spam, fill out https://t.co/di4RpizN and add “penguin” in the details. We’re reading feedback. 2 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    They’ve had this for quite a while, but now that some people are getting hit by the Penguin update, they’re going to be angry, and probably eager to point out stuff that Google missed, in an “If what I did was so bad, why not what this person did” kind of mentality.

    Another reason not to be too comfortable would be the fact that Google is likely to keep iterating upon the Penguin update. We’ve seen plenty of new versions and data refreshes of the Panda update come over the past year or so. Penguin is already targeting what Google has long been against. I can’t imagine that they won’t keep making adjustments to make it better.

  • Webspam And Panda Updates: Does SEO Still Matter?

    It’s been a crazy week in search. While not entirely unexpected, Google launched its new Webspam update (which should still be in the process of rolling out, as Google said it would take a few days). This update, according to the company, is aimed at black hat SEO tactics and the sites engaging in them, to keep them from ranking over content that is just better and more relevant. While most that don’t engage in such tactics would agree that this would be a good thing, a lot of people are complaining about the effects of the update on the user experience, and on results in general.

    Do you think Google’s results have improved or gotten worse with this update? Let us know in the comments.

    The Webspam update, as it’s officially been dubbed by Google’s Matt Cutts, is really only part of the equation though. Cutts also revealed that Google launched a data refresh of the Panda update around April 19th. So it would appear that a mixture of these two updates (along with whatever other tweaks Google may have made) have caused a lot of chaos among webmasters and in some search results.

    What The Panda Update Is About

    I’m not going to spend a lot of time talking about Panda here. I feel I’ve done that enough for the past year. If you’re not familiar with Panda, I’d suggest reading through our coverage here. Essentially, it’s Google’s attempt to make quality content rise to the top. There are a lot of variables, opinions and speculation throughout the Panda saga, but in a nutshell, it’s just about Google wanting good, quality content ranking well.

    What The Webspam Update Is About

    Interestingly enough, the Webspam update is about quality content as well. In fact, Google’s announcement of the update was titled: Another Step To Reward High-Quality Sites. It can be viewed as a complement to Panda. A way for Google to keep spammy crap from interfering with the high quality content the Panda update was designed to promote. That is, in a perfect world. But when has this world ever been perfect? When has Google ever been perfect?

    When Matt Cutts first talked about this update, before it had a name or people even really knew what to expect, he said Google was going after “over-optimization”. He said, at SXSW last month, “The idea is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level.”

    At the time, we wrote an article about it, talking about how Google was working on making SEO matter less. This week, Cutts aimed to clarify this a bit. Danny Sullivan quotes Cutts as saying, “I think ‘over-optimization’ wasn’t the best description, because it blurred the distinction between white hat SEO and webspam. This change is targeted at webspam, not SEO, and we tried to make that fact more clear in the blog post.”

    Well, it’s clear that black hat webpsam is a target, because the post says those exact words. “The opposite of ‘white hat’ SEO is something called “black hat webspam” (we say ‘webspam’ to distinguish it from email spam),” Cutts says in the post, later adding, “In the next few days, we’re launching an important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines. We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. ”

    OK, so as long as you abide by Google’s quality guidelines, this update should not impact you negatively right?

    The part that isn’t quite as clear is about how much SEO tactics really matter. While he have clarified that that they’re more concerned about getting rid of the black hat stuff, he also said something in that post, which would seem to indicate that Google wants content from sites not worried about SEO at all to rank better too (when it’s good of course).

    “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites,” says Cutts. Emphasis added.

    To me, that says that Google is not against white hat SEO (obviously – Google promotes plenty of white hat tactics), but they also would like to have it matter less.

    While I’m sure many in the SEO industry would disagree (because it could cost them their businesses), wouldn’t it ultimately be better for users and webmasters alike if they didn’t have to worry about SEO at all? If Google could just determine what the best results really were?

    Don’t worry, SEOs. We don’t live in that fantasy land yet, and while Google (and its competitors) would love to be able to do this, there is little evidence to suggest that will happen in the foreseeable future. In fact, I’d expect the nature of how we consume information from the web to evolve so much by that point, that it may not even be a relevant discussion.

    But rather than talk about what the future may bring (though Google’s certainly thinking about it), let’s focus on the here and now.

    Who Has Felt The Effects Of Google’s Updates?

    You can browse any number of forum threads and blog comments and see plenty of personal stories about sites getting hit. Searchmetrics, as it usually does following major Google updates, compiled some preliminary lists of the top winners and losers. Before we get to those lists, however, there are some caveats. For one, the firm was clear that these are preview lists. Secondly, the update has probably not finished rolling out yet. Third, they were put out before the Panda refresh was made public, and Matt Cutts says the list isn’t indicative of the sites impacted by the Webspam update.”

    He told Sullivan, “There’s a pretty big flaw with this “winner/loser” data. Searchmetrics says that they’re comparing by looking at rankings from a week ago. We rolled out a Panda data refresh several days ago. Because of the one week window, the Searchmetrics data include not only drops because of the webspam algorithm update but also Panda-related drops. In fact, when our engineers looked at Searchmetrics’ list of 50 sites that dropped, we only saw 2-3 sites that were affected in any way by the webspam algorithm update. I wouldn’t take the Searchmetrics list as indicative of the sites that were affected by the webspam algorithm update.”

    OK, so the lists apparently more indicative of the lastest Panda victims and winners. We still don’t really know who the biggest losers and winners on the Webpspam front are. Perhaps Searchmetrics will release another lists soon, with this new information taken into account.

    Here are the lists:

    Searchmetrics list

    Searchmetrics list

    Note that Demand Media’s eHow.com is not on the list. If you’ve followed the Panda saga all the way, you’ll know that it has always been in the conversation. Thought of as a content farm, it was the kind of site many thought Panda was designed to target. While it managed to escape unscathed for a while, Panda eventually caught up with it, and Demand Media made a lot of changes, which seem to have helped tremendously. They deleted a lot of articles and implemented some other things designed to keep quality up.

    During the company’s most recent earnings call (there’s another one coming in May), Demand Media said it hadn’t been affected by a Google update since July. It will be interesting to see what they say on the next call.

    There is some speculation that eHow may have benefited from recent Google updates, whether Panda or Webspam. Here’s a tweet from WebmasterWorld/PubCon Founder Brett Tabke:

    Did ‘ehow’ just make a comeback in the serps? hmmm – ran into them in 4 searches in last hour. 1 hour ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    We asked Demand Media if they’ve seen any increase in Google referrals. The company won’t comment because they’re in a quiet period ahead of their results announcement.

    Are Google Results Better?

    There is never a shortage of criticism of Google’s search results, yet it has managed to steadily dominate the market, so clearly they’ve remained good enough not to alienate the majority of users. There do, however, seem to be some very identifiable flaws in some search results right now.

    For example, there is all kinds of weird stuff going on with the SERP for “viagra”. For example, viagra.com, the official site, was not on the first page, when it should have been the first result. Just as I was writing this piece, viagra.com reappeared at number one. More on the other viagra page issues (some of which are still there) here.

    For the query, “make money online,” the top result was a page without any content on it whatsoever. Not what Google had in mind in terms of quality, I assume. Looking now, it actually appears Google has fixed this one too.

    A couple things we’ve seen mentioned by webmasters repeatedly, with regards to what has gotten sites’ Google rankings hit, are exact match domains and sites with a lot of links from spun content sources. Of course not every exact match domain is hit, but it could be a factor for some topics that do tend to generate a lot of spam. Viagra would certainly fit that bill, and may have just been an innocent casualty, which Google had to correct. I wonder how many more of those there are, and if Google will correct them.

    From what Google says, it’s more about things like keyword stuffing, link schemes and other things that violate its quality guidelines. You may want to go read those carefully.

    Update: Apparently, the Webspam update is now called the Penguin update, even though Cutts already called it the Webspam update. Sigh. I guess I have some re-tagging to do.

    What do you think? Did Google get its Webspam update right? As Panda continues to march on, is that making results better? Share your thoughts in the comments.

  • Reconsideration Request Tips From Google [Updated]

    If you think you’ve been wrongfully hit by Google’s Penguin update, Google has provided a form that you can fill out, in hopes that Google will see the light and get your site back into the mix.

    The update is all about targeting those in violation of Google’s quality guidelines. It’s an algorithmic approach designed to make Google better at what it has been trying to do all along. For those Google has manually de-indexed, there is still a path to redemption, so it seems likely that those impacted by the update can recover as well.

    For example, if you were busted participating in a link scheme, you’re not necessarily out of Google forever. Google says once you’ve made changes to keep your site from violating Google’s guidelines, you can submit a reconsideration request.

    To do so, go to Webmaster Tools, sign into your Google account, make sure you have your site verified, and submit the request.

    Google’s Rachel Searles and Brian White discuss tips for your request in this video:

    “It’s important to admit any mistakes you’ve made, and let us know what you’ve done to try to fix them,” says Searles. “Sometimes we get requests from people who say ‘my site adheres to the guidelines now,’ and that’s not really enough information for us, so please be as detailed as possible. Realize that there are actually people reading these requests.”

    “Ask questions of the people who work on your site, if you don’t work on it yourself,” she suggests, if you don’t know why you’re being penalized. Obviously, read the quality guidelines. She also suggests seeking help on the Google Webmaster forum, if you’d like the advice of a third party.

    “Sometimes we get reconsideration requests, where the requester associates technical website issues with a penalty,” says White. “An example: the server timed out for a while, or bad content was delivered for a time. Google is pretty adaptive to these kinds of transient issues with websites. So if you sometimes misread the situation, as ‘I have a penalty,” and seek reconsideration, it’s probably a good idea to wait a bit, see if things revert to their previous state.”

    “In the case of bad links that were gathered, point us to a URL-exhaustive effort to clean that up,” he says. “Also, we have pretty good tools internally, so don’t try to fool us. There are actual people, as Rachel said, looking at your reports. If you intentionally pass along bad or misleading information, we will disregard that request for reconsideration.”

    “And please don’t spam the reconsideration form,” adds Searles. “It doesn’t help to submit multiple requests all the time. Just one detailed concise report and just get it right the first time.”

    Google says they review the requests promptly.

    Update: Apparently reconsideration requests don’t do you a lot of good if you were simply hit by the algorithm. A reader shares (in the comments below) an email from Google in response to such a request:

    Dear site owner or webmaster of http://www.example-domain.com/,

    We received a request from a site owner to reconsider http://www.example-domain.com/ for compliance with Google’s Webmaster Guidelines.

    We reviewed your site and found no manual actions by the webspam team that might affect your site’s ranking in Google. There’s no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.

    Of course, there may be other issues with your site that affect your site’s ranking. Google’s computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users.

    If you’ve experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site’s content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you’ve changed the URLs for a large portion of your site’s pages. This article has a list of other potential reasons your site may not be doing well in search.

    If you’re still unable to resolve your issue, please see our Webmaster Help Forum for support.

    Sincerely,

    Google Search Quality Team

    Anyhow, should you need to submit a reconsideration request (I assume Google will still take manual action as needed), these tips might still come in handy.

    Image: Batman Returns from Warner Bros.

  • Google Penguin Update: Petition Calls For Google To Kill It

    Last week, Google gave frustrated webmasters a place to complain if they felt they were unjustly hit by the Penguin update. While I’m sure Google has received plenty of feedback through that, some are gravitating towards a petition to get Google to kill the Penguin update.

    For those of you who haven’t been following, Google announced the Penguin update (formerly known as the “Webspam update”) on April 24th, to target sites violating Google’s quality guidelines.

    Here’s what the petition says:

    Penguin killedPlease kill your Penguin update!l

    With the recent Google Penguin update, it has become nearly impossible for small content based websites to stay competitive with large publishers like eHow, WikiHow, Yahoo Answers and Amazon.

    Countless webmasters have seen their livelihoods vanish with this update. Sergey Brin recently came out against “Walled Gardens” of the likes of Facebook. However, the Penguin update has created a similar garden that only admits multimillion dollar publishing platforms.

    I’ll sign off with the words of someone who has lost everything in this update:

    “I got stuffed by it. I have a 7 year old website with SEO work done on it several years ago. No real SEO done in the past 3 years. So I have been penalised for SEO work done 3 years ago is all I can think.

    My website “was” top of its niche, with several hundred multi million pound clients. In the past day we have had a 90% drop in traffic and all but a bare few keywords left with rankings. Over 250 rankings we did have that we monitor each day have gone. These were top 3 rankings, now not even in the top 200.

    We have never done any bad SEO, we need to compete, but we have never done black hat. Saying that, what we did do was borderline, but then so does everyone else so we were left with little choice.

    Overnight my business which supports my 5 children, 3 employees, pays for my mortgage and debts etc has been wiped out.

    Thanks Google. At a time where almost every country in the world is suffering, way to go with applying a little more hardship to people whom have just tried to play the game as does everyone.

    The petition, which seeks 500 signatures, has 289 so far. There are also plenty of comments from webmasters leaving their reasons for signing.

    We saw plenty of stories about people losing their businesses and having to get rid of employees when Google launched the Panda update, and it appears that the Penguin update is having a similar effect.

    It’s still the early days for Penguin. My guess is that we’ll continue to see more adjustments on Google’s part. It’s hard to gauge how well Google’s update did from the outside looking in, in terms of getting rid of webspam and not penalizing the innocent. We have seen some examples where Google results were quite questionable, though Google quickly made adjustments. Of course, examples are always out there waiting to be pointed out, independent from the Penguin update.

    Comic image courtesy: DC Comics: Batman Annual #15 (via alternity)

  • Here’s What Matt Cutts Says to Sites That Have Been Hacked

    Google’s head of Webspam, Matt Cutts, has been in the news a lot this week, thanks to Google’s big webspam update, which has become officially known as the Penguin update. As Cutts says, Google has to deal with more types of spam than just the black hat SEO tactics, which the update targets. They also have to deal with sites who have been hacked.

    It’s not uncommon to stumble across compromised sites in Google’s search results. In fact, we saw a few with the “This site may be compromised” tag on the SERP for “viagra” this week, when we were analyzing the effects of the Penguin update. While Google addressed some issues with that SERP (viagra.com is ranking at the top again), there are still some compromised results on the page, even today.

    On his personal blog, Cutts posted an example email of what he tells site owners who have sites that Google has identified as hacked. The email (minus identifying details) says:

    Hi xxxxxxx, I’m the head of Google’s webspam team. Unfortunately, example.com really has been hacked by people trying to sell pills. I’m attaching an image to show the page that we’re seeing.

    We don’t have the resources to give full 1:1 help to every hacked website (thousands of websites get hacked every day–we’d spend all day trying to help websites clean up instead of doing our regular work), so you’ll have to consult with the tech person for your website. However, we do provide advice and resources to help clean up hacked websites, for example
    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=163634
    https://sites.google.com/site/webmasterhelpforum/en/faq-malware-and-hacked-sites
    http://googlewebmastercentral.blogspot.com/2008/04/my-sites-been-hacked-now-what.html
    http://googlewebmastercentral.blogspot.com/2007/09/quick-security-checklist-for-webmasters.html
    http://googlewebmastercentral.blogspot.com/2009/02/best-practices-against-hacking.html

    We also provide additional assistance for hacked sites in our webmaster support forum athttps://groups.google.com/a/googleproductforums.com/forum/#!forum/webmasters . I hope that helps.

    Regards,
    Matt Cutts

    P.S. If you visit a page like http://www.example.com/deep-url-path/ and don’t see the pill links, that means the hackers are being extra-sneaky and only showing the spammy pill links to Google. We provide a free tool for that situation as well. It’s called “Fetch as Googlebot” and it lets you send Google to your website and will show you exactly what we see. I would recommend this blog posthttp://googlewebmastercentral.blogspot.com/2009/11/generic-cialis-on-my-website-i-think-my.html describing how to use that tool, because your situation looks quite similar.

    Cutts says the best advice he can give to site owners is to keep their web server softare up to date and fully patched. If you want Google’s advice on the other kind of spam, read this.

  • Google Penguin Update Gets Fresh Losers List From Searchmetrics

    Earlier this week, Searchmetrics put out lists of winners and losers from Google’s Penguin update (when it was still called the Webspam update). After the lists were released, Google’s Matt Cutts spoke out about them, saying that they were inaccurate, because there had also been a Panda update, and that the lists were likely more indicative of that.

    Searchmetrics has now updated the lists, acknowledging what Cutts had to say.

    “I took nearly a huge set of keywords from short-head to medium and low search volume and looked at the current rankings from position 1 to 100 and compared the rankings to April 20th,” Searchmetrics Founder Marcus Tober writes. “In the data were also some glitches from the Panda 3.5 update which was going live from April 19th to 20th, Matt Cutts mentioned. But overall you see a trend of those domains which really lost visibility within the Google Penguin update.”

    Penguin Losers

    “A lot of these losers are database-driven websites – they mainly aggregate information and use large database systems to create as many pages as possible. Sites such as songlyrics.com, cubestat.com, lotsofjokes.com or merchandtcircle.com fall into this pattern. It makes sense that these sites will lose visibility,” says Tober. “Press portals and feed aggregators such as pressabout.us, newsalloy.com and bloglines.com were also affected, which makes sense from a Google point of view since these are the website types that are very often created by very aggressive (possibly overly aggressive) SEOs and often contain similar content.”

    He notes that ticketnetwork.com and ticketcity.com fit the bill of Google’s efforts with automatic, and possibly spun content.

    If you need to know exactly how to avoid getting caught by Google’s Penguin update, I’d start with the tips they give you. If you think you were unfairly hit by it, you can let Google know with a new form they’re providing.

    Image: Batman Returns From Warner Bros.