WebProNews

Tag: SEO

  • Google’s Matt Cutts: I Was Worried We’d Be Crushed By Altavista

    Google posted a video this week of a presentation from Matt Cutts at the 2012 Korea Webmaster Conference. He talks a bit about “the evolution of search”.

    He starts off talking about Yahoo in the early days, which he says is “a little strange” to call a search engine, “because Yahoo started out as a hand-compiled list of links. So, an individual person would decide what category to put things in, and they would decide whether it deserved to be in a certain category or not. The problem with that is that it doesn’t scale very well. You need to find a search engine that can work across the breadth of the entire web, or else it isn’t going to be as useful for every kind of query that people get.”

    The whole first section is really a history lesson in search, but it’s interesting to hear him talk about his early days with Google.

    “Whenever I joined Google, we were a start-up, so there was less than 100 people, whenever I joined Google,” he said. “And at the time, I was worried that we would be crushed by Altavista. Google was a tiny company. Altavista was a huge company. But Google has something that the other search engines at that time did not do. We looked at the links pointing to web pages.”

    He later said, “I remember whenever I started out at Google, I went and I talked to another company. And they had a list of results that they called featured, and they had a list that they called partnered. And I said, ‘What’s the difference between a featured result and a partner result?’ And the company said, there’s no difference at all. Everything is paid for. And that didn’t seem fair at the time.”

    “I’m proud that even to this day, you can’t pay to get a higher ranking on Google,” he said.

    It’s a 45-minute-long presentation, so you may want to check it out if you have a bit of time to kill.

  • Google Update Suspected By Webmasters Losing Traffic

    While unconfirmed, it is possible that Google rolled out a semi-major update over the past weekend, as Webmasters have taken to the forums to complain about loss in traffic.

    Barry Schwartz points to one WebmasterWorld thread in particular, in which a bunch of webmasters seem to think Google has indeed made a significant change.

    The person who started the thread says some webmasters are experiencing a 30-40% drop in traffic. Another user says they saw a 30-40% drop for a ten-year-old “authority site”.

    “One of my sites was hit on the 23rd.. others (ww members) the 30th… now you guys on the 6th. This update is hitting people in waves on a Thursday / Friday but it doesn’t affect everyone at once. The update appears to be isolated to a selected group each time,” says forum member petehall. “Wonder how long it takes for a Panda refresh to complete, as this seems to be the main connection with the 23rd of March.”

    That is indeed the day Google tweeted:

    Panda refresh rolling out now. Only ~1.6% of queries noticeably affected. Background on Panda: http://t.co/Z7dDS6qc 16 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Other forum members chimed in to express similar issues in traffic drops.

    Here are the changes Google made in March (at least the ones they have disclosed to the public). Unfortunately, we’ll have to wait a while to see what all Google has done in April.

    We’ve reached out to Google to inquire about an update over the weekend, and will update accordingly, though I don’t expect much more than a “we make changes every day” kind of response.

  • Google: If You Care About Your Standing in Search, Don’t Wait Out Penalties

    As previously reported, webmasters with links from paid blog networks that Google recently de-indexed have been receiving letters from Google Webmaster Tools.

    Google’s John Mueller talked a little about such letters and the reconsideration process in a Google Groups thread that you might find interesting (Another good find from the Google Forums by Barry Schwartz).

    “While we have just recently started sending out these messages, they may apply to issues that were already known (and affecting your site’s standing in our search results) for a while,” said Mueller. “If you receive a message like this, and you wish to resolve those issues, then I’d always submit a reconsideration request after having done so. In some cases, you may not be able to resolve all of the issues – if that’s the case, then it’s important to us that you document your efforts (you might even link to a Google Docs file if needed). It’s important to our team that it’s clear that you have taken significant effort to resolve all of the problems in that area, and that they can trust that these kinds of issues will not come back in the future.”

    “In situations where an algorithmic adjustment might have been made, you’re still welcome to submit a reconsideration request. It doesn’t cause any problems to do that, so especially if you’re unsure, submitting one is a good way to be certain,” he says. “Regarding the age of the unnatural links, I’d work to have them all removed, regardless of the age. For instance, in the general case where a site has been buying links for 2 years, it would be a good idea to go back that far.”

    He goes on to say that you should try to wait a penalty out if you’re serious about your site’s standing in search. “These are generally not issues that expire after a few days, they can affect your site’s standing for quite some time,” he says.

    Google’s own Chrome landing page recently had a 60-day penalty, which may have even hurt the web browser’s market share.

  • Google Algorithm Changes For March Finally Released

    Google just posted its big monthly list of “search quality highlights” or in other words, changes to its algorithm. I was starting to wonder if Google forgot about this. For February, they put out the list before the month was even over.

    “We’re starting to get into a groove with these posts, so we’re getting more and more comprehensive as the months go by,” says Johanna Wright, Director of Product Management at Google.

    Some significant points from the list:

    Google Panda Update: Google Mentions Dependence On Offline Processing
    New Google Algorithm Changes Continue To Focus On Freshness
    Google Favors Google+ Less In Search Results
    Google Is Being More Careful With Your Password

    Here’s the full list, as provided by Google:

    • Autocomplete with math symbols. [launch codename “Blackboard”, project codename “Suggest”] When we process queries to return predictions in autocomplete, we generally normalize them to match more relevant predictions in our database. This change incorporates several characters that were previously normalized: “+”, “-”, “*”, “/”, “^”, “(“, “)”, and “=”. This should make it easier to search for popular equations, for example [e = mc2] or [y = mx+b].
    • Improvements to handling of symbols for indexing. [launch codename “Deep Maroon”] We generally ignore punctuation symbols in queries. Based on analysis of our query stream, we’ve now started to index the following heavily used symbols: “%”, “$”, “\”, “.”, “@”, “#”, and “+”. We’ll continue to index more symbols as usage warrants.
    • Better scoring of news groupings. [launch codename “avenger_2”] News results on Google are organized into groups that are about the same story. We have scoring systems to determine the ordering of these groups for a given query. This subtle change slightly improves our scoring system, leading to better ranking of news clusters.
    • Sitelinks data refresh. [launch codename “Saralee-76”] Sitelinks (the links that appear beneath some search results and link deeper into the respective site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We’ve recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
    • Improvements to autocomplete backends, coverage. [launch codename “sovereign”, project codename “Suggest”] We’ve consolidated systems and reduced the number of backend calls required to prepare autocomplete predictions for your query. The result is more efficient CPU usage and more comprehensive predictions.
    • Better handling of password changes. Our general approach is that when you change passwords, you’ll be signed out from your account on all machines. This change ensures that changing your password more consistently signs your account out of Search, everywhere.
    • Better indexing of profile pages. [launch codename “Prof-2”] This change improves the comprehensiveness of public profile pages in our index from more than two-hundred social sites.
    • UI refresh for News Universal. [launch codename “Cosmos Newsy”, project codename “Cosmos”] We’ve refreshed the design of News Universal results by providing more results from the top cluster, unifying the UI treatment of clusters of different sizes, adding a larger font for the top article, adding larger images (from licensed sources), and adding author information.
    • Improvements to results for navigational queries. [launch codename “IceMan5”] A “navigational query” is a search where it looks like the user is looking to navigate to a particular website, such as [New York Times] or [wikipedia.org]. While these searches may seem straightforward, there are still challenges to serving the best results. For example, what if the user doesn’t actually know the right URL? What if the URL they’re searching for seems to be a parked domain (with no content)? This change improves results for this kind of search.
    • High-quality sites algorithm data update and freshness improvements. [launch codename “mm”, project codename “Panda”] Like many of the changes we make, aspects of our high-quality sites algorithm depend on processing that’s done offline and pushed on a periodic cycle. In the past month, we’ve pushed updated data for “Panda,” as we mentioned in a recent tweet. We’ve also made improvements to keep our database fresher overall.
    • Live results for UEFA Champions League and KHL. We’ve added live-updating snippets in our search results for the KHL (Russian Hockey League) and UEFA Champions League, including scores and schedules. Now you can find live results from a variety of sports leagues, including the NFLNBANHL and others.
    • Tennis search feature. [launch codename “DoubleFault”] We’ve introduced a new search feature to provide realtime tennis scores at the top of the search results page. Try [maria sharapova] or [sony ericsson open].
    • More relevant image search results. [launch codename “Lice”] This change tunes signals we use related to landing page quality for images. This makes it more likely that you’ll find highly relevant images, even if those images are on pages that are lower quality.
    • Fresher image predictions in all languages. [launch codename “imagine2”, project codename “Suggest”] We recently rolled out a change to surface more relevant image search predictions in autocomplete in English. This improvement extends the update to all languages.
    • SafeSearch algorithm tuning. [launch codenames “Fiorentini”, “SuperDyn”; project codename “SafeSearch”] This month we rolled out a couple of changes to our SafeSearch algorithm. We’ve updated our classifier to make it smarter and more precise, and we’ve found new ways to make adult content less likely to appear when a user isn’t looking for it
    • Tweaks to handling of anchor text. [launch codename “PC”] This month we turned off a classifier related to anchor text (the visible text appearing in links). Our experimental data suggested that other methods of anchor processing had greater success, so turning off this component made our scoring cleaner and more robust.
    • Simplification to Images Universal codebase. [launch codename “Galactic Center”] We’ve made some improvements to simplify our codebase for Images Universal and to better utilize improvements in our general web ranking to also provide better image results.
    • Better application ranking and UI on mobile. When you search for apps on your phone, you’ll now see richer results with app icons, star ratings, prices, and download buttons arranged to fit well on smaller screens. You’ll also see more relevant ranking of mobile applications based on your device platform, for example Android or iOS.
    • Improvements to freshness in Video Universal. [launch codename “graphite”, project codename “Freshness”] We’ve improved the freshness of video results to better detect stale videos and return fresh content.
    • Fewer undesired synonyms. [project codename “Synonyms”] When you search on Google, we often identify other search terms that might have the same meaning as what you entered in the box (synonyms) and surface results for those terms as well when it might be helpful. This month we tweaked a classifier to prevent unhelpful synonyms from being introduced as content in the results set.
    • Better handling of queries with both navigational and local intent. [launch codename “ShieldsUp”] Some queries have both local intent and are very navigational (directed towards a particular website). This change improves the balance of results we show, and helps ensure you’ll find highly relevant navigational results or local results towards the top of the page as appropriate for your query.
    • Improvements to freshness. [launch codename “Abacus”, project codename “Freshness”] We launched an improvement to freshness late last year that was very helpful, but it cost significant machine resources. At the time we decided to roll out the change only for news-related traffic. This month we rolled it out for all queries.
    • Improvements to processing for detection of site quality. [launch codename “Curlup”] We’ve made some improvements to a longstanding system we have to detect site quality. This improvement allows us to get greater confidence in our classifications.
    • Better interpretation and use of anchor text. We’ve improved systems we use to interpret and use anchor text, and determine how relevant a given anchor might be for a given query and website.
    • Better local results and sources in Google News. [launch codename “barefoot”, project codename “news search”] We’re deprecating a signal we had to help people find content from their local country, and we’re building similar logic into other signals we use. The result is more locally relevant Google News results and higher quality sources.
    • Deprecating signal related to ranking in a news cluster. [launch codename “decaffeination”, project codename “news search”] We’re deprecating a signal that’s no longer improving relevance in Google News. The signal was originally developed to help people find higher quality articles on Google News. (Note: Despite the launch codename, this project has nothing to do with Caffeine, our update to indexing in 2010).
    • Fewer “sibling” synonyms. [launch codename “Gemini”, project codename “Synonyms”] One of the main signals we look at to identify synonyms is context. For example, if the word “cat” often appears next to the term “pet” and “furry,” and so does the word “kitten”, our algorithms may guess that “cat” and “kitten” have similar meanings. The problem is that sometimes this method will introduce “synonyms” that actually are different entities in the same category. To continue the example, dogs are also “furry pets” — so sometimes “dog” may be incorrectly introduced as a synonym for “cat”. We’ve been working for some time to appropriately ferret out these “sibling” synonyms, and our latest system is more maintainable, updatable, debuggable, and extensible to other systems.
    • Better synonym accuracy and performance. [project codename “Synonyms”] We’ve made further improvements to our synonyms system by eliminating duplicate logic. We’ve also found ways to more accurately identify appropriate synonyms in cases where there are multiple synonym candidates with different contexts.
    • Retrieval system tuning. [launch codename “emonga”, project codename “Optionalization”] We’ve improved systems that identify terms in a query which are not necessarily required to retrieve relevant documents. This will make results more faithful to the original query.
    • Less aggressive synonyms. [launch codename “zilong”, project codename “Synonyms”] We’ve heard feedback from users that sometimes our algorithms are too aggressive at incorporating search results for other terms. The underlying cause is often our synonym system, which will include results for other terms in many cases. This change makes our synonym system less aggressive in the way it incorporates results for other query terms, putting greater weight on the original user query.
    • Update to systems relying on geographic data. [launch codename “Maestro, Maitre”] We have a number of signals that rely on geographic data (similar to the data we surface in Google Earth and Maps). This change updates some of the geographic data we’re using.
    • Improvements to name detection. [launch codename “edge”, project codename “NameDetector”] We’ve improved a system for detecting names, particularly for celebrity names.
    • Updates to personalization signals. [project codename “PSearch”] This change updates signals used to personalize search results.
    • Improvements to Image Search relevance. [launch codename “sib”] We’ve updated signals to better promote reasonably sized images on high-quality landing pages.
    • Remove deprecated signal from site relevance signals. [launch codename “Freedom”] We’ve removed a deprecated product-focused signal from a site-understanding algorithm.
    • More precise detection of old pages. [launch codename “oldn23″, project codename “Freshness”] This change improves detection of stale pages in our index by relying on more relevant signals. As a result, fewer stale pages are shown to users.
    • Tweaks to language detection in autocomplete. [launch codename “Dejavu”, project codename “Suggest”] In general, autocomplete relies on the display language to determine what language predictions to show. For most languages, we also try to detect the user query language by analyzing the script, and this change extends that behavior to Chinese (Simplified and Traditional), Japanese and Korean. The net effect is that when users forget to turn off their IMEs, they’ll still get English predictions if they start typing English terms.
    • Improvements in date detection for blog/forum pages. [launch codename “fibyen”, project codename “Dates”] This change improves the algorithm that determines dates for blog and forum pages.
    • More predictions in autocomplete by live rewriting of query prefixes. [launch codename “Lombart”, project codename “Suggest”] In this change we’re rewriting partial queries on the fly to retrieve more potential matching predictions for the user query. We use synonyms and other features to get the best overall match. Rewritten prefixes can include term re-orderings, term additions, term removals and more.
    • Expanded sitelinks on mobile. We’ve launched our expanded sitelinks feature for mobile browsers, providing better organization and presentation of sitelinks in search results.
    • More accurate short answers. [project codename “Porky Pig”] We’ve updated the sources behind our short answers feature to rely on data from Freebase. This improves accuracy and makes it easier to fix bugs.
    • Migration of video advanced search backends. We’ve migrated some backends used in video advanced search to our main search infrastructure.
    • +1 button in search for more countries and domains. This month we’ve internationalized the +1 button on the search results page to additional languages and domains. The +1 button in search makes it easy to share recommendations with the world right from your search results. As we said in our initial blog post, the beauty of +1’s is their relevance—you get the right recommendations (because they come from people who matter to you), at the right time (when you are actually looking for information about that topic) and in the right format (your search results).
    • Local result UI refresh on tablet. We’ve updated the user interface of local results on tablets to make them more compact and easier to scan.

  • Google Favors Google+ Less In Search Results

    As you may know, Google released a huge list of algorithm changes it made in the month of March. We’ve already looked at what Google had to say about the Panda update, and Google’s increased focus on freshness.

    Another noteworthy change from the list is:

    Better indexing of profile pages. [launch codename “Prof-2”] This change improves the comprehensiveness of public profile pages in our index from more than two-hundred social sites.

    When Google launched Search Plus Your World (with increased Google+ integration), Twitter and Facebook threw respective fits. We documented the back and forth the companies exchanged. Basically, Twitter and Facebook (especially Twitter) weren’t happy that Google was showing Google+ profiles over Twitter profiles for certain queries, even though the Twitter profiles were more popular.

    It was hard to argue with, though one can see the logic behind Google’s move, as it is trying to establish Google and Google+ as one great big network. If you were searching on Facebook for someone’s name, you would expect to get a Facebook profile first. So why not with Google?

    Hint: the answer is that Google is known more as a web search engine, not a social network.

    It appears that Google has remedied this, at least in some cases. Twitter specifically referenced an “@WWE” query when it was complaining. Twitter now shows up ahead of Google+ for “@WWE” (and just “WWE” for that matter).

    We (and others) have also pointed out in the past that Google was showing Mark Zuckerberg’s Google+ page (which he does not use) ahead of his Facebook profile in a Google search for “mark zuckerberg”. Obviously, the Facebook profile (which he does use) is much more relevant. It appears that Google has remedied this as well.

    As a matter of fact, Zuckerberg’s Google+ profile is completely gone from the front page for me now, where it was once the top result. That’s with Search Plus Your World toggled on, mind you. And I even have him in my Circles, so it would actually make sense for the profile to turn up somewhere on the page. Maybe they’ve gone too far in the opposite direction.

    We’re not positive these particular instances were fixed with the aforementioned update from today’s list, but it’s worth noting in general that Google is getting better at this. Twitter and Facebook (and other networks) should be a little happier.

  • New Google Algorithm Changes Continue To Focus On Freshness

    Google has been all about some freshness lately. In November, the company launched its freshness update, which it said built upon the momentum of Caffeine, in getting fresher results in Google.

    It seems as though Google is trying to make up for the lack of realtime search – a void left by the expiration of Google’s deal with Twitter.

    When Google released its monthly list of algorithm changes for January, it was clear that freshness was a major focus. As was the case with February’s list.

    Today, Google released its list for March, and yet again, freshness is mentioned a lot. Here are some changes from the list specifically related to freshness of results:

    High-quality sites algorithm data update and freshness improvements. [launch codename “mm”, project codename “Panda”] Like many of the changes we make, aspects of our high-quality sites algorithm depend on processing that’s done offline and pushed on a periodic cycle. In the past month, we’ve pushed updated data for “Panda,” as we mentioned in a recent tweet. We’ve also made improvements to keep our database fresher overall.

    Fresher image predictions in all languages. [launch codename “imagine2”, project codename “Suggest”] We recently rolled out a change to surface more relevant image search predictions in autocomplete in English. This improvement extends the update to all languages.

    Improvements to freshness in Video Universal. [launch codename “graphite”, project codename “Freshness”] We’ve improved the freshness of video results to better detect stale videos and return fresh content.

    Improvements to freshness. [launch codename “Abacus”, project codename “Freshness”] We launched an improvement to freshness late last year that was very helpful, but it cost significant machine resources. At the time we decided to roll out the change only for news-related traffic. This month we rolled it out for all queries.

    More precise detection of old pages. [launch codename “oldn23″, project codename “Freshness”] This change improves detection of stale pages in our index by relying on more relevant signals. As a result, fewer stale pages are shown to users.

    Hopefully some of the tweaks will help relevancy, because in my experience, freshness of results has been to great a signal in my opinion. Since Google’s Freshness update, I find that recency is often given more credence than relevancy. Sometimes the content I’m looking for is older. Not all of the best content on the web happened in the last week.

  • Going From Black Hat To White Hat SEO Doesn’t Mean Google Will Like You

    Much of the discussion in the SEO community of late has been related to Google’s efforts to “level the playing field” for mom and pops vs. those with bigger marketing budgets, and comments to this effect made by Matt Cutts at SXSW recently. He indicated that Google is working on things that would make it so people who “over-optmize” their sites don’t necessarily rank better than others who didn’t worry about SEO, but have great content.

    Are black hat SEO tactics worth the risk? Tell us what you think.

    To a great extent, Google has been working on these kinds of things for a long time. The Panda update was certainly designed to make content quality matter more, but Google also regularly gives tips about how to optimize your site better and releases lists of algorithmic changes, which practically beg webmasters to try and exploit them. Google, of course doesn’t take this stance, but when they release the signals, people pay attention, and try to play to them. Why wouldn’t they?

    Google knows this, of course, which is why they won’t release their entire list of signals, let alone talk about how much weight certain signals have compared to others, although if you pay close enough attention, you’ll sometimes catch hints at this too.

    You might say Google sends mixed signals to webmasters. Danny Sullivan asks if Google’s over-optimization penalty is its “jump the shark” moment. He makes the case that it’s more about PR for Google to indicate they’re actively working on making results more relevant.

    The whole de-indexing of paid blog/link networks plays to the whole making over-optimization matter less concept, but based on Google’s webmaster guidelines, it seems like doing so would have always fit into the company’s policy.

    When you play the black hat, or even gray hat game, you’re taking a big risk of being dealt a damaging penalty. Google didn’t even hesitate to penalize its own site for violating guidelines (at least after they were called out on it), which may have even cost Chrome some browser market share.

    Going white hat after playing it at a darker shade in the past isn’t necessarily going to help your rankings either though, as one blogger indicated in a recent post at SEOBullshit:

    I did paid links, paid reviews, and never, ever did any shit like “cloaking”, “spam”, or “stuffing.” Hence, the “grey” hat campaign type. I had awesome content. I had a crawlable site. It was perfect in every way. I used paid links and reviews to scream at GoogleBot, “Hey, notice me! I’m right here! I have killer content and reputable sites link to it.” The results were great. The money. Terrific. I left the competition scratching their heads since my site was HTTPS, it was hard to reverse engineer as most link-finding tools couldn’t really find my backlinks.

    However, the stress of running a grey-hat campaign eventually wears on you and you long for the peace of a white hat campaign. So, I hatched a plan to wean my site from grey and pray that the results weren’t too bad. I expected a 15-25% drop in SERPS and traffic which I could then recover by getting a big relevant, content piece linked up to the pages where I removed the TLA’s.

    Fucking failure. Total and monstrous failure.

    He continues on saying his total traffic drop was -72.5%.

    Every time Google makes big adjustments to its algorithm, sites pay the price. Sometimes that price is deserved, and sometimes it’s not. I find that often, people tend to think they didn’t deserve to lose their rankings. Even with the latest Panda refresh, we had sites telling us about ranking declines.

    The intro of a recent Entrepreneur article sums up the conundrum of the small business perfectly: “As a small business owner using the web to reach customers, you’ve surely been implementing search engine optimization tactics to make sure your site turns up high in web searches. But just when you might feel like you’re starting to get the hang of this SEO thing, it appears that search giant Google might start penalizing websites that are over-optimized.”

    We understand that there are plenty of white hat SEO tactics that Google not only is OK with, but encourages. However, most people simply don’t know what SEO even is. Matt Cutts himself shared results this week from a survey he conducted, finding only one in five people in the U.S. have even heard of SEO.

    It’s not surprising that sites would be tempted to go for the darker hat techniques. But as Google continues on this new (same) path of leveling the playing field, however, it may be more playing with fire than ever. And once you start, engaging in SEO’s dark arts, you may have a hard time returning to the lighter side, should you ever choose to do so.

    Have you ever been helped or hurt by using black hat SEO tactics? Let us know in the comments (you don’t have to use your real name).

  • Should You Block Google From Crawling Your Slower Pages?

    Google’s head of web spam, Matt Cutts put out a new video discussing site speed’s impact on rankings. This is not the first time Cutts has addressed the issue, but it’s a somewhat different take than we’ve seen before, as it’s in direct response to the following user-submitted question:

    You mentioned that site speed is a factor in ranking. On some pages, our site uses complex queries to return the users request, giving a slow pagetime. should we not allow Googlebot to index these pages to improve our overall site speed?

    “I would say, in general, I would let Googlebot crawl the same pages that users see,” says Cutts. “The rule of thumb is this. Only something like 1 out of 100 searches are affected by our page speed mechanism that says, things that are too slow rank lower. And if it’s 1 out of a 100 searches, that’s 1 out of roughly 1,000 websites. So if you really think that you might be in the 1 out of 1,000, that you’re the slowest, then maybe that’s something to consider.”

    “But in general, most of the time, as long as your browser isn’t timing out, as long as it’s not starting to be flaky, you should be in relatively good shape,” he continues. “You might, however, think about the user experience. If users have to wait 8, 9, 10, 20 seconds in order to get a page back, a lot of people don’t stick around that long. So there’s a lot of people that will do things like cache results and then compute them on the fly later. And you can fold in the new results.”

    “But if it’s at all possible to pre-compute the results, or cache them, or do some sort of way to speed things up, that’s great for users,” Cutts says. “Typically, as long as there is just a few number of pages that are very slow or if the site overall is fast, it’s not the kind of thing that you need to worry about. So you might want to pay attention to making it faster just for the user experience.But it sounds like I wouldn’t necessarily block those slower pages out from Googlebot unless you’re worried that you’re in one of those 1 out of a 1,000, where you’re really, really the outlier in terms of not being the fastest possible site.”

    In November, we referenced another video Cutts did talking about page speed, where he also dropped the “1 out of 100 searches” stat. He said basically not to overly stress about speed as a ranking factor. Both the new video and that video were actually uploaded to YouTube in August, so this advice is already older than it appears. Today’s video, however, was just made public by Google, so it stands to reason that the advice from the company remains the same.

  • Google On How A Lot Of Your Links Don’t Count

    Google has over 200 signals it uses to rank results. Given Google’s legendary PageRank algorithm, based on links, it has led to a lot of people worrying about links way too much. That’s not to say quality links aren’t still important, but just because you have a whole bunch of links, it doesn’t mean your site is going to rank well.

    Google’s Matt Cutts posted an interesting webmaster help video under the title: “Will Google Provide More Link Data For All Sites?” It’s Cutts’ response to the user-submitted question:

    In the wake of the demise of Yahoo Site Explorer, does Google Webmaster Tools plan to take up the reigns this product once provided to SEO’s everywhere?

    Cutts responds, “What I think you’re asking is actually code for ‘will you give me a lot of links?’ and let me give you some context about Google’s policies on that. I know that Yahoo Site Explorer gave a lot of links, but Yahoo Site Explorer is going away. Microsoft used to give a lot of links. And they saw so much abuse and so many people hitting it really, really hard that I think they turn that off so that people wouldn’t be tempted to just keep pounding them and pounding their servers.”

    “So our policy has been to give a subsample of links to anybody for any given page or any given site– and you can do that with a link colon command–and to give a much more exhaustive, much more full list of links to the actual site owner,” says Cutts. “And let me tell you why I think that’s a little bit more of a balanced plan. Yahoo Site Explorer, they were giving a lot of links, but they weren’t giving links that Google knew about. And certainly, they don’t know which links Google really trusts. And so I think a lot of people sometimes focus on the low-quality links that a competitor has, and they don’t realize that the vast majority of times, those links aren’t counting.”

    “So, for example, the New York Times sent us a sample of literally thousands of links that they were wondering how many of these count because they’d gotten it from some third party or other source of links,” he adds. “And the answer was that basically none of those links had counted. And so it’s a little easy for people to get obsessed by looking at the backlinks of their competitors and saying, ‘oh, they’re doing this bad thing or that bad thing.’ And they might not know the good links. And they might not know that a lot of those links aren’t counted at all.”

    “So I also think that it’s a relatively good policy because you deserve to know your own links,” he continues. “I think that’s perfectly defensible. But it doesn’t provide that much help to give all the links to a competitor site unless you’re maybe an SEO, or your a competitor, or something along those lines. So for somebody like a librarian or a power searcher or something like that, using link colon and getting a nice sample, a fair fraction of links to a particular page or to a particular website, is a very good policy.”

    “I think that’s defensible, but I don’t expect us to show all the links that we know of for all the different sites that we know of, just because people tend to focus on the wrong thing,” he concludes. “They don’t know which links really count. So they tend to obsess about all the bad links their competitors have and only look at the good links that they have. And it’s probably the case that surfacing this data makes it so that you’re helping the people who really, really, really want to try to get all their competitors backlinks or whatever. And I just think it’s a little bit more equitable to say, OK, you’re allowed to see as many of the backlinks as we can give you for your own site, but maybe not for every other site. You can get a sampling, so you can get an idea of what they’re like, but I wouldn’t expect us to try to provide a full snapshot for every single site.”

    Links obviously aren’t everything, and if you follow Google’s changes, it’s easy to see that other signals have been given a lot more significance in recent memory. This includes things like content quality, social signals and freshness. If you’re that worried about the number of links you have, you’re living in the wrong era of search.

    Granted, links have value beyond search ranking. They still provide more potential referrals to your site, but in terms of Google, the search engine is moving more and more away from the traditional 10 organic links anyway, with more personalized results, fresher results, blended (universal search) results, and more direct answers.

  • What The U.S. Search Market’s Been Up To Since 2008 [Infographic]

    On the Web, search is king. Even though a new Google Consumer Survey found that only 1 in 5 Americans knows what SEO is, just about everybody in the Web content industry not only knows about search engine optimization, but also formats and titles content in an effort to get noticed by Google’s complex algorithms.

    You can be white hat, black hat, or some shade of grey in the search world, but no matter your strategy, you know that there are a lot of page views (and consequently digital clout, ad revenue, and/or product sales) riding on your making it to the top of the search results pile. Google, of course, is king of the mountain, while Microsoft’s Bing recently surpassed Yahoo as the distant second place holder.

    Here’s an infographic from Statista that describes the search market in the United States. Search results handled by the five major search engines are up 68% since January 2008, to some 17 billion monthly queries today.

    [Statista; Image Source: ThinkStock]

  • Matt Cutts: 1 In 5 People In U.S. Have Heard Of SEO

    As you may know, Google launched a new product today called Google Consumer Surveys. Googlers are certainly hyped up about it.

    Google’s head of web spam, Matt Cutts, used the product to put out his own survey about SEO in which he determined that 1 in 5 in the U.S. have heard of SEO.

    “In my world, everyone I talk to has heard of search engine optimization (SEO),” he says on Google+. “But I’ve always wondered: do regular people in the U.S. know what SEO is? With Google’s new Consumer Surveys product, I can actually find out. I asked 1,576 people ‘Have you heard of ‘search engine optimization’?”

    “It turns out only 1 in 5 people (20.4%) in the U.S. have heard of SEO!” he says.

    Matt Cutts SEO Survey

    “The survey also turned up an interesting gender difference: almost 25% of men have heard of SEO, but only about 16% of women have,” Cutts notes. “Doing this sort of market research in the past would have been slow, hard, and expensive. Asking 1,500 people a simple question only costs about $150.”

    Matt Cutts SEO Survey

    The survey may only be a small set of people compared to the actual population of the country, but my guess is that’s not that far off. In my experience, outside of work, most people have no idea what SEO is.

    That’s probably one reason that Google wants to level the playing field in search rankings, when it comes “over-optimized” content. But that’s a whole other discussion.

  • Are You Surprised That Google Doesn’t Like Paid Blog Networks?

    Google has been cracking down on lesser quality content littering its search results a great deal over the past year – probably more than any other time in the search engine’s history. Obviously, to those who follow the search industry, the Panda update has been leading the charge in this area.

    Google has been de-indexing blog networks that webmasters have essentially been paying to get links. Do you think this will improve Google’s results? Share your thoughts in the comments.

    One way that content, including some lesser-quality content, has been able to manipulate Google’s algorithm is through paid links, and linking “schemes”. Google has long had policies against these things, and has not hesitated to penalize sites it busted. See JC Penney and Overstock.com incidents from last year, for a couple of examples (not necessarily the best examples of low quality, but of getting busted). Google even penalized its own Chrome landing page, after paid links set up by a marketing firm were discovered.

    Penalties like these can greatly hurt sites. There was talk that Chrome’s share of the browser market was impacted by that penalty, and that’s Google’s own property. Overstock blamed Google for its ugly financials when it reported its earnings earlier this month.

    If such penalties can have such an impact on brands like these, think what they could do to lesser-known brands.

    Google is now cracking down on blog networks, which have added sites to their networks in exchange for fees. BuildMyRank, in particular has received a lot of attention.

    Build My Rank

    The site posted a message about it recently:

    On a daily basis, we monitor our domain network to check metrics like page rank, indexed pages, etc. As with any link-building network, some de-indexing activity is expected and ours has been within a permissible range for the past two years. Unfortunately, this morning, our scripts and manual checks have determined that the overwhelming majority of our network has been de-indexed (by Google), as of March 19, 2012. In our wildest dreams, there’s no way we could have imagined this happening.

    It had always been BMR’s philosophy that if we did things a bit different from other networks, we would not only have a better quality service to offer our users, but a longer life in this fickle industry. Sadly, it appears this was not the case.

    In case you’re not familiar with how BMR actually works, it essentially sells link juice. In the “how it works” section, it explains that the backlinks it helps you build “help add extra link juice and added indexing speed”. This comes at prices up to $400/month. Here’s their video overview:

    Word throughout the SEO community is that other blog networks have been getting de-indexed as well. Meanwhile, webmasters with links from these networks, have been getting messages from Google’s Webmaster Tools. SEOmoz shares a message from Google Webmaster Tools that some webmasters have received:

    Dear site owner or webmaster of http://example.com/,

    We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines.

    Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes.

    We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results.

    If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request.

    If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.

    Sincerely,

    Google Search Quality Team

    Is any of this really a surprise? If you’re paying a blog network, is this not basically paying for links? The most surprising thing is that sites have been getting away with it for so long, without facing the wrath of Google. That’s damn amazing, really.

    “Don’t participate in link schemes designed to increase your site’s ranking or PageRank,” Google says in its Webmaster Guidelines. “In particular, avoid links to web spammers or ‘bad neighborhoods’ on the web, as your own ranking may be affected adversely by those links.”

    It’s pretty clear.

    Internet marketer Jennifer Ledbetter (otherwise known as PotPieGirl) wrote a fantastic article on this whole ordeal. “Let’s face it and be real,” she writes. “We’ve used any of these services, we know exactly WHY we use them, don’t we? We use them to get the in-content links to help our web pages rank better. Yes, we use them to manipulate Google rankings. We all know what we’re doing – we know Google frowns on that (ok, totally HATES that), but we do it anyway. So, please – no whining about how this isn’t ‘fair’, ok?”

    SEOmoz CEO Rand Fishkin had some helpful advice on Twitter:

    If you’ve been affected by Google’s recent link penalties, disclosing the details of how you acquired the links can speed up reconsideration 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Perhaps this is how webspam intends to fight the more underground/private link manipulation schemes 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @LukeyG28 Google’s shockingly good at knowing when spam’s been built by you vs. others; I wouldn’t sweat it. 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @randfish getting a reply from the main man awesome! – although I have to disagree, if it’s a authority website yes, new website no dm me 1 day ago via Twitter for iPad ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @LukeyG28 I tried recently to “bowl” a few small sites out of Google (using some black hat friends’ advice/networks) but they stayed fine 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @randfish those friends wernt blackhat enough lol. Ive been trialing it on some of my old sites and there dropping like flys.msg me for info 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @LukeyG28 Like I said, I suspect there’s some footprints of those sites that make G more apt to allow for link penalties 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    There has been a lot of discussion from webmasters worried that competitors will be able to hurt their sites by posting bad links to their content, and the general consensus, as it has been for years, is that if you get good links, it should counter the bad. Barry Schwartz at Search Engine Roundtable points to a quote from Google saying, “Our algorithms are pretty complex, it takes more than a handful of bad links to sway their opinion of a website. Even if Webmaster Tools shows a million links, then that’s not going to change things if those links are all ignored for ranking purposes.”

    According to Google, you really shouldn’t be focusing on the number of links you have anyway. Matt Cutts put out a video last week talking about how Google doesn’t count a lot of your links.

    “I think a lot of people sometimes focus on the low-quality links that a competitor has, and they don’t realize that the vast majority of times, those links aren’t counting,” Cutts said. “So, for example, the New York Times sent us a sample of literally thousands of links that they were wondering how many of these count because they’d gotten it from some third party or other source of links, and the answer was that basically none of those links had counted. And so it’s a little easy for people to get obsessed by looking at the backlinks of their competitors and saying, ‘oh, they’re doing this bad thing or that bad thing.’ And they might not know the good links. And they might not know that a lot of those links aren’t counted at all.”

    It’s getting to be about time for Google to announce its monthly list of algorithm changes, but in last month’s list, one of the changes was “Link Evaluation”.

    “We often use characteristics of links to help us figure out the topic of a linked page,” the company said. “We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.”

    While links are the foundation of PageRank, it seems to me that links have become less and less important in search visibility altogether. Don’t get me wrong. Links matter. Good links are great. Links from sources Google thinks are great are still great, but just having a bunch of inbound links won’t get you very far if they’re not significant links.

    Search visibility these days is much more about who’s sharing/discussing your content (especially on Google+), who you are as an author, how fresh your content is, and how in-depth it is compared to your competition. This is of course simplifying things a great deal (Google has over 200 signals), but if you consider these things more than just chasing meaningless links, not only will you likely do better in search, you will avoid getting a destructive penalty from Google.

    All of that said, you may be spending too much time obsessing over search in general, and would do better to consider other means’ of traffic. How dependent do you really want to be on an ever-changing algorithm? Expanding upon your social strategy is likely to pay off much better, and thankfully, the better you do in social channels, the better you’re likely to do in search.

    Should Google be penalizing blog/link networks? Are links as important as they once were? Tell us what you think.

  • Google: Blocking Javascript, CSS May Be Hurting Your Rankings

    If you’re blocking Google from crawling your javascript and CSS, you may potentially be hurting your own search rankings. It’s not that using that javascript and CSS will necessarily make you rank better, but if you don’t let Google crawl it, you’re not giving Google the entire picture of what’s on your page.

    Matt Cutts posted a new webmaster help video, but this time, instead of responding to a user-submitted question like he usually does, he provides what he refers to as a public service announcement.

    “If you block Googlebot from crawling javascript or CSS, please take a few minutes and take that out of the robots.txt and let us craw the javascript,” says Cutts. “Let us crawl the CSS, and get a better idea of what’s going on on the page.”

    “A lot of people block it because they think, ‘Oh, this is going to be resources that I don’t want to have the bandwidth or something,” but Googlebot is pretty smart about not crawling stuff too fast, and a lot of people will do things like, they’ll check for Flash, but then they’re including some javascript, and they don’t realize that including that javascript – the javascript is blocked, and so we’re not able to crawl the site as effectively as we would like,” he says.

    “In addition, Google is getting better at processing javascript,” he continues. “It’s getting better at things like looking at CSS [to] figure out what’s important on the page, so if you do block Googlebot, I would ask: please take a little time, go ahead and remove those blocks from the robots.txt so you can let Googlebot in, get a better idea of what’s going on with your site, get a better idea of what’s going on with your page, and then that just helps everybody in terms of if we can find the best search results, we can return them higher to users.”

    “So thanks if you can take the chance. I know it’s kind of a common idiom for people to just say, ‘Oh, I’m gonna block javascript and CSS, but you don’t need to do that now, so please, in fact, actively let Googlebot crawl things like javascript and CSS, if you can.”

  • Google Panda Update Gets Another Refresh, Affecting 1.6% Of Queries

    Google has pushed out another Panda Update. The company tweeted about it as the weekend got underway, saying that about 1.6% of queries are “noticeably affected”.

    Panda refresh rolling out now. Only ~1.6% of queries noticeably affected. Background on Panda: http://t.co/Z7dDS6qc 2 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    The tweet links to the original announcement about the update (from before the public even knew it by the name Panda). Given that Google pointed to this article, it might be worth stepping back, and revisiting Google’s own explanation of the update.

    The post was from Google’s Matt Cutts and Amit Singhal. “Our goal is simple: to give people the most relevant answers to their queries as quickly as possible,” the post began. “This requires constant tuning of our algorithms, as new content—both good and bad—comes online all the time.”

    “Many of the changes we make are so subtle that very few people notice them,” it continued. “But in the last day or so we launched a pretty big algorithmic improvement to our ranking—a change that noticeably impacts 11.8% of our queries—and we wanted to let people know what’s going on. This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.”

    Obviously, many algorithmic changes have been made since then, including quite a few to Panda itself. How much do you think Google’s results have improved over that time? Are they better?

    “We can’t make a major improvement without affecting rankings for many sites,” the post went on. “It has to be that some sites will go up and some will go down. Google depends on the high-quality content created by wonderful websites around the world, and we do have a responsibility to encourage a healthy web ecosystem. Therefore, it is important for high-quality sites to be rewarded, and that’s exactly what this change does.”

    Many, many sites did indeed go down. Some clearly deserved to do so, but for others, this was questionable. Granted, some were able to make recoveries, and Google admitted that the algorithm was not perfect.

    For most of the time since Panda initially launched, Google had one main help thread, where webmasters could vent their frustration and state their claims as to why they felt their site was unjustly penalized by the algorithm. Google made it clear that they were reading the thread. Earlier this month, however, the thread got split up, though the company still encourages posting and finding old posts via search. Things just might not be as convenient as they were under one centralized thread.

    Prior to the new Panda refreshed, as tweeted by Google, the last Panda update, in February, improved how Panda interacts with Google’s indexing and ranking systems. Google said it was “more integrated into our pipleines”. Google also said it was made “more accurate and more sensitive to recent changes on the web.”

    View our extensive library of Panda Update coverage here.

    Image credit: Rick Bucich

  • New Site Speed Reports In Google Analytics

    Google announced today that it has released a new Site Speed report, with “all the key metrics” in an easy-to-read Overview report.

    “The Overview report provides an at-a-glance view of essential information for measuring your site’s page loading metrics: Avg. Page Load Time by Browser, Country/Territory, and Page,” explains Google’s Mustafa M. Tikir. “Plus you can compare your site’s average performance over time to forecast trends and view historical performance. All of these tools can help you identify where your pages may be underperforming and adjust so more visitors land on your site instead of waiting in frustration or leaving.”

    “Previously there was only one Site Speed report, this has been renamed to ‘Page Timings’”, adds Tikir. “On the Page Timings report, you can view your site’s load times in three ways: use the Explorer tab to explore average load time across dimensions, use the Performance tab to see how the load times break down by speed ranges, or use the Map Overlay tab to see how the load times breakdown by geography.”

    site speed overview

    Google notes that it has also updated the Intelligence Reports to include average site load times and all Page Timings metrics.

    In addition to all of this, sites with less than 10,000 visits per day can increase the site speed sample rate up to 100% and get full samples for page load time.

    Note that speed is now a ranking factor in Google.

    These aren’t the only improvements Google announced for Analytics this week. On Tuesday, the company announced that new Social Reports are on the way.

  • Google Webmaster Hangouts: 2 You Can Join In The Near Future

    Google often does Webmaster Central Hangouts on Google+. This gives webmasters an opportunity to connect with Googlers and learn valuable tips about how they can get more out of their sites, and out of Google.

    Google’s Pierre Far announced a couple of upcoming hangouts for Tuesday, March 27, and Wednesday, March 28. Both begin at 2PM UK time, and last for an hour. Far writes:

    US-based webmasters: please be careful with the time difference for these as Europe would have switched to summer time by then!

    Where: Right here on Google+. It works best with a webcam + headset. You can find out more about Hangouts and how to participate at http://goo.gl/k6aMv

    Topic: Anything webmaster-related: Webmaster Tools, Sitemaps, crawling, indexing, duplicate content, websites, web search, etc.

    To join, you obviously need a Google+ account. The thing is, they’re only limited to 10 participants, but people tend to come and go, so even if you can’t immediately get in, you might be able to squeeze in sometime within the hour. It’s a chance to get some direct advice about your site from Google, so depending on how pressing your issue is, it may be worth waiting to get in.

  • Google Will Need Time To Learn About How To Rank New TLDs

    We recently talked about a post Google’s Matt Cutts made to Google+ discussing how Google will handle the new TLDs. He referenced a blog post talking about how the new TLDs will be “automatically favoured by Google over a .com equivalent,” which Cutts said is “just not true.”

    He has now put out a new video talking about how Google will treat the TLDs, in response to the user-submitted question:

    How will Google treat the new nTLDs where any Top Level Domain is possible e.g. for corporations eg. www.portfolio.mycompanyname regarding influence on ranking and pagerank?

    “Well we’ve had hundreds of different TLDs, and we do a pretty good job of ranking those,” says Cutts. “We want to return the best result, and if the best result is on one particular TLD, then it’s reasonable to expect that we’ll do the work in terms of writing the code and finding out how to crawl different domains, where we are able to return what we think is the best result according to our system.”

    “So if you are making Transformers 9, and you want to buy the domain transformers9.movie or something like that, it’s reasonable to expect that Google will try to find those results, try to be able to crawl them well, and then try to return them to users.”

    “Now there’s going to be a lot of migration, and so different search engines will have different answers, and I’m sure there will be a transition period where we have to learn or find out different ways of what the valid top level domains are, and then if there’s any way where we can find out what the domains on that top level domain are,” he says. “So we’ll have to explore that space a little bit, but it’s definitely the case that we’ve always wanted to return the best result we can to users, and so we try to figure that out, whether it’s on a .com, or a .de, or a dot whatever, and we’ll try to return that to users.”

    Cutts also put out another new webmaster video talking about why you shouldn’t be obsessing over your link numbers.

  • Google Referrals To Get Even More Mysterious

    Last fall, Google launched encrypted search (via SSL) as the default setting for signed in users, expanding the feature to the worldwide level earlier this month. The amount of encrypted searches may soon go up .

    Webmasters, SEOs and marketers haven’t been entirely thrilled with the whole thing, because with the encrypted search, much of the Google referral data in Google Analytics is now marked as “not provided”. WebProNews talked to several SEO professionals about the changes last fall, who expressed their discontent:

    Christopher Soghoian at the blog Slight Paranoia figured this out, then Danny Sullivan at Search Engine Land received a statement from Johnathan Nightingale, the Director of Firefox Engineering, who said that Mozilla is testing using the SSL for built-in Google searches, and that if no issues are uncovered, it will ship to all Firefox users (after going through Aurora and Beta channels). That would include non-English versions of Firefox too, by the way.

    This is particularly significant given that Google and Mozilla recently renewed their deal to keep Google the default search in Firefox.

    If the encrypted search was turned on by default for searches performed from the Firefox search box, the number of “not provided” referrals would be increased by a tremendous amount, given the popularity fo the browser. Barry Schwartz at Search Engine Roundtable makes the point that they could easily add this to Chrome at some point as well. And why wouldn’t they, if they feel that this is the search experience that is best for users? Given all the privacy concerns that are always circulating around Google’s practices, this is one area, where they could make people feel easier, even if SEOs, webmasters and marketers aren’t huge fans.

    There could be potential issues with Internet Explorer as well, if users set their default search to Google, which given Google’s share of the search market, it is highly likely that many will still do.

    The point is, for those keeping up with their analytics, those Google referrals might even become more mysterious if encrypted search is expanded across the browser level, which it appears is about to happen with Firefox.

  • Yandex Offers Search Refiners To Help Archive Search Goals

    Yandex now offers search refiners to help web users achieve their search goals. The new functionality appears right below Yandex’s search bar in response to unspecified queries, helping users to instantly refine their searches. Those searching for “blueberry”, for instance, could be looking for blueberry recipes, or possibly be interested in learning
    about the health benefits or nutritional value of blueberries. Now, they can see at a click exactly what they are looking for.

    Unspecified queries currently represent about 20% of all searches on Yandex. In response to these unspecified queries, Yandex’s search engine now offers options for users to choose from. Users looking for “Charlie Chaplin”, for instance, will have to click on one of the search refining options suggested by Yandex: bio, photo, video, film or quote.

    Search refiners are powered by Yandex’s proprietary technology, Spectrum, launched in December 2010. This technology enables Yandex’s search engine to determine users’ possible search goals and offer results specific to each of these goals. Spectrum includes into search results links to web documents belonging to different user intent categories based on how popular these categories are. Search refiners appearing on Yandex Search is another step in Spectrum’s development.

    “Any search system needs to know what it is exactly that the user is looking for and help them find it,” says Elena Gruntova, head of the Intent-based Search Program at Yandex. “This is what Yandex has been doing for the past fifteen years, starting with the word-form sensitive search engine and continuing with vertical search results from our own
    services in 2000. The launch of search refiners is yet another milestone on this road.”

    Intent-based search engine understanding users’ needs and helping them to attain their goals is one of Yandex’s key priorities in 2012. Yandex intends to launch other products as part of its Intent-based Search Program in the near future.

  • SEO DOs And DONT’S According To Google: Mixed Signals?

    Google is talking a lot about SEO these days. In a recent webmaster discussion at SXSW, Google’s Matt Cutts spoke about some changes Google is working on that would seem to make SEO matter less, in that sites with good, quality content that don’t do a lot of SEO could potentially rank just as well, or better than a bigger site with a bigger SEO budget and a lot of SEO tactics implemented. The whole thing appears to be more about Google getting better at not helping sites just because they employ a lot of grey hat/borderline black hat tactics. Google has always tried to do this, but based on what Cutts said, it sounds like they’re about to get better at it.

    Changes to Google’s algorithm have the ability to make or break businesses. Google is sending out the signal that you should worry less about the current SEO trends, and more about producing great content, and that they’re “leveling the playing field” for sites that don’t pay as much attention to SEO. Obviously great content is a positive, but at the same time, Google is showing us each month all of the changes it is making, and all the while, providing tips about how to do certain SEO things better. Is Google sending mixed signals? Just how much should webmasters worry about optimization? Share your thoughts in the comments.

    Google Changes To Come

    WebProNews spoke with former Googler and Google Webmaster Central creator Vanessa Fox about it, after she wrote her own blog post, sharing her thoughts about Google’s approach to SEO. In her post, she wrote, “Some are worried that Google will begin to penalize sites that have implemented search engine optimization techniques. My thoughts? I think that some site owners should worry. But whether or not you should depends on what you mean by search engine optimization.”

    “Matt talked about finding ways to surface smaller sites that may be poorly optimized, if, in fact, those sites have the very best content,” she said in the post. “This is not anything new from Google. They’ve always had a goal to rank the very best content, regardless of how well optimized or not it may be. And I think that’s the key. If a page is the very best result for a searcher, Google wants to rank it even if the site owner has never heard of title tags. And Google wants to rank it if the site owner has crafted the very best title tag possible. The importance there is that it’s the very best result.”

    There has been a lot of discussion about it in the SEO community, and there will no doubt be plenty around SES New York this week. Some of the talk has been blown out of proportion, and Cutts appears to feel that the press has contributed to this. For the record, when we first reported on it, we linked to the full audio from the panel, as Cutts provided, and since then, he’s linked to the full transcript for those who don’t have time to listen to an hour’s worth of audio. We’ve also pointed to this in previous coverage. Cutts seems to have given his seal of approval to Fox’s take on the whole thing:

    Rob Snell did a full transcript of the recent #sxsw session with Danny Sullivan, Duane Forrester, & me: http://t.co/RCGR99Ff 21 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @mattcutts ah thanks! That might come in useful against the press who are taking some quotes WAY out of context. 21 hours ago via Osfoora for Mac ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @yoast yup, totally agree. Vanessa did a good write up too. 16 hours ago via Twitter for Android ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Following is a snippet from our previous article, discussing the Google changes with Fox, because it’s highly relevant to the larger story:

    If you’ve listened to or read what was said, you’ll notice that the whole thing was in response to a question about mom and pops, which might make you wonder if brand is a significant part of what’s at play.

    “I don’t think it’s about just mom and pop vs. big brands,” Fox tells WebProNews. “Lots of big brands don’t know the first thing about SEO. I think (total guess on my part) the sites that will be negatively impacted are those that focus on algorithms and build content/sites based on the things what they think the algorithms are looking for. The kind of sites where someone didn’t say ‘I want this page to rank for query X. How can this page best answer what the searcher is asking about X’ but instead said ‘I want this page to rank for query X. How many times should I repeat X in my title, heading, content on the page, internal links…”

    Vanessa Fox Talks to WebProNews“I think it’s still useful (and not negative) to make sure the words that searchers are using are on the page, but some sites go well beyond this and get so caught up in what they think the algorithms are doing that they forget to make sure the content is useful,” she adds.

    “As far as sites that will see a positive from this, I think it will likely be both small sites (B&B in Napa that titles their home page ‘home’ vs. an affiliate site that sells wine gift baskets) and large brands (sites that use a lot of Flash),” says Fox. “I think foundational SEO practices (like those I describe in my article) will continue to be beneficial for sites.”

    When she talks about SEO in her article, by the way, she says she’s talking about “using search data to better understand your audience and solve their problems (by creating compelling, high-quality content about relevant topics to your business)” and “understanding how search engine crawl and index sites and ensuring that your site’s technical infrastructure can be comprehensively crawled and indexed.”

    Interestingly, though Google always puts out webmaster tips and videos, there seem to have been quite a few nuggets making their way out of the company’s blogs and YouTube channels over the past week or so – the time since the SXSW session took place.

    Last week, for example, Google’s Developer Programs Tech Lead Maile Ohye talked about Pagination and SEO, complete with a 37-page slideshow:

    In fact, it looks that this might be part of a new series of SEO tips from Ohye, as another one has come out about SEO mistakes and “good ideas”:

    SEO DOs And DON’TS, According To Google

    According to Google, these are some things you should not do in your SEO efforts:

    1. Having no value proposition: Try not to assume that a site should rank #1 without knowing why it’s helpful to searchers (and better than the competition 🙂

    2. Segmented approach: Be wary of setting SEO-related goals without making sure they’re aligned with your company’s overall objectives and the goals of other departments. For example, in tandem with your work optimizing product pages (and the full user experience once they come to your site), also contribute your expertise to your Marketing team’s upcoming campaign. So if Marketing is launching new videos or a more interactive site, be sure that searchers can find their content, too.

    3. Time-consuming workarounds: Avoid implementing a hack rather than researching new features or best practices that could simplify development (e.g., changing the timestamp on an updated URL so it’s crawled more quickly instead of easily submitting the URL through Fetch as Googlebot).

    4. Caught in SEO trends: Consider spending less time obsessing about the latest “trick” to boost your rankings and instead focus on the fundamental tasks/efforts that will bring lasting visitors.

    5. Slow iteration: Aim to be agile rather than promote an environment where the infrastructure and/or processes make improving your site, or even testing possible improvements, difficult.

    On the flipside, this is what Google says you should do:

    1. Do something cool: Make sure your site stands out from the competition — in a good way!

    2. Include relevant words in your copy: Try to put yourself in the shoes of searchers. What would they query to find you? Your name/business name, location, products, etc., are important. It’s also helpful to use the same terms in your site that your users might type (e.g., you might be a trained “flower designer” but most searchers might type [florist]), and to answer the questions they might have (e.g., store hours, product specs, reviews). It helps to know your customers.

    3. Be smart about your tags and site architecture: Create unique title tags and meta descriptions; include Rich Snippets markup from schema.org where appropriate. Have intuitive navigation and good internal links.

    4. Sign up for email forwarding in Webmaster Tools: Help us communicate with you, especially when we notice something awry with your site.

    5. Attract buzz: Natural links, +1s, likes, follows… In every business there’s something compelling, interesting, entertaining, or surprising that you can offer or share with your users. Provide a helpful service, tell fun stories, paint a vivid picture and users will share and reshare your content.

    6. Stay fresh and relevant: Keep content up-to-date and consider options such as building a social media presence (if that’s where a potential audience exists) or creating an ideal mobile experience if your users are often on-the-go.

    Of course, Google has continued to put out the usual Webmaster videos from Matt Cutts. He did one, or example, on meta tags, talking about how “you shouldn’t spend any time on the meta keywords tag,” but how Google does use the meta description tag.

    In that video, Cutts says, “So if you’re a good SEO, someone who is paying attention to conversion and not just rankings on trophy phrases, then you might want to pay some attention to testing different meta descriptions that might result in more clickthrough and possibly more conversions.” Emphasis added.

    “So don’t do anything deceptive, like you say you’re about apples when you’re really about red widgets that are completely unrelated to apples,” he adds. “But if you have a good and a compelling meta description, that can be handy.”

    The More Things Change, The More They Stay The Same

    This advice is basically in line with the position Google has had for years, which is also inline with what Fox had to say. It doesn’t sound like much has changed, but Google is getting better at distinguishing the good from the bad. Or at least that’s what they want SEOs to believe.

    I’m not saying they don’t have things in the works that are improvements, but Google has a broader issue with relevancy in results, and it would certainly be inaccurate to say that nothing has changed. Google makes changes to its algorithm every single day, and these days they are even going so far as to list at least some of the changes publicly each month. These lists are invaluable to webmasters looking to boost their Google presence, because while Google may say to not chase specific changes, they also show webmasters the areas where Google actually is changing how it does things. Ignoring them is foolish. That doesn’t mean you have to exploit them in a black hat kind of way, but you can certainly be aware of them, and look for tweaks that may have a direct effect on your current strategy.

    For example, if Google says it is putting fresher image results in image searches, perhaps you should consider how visual your content is.

    It will be interesting to see what this month’s changes are, as well as the changes Cutts discussed at SXSW. Will they make Google’s results more relevant? If enough sites follow the advice Google is giving, will the results get better? On the other hand, how much will it matter if you’re following all of Google’s advice if Google’s getting better at “leveling the playing field’ for those who aren’t paying attention to SEO at all? Those who aren’t paying attention to SEO probably aren’t reading articles like this or following Google’s webmaster blogs and videos. All of that said, doing the things Google says to do probably won’t hurt.

    What do you think? Should you spend less time worrying about SEO trends, like Google suggests? Let us know in the comments.

  • Google’s Matt Cutts: Good SEOs Pay Attention To Conversion

    Earlier, we posted a story about a new video Google uploaded, where Matt Cutts talks about how Google uses meta tags. Given some of the recent discussion in the SEO community, it seemed like a couple of things Cutts said in the video were worth a separate article.

    As you may know, Cutts spoke at SXSW recently, and mentioned some changes Google is working on, which would “level the playing field” for mom and pops, compared to bigger companies with big SEO budgets. The thinking is that as long as you have good, relevant content, it doesn’t matter that it’s not optimized for search engines – or at least not as much.

    We’ll have to wait and see just how Google is approaching this. Former Googler Vanessa Fox, who created Google Webmaster Central shared some interesting insight on Google’s plans, and suggested it’s likely not so much about brands.

    It’s interesting that Cutts would put out a video about the SEO implications of meta tags right in the middle of all of this. As a matter of fact, in the video, he says, “So if you’re a good SEO, someone who is paying attention to conversion and not just rankings on trophy phrases, then you might want to pay some attention to testing different meta descriptions that might result in more clickthrough and possibly more conversions.”

    Note: It’s worth noting that the video was actually uploaded all the way back in August, but Google chose today to make it “today’s webmaster video”:

    Today’s webmaster video: “How much time should I spend on meta tags, and which ones matter?” http://t.co/na6KlEHU 2 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    “So don’t do anything deceptive, like you say you’re about apples when you’re really about red widgets that are completely unrelated to apples,” he adds. “But if you have a good and a compelling meta description, that can be handy.”

    Google has always been against the more deceptive, black hat SEO tactics, and the signals we’re getting are basically that Google is trying to get better at what it’s already tried to be good at for years.