WebProNews

Tag: SEO

  • Google: Actually, Meta Tags Do Matter.

    Google posted a new Webmaster Help video from Matt Cutts today. The question at hand this time is: How much time should I spend on meta tags, and which ones matter?

    This one is also significant because Cutts submitted the question himself. That means, he felt this was an important enough issue, that even though it wasn’t submitted it by a user, needed to be addressed.

    “So the conventional wisdom a few years ago was that meta tags mattered a whole lot,” says Cutts. “You really had to tweak them and spent a lot of time to get your keywords right, and did you have a space, or a comma between your keywords, and all that kind of stuff. And we’ve mostly evolved past that, but the pendulum might have gone a little bit too far in the other direction, because a lot of people sometimes say, don’t think at all about meta tags. Don’t spend any time whatsoever on them, and so let me give you a more nuanced view.”

    “You shouldn’t spend any time on the meta keywords tag,” he says. “We don’t use it. I’m not aware of any major search engine that uses it these days. It’s a place that people don’t really see when they load the browser, and so a lot of webmasters just keyword stuff there, and so it’s really not all that helpful. So we don’t use meta keywords at all.”

    This is actually not the first time Cutts has posted a video about this topic. There was one from several years ago, where he basically said the same thing about the keywords meta tag. At the time, Google talked about how it used the description meta tag, as well as the meta tags “google,” “robots,” “verify-1,” “content type,” and “refresh”.

    Here’s a chart from Google Webmaster Tools, which breaks down how Google understands different meta tags:

    Google meta tags

    “But we do use the meta description tag,” Cutts continues in the new video. “The meta description is really handy, because if we don’t know what would make a good snippet, and you have something in the meta description tag that would basically give a pretty good answer–maybe it matches what the user typed in or something along those lines, then we do reserve the right to show that meta description tag as the snippet. So we can either show the snippet that might be the keyword in context on the page or the meta description.”

    “Now, if the meta description is really well written and really compelling, then a person who sees it might click through more often,” he says. “So if you’re a good SEO, someone who is paying attention to conversion and not just rankings on trophy phrases, then you might want to pay some attention to testing different meta descriptions that might result in more clickthrough and possibly more conversions. So don’t do anything deceptive, like you say you’re about apples when you’re really about red widgets that are completely unrelated to apples. But if you have a good and a compelling meta description, that can be handy.”

    “There are a lot of other meta tags,” he says. “I think in the metadata for this video, we can link to a really good page of documentation that we had, that sort of talks about which stuff we pay attention to and which stuff we don’t pay attention to. But at a 50,000-foot level, don’t pay attention to the keywords meta tag. But the description meta tag is worth paying attention to.”

    It sounds like SEO still matters.

  • New Google Changes: Really A Matter Of Mom And Pop?

    In a recent webmaster Q&A session at SXSW, Google’s Matt Cutts briefly discussed some changes Google is making that will “level the playing field” between smaller, mom and pop sites and “overly optimized” sites, as bigger companies have a lot more money to spend on SEO.

    Former Googler Vanessa Fox, who happens to be the creator of Webmaster Central, wrote an interesting blog post about it, which we discussed in another article about how the changes sound like they fall in line with Google’s greater philosophy of providing high quality sites (which is what the Panda update was all about).

    We reached out to Fox for some additional insight, as hers is particularly unique given her background.

    “I don’t think this is part of Panda,” Fox tells WebProNews. “Google makes hundreds of algorithm changes/introduces new signals/etc. every year. Panda is just one of many. Google just doesn’t name each one (and of course, not all of them are as impactful).”

    She notes, as she hinted at in her own post, that Cutts may have been simplifying things for a non-search audience (SXSW isn’t a search conference like SES or SMX), and says that “it’s possible this isn’t a new anything, but instead is just tweaking of existing signals that look for things like keyword stuffing and link exchanges.”

    Last week, Cutts pointed to the audio. Today he points to a full transcript:

    Today’s webmaster *audio* is a recording of our #sxsw panel: “Dear Google & Bing: Help Me Rank Better!” http://t.co/ddIH6VX5 3 days ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Rob Snell did a full transcript of the recent #sxsw session with Danny Sullivan, Duane Forrester, & me: http://t.co/RCGR99Ff 12 minutes ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    If you’ve listened to or read what was said, you’ll notice that the whole thing was in response to a question about mom and pops, which might make you wonder if brand is a significant part of what’s at play.

    “I don’t think it’s about just mom and pop vs. big brands,” Fox says. “Lots of big brands don’t know the first thing about SEO. I think (total guess on my part) the sites that will be negatively impacted are those that focus on algorithms and build content/sites based on the things what they think the algorithms are looking for. The kind of sites where someone didn’t say ‘I want this page to rank for query X. How can this page best answer what the searcher is asking about X’ but instead said ‘I want this page to rank for query X. How many times should I repeat X in my title, heading, content on the page, internal links…”

    “I think it’s still useful (and not negative) to make sure the words that searchers are using are on the page, but some sites go well beyond this and get so caught up in what they think the algorithms are doing that they forget to make sure the content is useful,” she adds.

    “As far as sites that will see a positive from this, I think it will likely be both small sites (B&B in Napa that titles their home page ‘home’ vs. an affiliate site that sells wine gift baskets) and large brands (sites that use a lot of Flash),” says Fox. “I think foundational SEO practices (like those I describe in my article) will continue to be beneficial for sites.”

    When she talks about SEO in her article, by the way, she says she’s talking about “using search data to better understand your audience and solve their problems (by creating compelling, high-quality content about relevant topics to your business)” and “understanding how search engine crawl and index sites and ensuring that your site’s technical infrastructure can be comprehensively crawled and indexed.”

    Whether or not the new changes are directly related to Panda, Google’s Panda-related quality guidelines will probably still be something to keep in mind, with regards to what Matt is talking about.

  • Google Webmaster Central Creator Talks Google’s “New” Google Changes

    Perhaps “anti-SEO” is a little strong, but as previously reported, Google is working on making SEO matter less. At a recent SXSW session, Google’s Matt Cutts discussed (without a lot of details) some changes Google is going to be making to “level the playing field” for mom and pops, in terms of how sites can gain visibility in search.

    “Normally, we don’t sort of pre-announce changes, but there is something we’ve been working on in the last few months, and hopefully in the next couple months or so, or you know, in the coming weeks, we hope to release it,” said Cutts. “And the idea is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level.

    Update: Vanessa Fox offered some additional observations to WebProNews. >>> Click here to read the article.

    “So that’s the sort of thing where we try to make the website…the Googlebot smarter, we try to make our relevance more adaptive, so the people who don’t do SEO, we handle that, and then we also start to look at the people who sort of abuse it, whether they throw too many keywords on the page or whether they exchange way too many links, or whatever they’re doing to sort of go beyond what a normal person would expect in a particular area,” he continued. “So that is something where we continue to pay attention, and continue to work on it…we have several engineers on my team working on that right now.”

    Naturally, many webmasters and SEOs are wondering just what all of this will mean for SEO going forward. Combine that, with a reported strategy of Google’s to greatly expand its direct answer results, which could also slow traffic to some sites.

    Vanessa Fox, the former Googler who built Webmaster Central, offers some perspective in a blog post.

    My thoughts on Google’s upcoming “over optimization” algorithm change: http://t.co/qsG4j0rM 10 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    “A lot of people have asked me what this means for those who include search engine optimization as part of their marketing mix,” says Fox in the post. “Some are worried that Google will begin to penalize sites that have implemented search engine optimization techniques. My thoughts? I think that some site owners should worry. But whether or not you should depends on what you mean by search engine optimization.”

    Interestingly, she compares Google’s approach to what the company has been doing with the Panda update, in that it’s about “separating high-quality, useful pages from pages that were just a collection of words about a particular topic.”

    “Matt talked about finding ways to surface smaller sites that may be poorly optimized, if, in fact, those sites have the very best content,” Fox says. “This is not anything new from Google. They’ve always had a goal to rank the very best content, regardless of how well optimized or not it may be. And I think that’s the key. If a page is the very best result for a searcher, Google wants to rank it even if the site owner has never heard of title tags. And Google wants to rank it if the site owner has crafted the very best title tag possible. The importance there is that it’s the very best result.”

    One great point that she brought up is that Cutts was not speaking at a search conference, when he was talking about this. It’s a different audience, in which he may not have gotten as specific about certain things with, as he may have at a conference like SMX Advanced.

    The way Fox talks about it, it almost sounds like he could have even been talking about Panda-related offerings. Remember how Google has made Panda more a part of “its pipelines” recently. Wouldn’t that be part of “making Googlebot smarter,” as Matt put it?

    Fox tells us, however, she doesn’t think what Matt was talking about is part of Panda, though I doubt we’re going to get much more out of Google on the subject, other than the usual monthly lists of changes. Either way, it does seem to fit with the greater philosophy behind Panda, which is really just about returning the best content anyway. More on this topic to come.

  • Google Is Working On Making SEO Matter Less

    At SXSW there was a session called, “Dear Google & Bing: Help Me Rank Better!” As previously reported, Matt Cutts was supposed to be there, but couldn’t make it due to his wife having foot surgery. He was still able to appear remotely, and during the session, he just happened to mention that Google is working on some things that will “level the playing field” for people who just have good content, and don’t focus on much on SEO. Bing’s Duane Forrester also participated as Danny Sullivan moderated.

    Today’s webmaster *audio* is a recording of our #sxsw panel: “Dear Google & Bing: Help Me Rank Better!” http://t.co/ddIH6VX5 22 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Here’s the official description for the session:

    If you build it, they might not come, if you haven’t thought about how search engines view your web site. Forget testing for Internet Explorer, Firefox, Chrome and Safari. Search engines are the common browser that everyone uses. The good news is that search engine optimization (SEO) doesn’t mean terrible design or some type of black-magic trickery. Rather, there are good, sensible things that everyone should do that pleases both search engines and human visitors. In this session, representatives from Google and Bing provide this type of advice. They’ll even get you up to speed on the impact that social media is playing on search results. Even better, it’s all Q&A. Bring your top questions about how they rank sites and get answers directly from the source.

    The official SXSW page has the audio for the entire session. Hat tip to Barry Schwartz for pointing to this specific part of it.

    During the Q&A, one webmaster asked how a mom and pop doing its own optimization can stand a chance against all of those who are spending thousands of dollars on SEO.

    “The way that I often think about SEO is that it’s like a coach,” said Cutts. “It’s someone who helps you figure out how to present yourself better. In an ideal world though, you wouldn’t have to think about presenting yourself and whether search engines can crawl your website, because they’d just be so good that it could figure out how to crawl through the Flash, how to crawl through the forms, how to crawl through the javascript, how to crawl through whatever it is. And for the most part, most search engines have made a lot of progress on being able to crawl through that richer content.”

    Regarding the people that are optimizing “really hard” and doing a lot of SEO, Matt says, “Normally, we don’t sort of pre-announce changes, but there is something we’ve been working on in the last few months, and hopefully in the next couple months or so, or you know, in the coming weeks, we hope to release it.”

    “And the idea,” he says, “Is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level. So that’s the sort of thing where we try to make the website…the Googlebot smarter, we try to make our relevance more adaptive, so the people who don’t do SEO, we handle that, and then we also start to look at the people who sort of abuse it, whether they throw too many keywords on the page or whether they exchange way too many links, or whatever they’re doing to sort of go beyond what a normal person would expect in a particular area. So that is something where we continue to pay attention, and continue to work on it…we have several engineers on my team working on that right now.”

    Duane Forrester joked that Bing has some hamsters working on this in the back room, spinning some wheels. He suggested having a great product and being engaged socially. These are strong signals for Bing, he said.

  • Chrome Comes Out Of The Penalty Box, Following Paid Link Fiasco

    Remember when Google was involved in that controversy regarding paid links on blog posts about the company’s Chrome browser? As people had caught wind of what was going on, Google implemented a PageRank penalty on Chrome’s landing page, which knocked it down in search results.

    “In response, the webspam team has taken manual action to demote www.google.com/chrome for at least 60 days,” Google’s Matt Cutts said at the time. “After that, someone on the Chrome side can submit a reconsideration request documenting their clean-up just like any other company would. During the 60 days, the PageRank of www.google.com/chrome will also be lowered to reflect the fact that we also won’t trust outgoing links from that page.”

    This was Google’s effort to show others that it would treat its own properties the same as others’. It looked like this had a direct impact on Chrome’s market share, at a time when Microsoft’s Internet Explorer was showing signs of improvement.

    The penalty has now been lifted, as the 60-day mark has come. Barry Schwartz reports that Google has confirmed this.

    It will be very interesting to see how Chrome’s market share numbers look after being out of the penalty box for a while.

  • Google Panda Update Not Happening

    Google Panda Update Not Happening

    Every time Google makes some change to its algorithm that webmasters notice, lots of people jump to conclusions that it’s another Panda update. This is simply not the case. In fact, it’s usually not the case.

    Barry Schwartz at Search Engine Roundtable is pointing to some discussion in WebmasterWorld where people talk about the possibilities of a new Panda update.

    Luckily, Schwartz has already confirmed with Google that there it was not Panda or a Panda data refresh.

    Google makes changes to is algorithm every day. It makes roughly 500 of them each year.

    In either late March or early April, we should see a big list of changes that Google has made this month. A Panda update was included in February’s list, which Google said refreshed the data in the Panda system, “making it more accurate and more sensitive to recent changes on the web.”

    It’s worth noting that while Google may not have a new Panda update happening right now, the last one was designed to make it more integrated into Google’s piplelines.

    Soon, webmasters may have a whole new round of Google changes to deal with as the company reportedly gets ready to expand its direct answer results.

  • Will Google Rank New TLDs Better Than .com Domains?

    Google’s head of web spam, Matt Cutts, took to Google+ to bust yet another myth (there’s been a lot of Matt Cutts myth busting lately, it seems).

    He points to an article from Adrian Kinderis, CEO of ARI Registry Services (described as “a top-level domain specialist”), which claims that the new top-level domains will “trump .com in Google search results”. Kinderis writes:

    Will a new TLD web address automatically be favoured by Google over a .com equivalent? Quite simply, yes it will. I’ve been researching this topic since development of the new TLD program first began (around 6 years ago) and have closely followed the opinions of the many search industry experts who have taken a great deal of interest in the introduction of these new domains and the impact they will have.

    The more I research, the more I have no doubt that a new TLD address will trump its .com equivalent.

    Followers of Cutts may have some doubt. Here’s what he said about it on Google+:

    Sorry, but that’s just not true, and as an engineer in the search quality team at Google, I feel the need to debunk this misconception. Google has a lot of experience in returning relevant web pages, regardless of the top-level domain (TLD). Google will attempt to rank new TLDs appropriately, but I don’t expect a new TLD to get any kind of initial preference over .com, and I wouldn’t bet on that happening in the long-term either. If you want to register an entirely new TLD for other reasons, that’s your choice, but you shouldn’t register a TLD in the mistaken belief that you’ll get some sort of boost in search engine rankings.

    In the comments on Matt’s post, one reader suggested that Google doesn’t rank good content, but ranks popular content. Matt responded to that, pointing to a post we did on a video where he discussed porn sites and PageRank.

  • Matt Cutts Gives A Google Algorithm History Lesson

    Most webmasters these days are probably more concerned with more recent Google algorithm updates. While you certainly want to acknowledge the things Google has had in place for years, Google puts out big lists of changes on a monthly basis, and if you want to stay on the cutting edge of what the search giant is up to, it’s good to follow these lists.

    That said, it’s also interesting to jump in the time machine and look back at how Google has handled changes in the past. This video doesn’t have any real SEO value to webmasters of today, as far as I can tell, in terms of providing fresh ideas for how to rank better in Google in 2012, but again, it’s interesting if you want to learn more about the inner workings of Google.

    Matt Cutts has posted one of his Webmaster Help videos, but this time addresses a question about Google’s history (submitted by a user):

    According to “In the Plex,” the last Google Dance and everflux switch came with update “Bart” but, in an earlier post you said it was “Fritz”. Did the last Google Dance and switch to everflux come with update Bart or Fritz?

    Today’s webmaster video covers ancient history: Google dances, BART, and Update Fritz: http://t.co/gyjV01Of 3 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    “It’s critical that we nail down all these last little bits of ancient search engine history,” says Cutts.

    “Bart was the internal code name. It was actually, if I remember correctly, named after a particular salesperson, who was especially fresh, so if somebody comes to a Halloween party dressed as Barry Bondage, that’s pretty fresh, right?” he says. “So…named after a salesperson internally, Bart. It was known as Fritz externally, because whenever the Google Dance would happen, it would happen about once a month, and basically you’d have several data centers, and each night we would take one data center out of the rotation, and we would put new data on it.”

    “So for about a week, we were swapping now old data versus new data, and so for that week, you’d have the Google Dance, because you’d hit either old data centers or new data centers,” he continues. “So once a month, people would look for the Google Dance to happen. They would name them alphabetically like hurricanes. You start with A early in the year and then B the month after that, and so summer, which was F, you’d have gotten to Fritz. So they called it Update Fritz.”

    “And I remember, Fritz lasted all the way through the summer of, I believe, 2003, because it was really Everflux,” Cutts says. “That is, it was changing to an incremental update system. So rather than a batch system that would update once per month, it was, OK, we’ll update a certain percentage of our index every night, and so the index was always changing. So internally, that system to have very fresh results was called Bart. Externally, people called it Update Fritz. So I hope that explains the difference between those two names.”

    The story does illustrate that internal and external names can sometimes cause confusion. Google, in its monthly lists these days, always shares its internal names alongside the changes. This will likely help keep the names straight as points of reference for commentators as time goes on.

  • Google Search Quality Meeting Uncut [Video]

    Google tweeted out a link to a video of some footage from one of its search quality meetings. The company says this is part of its efforts to be more transparent how how its search engine works.

    Watch uncut video from one of our search quality meetings (for the first time ever!) http://t.co/y53oAAx0 30 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    This particular video deals with spelling for long queries. The meeting, held in Mountain View, is where the search team decides to launch specific algorithm changes.

    Google says in the YouTube description:

    “As part of our continued effort to be more transparent about how search works, we’re publishing video footage from our internal weekly search meeting: ‘Quality Launch Review.’ We hold the meeting almost every Thursday to discuss possible algorithmic improvements and make decisions about what to launch. This video is from the meeting that happened on December 1st, 2011, and includes the entire uncut discussion of a real algorithmic improvement to our spell correction system.”

    Google has been talking about these transparency efforts for months, and recently started a monthly series of blog posts outlining various algorithmic changes they’ve made. From the sound of it, we can expect more of these videos as well, which should give us an even deeper look at Google’s strategies and mindset.

    If your’e a webmaster looking to get the most out of your SEO efforts, I’d suggest keeping an eye out for those monthly posts, these videos and the Webmaster Help videos Google’s Matt Cutts puts out. In fact, add Duane Forrester’s videos and Bing’s search quality series to the list as well.

  • Matt Cutts Won’t Be At SXSW

    For the search crowd, the SXSW session to take place on Saturday called “Dear Google & Bing: Help Me Rank Better!” is no doubt on the list of those to attend. It was supposed to have Google’s head of web spam Matt Cutts, Bing Sr. Product Manager Duane Forrester (who has kind of become known as Bing’s Matt Cutts) and Search Engine Land Editor in Chief Danny Sullivan, who has established him as one of the leading voices in the search industry.

    View our SXSW coverage here.

    Matt Cutts announced, however, that due to his wife having surgery, he will be unable to attend. He tweeted early this morning:

    My wife has foot surgery tomorrow, so I won’t be able to make it to SXSW in person: http://t.co/CRRAulpC I’ll try to Skype in for the panel. 15 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    He actually wrote about the news on his blog a few days ago, but at that point thought he would still be able to do the panel:

    Every so often real life catches up with you in ways you didn’t expect. My wife broke her foot a few days ago. She took a unfortunate spill off a stepstool, but she’s telling everyone it was a ninja fight. Those ninjas pack a wallop: she’ll wear a cast for up to 6-8 weeks, and the doctor said she can’t drive with her current cast. Overall, the broken foot has been a good reminder that having your bike stolen, while annoying, isn’t too horrible in the grand scheme of things.

    One wrinkle is that my wife and I were going to spend about a week together at South by Southwest, and I was scheduled to participate on a panel. She’s not going now for obvious reasons (ninja fight). I’ve rejiggered my travel so I’m only away from my wife for a day but I believe I can still do the panel.

    That now has an update on it, reflecting what he said in the tweet.

    Fans will no doubt be disappointed. I’ve seen this guy walk the halls at conferences, constantly being surrounded by people who want to talk to him. Just like a rock star. The Twitterverse is understanding, however.

    @mattcutts You are a great husband. Our philosophy is family first. Best of luck to your wife (and you too!) 🙂 15 hours ago via TweetDeck ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @mattcutts sounds painful. Hope she feels better. Please send her my best. 15 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Here’s the official description for the session:

    If you build it, they might not come, if you haven’t thought about how search engines view your web site. Forget testing for Internet Explorer, Firefox, Chrome and Safari. Search engines are the common browser that everyone uses. The good news is that search engine optimization (SEO) doesn’t mean terrible design or some type of black-magic trickery. Rather, there are good, sensible things that everyone should do that pleases both search engines and human visitors. In this session, representatives from Google and Bing provide this type of advice. They’ll even get you up to speed on the impact that social media is playing on search results. Even better, it’s all Q&A. Bring your top questions about how they rank sites and get answers directly from the source.

    Cutts, Forrester and Sullivan all put together a session together at last year’s SXSW as well. Here’s our coverage of that. It could give you an idea of the kinds of things to expect, though a lot has certainly happened in search in a year’s time. I’m sure Search Plus Your World, for example will be a topic of discussion this year.

    You can find plenty of advice from Matt Cutts on various topics here.

  • SEO, Analytics, Financials, All In One Place

    SEO, Analytics, Financials, All In One Place

    How’s business? It’s a question small business owners are asked a lot. But it’s one that is increasingly difficult to answer because of the complexities of running a modern business. WebControlRoom.com is a free tool that has been developed to help small businesses answer this question by providing a real time performance report with data from various sources in the one place.

    In order to thrive, small business owners need to keep their eye on many different measures to know not only if they are on track now but whether things are heading in the right direction for the future.

    Because of the myriad of different services used by small business owners these days, particularly web savvy ones, it’s not easy to get an overall picture of how things are tracking without logging into a lot of different places and manually piecing together the puzzle.

    After battling with this issue himself, Australian small business web expert Dan Norris developed a free tool, WebControlRoom.com to do exactly that. By talking to popular small business services like Xero, Mail Chimp, Google Analytics etc, WebControlRoom.com is able to present a one page chart showing the key stats all in the one place giving business owners the important information in seconds.

    But unlike other dashboard type services, WebControlRoom.com is not just about pretty charts, Dan explains. “Big companies love pretty charts. But small business wants the right information, quickly. By focusing in on month by month comparison data, the tool makes it clear which areas of your business are going well and which ones need attention”.

    “Small business owners can open their report each day from their computer or their mobiles and within a few seconds identify issues or see areas that are performing well”.

    Examples of the charts include current Google rankings so business owners can stay in top of how their site ranks for their chosen keywords and newsletter opens from Mail Chimp or Klout score to measure the engagement of their audience. Revenue charts from Xero show how this month’s revenue compares to last month and color and size coding on all charts makes is quick to pinpoint problem areas or see spikes in results.

    In total there are 9 services and many more on the way and there is a slim lined version for mobiles that loads whenever the service is accessed from a mobile.

    Other than supporting more services, plans for the future include a smart messaging system that provides users with sponsored improvement messages related to their performance.

    WebControlRoom.com is available for free to the public as a beta release now.

  • Bing Webmaster Tools Gets Organic Keyword Research Tool, API

    At SMX West today, Microsoft’s Bing revealed some updates to Bing Webmaster Tools. There’s a new organic keyword research tool and an API.

    On the keyword research tool, Bing’s Duane Forrester explains, “This tool allows you to perform keyword research on any phrase you enter. It resides within your WMT account and offers the ability to see query volume data on the phrase you enter, and related phrases, across many different countries and languages. You can easily explore query volumes on keywords by simply clicking on any related keyword. All data within the tool is exportable, and we hold a history of up to 6 months for all phrases. This means you can select a date range covering up to the previous six months to see query volume data for the time period you select. Query data shown in the results within this tool are based on organic query data from Bing and is raw data, not rounded in any way. This tool is found when you login, on the Keyword tab.”

    Bing Keyword Research

    “Upon login, you see a simple interface with a few options to help you target country and language. You can also select “strict” to ensure results are restricted to the exact phrase to word you entered,” he continues. “Entering a phrase or keyword and clicking the Search button will bring back organic keyword query data for the phrase entered, as well as for related phrases. Here we have selected one filter for the United States, but left the language open, and strict unchecked to see what the keyword ecosystem looks like around our topic, in the US. Not surprisingly, our example of fly fishing, during these winter months, nets us lower query volumes. The graph clearly shows a run-up on query volume coming into the holiday season, and trending lower afterwards. ”

    The API, of course, lets you access the data in other places. The documentation for that is here.

    Forrester goes more in depth into the new changes in this blog post.

    The features are currently available to all people with webmaster tools accounts.

  • SEO Solution, InferClick, New From SearchDex

    In the realm of search engine optimization offerings, SearchDex, a digital marketing service and solutions provider, today announced at eTail Palm Springs the release of InferClick, a new solution to help empower online retailers with actionable insights into consumer behavioral patterns.

    InferClick, the behavioral data analytics technology from SearchDex, provides online retailers revenue tracking by keyword, increased keyword valuation, enhanced merchandising, intelligent product recommendations, keyword expansion based on user behavior, as well as the ability to create new pages around newly discovered keywords.

    “As online retailers strive to recognize how buyers find and interact with their ecommerce sites, e-tailers need the necessary tools to discover the relevant actions and behaviors of consumers in order to determine critical revenue drivers,” said David Chaplin, CEO at SearchDex. “InferClick is our answer to the demands of online marketers who would benefit from these capabilities, offering improved keyword targeting and additional enhancements focused on SEO program expansion through the analysis of view-stream and shopping cart data.”

  • Google Changes How It Evaluates Links

    Google announced a bunch of changes it made to its algorithm over the course of February, and some of those changes are more interesting than others.

    So far, we’ve taken a closer look at the increased sensitivity of the Panda update, some location-based changes to YouTube suggestions, and the increased importance of image search optimization. Another very interesting entry to Google’s list is:

    Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.

    It would, of course, be helpful to know some more specifics about this method of link analysis, but that’s probably one of those things that Google would rather play a bit closer to their chest than some of their other signals. Google can’t have people going out and exploiting that information and gaming the results, now could they? That could be a big “bug” that could end up hurting that search quality they’re trying so hard to maintain.

    I’m sure there will be plenty of theories and speculation regarding how Google is analyzing links, just as there has been since the dawn of PageRank.

    I doubt this new change will bring about any major findings in SEO, but it’s interesting to know that such a change was made – one that removes something that Google has been using for “several years”. One has to wonder if this will have a major impact on the PageRank of sites around the web.

  • Google Image Search Optimization Now More Important To SEO

    Google announced forty changes it has made to its algorithm/search quality efforts over the past month. One of them was the increased sensitivity of Panda. Another was “more locally relevant predictions in YouTube“. I’m not going to run down every single one fo them again, but there are a few that stuck out to me.

    Another of these would be an increase in the number of queries that will return image results in Universal Search. Here’s what Google listed for that:

    Expand the size of our images index in Universal Search. [launch codename “terra”, project codename “Images Universal”] We launched a change to expand the corpus of results for which we show images in Universal Search. This is especially helpful to give more relevant images on a larger set of searches.

    To me, this means images should probably a bigger part of your SEO strategy. Images were already important to SEO, but if they’re going to inject more images into the first-page organic results mix for more queries, this is another opportunity to get your content in front of searchers’ eyeballs.

    Last year, we ran a good article by Michael Gray about optimizing images for search traffic. That would be a good place to start for some solid advice on taking advantage of this.

    Let’s not forget, however, that Search Plus Your World plays a major role these days in what images Google shows users in such results, so perhaps this should also be considered yet another reason to expand your Google+ presence, and engage with others on Google’s social network.

    This other entry to Google’s list is certainly worth considering as well:

    Fresher images. [launch codename “tumeric”] We’ve adjusted our signals for surfacing fresh images. Now we can more often surface fresh images when they appear on the web.

  • Google Algorithm Updates Announced: Panda Gets More Sensitive

    Google Algorithm Updates Announced: Panda Gets More Sensitive

    I wasn’t expecting this to come until early March, since the month isn’t even over yet, but Google has gone ahead and released its monthly list of updates: 40 changes for February.

    While we’ll take a deeper look into the list soon, it’s worth noting right off the bat that there is a Panda update listed. Late last week, in light of Panda’s one-year anniverary, I asked Google if the Panda adjustment from January’s list had been the most recent adjustment to Panda. The response I received from a spokesperson was:

    “We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda.”

    This was basically what the company said in January. Now, in today’s list for February, Google says:

    “This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.”

    So between January’s and February’s Panda news, it sounds like Panda is more ingrained into how Google indexes the web than ever before, and may even be pickier about quality.

    Here’s the full list in Google’s words:

    • More coverage for related searches. [launch codename “Fuzhou”] This launch brings in a new data source to help generate the “Searches related to” section, increasing coverage significantly so the feature will appear for more queries. This section contains search queries that can help you refine what you’re searching for.
    • Tweak to categorizer for expanded sitelinks. [launch codename “Snippy”, project codename “Megasitelinks”] This improvement adjusts a signal we use to try and identify duplicate snippets. We were applying a categorizer that wasn’t performing well for our expanded sitelinks, so we’ve stopped applying the categorizer in those cases. The result is more relevant sitelinks.
    • Less duplication in expanded sitelinks. [launch codename “thanksgiving”, project codename “Megasitelinks”] We’ve adjusted signals to reduce duplication in the snippets forexpanded sitelinks. Now we generate relevant snippets based more on the page content and less on the query.
    • More consistent thumbnail sizes on results page. We’ve adjusted the thumbnail size for most image content appearing on the results page, providing a more consistent experience across result types, and also across mobile and tablet. The new sizes apply to rich snippet results for recipes and applications, movie posters, shopping results, book results, news results and more.
    • More locally relevant predictions in YouTube. [project codename “Suggest”] We’ve improved the ranking for predictions in YouTube to provide more locally relevant queries. For example, for the query [lady gaga in ] performed on the US version of YouTube, we might predict [lady gaga in times square], but for the same search performed on the Indian version of YouTube, we might predict [lady gaga in India].
    • More accurate detection of official pages. [launch codename “WRE”] We’ve made an adjustment to how we detect official pages to make more accurate identifications. The result is that many pages that were previously misidentified as official will no longer be.
    • Refreshed per-URL country information. [Launch codename “longdew”, project codename “country-id data refresh”] We updated the country associations for URLs to use more recent data.
    • Expand the size of our images index in Universal Search. [launch codename “terra”, project codename “Images Universal”] We launched a change to expand the corpus of results for which we show images in Universal Search. This is especially helpful to give more relevant images on a larger set of searches.
    • Minor tuning of autocomplete policy algorithms. [project codename “Suggest”] We have a narrow set of policies for autocomplete for offensive and inappropriate terms. This improvement continues to refine the algorithms we use to implement these policies.
    • “Site:” query update [launch codename “Semicolon”, project codename “Dice”] This change improves the ranking for queries using the “site:” operator by increasing the diversity of results.
    • Improved detection for SafeSearch in Image Search. [launch codename “Michandro”, project codename “SafeSearch”] This change improves our signals for detecting adult content in Image Search, aligning the signals more closely with the signals we use for our other search results.
    • Interval based history tracking for indexing. [project codename “Intervals”] This improvement changes the signals we use in document tracking algorithms.
    • Improvements to foreign language synonyms. [launch codename “floating context synonyms”, project codename “Synonyms”] This change applies an improvement we previously launched for English to all other languages. The net impact is that you’ll more often find relevant pages that include synonyms for your query terms.
    • Disabling two old fresh query classifiers. [launch codename “Mango”, project codename “Freshness”] As search evolves and new signals and classifiers are applied to rank search results, sometimes old algorithms get outdated. This improvement disables two old classifiers related to query freshness.
    • More organized search results for Google Korea. [launch codename “smoothieking”, project codename “Sokoban4”] This significant improvement to search in Korea better organizes the search results into sections for news, blogs and homepages.
    • Fresher images. [launch codename “tumeric”] We’ve adjusted our signals for surfacing fresh images. Now we can more often surface fresh images when they appear on the web.
    • Update to the Google bar. [project codename “Kennedy”] We continue to iterate in our efforts to deliver a beautifully simple experience across Google products, and as part of that this month we made further adjustments to the Google bar. The biggest change is that we’ve replaced the drop-down Google menu in the November redesign with a consistent and expanded set of links running across the top of the page.
    • Adding three new languages to classifier related to error pages. [launch codename “PNI”, project codename “Soft404”] We have signals designed to detect crypto 404 pages (also known as “soft 404s”), pages that return valid text to a browser but the text only contain error messages, such as “Page not found.” It’s rare that a user will be looking for such a page, so it’s important we be able to detect them. This change extends a particular classifier to Portuguese, Dutch and Italian.
    • Improvements to travel-related searches. [launch codename “nesehorn”] We’ve made improvements to triggering for a variety of flight-related search queries. These changes improve the user experience for our Flight Search feature with users getting more accurate flight results.
    • Data refresh for related searches signal. [launch codename “Chicago”, project codename “Related Search”] One of the many signals we look at to generate the “Searches related to” section is the queries users type in succession. If users very often search for [apple] right after [banana], that’s a sign the two might be related. This update refreshes the model we use to generate these refinements, leading to more relevant queries to try.
    • International launch of shopping rich snippets. [project codename “rich snippets”]Shopping rich snippets help you more quickly identify which sites are likely to have the most relevant product for your needs, highlighting product prices, availability, ratings and review counts. This month we expanded shopping rich snippets globally (they were previously only available in the US, Japan and Germany).
    • Improvements to Korean spelling. This launch improves spelling corrections when the user performs a Korean query in the wrong keyboard mode (also known as an “IME”, or input method editor). Specifically, this change helps users who mistakenly enter Hangul queries in Latin mode or vice-versa.
    • Improvements to freshness. [launch codename “iotfreshweb”, project codename “Freshness”] We’ve applied new signals which help us surface fresh content in our results even more quickly than before.
    • Web History in 20 new countries. With Web History, you can browse and search over your search history and webpages you’ve visited. You will also get personalized search results that are more relevant to you, based on what you’ve searched for and which sites you’ve visited in the past. In order to deliver more relevant and personalized search results, we’ve launched Web History in Malaysia, Pakistan, Philippines, Morocco, Belarus, Kazakhstan, Estonia, Kuwait, Iraq, Sri Lanka, Tunisia, Nigeria, Lebanon, Luxembourg, Bosnia and Herzegowina, Azerbaijan, Jamaica, Trinidad and Tobago, Republic of Moldova, and Ghana. Web History is turned on only for people who have a Google Account and previously enabled Web History.
    • Improved snippets for video channels. Some search results are links to channels with many different videos, whether on mtv.com, Hulu or YouTube. We’ve had a feature for a while now that displays snippets for these results including direct links to the videos in the channel, and this improvement increases quality and expands coverage of these rich “decorated” snippets. We’ve also made some improvements to our backends used to generate the snippets.
    • Improvements to ranking for local search results. [launch codename “Venice”] This improvement improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.
    • Improvements to English spell correction. [launch codename “Kamehameha”] This change improves spelling correction quality in English, especially for rare queries, by making one of our scoring functions more accurate.
    • Improvements to coverage of News Universal. [launch codename “final destination”] We’ve fixed a bug that caused News Universal results not to appear in cases when our testing indicates they’d be very useful.
    • Consolidation of signals for spiking topics. [launch codename “news deserving score”, project codename “Freshness”] We use a number of signals to detect when a new topic is spiking in popularity. This change consolidates some of the signals so we can rely on signals we can compute in realtime, rather than signals that need to be processed offline. This eliminates redundancy in our systems and helps to ensure we can continue to detect spiking topics as quickly as possible.
    • Better triggering for Turkish weather search feature. [launch codename “hava”] We’ve tuned the signals we use to decide when to present Turkish users with the weather search feature. The result is that we’re able to provide our users with the weather forecast right on the results page with more frequency and accuracy.
    • Visual refresh to account settings page. We completed a visual refresh of the account settings page, making the page more consistent with the rest of our constantly evolving design.
    • Panda update. This launch refreshes data in the Panda system, making it more accurate and more sensitive to recent changes on the web.
    • Link evaluation. We often use characteristics of links to help us figure out the topic of a linked page. We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years. We often rearchitect or turn off parts of our scoring in order to keep our system maintainable, clean and understandable.
    • SafeSearch update. We have updated how we deal with adult content, making it more accurate and robust. Now, irrelevant adult content is less likely to show up for many queries.
    • Spam update. In the process of investigating some potential spam, we found and fixed some weaknesses in our spam protections.
    • Improved local results. We launched a new system to find results from a user’s city more reliably. Now we’re better able to detect when both queries and documents are local to the user.

    More analysis to come.

  • Are Google’s Results Better After A Year Of Panda Updates?

    It’s hard to believe that it’s already been a year since Google first launched the Panda update. OK, who am I kidding? It feels like an eternity ago. But it’s been a year. How much work have you done on your site to comply with Panda in that amount of time?

    Now that you’ve had a year to get to know it, how do you think Google has done with the Panda update? Was your site affected? For better or for worse? Do you think Google did a good job in making search results higher in quality and relevancy? Let us know in the comments.

    Earlier this month, when Google ran down its publicly known algorithmic changes for the month of January, it mentioned what still appears to be the most recent change to Panda. It said:

    “We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda.”

    This change had actually been confirmed in January, but was spelled out one more time (as much as Google will in fact spell it out). Just to make sure this was in fact the most recent Panda-related adjustment, we asked Google. A spokesperson for the company responded: “As mentioned in January, we’re continuing to improve how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines.”

    So, it sounds like the improvements are still ongoing, but no major Panda update since that particular announcement.

    This list has been referenced plenty of times by myself and others discussing the Panda update, but as Google tweaks it, these things will continue to be important to keep in mind. Possibly even more than ever, considering that it’s so much more “integrated into the pipelines”. It’s the list of questions that provides “guidance” on how Google looks at the issue of search quality.

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    For a comprehensive look back (and forward) at the ongoing Panda saga, you may find our Panda page useful. It basically looks like our homepage, but is dedicated to Panda-related stories. Sure, in essence, it’s basically a tag page, but with our recent redesign, kind of takes on a life of its own. It points to all of our coverage, all the way back to the beginning. It includes the important info, as well as some of the more fun things, like Panda bread, parody videos and infographics.

    Here are a few of the classic videos:

    And speaking of infographics, Search Engine Land and BlueGlass put this one together:

    The Google Panda Update, One Year Later

    It’s a well put-together infographic for sure, and provides a nice visible timeline of the various iterations of the Panda update, but it really only scratches the surface of the affects the update has had on the web – the struggles of webmasters who felt their sites were unjustly impacted for the worse. We heard a whole lot of stories over the last year. We covered some of them, but only a fraction. For some sites it was clear that their quality was lacking, and didn’t really deserve to be ranking over higher quality sites, but for others, we had to wonder if Google was making the right call. Some sites were able to recover (fully or partially), while many, no doubt, just gave up and started over. Some had to make huge adjustments to their entire content strategies.

    No better example of this exists, than Demand Media, widely considered the poster child for the content farm concept – a concept, which ultimately led to the Panda update’s existence in the first place, and its early pre-Panda nickname the “farmer update”. Demand Media’s eHow property, specifically, was the main culprit, though other of the company’s properties were named from time to time in the visibility reports from third parties referenced in the above infographic.

    Amazingly, the initial Panda update a year ago didn’t have any impact on eHow, but that would change in future iterations. It ultimately led to an huge shift in strategy for Demand Media’s content arm, which included the addition of a new user feedback system, the deletion of thousands of articles, and the reduction in new article assignments. All the while, the company has been expanding its social presence, and forming content partnerships to boost the quality and reputation of its eHow brand, which has a top 20 domain in the U.S. Demand Media’s properties get 100 million visitors per month.

    The Panda update has had such an impact on this company that it still has to talk about it in its earnings calls. They just had one a couple weeks ago, and declared that the last Google algorithm update to affect eHow as in July. Panda 2.3, as its referred to in the infographic, was on July 23. There have been 5 Panda updates since then, so it would appear that the company has learned how to cater to it.

    I’m not going to dig back through all of the Panda stories of the past year, because frankly, there are just too many of them. A lot of anecdotes, a lot of theories, and a lot of analysis. There’s enough to write a sizable book. Maybe one day.

    One important thing to note, which is also referenced in the infographic, is that Panda is only one of over 200 signals Google uses. It’s an important one, but there are a lot of other ones. A lot of other important ones. These days, the big controversial Google signal is “Search Plus Your World“. There’s a lot of criticism about how its damaging relevancy. I’ve seen examples where Google’s “freshness” update has hurt relevancy. The lack of realtime search isn’t helping things either.

    In a week or two, Google will likely give us a look at the changes it has made to its algorithm since that January list. Then webmasters and SEOs sill have even more factors to consider in the elaborate quest for gaining visibility in the world’s largest search engine as it continues to become more personalized to each user. Nobody said it would be easy, but these things are worth paying attention to.

    Now that we’ve had an entire year to digest the Panda update, while being thrown new curveballs from Google along the way, how do you think Google is doing with search quality? Is Google showing more relevant results that it was a year ago? Tell us what you think in the comments .

    Lead Image Credit: yosoybeezel on Photobucket

  • SES London: Google Talks Bounce Rate, Social Signals You Should Be Measuring

    Search Engine Strategies London is going on right now, and Googler Avinash Kaushik gave a keynote address that appears to have left attendees inspired.

    State of Search’s Louis Venter has an extensive account of Kaushik’s speech. One interesting part is what he says about bounce rate. I’m not sure if these were Kaushik’s exact words, but Venter writes, “Bounce rate shows you how much you suck.”

    One issue that’s been debated in the industry is how much of a signal bounce rate is in Google’s algorithm. If this is the message Kaushik is sending, however, it stands to reason that this is the message Google is sending. He is, after all, the digital marketing evangelist for the company, and the analytics guru.

    Another highlight was the following list of “things you should measure for social contribution” as Venter conveys:

    1) Conversation rate – if you talk does anyone care? The number of audience comments per social contribution will help measure this.

    2) Amplification rate – 70, 000 people follow Avinash, his second level of people is over 1 million people. # forwards per social contribution is a key metric to cover too.

    3) Applause Rate – # positive clicks per social contribution

    4) Economic Value – sum of your macro and micro conversions and how they contribute to your overall picture

    Here’s some Twitter reaction to the event so far:

    Avinash was awesome, here is the post i wrote on it http://t.co/UKHi4lcq # seslondon 10 hours ago via TweetDeck ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Great Opening Keynote of #seslondon 2012 by @avinash ! Hope to see you again soon again in Japan. http://t.co/bYQO6Vcs 5 hours ago via HootSuite ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    HITS . How Idiots Track Success. Great quote from Avinash Kaushik at #seslondon 11 hours ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Some pictures of day 1 #seslondon http://t.co/bWyVFUNN @sesconf – more people in the pictures tomorrow! 4 hours ago via CoTweet ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    is re-targeting right for you? great session at #seslondon http://t.co/MbvOE44K 5 hours ago via TweetDeck ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Thx! RT @willquick: I pray to God that @LeeOdden is putting those slides online. Amazing information. #seslondon 5 hours ago via Echofon ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    A ranking in position 3 or 4 with review stars can generate as much traffic as number 1 position, says @guylevine #seslondon 5 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Key Linkbuilding Strategies Presentation from SES London #seslondon http://t.co/wNS0KPt7 via @patrickaltoft 6 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @matt_mcgowan Thanks Matt, it was great to open the conference. Great audience, great Incisive staff! #seslondon 7 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    SEO Migration Plan: A Failure to Plan is a Plan to Fail @SESLondon http://t.co/TAcosYVY 1 day ago via bitly ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    PPC Tools of the Trade #seslondon http://t.co/p2umTS96 1 hour ago via twitterfeed ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    53 Best Tips from #SESLondon Day 1: http://t.co/2xnLf3l2 by @kevgibbo (via @seoptimise) 14 minutes ago via Tweetbot for iOS ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    #SESLondon Day1, 3 Panels moderated, @guylevine @daxhamman @andymihalop @refinedlabs @monetate Karl Blanks @alistairdent top speakers! 58 minutes ago via TweetDeck ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Pleased with how my #seslondon presenting is shaping up now 🙂 1 hour ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    audit & benchmark, preserve URLs, update backlinks/301, submit new sitemap & work w/ SEO agency from day 1 @russosullivan #seslondon 5 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Attending the event? What has been the best part to you?

  • Bing: Here’s How To Become An Authority

    It looks like Bing’s counterpart to Google’s Matt Cutts, Duane Forrester, is now putting out Matt Cutts-style webmaster videos for Bing Webmaster Tools.

    He posted this one about becoming an authority by building quality content and sharing properly:

    <a href='http://video.msn.com/?vid=4f90e5ae-fa68-433a-b8a1-534f98bd888d&#038;mkt=en-us&#038;src=SLPl:embed::uuids' target='_new' title='Bing Webmaster Tools: Duane Forrester on Establishing Authority in Bing'>Video: Bing Webmaster Tools: Duane Forrester on Establishing Authority in Bing</a>

    “You being an authority means you’re an expert. You rank better,” says Forrester. “You get more traffic.That just leads to better business success for you, which is what you want.”

    “The first thing is, you need to focus on fundamentals here,” he says. “What we’re really talking about is the quality you have – the quality of content you build and the quality of sharing you do socially. Those are really two critically important points.”

    He gives an example of “how to build quality content” using eBay.

    “Let me give you two scenarios,” he continues. “One: you’re going to sell a cordless drill on eBay, and you’re just going to take the standard information, images and such. Now, the second example, we’re going to sell the same product – the same cordless drill – but we’re actually going to take videos of that cordless drill in use. We’re going to show that cordless drill in its packaging, in its wrapper, in every way possible. We are going to amplify it. Lots of extra pictures. We’re going to do this all on our own. We’re going to write up descriptions. We’re going to put all of that together.”

    “It’s pretty clear to see here by these descriptions that we’re going to have a standard view of an item for sale, and a really deep, rich, immersive view of an item for sale,” says Forrester.

    The second version, he says, is the “quality”.

    “That is what people are looking for to answer their questions,” he continues. “So when that comes to you content, you have to think of it in terms of, ‘Have I answered all of the questions this searcher has? Have I done it to a depth that satisfies them?’ If you can do that, you need to move on to the next step, which is sharing properly.”

    “You get out there, and you’re sharing things on Facebook, or you’re putting it on Twitter. Any of the social media spaces that you like and you frequent, you’re putting this stuff out there.”

    He says that before you submit this stuff, you have to ask yourself: Will my tweet or my post bring quality to my followers or my friends?

    “That is a critical step,” he emphasizes. “They want you to bring them quality. They need you to bring them quality. You need to bring them quality. If you don’t bring them quality, they’re going to stop following you. If you bring them good quality links either to your content or to related content, they will continually engage with you. They will share you. They will like you. They will amplify that for you. That amplification – that signals that you’re becoming an authority socially.”

    “Pull all of that together,” he says. “Now you’re starting to see things as the search engine sees it.

    More on Forrester’s thoughts about search and social from a presentation he gave at BlogWorld in November can be found here.

  • Googler: SEO Is A Bug, And Google Is Trying To Fix It

    Six days ago, a Google engineer, who just started with the company this year, posted to Hacker News to ask, “What is so evil about adding social networking features to everyone’s account?”

    “You have a Docs account and a Picasa account too, even if you don’t use them, and nobody complains about that,” he wrote. “What’s the difference between Docs and Google+?”

    It’s a fair point.

    He goes on to bash SEO as a practice, however, and this has raised some eyebrows within the industry. Here’s the relevant portion:

    Instead of being able to SEO the entire Internet, businesses can now only affect the search results for a tiny percentage of users. That’s a good thing because SEO can’t scale, and SEO isn’t good for users or the Internet at large.

    If you look at the Google experience from the standpoint of customers, it’s pretty good. Users get relevant search results and ads. Advertisers get their content on top of everything else. It’s a good compromise between advertising and usability, and it works really well. It’s a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug. Manipulating Google results shouldn’t be something you feel entitled to be able to do. If you want to rank highly in Google, be relevant for the user currently searching. Engage him in social media or email, provide relevant information about what you’re selling, and, generally, be a “good match” for what the user wants.

    Aaron Wall, who deserves credit for pointing the spotlight on the comments, raises an interesting point. “You can learn a lot more about what Google really thinks by reading what their new hires say,” he writes. “They are not yet skilled in the arts of public relations & make major gaffs like this one.”

    I don’t know that it’s fair to say that what one new guy at Google says is “what Google really thinks”. Google has thousands of employees (over 32,000 at last headcount), and I’m quite sure that many of them have different opinions about things. For one, we hear that not all of them are thrilled with Google’s “Search Plus Your World”.

    Then, of course, there was the incident where one Googler went so far as to call Google+ a “knee-jerk reaction” and a “study in short-term thinking” on Google+ (granted, this was meant to be an internal post).

    But Googlers are all over Google+ speaking their mind and sharing what they find interesting, not to mention connecting with the public. They do it every day. A lot of them. This does speak to Wall’s point about really learning more about the way Google thinks – just paying attention to what they all say collectively, whether that be on Google+, on Twitter, in forums, or in personal blog posts.

    In the end, I would say that in general, the opinions or actions, especially outside the confines of Google’s official promotional vehicles (whether they be company blogs, Google+ pages, Twitter accounts or YouTube channels) don’t necessarily represent the company’s stance. The company is made of people. Real humans who have real opinions. PR blunders aside, sometimes people are going to say how they really feel.

    Rockway did later follow up on his initial comments on Hacker News (hat tip: Barry Schwartz):

    Since people are taking what I’ve said out of context, I thought I’d clarify this statement:

    It’s a bug that you could rank highly in Google without buying ads

    I shouldn’t have mentioned ads here. Position on the results page should only depend on the quality of your content; if your site has the best content on the Internet for the user’s search terms, you should be the top result. You shouldn’t be able to change your position in the organic results any other way, like by exploiting bugs in Google’s ranking algorithm. The specifics of the ranking algorithm may change, but if your site is the best, you won’t have to worry about it.

    It doesn’t seem to indicate any significant change of heart with regards to SEO in general.

    Rockway’s last update on Google+: “I’m starting to feel high from my neighbors’ second-hand pot smoke.”

  • Google’s Latest Algorithm Changes (Freshness Update Gets Updated)

    Google rolled out “Search Plus Your World” in January, but that’s not the only change they made to how they deliver search results. Not even close.

    If you’ve been following, you may know that Google has been putting out blog posts the last few months highlighting some of the various algorithm changes they’ve made (without giving away the secret sauce of course). Here’s our coverage of last month’s updates.

    Here’s what Google now lists for January:

    • Fresher results. [launch codename “nftc”] We made several adjustments to the freshness algorithm that we released in November. These are minor updates to make sure we continue to give you the freshest, most relevant results.
    • Faster autocomplete. [launch codename “Snappy Suggest”, project codename “Suggest”] We made improvements to our autocomplete system to deliver your predicted queries much faster.
    • Autocomplete spelling corrections. [launch codename “Trivial”, project codename “Suggest”] This is an improvement to the spelling corrections used in autocomplete, making those corrections more consistent with the spelling corrections used in search. This launch targets corrections where the spelling change is very small.
    • Better spelling full-page replacement. [launch codenames “Oooni”, “sgap”, project codename “Full-Page Replacement”] When we’re confident in a spelling correction we automatically show results for the corrected query and let you know we’re “Showing results for [cheetah]” (rather than, say, “cheettah”). We made a couple of changes to improve the accuracy of this feature.
    • Better spelling corrections for rare queries. This change improves one of the models that we use to make spelling corrections. The result is more accurate spell corrections for a number of rare queries.
    • Improve detection of recurrent event pages. [launch codename “neseda”] We made several improvements to how we determine the date of a document. As a result, you’ll see fresher, more timely results, particularly for pages discussing recurring events.
    • High-quality sites algorithm improvements. [launch codenames “PPtl” and “Stitch”, project codename “Panda”] In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites. We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda.
    • Cross-language refinements. [launch codename Xiangfan] Previously, we only generated related searches based on the display language. With this change, we also attempt to auto-detect the language of the original query to generate related search queries. Now, a user typing a query in French might see French query refinements, even if her language is set to English.
    • English on Google Saudi Arabia. Users in Saudi Arabia can now more easily choose an English interface to search on google.com.sa.
    • Improved scrolling for Image Search. Previously when you scrolled in Image Search, only the image results would move while the top and side menus were pinned in place. We changed the scrolling behavior to make it consistent with our main search results and the other search modes, where scrolling moves the entire page.
    • Improved image search quality. [launch codename “endearo”, project codename “Image Search”] This is a small improvement to our image search ranking algorithm. In particular, this change helps images with high-quality landing pages rank higher in our image search results.
    • More relevant related searches. Sometimes at the bottom of the screen you’ll see a section called “Searches related to” with other queries you may want to try. With this change, we’ve updated the model for generating related searches, resulting in more useful query refinements.
    • Blending of news results. [launch codename “final-destination”, project codename “Universal Search”] We improved our algorithm that decides which queries should show news results, making it more responsive to realtime trends. We also made an adjustment to how we blend news results in Universal Search. Both of these changes help news articles appear in your search results when they are relevant.
    • Automatically disable Google Instant based on computer speed. [project codename “Psychic Search”] Google Instant has long had the ability to automatically turn itself off if you’re on a slow internet connection. Now Instant can also turn itself off if your computer is slow. If Instant gets automatically disabled, we continue to check your computer speed and will re-enable Instant if your performance improves. We’ve also tweaked search preferencesso you can always have Instant on or off, or have it change automatically.

    I thought it seemed like Google was placing a great deal of emphasis on recency. Now, we find out that they’ve made adjustments to the freshness update to make them even fresher and “more relevant”. I’m not sure that “more relevant” part is always working, however. Sometimes the freshest result isn’t the most relevant, and sometimes I think Google is showing results that would be better with less emphasis on freshness.

    Sometimes.

    Other times, it can be helpful. I guess it does, to some extent, make up for Google’s lack of realtime search, which went away with the expiration of the company’s agreement with Twitter last year.

    Again, to some extent. Not the full extent.

    The “final-destination” update, which deals with how Google blends news results into the mix is worth noting as well. The fact that this is based around “realtime trends” seems to be another area where Google attempting to fill the void of realtime search.

    Don’t expect Google to get Twitter-based realtime search back anytime soon. The two companies apparently won’t even talk to each other.

    Note that the Panda update was also addressed.

    Do you think Google’s recent changes have made results better? Let us know in the comments.