WebProNews

Tag: SEO

  • Are You Getting Your Content in Front of News Seekers?

    Getting press coverage can mean a great deal for gaining traffic and overall exposure for your business. That said, there are also ways to take some initiative yourself in getting some exposure from news search.

    Is news search part of your strategy? Real-time? Social Search? Press releases? Discuss here.

    News Search Optimization

    As Lisa Buyer of the Buyer Group talked about with WebProNews at SES last week, news search optimization is getting more powerful with social media and real-time search. Add these to older tactics like blogs and press releases, and there have never been more opportunities to get news-related content discovered.

    Press Releases

    Press releases can still be a great way to spread the word about any announcements your business might have. They can also drive traffic, particularly from search engines.

    Back in the summer, PRWeb shared a case study with us, involving a firm that typically sees a boost in search engine rankings and a 50% spike in web traffic after they issue a release. In fact, for one release in particular, the firm saw a spike of 400% on two different Web sites, and the firm doesn’t believe they were from the same users. They also incorporate social media tools like Twitter to extend the "shelf life" of press releases, and say that drives additional traffic.

    "When we included a link to our press releases on Twitter and other social media networks, we saw these both expanded the scope of distribution and the extended the longevity of the announcement," the CEO of the company behind the case study had said.  "With other news releases we saw an initial spike in Web site traffic on the first two days and then it dropped off.  With these features we’ve seen increases in traffic up to five days after the news release was issued."

    Remember, Google News indexes press releases as well.

    Real-Time Search

    You’re probably already using social media in some capacity at this point. Real-time search presents an added benefit to talking about timely topics using channels that you are already using (Facebook, MySpace, Twitter, blogs, etc.). That doesn’t mean spam. Spamming won’t get you very far here anyway, because Google is pretty good at filtering this. We went over a few basic tips for real-time search optimization a while back. The recap is below, but you can find them elaborated on here.

    1. Use keywords
    2. Talk about timely events
    3. Have a lot of followers (who can share your content)
    4. Promote Conversation
    5. Include Calls to Engagement

    Real-time search is much more than just Google. There are an increasing number of players in this space, and with the rise in smartphone usage, mobile apps are giving consumers a lot of choices in how to obtain their information.

    Social Search

    Another great benefit of using social media means you get to show up in your friends/followers personalized social search results for numerous queries on Google. Newsy topics are frequently the ones that trend, and that means lots of people searching. If something big happens, there’s a chance that some of your social network contacts will search for something related to that, and if you have something to say about it, there’s a good chance they’ll see it in their results.

    Of course people search with the social networks themselves as well. Facebook search queries were on the rise last time I checked.

    Google News

    Last September, we ran down a number of Google News SEO tips here. Google shared some tips of their own on the subject as well:

    Optimizing for news search means more shots at showing up in search results period. Do you have other ideas about getting in front of news seekers? Share here.

  • Longtail SEO For Ecommerce

    Longtail SEO For Ecommerce

    The significance of longtail keywords can be exemplified by thinking about the following two people:

     

    Bill is a cafeteria worker who spends his spare time fishing and has heard that his favorite TV shows will look even better on on this new-fangled technology called “HDTV”. He might as well upgrade from his 20” to something a little larger while he’s at it his friends tell him (though they don’t know much more about it than he does). He sits at his computer and enters “hdtv” into the Google search box.

    Steve also works in a cafeteria but is a bit more tech-savy. He has and uses a Facebook account, watches videos on YouTube and looks up information on Google when he’s looking for an answer to one of his questions. He too is interested in HDTV but decides to check out a few review sites first before making the leap. He reads a great review on CNET and likes the specs of the “Panasonic Viera TC-P50G10” and decides to look around for pricing. He heads back to Google and searches for “panasonic viera tc-p50g10” or perhaps even “buy panasonic viera tc-p50g10 online”.

    The difference between these two? Other than the fact that one has a dismal likelihood of conversion and the other a high likelihood – the difficulty in attaining top rankings for the two phrases is very different as well. Now, I’m not saying there isn’t a place for going after the generic, high-traffic phrases but ignoring the higher converting, less-work-per-conversion phrases that are easier to attain rankings for – well – that just doesn’t make good business sense does it?

    So – how do you rank for the longtail?

    We all understand that the factors of SEO are the factors of SEO. Just like any other phrases – your ability to rank is quite simply based on a combination of page strength and relevancy (yes there are tons of signals Google uses but they essentially break down to these two points). To affect these areas we use a combination of onsite optimization and link building. Sounds easy so far? Perfect. So let’s take a look first at onsite optimization.

    Optimizing your site for the longtail

    I can’t possibly cover the different technologies and how to make sure your site is crawlable. Let’s just say – the first step is to make sure that the crawlers can get to your internal pages and that strength passes down. If the crawlers can’t get through to the internal pages then you’ve got bigger problems than tweaking your content and building some links. Contact a developer immediately and get that sorted out first – then continue reading.

    Once you know that the crawlers are getting through and strength is passing we move on to the actual optimization. The first thing one wants to look at it how to push the items with the highest ROI potential up in the hierarchy of your site. Let’s use Amazon as an example of how that should be done (they know a thing or two about ranking for products).

    Amazon uses one of my personal favorite tactics in that they automate the process but it’s not necessary. You probably don’t have the same number of products so you can likely do manually what they have to automate but let’s look at what they’re doing and you can apply the strategy as you see fit.

    If I was Amazon and I wanted to rank my site for longtail phrases I’d want to rank for the phrases that had the highest search volume and highest chances of conversions. I’d have to apply global rules to a massive site (you don’t have to – you can likely do things on a case-by-case basis but I’m sure we can all agree – Amazon cannot). So to keep the most profitable phrases high in the hierarchy but still not ignore the other longtail phrases they have created a hierarchy that puts the top product categories one hop from the homepage (Laptops & Netbooks For Example) and on that page they have links to all the major brands and uses but my favorite tactic is that they have the bestsellers. This information is easily created from their database and insures that the more popular products are two hops from the homepage and linked to with the brand and model number. At the time of this writing they have a link to the “ASUS Eee PC Seashell 1005PE-MU17-BK 10.1-Inch…”. If I search “asus eee pc 1005pe-m” who do you think shows up first? Amazon.

    So step one – make sure you’re linking to the product pages with the brand and model number of the item and also put the more important items higher in the hierarchy of your site. Now this doesn’t mean cram all your products on the page. You have to apply the same principles to links with onsite as you do with offsite optimization. A page has a vote. It you have a page with 10 products listed on it – each product gets 1/10 of the weight passed to it. If the page has 500 products listed on it – well, you get my point. Figure out what matters and focus there.

    Of course – you don’t want to ignore the other potential phrases. You’ll notice that as well as linking to the top products in each category they link to sub pages with brands, specs, etc. This is why they rank so well for so many phrases. Well – that’s part one.

    Once you’ve got the internal linking sorted out you need to follow that up with some onsite relevancy. Here we’re referring to optimized titles, descriptions, H1 tags, content, etc. I’m going to have to leave a full breakdown of onsite optimization for another article but I can discuss some of the differences you’ll encounter with longtail optimization with ecommerce sites.

    With “traditional” optimization we visit a page and adjust the relevant aspects (titles, content, etc.) manually. With large ecommerce sites we need to come up with rules that apply site wide. Developing titles, descriptions and content for each and every page one-by-one is likely not an option. If you look at Amazon again you’ll see that they automate the process by using the brand, model and categories in the title, description, keywords and H1 tag. Easily automated. Through their use of automated elements (“Customers bought with …”, specs, descriptions, reviews, etc.) they are also able to insure that that the brand and model number appear on the page.

    Now that works well for Amazon. They have millions of links and huge site strength. But what if you don’t have that behind you. They can build a page, put it on their site and rank. You may need to invest some of your time in link building.

    Link building for longtail optimization

    There are two primary aspects of link building that one needs to address when we’re looking at longtail optimization. The first is to the homepage for site strength and the other is to specific internal pages. The reason that we’ll want to link to specific internal pages is that like it or not, you’re not as strong as Amazon and so you need to build links to compete where they do not.

    I’ll leave the discussion of how to build links to other articles (you know – one of the 800,000 written on the subject) however we will discuss the purpose of the links and thus you’ll understand the pattern of the link building.

    The homepage links are in place to simply build overall site strength and should be geared to your generic, homepage phrases – it’s the internal links that are specifically geared to brands and models. So we’ll focus on those links in this article.

    How to build links to internal pages

    Building links to internal pages is virtually identical to homepage. True you can’t use directories but that’s about the only link building tactic that doesn’t apply. There are two points that you’re going to want to direct links to:

    1 – the category/brand main page.

    The first point you’re going to want to direct links at is the main category page and the main sub-category points of the ecommerce site. You’ll want to direct these links in with anchor text that suits the brand and/or category subject. Let’s use Amazon as an example again.

    For the purpose of longtail optimization – the links we’d direct to http://www.amazon.com/Netbooks-Computers/b?ie=UTF8&node=679517011 would primarily be geared to strengthening the page. Oh I’d use anchor text geared at “netbooks” and the link but the main point is to make that page stronger and in turn – the pages it links to. These links will also get the page spidered more.

    What this will do is make the links to the brands stronger but most important – the links to the top sellers stronger and more quickly picked up. This is why they rank for new products in a matter of hours.

    The individual brand and usage pages are the same from this perspective./ You’ll want to optimize the pages and you’ll want to focus the links for long term gain but the short term purpose is to pass strength to the product pages.

    2 – the product pages.

    On top of building links to pages one level up (as we’ve just discussed) you’ll also want to build links to the individual product pages. Amazon can build a page, link to it and have it rank – you probably cannot. For products and models you know will stand the test of time – building links can be a long term strategy but not my favorite (due mainly to the fact that it’s not exciting). Personally I like building links to “Coming soon” product pages and getting them spidered before there’s any competition and then adding in the product the day it launches giving you a one-up over your competitors in both timing and strength. Heck, you might even win out over Amazon for a while. 🙂

    Don’t overdo it in the link building. You’ve got a lot of products. Unless you know a specific product is going to be HUGE you’ll want to just build a few links and move on. You’ve got a lot of products to cover.

    Moving forward

    Obviously I can’t cover all the various aspects of ranking for the longtail in a single1800 word article and in fact, if I turned this into a 180 page book I’d still not be able to cover all the variables but my hope is that I’ve given you food for thought in the tactics and timing you’ll find helpful in moving forward and ranking your website for the longtail phrases that convert so well and for which you can rank so quickly if you do it right.

  • Examine Your Site’s Text, Reduce Chances of Search Engine Confusion

    Has it ever occurred to you that you may have keywords on your site that are misleading to search engines? Or that you need to take a look at all of the keywords you are trying to rank for, and think about the different meanings and contexts that those could be taken in that are unrelated to your actual product, and then eliminate other seemingly unrelated words that to a search engine could be misconstrued as an indication of one of those other contexts?

    At SMX West last week, WebProNews sat down with Bruce Clay of Internet Marketing firm Bruce Clay, Inc. who made some interesting points about understanding searcher behavior, intent-based search, and how that should affect keyword research.

    Note: We talked to Bruce about quite a few search-related topics, but this subject is focused on more toward the end of the video (about 20 minutes in).

    Clay talks about Google delivering more personalization in search results, taking into consideration things like how prior queries influence future queries. "Ranking is going to be less of a measurement," he says. "We’re going to be focused on more the traffic."

    "When I decide I’m selling a hammer, I have to actively go out of my way not to have certain things appear in my site, because the search engines could be confused about what I’m talking about….I don’t mean the Armand Hammer Art Museum at UCLA. I don’t mean a bowling ball…you know, the things that show up for hammer are all over the board," says Clay.

    "One of the things that I think is important, and that we’ve been working on is how do we actually do keyword research without knowing the behavioral aspects our personas that are actually going for our product? You have to understand personas now a little bit better – what kinds of things are they likely to search on, in sequence – before they type in hammer…so if they’re on an arts and crafts site, and then they type in hammer, I ought to understand that behavior in sequence, so that I can better do my keyword research and determine how I’m gonna put the words on my page. I don’t see a lot of people even thinking that way."

    Personalized search is nothing new. Google’s been personalizing search results for some time, based on various indicators, and it appears that Google is looking for more ways to deliver users a personalized experience (whether they want that or not).

    Between personalized search and other sources of information infiltrating search results pages, traditional SEO is becoming harder to accomplish, and Bruce says, even ineffective. That’s why it may become increasingly important to focus on relevant elements of the SERP for queries you hope to be found for.

  • As SERPs Get More Complicated, Focus on Relevant Elements

    At SES Chicago last year, Yahoo VP of Consumer Products, Larry Cornett suggested that blended search results bring businesses a broader range of SEO opportunities, a chance to take control of their brand, and a potential increase in qualified clicks. While these blended results can tend to divert users away from organic listings, as SEO Dave Naylor pointed out at that same conference, Cornett does have a point.

    Blended search results offer ways to get to the front page of search results beyond just the highly more competitive organic rankings. Sites have opportunities to show up for:

    – real-time results
    – news results
    – image results
    – video results
    – shopping results
    – local results (customers don’t even need to go to your site in some cases)

    At the recent Online Marketing Summit in San Diego, WebProNews spoke with Conductor CEO Seth Besmertnik, who says companies should still build a foundation in organic rankings before trying to conquer other areas:

    That said you can break these different elements of blended results down one by one, and look at ways to have your site perform well in each particular one. Here are tips for image search optimization, for example. Here are some for video. Here are some for real-time search. Here are some for news search.

    Back to Cornett’s point about qualified clicks – focus on what makes the most sense for your site. Is focusing on real-time search worth your time? With Google, at least, even if you show up here, your presence will quickly give way to the next in line, and you will be off the page momentarily (although there still may be times when it makes sense to be seen here).

    If you don’t have quality video content, video search optimization is not bound to be a very practical use of you time. However, if you do have some good stuff, perhaps you should be heavily focused in this area. I think you get the point.

    Of course there are plenty of other factors of today’s search results page that drive users away from the "ten blue links" of organic results. It’s not just the blended search elements discussed above. You’ve also got search suggestions, related search links, location, mobile use, paid listings, search options, and various other elements of the user experience that compete for user attention. This is one reason why the lines between search marketing and other types of marketing continue to blur (consider that users of Google or Yahoo can customize their home pages to accomodate many of their favorite sites, making those just a click away).

    Still, that foundation in natural search that Besmertnik mentioned is definitely a big part of the overall picture. I suggest taking advantage of your listings here, and maximizing those, regardless of how well you rank. Things like site links and breadcrumbs come to mind.

  • Site Speed Tips for When Google Uses That as a Ranking Factor

    Last year, Google’s Matt Cutts dropped the bomb (to put it in the exaggerated tone that many took the news in), that Google was considering taking site speed into consideration as one of many potential ranking factors for search results.

    Is your site’s performance up to snuff? Comment here.

    This of course freaked a lot of people out, but as Matt and Google as a whole has maintained, this would not trump relevance. It would be taken more into consideration when there are two sites of relatively equal relevance, but one site loads faster and delivers a better user experience. Matt reiterated this point in an interview we did with him this week at SMX.

    WebProNews also chatted with Maile Ohye, Senior Developer Programs Engineer for Google at SMX, about website performance (speed), how that pertains to search rankings and the user experience, and some tips for making sure your site is up to speed, so to speak.

    Stream videos at Ustream

    As far as site speed as a ranking factor, Ohye pretty much makes the same point as Cutts, and it’s probably not going to be something where all of a sudden all of the faster sites are ranking better and the slower ones are doing worse. But it does enhance the user experience, and she refers to a study that found that an optimized site actually increased conversions by 16%. So if you’re not optimizing your site’s performance for Google, maybe that’s a good enough reason on its own.

    Watch the video to get some specific advice regarding some simple adjustments you can make to your site that can make a big difference.

    If you’re one of those freaking out about getting your site performance optimized, you may feel better after hearing what she has to say, and realize that it might not be as big a deal as you thought.

    By the way, Cutts also mentioned that the speed thing is completly independent of Caffeine.

    Do you think site performance is a manageable attribute of your search engine marketing strategy? Discuss here.

  • SEO and Social Media Matter for Press Coverage

    When businesses think about search and social media, a great deal of the time, they are thinking about traffic, customer engagement, and brand awareness. While these are all good things to consider, there may be more to that last one that you have spent much time thinking about.

    Brand awareness goes beyond just having a random customer find your site in a set of search results or through a link from their Facebook news feed. Have you considered how channels like search and social media are used by media outlets and journalists? The fact of the matter is that journalists and bloggers alike utilize both to a great extent while covering their beats.

    Do you take press coverage into consideration? Comment here.

    Search and social both play significant roles in PR. This is a topic that WebProNews recently discussed with TopRank Online Marketing CEO Lee Odden. Odden calls journalists customers, and in many ways they should be treated as such when it comes to getting your product or site in front of their eyeballs.

    Odden says to look at what it is you can do as a marketer to make it easier for the journalist to do their job. Optimize your content for what a journalist is looking for. This is one way you can potentially increase your media coverage, which can obviously increase brand awareness.

    Odden makes a great point online journalists often having tighter deadlines, and turning to blogs and social networks for sources and quotes. For example, the real-time nature of a Twitter search might be just what a journalist or blogger need to find someone who’s talking about the subject they’re writing about, at nearly the moment they’re looking for it.

    For that matter, Google’s real-time search can help for the same reason, and most journalists and bloggers frequently use Google to search for what they’re looking for. If what they’re looking for happens to be related to a newsy topic, they just might see Google’s real-time results literally before anything else. If that topic happens to be related to something you’re talking about, you just might end up in those results too. Google is also indexing updates from Facebook Pages here now, by the way.

    The point is, if you are looking for increased media coverage, there are ways to increase your chances of getting in front of the right people, and it is certainly not limited to real-time search. Sometimes journalists/bloggers will simply tap their contacts within their social networks (or email of course) to find sources. This is as good a reason as any to engage in social media on a regular basis and network with lots of relevant people.

    If attracting media attention is what you’re after, consider these five tips I offered in a SmallBusinessNewz article last year:

    1. Do something that’s different – Simply do something that makes you stand out: something that gets people talking. If it creates enough buzz, the media coverage will likely follow.

    2. Look for niche publications – the more niche the publication, the more likely they probably are to cover you.

    3. Personalize your message – When you’re writing an email to a publication to talk about your business, for example, personalize the message for the specific person you’re contacting, so they know it’s not just a manufactured piece that you’re sending all over the web. Journalists like exclusivity.

    4. Find multiple contacts – If you can find more than one contact for a particular publication, it may be wise to send your story pitch to them. This will increase the potential visibility among the publication’s staff.

    5. Provide plenty of details – When sending such a pitch, it’s a good idea to include as many details about the product/story as possible. The more details available, the less research is required, and time is more valuable than ever, especially for a journalist.

    Another piece of advice I would give is to not let your press center hold back your marketing opportunities. I’ve seen a lot of companies fail to keep their own press centers up to date with the latest news, even as big announcements are made, and even if they have issued press releases. Often times, these releases won’t even be available on the site until later. If you want to increase your chances of more media coverage, you should always have your latest news readily available in your press center, or via your blog – wherever you make announcements. And always provide contact info.

    Share your tips for increasing press coverage.

  • Liveblogging: The State Of The Search Union (Google, Yahoo & Experts)

    Watch the Keynote live at live.dev.webpronews.com.

    At SMX West in Santa Clara, the State of the Search Union keynote is taking place today. It’s moderated by Chris Sherman, Executive Editor of Search Engine Land, and features SEL Contributing Editor Vanessa Fox, Google Analytics Evangelist Avinash Kaushik, Yahoo Director of Search Marketing David Roth, and Misty Locke, President, Range Online Media and Chief Strategy Officer of iProspect. The official description for this keynote says:

    We’ve just come through the most turbulent period in history for search marketers. Economic disruption, massive algorithm updates, the disappearance of a major player through consolidation with one of its former competitors… these events and others have reshaped the search landscape, creating both challenges and opportunities for search marketers. On this panel we’ve assembled some of the sharpest minds in search to discuss where things stand and where we’re going – you won’t want to miss the insights and recommendations from this group of super-savvy panelists.

    I will liveblog the event below, when it starts 9:00am Pacific/12:00pm Eastern (please forgive typos):

    Liveblogging starts:

    12:00 EST: should be starting anytime now…

    12:03 People are taking the stage…getting set up with audio…

    12:04 Sherman: An interesting year in search. Often not a whole lot has happened, usually just Google, Google, Google. IN the past year, we’ve seen more radical change than in the past 15 years or so. no sign of change letting up…

    12:05: A few questions: key question: when we were here last year, we were in the early stages of an economic meltdown…everybody uncertain….what’s going to happen…search itself was still relatively young/ what was going to happen to industry? so now, how are we doing? 

    Sound problems…dave says as a search marketer, it gave oppotunities to show stuff and shift strategies. support business goals/shipping landscape. maybe used to optimize for ROI now different metric…shift back to SEO. not just paid side.

    Misty: ecommerce still did well in some areas. some clients due to search, record breaking months at times. even in downturn. some marketers utilizing different techniques, driving revenue, and reoccurring customer loyalty. combined search with other marketing channels…flexible companies saw growth.

    12:09: Vanessa: Super bowl – with pepsi, they decided to spend their money on social. interesing that some companies think online is a better way to go… one thing from super bowl ads…so many large brands seem to only just now understand that search is important. across the board, it was better than last year as far as big brands in search during super bowl. a lot of work left though.

    12:10 Avinash:  emboss your brand on somebody’s brain (branding)..search can do this. at the end of the day. when people want to run a branding campaign…..what do you want out of it? one night stand? long term relationshiP? depending on what you want? search is a massively effective way to get to the right kind of people..

    12:13: Microsoft/yahoo deal: Dave: since we got regulatory approval, the integration is on. huge project. lot of resources from both companies. proof will be in the pudding. advertisers start to migrate…i’m yahoo the advertiser. we’re going to continue to innovate around the customer experinece around search. products tah live outside of the index….

    12:14 Sherman: anomisity? integration with cultures? Dave: just beginning. large amoutn of resources at yahoo to be moved over and work with MS. will be a portion of yahoo that stays at yahoo. a lot remains to be seen. clearance still to new…people hard at work. everybody on the project understands that this is critical. it absoultely MUST work.

    12:15: Misty: clients excited about deal. allows viable number 2. may not drastically change how they upload campaigns, but it does allow to shift focus stategy for bing….60/40 time split between google and bing…excited about volume…reach….

    12:16: some changes in showing results will open up some new ways to utilize Bing to advertise. new customers…cashback is a big driver. marketers in geneal have been slow to adopt….

    12:17: Sherman: 2 major players: shrinking? couple of giants? no 2 strong players. market growing. Avinash: competition is a good thing. gets everybody to innovate. important to realize…prudent to have portfolio strategy with acquistions. think about content network, youtube ,search, doubleclick. poeple get far too obsessed between microsoft, yahoo, and google. you should have already had a very effective strategy across all search engines.

    12:18: Avinash says a blog post of his got way more traffic from bing for the word analytics than google.

    12:19: You will find new customers and use dollars more effectively with the portfolio strategy.

    12:19 Vanessa: waiting to see how things shake out with integration. searchmonkey? boss? waiting to see how it works out…yahoo did try to make a play for innovation. don’t know how much they’ll be motiviated.. it will be great if they do. reserving judgment.

    12:20 Vanessa: hopefully we won’t lose all the yahooness.

    12:21: Caffeine: rolled out after holidays? no. still just one data center. sherman talking to vanessa: what impact is caffeine going to have on SEO? is Google going to continue in spirit it always has to provide tools/insight…Vanessa: changes – social, real-time,local, etc. things will ramp up more and more. caffeine specifically. i don’t know it’s going to impact seo that much. just back-end. on their side, a better way to crawl the web. their hope is just to do it better. that’s a benefit for site owners. it’s not a rankings impact, except in more of an indirect way. i don’t think from an SEO perspective, there’s much you need to do. I hoep they  keep reaching out. when i was there i loved being able to go out and see what people needed. don’t see a reason why they would stop doing that. they have kept doing it. 

    Avinash: if every googler woke up and for the entire day, they would answer questions for webmaster, it wouldn’t answre all questions. how can we help people at scale? webmaster tools. a number of tools thag google puts out. just a few more releases for wm tools over the last six months. you’ll continue to see google keep puting out tools that allow this kind of self help at scale. help you make better decisions with search data. orgasmic about amount of data google has put out there in terms of your ability to make better decisions. love access you have to google’s organic search data….insights for search, ad planer….etc.
     
    12:26 with google ad planner you can look at certain demographics and sell to them, but who have done a particular search….target with very relevant display ads using search data. we’ll continue to put tools like this in your hands.

    12:27 Sherman: back to dave on microsoft/yahoo: Dave: yahoo’s staying committed in search. in sales side, very experienced in search and display. maintain high touch with big customers. small businesses and self service customers more directly managed on microsoft side.

    The goal is to work on the acenter platform and make it the platform of choice on microsoft as well as yahoo.

    12:29 Sherman: Yahoo search in dna? Dave: rich data from search, now idea is what can we do with the data we have and the assets to create a better ad for the consumer…behavioral, targeting, etc. teams of people focused on new ad products…

    12:30: Sherman: social media – replacing search as the way people interact on the web? i don’t see it, but Facebook has huge stats. what’s happenign with that and what can search marketers do? Vanessa: reporter last week said search is so old news, so why still do search? she said people are still searching and they’re searching more and more. and thy’re going to keep doing so. doesn’t mean don’t think about social media, but it’s about audience. it’s not an either/or thing. misty agrees.

    12:32 MIsty: with social you can do traditional things outside of search. boundaries are dissolving between different marketing strategies…

    12:33: Misty: social/real-time can drive search volume. marketers will find new ways…it’s a new beginning for search.

    12:34: Dave: big companies look for search marketers for expertise in this. Avinash: media loves all stories, facebook/google/twitter/…world’s is all about one thing or the other ….video killed the radio star…it’s not like that here.  once said twitter was the dumbest thing on earth….now he uses it and thinks its the coolest thing since sliced bread.  important to realize that as you think about different elements, you use them for what they’re good at. the worst strategy is the tv strategy…to shout at peopl…thats why most big brands have pathetic number of followers…they’re not having conversations like danny sullivan.

    12:36: Sherman: Managing info overload? how to make advertising legitimate business? will search be absorbed? siloing? Avinash: what we do today is try to influence people….there are many ways now to do that…one emerging way is to have these conversatiosn (is it going to survive)….single greatest reason for google’s success is relevance……….advertisers (madmen style)…that way is dead.

    12:37: Avinash: when i work with some of the largest companies….the small ones will use search to get people to raise money, brand awareness…very broad range…Dave: Siloing? exactly the opposite. social media is finally…we used to talk about the promise of engaging with a consumer. social media is the first channels that’s accomplished this. it’s forcing the breaking down of the silos. who should own social media in your organziaton? Who owns the paper in you organization? It’s breaking down the silos.

    Misty: everybody wants to own a piece of that because it can be so influential. PR, marketing, brands, search all involved. all can communicate more effectively. in the end we’ll win, because w’re tapping into what the consumer is truly looking for. customers can tell stories for us…make the pieces of our brand that they love more accessible to them…

    12:41 Vanessa: we have to go the opposite way of silos. If you think about the data side, if all the areas of marketing can share data, there’s going to be so much more engagment…

    12:42: Sherman, people are engaging and being social…but we’re still in the young, naive days in terms of black hat use…unethical marketers using data? What’s gonna happen if gov. steps in and says privacy is an issues…we don’t understand the issues, but will legislate it anyway:

    12:43: Misty: thanks the gov. for paying attention to privacy (over healthcare,etc.). Avinash: talks about spam comments in egyptian tombs….spam has been a problem for a very long time and will continue to be…just try to use intelligent ways to surpress it as much as possible and provide incentives to do the right thing.

    12:44: Misty: yes there will always be spam….marketers are always going to find a way to use/exploit media…years ago it was different people screaming "you’re a black hat"…now it’s the users who are policing the good/bad. consumers can sniff out the authenticity. the hard part is being so authentic that you don’t get called out by the consumer…

    12:46: Dave: regulaton – a fair degree of risk…double edged sword. potential to do unehtical and criminal things. ….mentions recent product from Google that raised privacy concerns (doesn’t blame google necessarily)…not enough understanding in regulation to deal with it….some degree may be needed, but capitol hill’s understanding of the internet is frightening…..education is required…. fears that legislators aren’t up to speed.

    12:48: Sherman:  shift from US perspecitve to global. what is the opportunity for search marketers to go global, and how do you deal with restictions in other countries like China…..Vanessa: you’ve always had to understand your audience whree they are. it’s not just about geography/translating…understanding the culture. government stuff is a whole other set of issues…starts with understanding the market before you go into it.

    12:49: Avinash: there tend to be very sophisticated marketers in other countries. those that are doing web are sophisticated….just not enough of them. many countries extrememly young. opportunities in these countries…like vanessa’s point…you have to truly engage and understand the market. Misty: some other countries are doing more cool and unique things in social….

    12:52:  Dave: a lot to be learned from other countries. some are leapfrogging…some not even using email, but going straight to Facebook, etc.

    12:52: Mobile is here, but maybe it’s not what we thought it was gonna be….we’re about to see things change very drastically…iphone was a game changer…heading very quickly in a direction we didn’t anticipate a few years ago…

    12:53 sherman: changes for search marketers? Avinash: story about being with his kids wanting to see something and using his nexus one….used google search and transcribed what he said through voice search, used location, gave him driving directions fast….i wonder as SEOs/marketers, if we’ve thought of this as the use case. sites optimized so they can do these things….for their business….not just on google….i really think…i have to rethink my search strategy…not even a fragment of marketers in search business are thinking about that as search. encourages marketers to think of mobile like that…not just a WAP version of a web page.

    Misty: are all your local listings up to date, ready for navigation, prodcuts easy to access? then think of advertising things…usability of site in mobile…

    12:56: Vanessa: Avinash is right. ubiquity in mobile opens door for a lot of new opportunities. it’s not that new ways of searching will replace google, but it’s just presenting different ways. people don’t think about using some things as searching, but it is….urbanspoon, etc.  Look at where the new opportunities are.

    12:57 Sherman and audience thanks panel. It’s over.

  • Google SEO Report Card Scores Company’s Own SEO Efforts

    Google is looking to improve upon its own internal SEO efforts. The company has created what it calls an "SEO Report Card," designed to improve the user experience and visibility of some of its own properties. The company says it aims to identify potential areas for improvement in Google’s product pages, which could help users find them more easily in search engines, and fix bugs that annoy visitors and hurt the pages’ performance in search engines.

    Google is making this report card publicly available though, and that means other businesses and webmasters can study it themselves, and use what they learn to improve their own sites. It may come as a surprise to some, but Google appears to have a great deal of improvement to do when it comes to search engine optimization, the irony of course coming from the fact that Google operates the world’s most dominant search engine.

    "Simple steps such as fixing 404s and broken links, simplifying URL choice, and providing easier-to-understand titles and snippets for our pages can benefit both users and search engines," says Google’s Search Quality team. "From the start of the project we also wanted to release the report card publicly so other companies and webmasters could learn from the report, which is filled with dozens of examples taken straight from our products’ pages."

    Here’s a quick look at their scoring:

    Google's SEO Report Card shows Google search engine optimization efforts

    The whole document is about 50 pages (though much of that is graphical), and is available to download in PDF format. Google began by reviewing the main pages of 100 of its different products across a number of common SEO topics, and says it will go deeper into the sites in future versions of the report card.

    What do you think about Google’s SEO scores? Do you find the information within the report card helpful?

  • SEO and Quality Key to Competing in the Long Tail

    A while back, WebProNews had a conversation with RateItAll President Lawrence Coburn about how the long tail of search is getting more competitive. Companies like AOL and Demand Media are working on dominating long tail searches with content across a broad scope of article subject matter. We had another conversation with another company that is doing this, called Suite101, which is placing an increased amount of emphasis on SEO to up the competition in this space even more. Suite101 President and CEO Peter Berger took a break from Olympics mania in Vancouver (home of the company’s headquarters) to tell us about it.

    Peter Berger, CEO of Suitie 101 Talks about SEO , Quality, and the long tail "Making sure well-written articles get found online involves continuous hard work and search engine knowledge," says Berger. "We know that in order to help our writers get their stories found, we need to increase our expertise in the area of search." That’s why the company just hired search strategist Aaron Bradley as its SEO Director to implement new SEO tactics across its articles.

    Berger tells WebProNews Suite101 attracts over 25 million unique monthly visitors. The company’s revenue comes from advertising – mainly AdSense, but other networks have been integrated as well. They don’t charge writers fees, but they have a strict submission process. Only 20% of writers are accepted, with 80% being turned away. Writers are required to submit work samples and resumes before being accepted. The first article must be submitted before it goes live, but after that, articles go live and are then reviewed by editors.

    Berger says "quality is key," and is the reason he doesn’t seem too worried about competition from big name brands like AOL. That, and he says most writers want to write for numerous publications, so even if a writer does work for AOL, there’s a good chance they’ll submit to Suite101 as well.

    Presumably Berger is hoping the hiring of Bradley will help with the competition in terms of search engine traffic, the company’s biggest traffic source (though they do see spikes from social media as well). One writer for Suite101 achieved a monthly earnings record of $5,000 for articles published at the site, which splits revenue with its writers. It will be interesting to see how quickly that record is surpassed with the company’s new SEO efforts.

    Naturally, the more quality articles the site is able to obtain, the more content it will have out there in the search engines, and if their SEO efforts are as effective as they hope, they will be getting a lot more eyeballs and clicks on their ads. Berger thinks writers like Suite101 because it’s the "closest" they can et to "actual professional editors in a lot of cases. Quality, he says, is the "key differentiator" between Suite101 and its competitors. 

    There has been a lot of talk about how SEO practices can hinder quality, because you should write for people, and not search engines. Berger thinks they can achieve both.

  • Will Bing Powering Yahoo Make SEO Easier?

    There is an interesting discussion going on in our WebProWorld forum about search engine optimization post Microsoft-Yahoo deal. For those unfamiliar with the topic, Microsoft and Yahoo recently gained regulatory approval on a search and advertising deal announced last year, which will see Yahoo using Bing’s algorithm in its search results. The discussion is about whether or not this means businesses and webmasters will only have to worry about optimizing for 2 search engines (Google/Bing) rather than 3 (Google, Yahoo, and Bing).

    Will you focus your efforts more heavily on Bing? Discuss.

    What Bing Coming to Yahoo Means

    It’s important to note that Microsoft and Yahoo still have plenty of details to work out before anyone knows just how the product of this deal will function. We know that Bing will be used in the back-end of searches on Yahoo, but we don’t know what other elements Yahoo will still be incorporating into the search experience. For example, Yahoo said last week that the companies will still be discussing how SearchMonkey and BOSS figure into the mix.

    Optimizing for Yahoo is not going to be limited to showing up in Bing’s results. That’s not to say that showing up in Bing’s results won’t have its advantages for Yahoo search, but there is a lot more going on at Yahoo than that. The company has been stressing that it is still very much focused on search, and under the deal with Microsoft, Yahoo will still be controlling the user experience at Yahoo.com.

    Right now, Yahoo.com has plenty of elements to consider, from news and trending topics, to a whole slew of "applications" that users can customize on their Yahoo homepage. Among these are Facebook and Flickr. If you want to get in front of Yahoo users, it’s not limited to Yahoo search results. That said, Yahoo search results also have their own thing going on. Keep an eye on the box that appears under the search box after you enter a query. It contains related queries, and "related concepts". This is one area that could conceivably be independent from Bing (although that remains to be seen at this point). Yahoo is not shy about putting brands in these "related concepts" either. You can find WebProNews in there for a query like "ebusiness news".

    eBusiness News suggestions on Yahoo

    The point is, Yahoo has made it clear that it will continue to control the user experience, and that means there should be plenty of areas within Yahoo that are out of Bing’s control. This leads me to presume that Yahoo will not be something you’ll want to ignore, just because Bing is integrated into it. Remember that at this point, Yahoo controls a much greater percentage of the search market than Bing.

    All of that said, you may want to pay closer attention to your Bing rankings if you haven’t done so in the past, because while Yahoo will still be Yahoo to its users, the deal also means there will be significantly more eyeballs on what Bing determines to be the most relevant results to searches.

    Why Stop at Google, Yahoo, and Bing?

    These may be the biggest three search engines in terms of market share in the United States, but there are still plenty of people using others. For one thing, YouTube is number 2. Not Yahoo or Bing. If you are concerned about simply being found where people are searching, you should have a YouTube presence. That of course means having a video strategy, but that doesn’t necessarily mean you have to have a huge video budget.

    There are still people using Ask as well. In search industry coverage, it often gets overshadowed by the others, but there are still a lot of people using it. In fact, the Ask Network’s market share grew by 6% from December to January. Ask.com’s market share grew by 1%. A lot of people search with AOL. AOL’s search is powered by Google, but it doesn’t always return the same results as Google.

    Search Query Report

    Facebook’s search market share grew by 13% in that same period of time. You may not think about Facebook for search as much, but people are spending more and more time on Facebook, and it stands to reason that they’ll be conducting more and more searches from Facebook. Granted, Facebook’s web search feature is powered by Bing, but that’s only a piece of the Facebook Search puzzle. If you don’t have a Facebook strategy, you may be missing out on a lot more searches. By the way, did you know that Facebook recently passed Yahoo as the 2nd most visited site (just under Google)?

    These are just a few examples. People are searching from a lot more places. Rather than just optimizing for Google, Yahoo, and Bing, perhaps you should think about all of the places where your site/business would make sense when a user searches (consider niche sites as well).

    Does the Yahoo/Bing deal make optimization easier? Weigh in with your thoughts.

  • Links Not Always the Best Indicator of Relevance

    In a recent video uploaded to Google’s Webmaster Central YouTube channel, Matt Cutts talks about creating tags and categories on blogs for SEO purposes. Rather, he discusses how there’s not much point in creating them for this reason.

    On average, how many tags do you include with your articles/blog posts? Let us know.

    "Google is pretty good at saying, ‘You know what? The first time you say a phrase, it’s interesting, and the second time you say a phrase, it’s still a little bit useful,’" says Cutts. "After a while, we sort of realized, ‘okay, you’ve said that phrase, you don’t have to keep repeating it 8, 9, 10 different times.’ So there are certainly some blogs (including some really popular blogs) who have like an entire paragraph full of tags. And they have clearly spent a lot of time, almost as many, you know, minutes writing tags out as they have the actual content of the post. And I always laugh at that because it’s not really that needed."

    He notes that a lot of the time, the tags are already words that are used in the post, so it won’t make that much difference.

    Matt appears to be discussing how much the tags will benefit the page the actual content appears on. However, he doesn’t really go into the pages that contain listings of the articles contained within those tags, at least with relation to SEO (He does point out that the tag pages can be useful because they can provide a feed for just that category). This is probably because they don’t do particularly well in search engines either, which could be because they aren’t linked to particularly often.

    Google is all about providing users with the most relevant results for the best user experience, and maybe the fact that these kinds of sites aren’t often featured near the top of results could be considered an area where Google isn’t necessarily delivering the best results.

    For example, If I wanted to find all WebProNews SEO articles, there is no better place than our tag page for "SEO" at dev.webpronews.com/tag/seo. There, any user looking to find WebProNews SEO articles would find all of them arranged by date. If I wanted to see all of the Facebook articles Mashable has, I can do that by going to mashable.com/tag/facebook. Yet neither of these pages are returned anywhere near the top for queries like "webpronews SEO articles" or "mashable facebook articles", at least in the results I get (they can vary from user to user). Instead, you might find indvidual articles and results from other sites, with what I would consider to be most relevant pages nowhere in site.

    Links are only one of the many factors Google takes into consideration for its rankings, but they are commonly known to be one of the biggest. These tag pages simply highlight the fact that links may not always be the best indicator of relevance.

    Note: Our SEO tag page is crawled, and is even featured as one of our "site links" seen by searching for "WebProNews” on Google.

    Would you consider there to be a more relevant result for a query like those mentioned above than such tag pages? Do you think Google’s algorithm could be improved in this area? Are links always the best indicator of relevance? Share your thoughts.

  • Where Does Yahoo Fit Into Your Search Strategy?

    In search, a lot of what Yahoo has done has been overshadowed by what Google and Bing have done, simply because Google controls such a huge piece of the search pie, and Bing is still a relatively fresh entity. All eyes are still on Bing as it grows. That leaves Yahoo somewhere in the middle, where it technically sits in terms of market share.

    How important is Yahoo to your search strategy? Let us know.

    Yahoo has done quite a bit over the past six months, and has a lot more going on in the coming ones. Regardless of whether or not Yahoo’s deal with Microsoft finally goes through, and Bing takes over the algorithm side of things, Yahoo is still very much focused on search.

    "Yahoo has been in search, is in search, and will continue to be in the future," says Yahoo’s new senior VP of search products, Shashi Seth. "We’ll continue to drive innovation. It’s our stake in the ground."

    According to the latest data from Experian Hitwise, Yahoo’s market share in the U.S. declined by 2 percentage points from December to January as Bing and even Ask grew by 5% and 4% respectively. Regardless of this data, there are still plenty of people using Yahoo, and that means businesses shouldn’t ignore it. In fact, businesses should do all they can to understand the audience they are reaching with each individual search engine.

    An interesting study from Wunderman, ZAAZ, and Compete suggests that the demographic and psychographic profile of each loyal search engine user is different. Bing users, for example, tend to be mostly from the tip of the adoption curve (innovators and early adopters) where Yahoo and Google’s passengers tend to be middle majority, according to the report from these firms.

    Search-Engines

    The point is that it is easy to get wrapped up in specific search engines, but Yahoo is still a key player and it is worth paying attention to all of the things they are doing to improve their own users’ experience, because you might find specific ways to reach Yahoo users that might be slightly different than ways you might try to reach Google users (or Bing’s users).

    What Yahoo Has Done Lately

    In September, Yahoo launched a completely new version of Yahoo Search. In addition to being faster, this new version included things like:

    –  SearchMonkdey structured data, which lends to richer results from an increasing number of sites

    – Search Scan and Safe Search, which help protect users from viruses, spyware and spam

    – Search Pad, which lets users take notes for research as they search

    – Query assistance, which has been extended in the left-hand column to let users browse concepts related to their queries

    – Image and video search refiners

    Yahoo has expanded its coverage for enhanced results to formats like video, documents, games, products, local businesses, events, discussions, and news. In November, Yahoo extended its Search Assist features from the web search box to the search box on every Yahoo property. They also began including photos, videos, and tweets about news stories in search results.

    "We’re focused on making it easier to search for local businesses," says Larry Cornett, Vice President, Consumer Products, Yahoo Search. "Starting in December 2009, we display more Yahoo! local business shortcuts when you search for a business, even if you don’t include your location in your query. We also began providing new functionality directly within the local shortcut to refine results by neighborhood or nearby city right on the search results page. This further enhances an already great shortcut that provides more of the information you care about most directly on the search results page; including ratings, reviews, photos, and directions." (emphasis added)

    In December, Yahoo started integrating tweets in the form of a shortcut from search results pages (separate from the news tweets) when users search for "buzzy" topics. Finally, Yahoo added more entertainment refiners within its image/video search products.

    What Yahoo Will Be Doing

    One new innovation that Yahoo unveiled this week is called "Sketch-a-Search". It’s a mobile app that lets users pull up a map and user their fingers to search by tracing a line around the area they want to search. An image of it can be seen here.

    As far as advertising, Yahoo says it’s focused on three key areas: better value, transparency and control, and innovation. The company says it is making pricing adjustments, and will allow advertisers to pay different rates for different traffic sources across Yahoo’s network. The company said it knows most people focus most of their campaign son Google, and they’ve created an import campaign tool and a new desktop tool for Yahoo Search Marketing, which will be available next month. They are putting ads into Search Assist, and they’re doing re-targeting of ads based on users’ search history.

    Do you like the direction Yahoo’s headed in? Do you use Yahoo for search? How important is it to your marketing strategy? Comment here.

  • Ways to Get Fresh Links to Old Content for Better Search Rankings

    You may have gotten some good links in the past, but don’t count on them helping you forever. Old links go stale in the eyes of Google.

    Do you still get links to old content? Tell us why you think that is.

    Google’s Matt Cutts responded to a user-submitted question asking if Google removes PageRank coming from links on pages that no longer exist (for example, GeoCities pages that have been shut down). The answer to this question is unsurprisingly yes, but Cutts makes a statement within his response that may not be so obvious to everybody.

    "In order to prevent things from becoming stale, we tend to use the current link graph, rather than a link graph of all of time," he says. (Emphasis added)

    Now, this isn’t exactly news, and to the seasoned search professional, probably not much of a revelation. However, to the average business owner looking to improve search engine performance (and not necessarily adapting to the ever-changing ways of SEO), it could be something that really hasn’t resonated. Businesses have always been told about the power of links, but even if you got a lot of significant links a year or two ago, that doesn’t mean your content will continue to perform well based on that.  WebProNews has discussed the value of "link velocity" and Google’s need for freshness in the past:

    Link velocity refers to the speed at which new links to a webpage are formed, and by this term we may gain some new and vital insight. Historically, great bursts of new links to a specific page has been considered a red flag, the quickest way to identify a spammer trying to manipulate the results by creating the appearance of user trust. This led to Google’s famous assaults on link farms and paid link directories.

    But the Web has changed, become more of a live Web than a static document Web. We have the advent of social bookmarking, embedded videos, links, buttons, and badges, social networks, real-time networks like Twitter and Friendfeed. Certainly the age of a website is still an indication of success and trustworthiness, but in an environment of live, real time updating, the age of a link as well as the slowing velocity of incoming links may be indicators of stale content in a world that values freshness.

    Do you think link freshness should play a role in search engine rankings? Let us know.

    So how do you keep getting "fresh" links?

    If you want fresh links, there are a number of things you can do. For one, keep putting out content. Write content that has staying power. You can link to your old content when appropriate. Always promote the sharing of your content. Include buttons to make it easy for people to share your content on their social network of choice. You may want to make sure your old content is presented in the same template as your new content so it has the same sharing features. People still may find their way to that old content, and they may want to share it if encouraged.

    Go back over old content, and look for stuff that is still relevant. You can update stories with new posts adding a fresher take, linking to the original. Encourage readers to follow the link and read the original article, which they may then link to themselves.

    Leave commenting on for ongoing discussion. This can keep an old post relevant. Just because you wrote an article a year ago, does not mean that people will still not add to it, and sometimes people will link to articles based on comments that are left.

    Share old posts through social networks if they are still about relevant topics. You don’t want to just start flooding your Twitter account with tweets to all of your old content, but if you have an older article that is relevant to a current discussion, you may share it, as your take on the subject. A follower who has not seen it before, or perhaps has forgotten about it, may find it worth linking to themselves. Can you think of other ways to get more link value out of old content? 

    Do you get fresh links for old content? Why do you think that is? Share your thoughts.

     

    Related Articles:

    > How Google Rates Links from Facebook and Twitter

    > How Press Releases Can Be Great For Search

    > Link Building for Bing Rankings: Dos and Don’ts

  • How Many Spiders Does Google Have?

    How Many Spiders Does Google Have?

    Google has posted a short but interesting video to its Webmaster Central YouTube channel. A user asked the question, "How many bots/spiders does Google currently have crawling the web?" and Google’s Matt Cutts gave his answer.

    "It’s important to realize that it’s not really actual robots or actual spiders out there…instead, it’s banks of machines …at Google’s data centers who open up an HTTP connection and request a page and then get it back," he says. "So any bank of machines (even 50 machines) could easily be requesting a bunch of different content."

    "We try to refresh a large fraction of the web every few days," he adds. "So it turns out you really don’t need a ton of machines. Even a relatively small amount of machines operating in parallel and fetching pages in parallel can really be able to crawl of find new pages on the web in a very quick way."

    Matt says that Google doesn’t give out the exact number, but that it’s somewhere between 25 and 1,000. I’m not sure what you can really do with that information, but it’s worth hearing a quick rundown of how it works for those who aren’t real familiar with how Google indexes content.

    Related Articles:

    > Google Rolls Out Breadcrumb Display in SERPs

    > Google Makes it Easier to Tell Where Results Originate From

    > Get More Links in Your Actual Google Results

     

  • Google Sets Record Straight on Page Speed as Ranking Factor

    Google Sets Record Straight on Page Speed as Ranking Factor

    Late last year, in a conversation about the Caffeine update, Google’s Matt Cutts told WebProNews that page speed could become a factor Google looks at for ranking search results. His comments received a lot of attention, because Google has never taken this into consideration for ranking websites in the past. The notion that they would do so riled a lot of people up, because a lot of site owners out there simply don’t have incredibly fast sites. That could pose a big problem if it suddenly damages their search rankings.

    Do you count speed among the priorities for your site? Comment here.

    Despite the fact that Cutts never said that page speed would become any more important of a ranking factor than anything else, many around the web and Blogosphere jumped to conclusions. While many more have remained sensible about the concept, not expecting page speed to trump relevant content, Cutts has now provided a video setting the record straight. The video is a response to the following user-submitted question:

    Since we’re hearing a lot of talk about the implications of Page Speed, I wonder if Google still cares as much about relevancy? Or are recentness and page load time more important?

    Matt’s answer is simply, "No. Relevancy is the most important. If you have two sites that are equally relevant (same backlinks…everything else is the same), you’d probably prefer the one that’s a little bit faster, so page speed can be an interesting theory to try out for a factor in scoring different websites. But absolutely, relevance is the primary component, and we have over 200 signals in our scoring to try to return the most relevant, the most useful, the most accurate search result that we can find. That’s not going to change." (emphasis added)

    "If you can speed your site up, it’s really good for users, as well as potentially down the road, being good for search engines," he says. "So it’s something that people within Google have thought about."

    It is interesting that anyone would ever assume page speed would become more important than relevance to Google, just because Matt Cutts indicated that page speed may become one of the many factors Google uses. If it were more important than relevance, Google probably would have been placing emphasis on page speed for a long time.

    That said, it is worth pondering just how big a factor page speed would play. If there are over 200 factors, where would page speed be placed within the ranking of ranking factors? On a scale of one to two hundred, where would Google rank the importance of page speed? That question might not be quite so easy to answer, particularly since Google isn’t real keen on the idea of giving away its secrets, and frankly, that’s probably in the best interest of the web.

    Just as with any other SEO tactic, it is up to individuals and the industry at large to speculate, analyze, and test. It’s no easy feat, but there are plenty of educated guesses out there about just what Google’s "over 200 ranking factors" are. Once you get into how much weight each one carries, it gets even more difficult to speculate.

    I think the real takeaway here is simply to make your site as fast and user-friendly as possible, within reason. If it means you have to spend less time producing relevant content that is likely to get you good search engine placement, then maybe it’s not worth it. However, if it means providing a better user experience on top of relevant content, and it’s within your means to do so, it will only have good implications for the future of your site.

    Google offers webmasters a lot of different tools to help them make their sites faster. In fact, they have a list of such tools here, and it doesn’t just contain Google tools. They also point to tools from third-party developers. It’s all part of Google’s initiative to "make the web faster."

    On a scale of 1 to 200, where would you place the importance of page speed? Discuss here.

    Related Articles:

    Google: Page Speed May Become a Ranking Factor in 2010

    Google Tracks User Data to Monitor Load Times

    Google Introduces Page Speed Tool

    Things to Consider if Page Speed is to Become a Ranking Factor

    Google Provides Tool for Speeding Up Web Pages

    Google Launches Site Performance Feature

    Google Announces SPDY Application-Layer Protocol

  • A Markup That Could Have Big Implications for SEO

    RDFa, which stands for Resource Description Framework in attributes, is a W3C recommendation, which adds a set of attribute level extensions to XHTML for embedding rich metadata within web documents. While not everyone believes that W3C standards are incredibly necessary to operate a successful site, some see a great deal of potential for search engine optimization in RDFa.

    In fact, this is the topic of a current WebProWorld thread, which was started by Dave Lauretti of MoreStar, who asks, "Are you working the RDFa Framework into your SEO campaigns?" He writes, "Now under certain conditions and with certain search strings on both Google and Yahoo we can find instances where the RDFa framework integrated within a website can enhance their listing in the search results."

    Lauretti refers to an article from last summer at A List Apart, by Mark Birbeck who said that Google was beginning to process RDFa and Microformats as it indexes sites, using the parsed data to enhance the display of search results with "rich snippets". This results in the Google results you see like this:

    RDFa in play

    "It’s a simple change to the display of search results, yet our experiments have shown that users find the new data valuable — if they see useful and relevant information from the page, they are more likely to click through," Google said upon the launch of rich snippets.

    Google says it is experimenting with markup for business and location data, but that it doesn’t currently display this information, unless the business or organization is part of a review (hence the results in the above example). But when review information is marked up in the body of a web page, Google can identify it and may make it available in search results. When review information is shown in search results, this can of course entice users to click through to the page (one of the many reasons to treat customers right and monitor your reputation).

    Currently Google uses RDFa for reviews, but this search also displays the date of the review, the star rating, the author and the price range of an iPod, as Lauretti points out.

    Best Buy’s lead web development engineer reported that by adding RDFa the company saw improved ranking for respective pages. They saw a 30% increase in traffic, and Yahoo evidently observed a 15% increase in click-through rates.(via Steven Pemberton)

    Implications for SEO

    I’m not going to get into the technical side of RDFa here (see resources listed later in the article), but I would like to get into some of the implications that Google’s use of RDFa could have on SEO practices. For one, rich snippets can show specific information related to products that are searched for. For example, a result for a movie search could bring up information like:

    – Run time
    – Release Date
    – Rating
    – Theaters that are showing it

    "The implementation of RDFa not only gives more information about products or services but also increases the visibility of these in the latest generations of search engines, recommender systems and other applications," Lauretti tells WebProNews. "If accuracy is an issue when it comes to search and search results then pages with RDFa will get better rankings as there would be little to question regarding the page theme." (Source) He provides the following chart containing examples of the types of data that could potentially be displayed with RDFa:

    RDFa Implications

    "It is obvious that search marketers and SEOs will be utilizing this ability for themselves and their clients," says Lauretti. Take contact information specifically. "Using RDFa in your contact information clarifies to the search engine that the text within your contact block of code is indeed contact information." He says in this same light, "people information" can be displayed in the search results (usually social networking info). You could potentially show manufacturer information or author information.

    RDFa actually has implications beyond just Google’s regular web search.
    With respect to Google’s Image search, the owner of images can also use RDFa to provide license information about the images they own. Google currently allows image searchers to have images displayed based on license type, and using RDFa with your images lets the search bots know under which licenses you are making your images available (Via Mark Birbeck). There is also RDFa support for video.

    Following are some resources where you can learn more about RDFa and how to implement it:

    Google Introduces Rich Snippets
    Introduction to RDFa
    RDFa Primer
    About RDFa (Google Webmaster Central)
    RDFa to Provide Image License Info
    RDFa Microformat Tagging For Your Website
    For Businesses and Organizations
    About Review Data (Google Webmaster Central)

    Google’s Matt Cutts has said in the past that Google has been kind of "white listing" sites to get rich snippets, as Google feels they are appropriate, but as they grow more confident that such snippets don’t hurt the user experience, then Google will likely roll the ability out more and more broadly. This is one thing to keep an eye on as the year progresses, and is why those in the WebProWorld thread believe RDFa will become a bigger topic of discussion in 2010.

    WebProNews would like to thank Dave Lauretti, who contributed some findings to this piece.

    Update: As I pieced together this article, Google coincidentally announced support for rich snippets for Events.


    Related Articles:

    > Get Your Breadcrumbs in Google for More Links in Results

    > Google Makes it Easier to Tell Where Results Originate From

    > Get More Links in Your Actual Google Results

  • Has Google Begun Changing How it Indexes the Web?

    Last summer Google announced a new project called "Caffeine", which was described as a re-write of Google’s web search architecture. Around that time, Matt Cutts discussed Caffeine with WebProNews, comparing it to the "Big Daddy Update" of 2005, which consisted of changes to the way Google crawls and indexes websites. It appears that more people are now seeing the effects from Caffeine out in the wild.

    Have you seen possible Caffeine effects in use? Tell us about it.

    Back before the holidays, Google made it a point to assure everybody that Caffeine would not be rolled out (except for at one data center) until after the holidays were over – January at the earliest. The reason for this was that Google didn’t want to shake everything up during a key time for businesses (they didn’t want a repeat of the Florida update).

    The company let everyone know about its intentions at PubCon in November. In fact, a few days ago, Google’s Matt Cutts posted a video running through his presentation from that event on his blog. He also provided the slideshow. It covers much more than just Caffeine, but if you missed it, you may want to consider watching it anyway (Caffeine discussion starts at about 22:10 in the video and at slide 29 in the presentation).

    "It’s a re-write of our indexing infrastructure. It’s taking the old way that we used to index things that we’d crawled around the web, and we’re replacing that with new architecture that’s fresh and that had been written to be more scalable, more flexible, [with] the ability to attach different types of data, and in the process of indexing, the ability to do more documents for a more comprehensive version of the web, and the ability to do it faster," Cutts says of Caffeine.

    But enough background. Barry Schwartz at Search Engine Roundtable points to a WebmasterWorld forum thread where administrator Tedster claimed to have seen Caffeine in action at a number of IP addresses. He wrote:

    I’m seeing the Caffeine data-set being served via this set of IP addresses: 64.233.169.147, 64.233.169.105, 64.233.169.103, 64.233.169.104, 64.233.169.99,64.233.169.106

    It seems to take 5 IP addresses to build the complete SERP, where in the past it often took only 3.

    Schwartz also pointed to another member’s post (Whitenight), who said:

    Well, just tripled checked with offices/employees in Texas, Colorado, and Indiana. All 5 "control" keywords/sites showed live Caffeine.

    That member’s latest post says that the Caffeine Dataset is also on http://66.102.7.99 and http://66.102.7.104.

    We don’t know for sure if this is all really Caffeine in action though. Google hasn’t commented on it, and has not made any announcements regarding Caffeine since what Matt said above. Some people don’t believe this is Caffeine at all. As Schwartz notes, we’ll have to wait for Google to say something.

    Still, January is almost over, and Google said it would wait until after the holidays, specifically mentioning the month of January. It’s about time for this to be rolling out to some extent. Speed has been emphasized a significant amount in Caffeine discussion, and Cutts told us that page speed would likely become a ranking factor. Regardless of whether or not you are witnessing Caffeine in action yet, rest assured that it will be here sooner or later, and any edge you can give yourself in the meantime is for the good of your own site’s performance. Speed will not only supposedly help you in search going forward, but it just makes for a better user experience.

    Share your thoughts about Google’s Caffeine update.

    Related Articles:

    > Matt Cutts Talks Google Caffeine Update

    > New Details on Google Caffeine Update

    > A Markup That Could Have Big Implications for SEO

  • Get Your Breadcrumbs in Google for More Links in Results

    Last summer it was discovered that Google was testing breadcrumbs in search results (breadcrumbs being the hierarchical display commonly used in site navigation. For example: Home Page>Product Page>Product A Page). Then in mid-November, Google announced that it was rolling out the use of breadcrumbs in search results on a global basis. What this means for webmasters is that if you can get your breadcrumbs into Google’s results, you essentially have more links on the results page. You have a separate link for each page in the breadcrumb trail.

    Do your site’s breadcrumbs show up in Google’s results? Comment here.

    The company said they would only be used in place of some URLs, mainly ones that don’t give the added context of a link the way that breadcrumbs do. Interestingly, there seems to be an incentive for those who go the breadcrumb route because of the multiple links that you just don’t get with regular search results.

    Google Breadcrumbs display

    Google’s move was generally well received. This was reflected in the comments from WebProNews readers on our past coverage. For example, a commenter going by the handle Stupidscript said, "It’s definitely a good time to start wrapping your head around the notion of ‘providing context’, because the web is heading into its "semantic" period … where each link will be more or less valuable based on its relationships with and context to information found behind other links."

    Google’s use of breadcrumbs in search results is the focus of a recently submitted question to the Google Webmaster Central team. The question was, "Google is showing breadcrumb URLs in SERPs now. Does the kind of delimiter matter? Is there any best practice? What character to use is best? > or | or / or???" Google’s Matt Cutts responded:

    Matt says you should have a set of delimited links on your site that accurately reflect your site’s hierarchy. He also notes, however, that it is still in the "early days" for breadcrumbs.

    "Think about the situation with sitelinks," he says. "Whenever we started out with sitelinks, it took a while before…for example, we added the ability in Google Webmaster Tools where you could remove a sitelink that you didn’t like or that you thought was bad. So we started out, and we did a lot of experiments, and we’ve changed the way that sitelinks look several times. And we have different types of sitelinks (within a page, and the standard ones you’re familiar with). So we’ve iterated over time."

    In this same way, he says, Google is in the early stage with breadcrumbs and he has seen different experiments with them. For example, there have been prototypes where the breadcrumbs were in the rich snippet gray line, above the regular snippet. "Having it in the URL is kind of nice, but it could still change over time," he says.

    He says the best advice he can give is to make sure you have a set of delimited links that accurately reflect your site’s hierarchy, and that will give you the best chance of getting breadcrumbs to show up in Google, but Google will continue to work on ways to improve breadcrumbs. He says any new announcements about it will likely be made on the Google Webmaster blog.

    While Matt doesn’t exactly lean toward one way or another with regards to which character to use as asked about in the submitted question, all of the examples I have seen highlighted show the ">" used. That includes examples from Google’s original announcement on the inclusion of breadcrumbs (if you see other ways, please point them out in the comments). Based on that, if I were going to choose one, I’d go with that.

    There are three types of breadcrumbs (as described here): path, location, and attribute. Path breadcrumbs show the path that the user has taken to arrive at a page, while location breadcrumbs show where the page is located in the website hierarchy. Attribute breadcrumbs give information that categorizes the current page. Obviously, location breadcrumbs would be the ones Google is using (although with personalized search becoming more of a factor, who knows in the future?).

    Update: 
    In the comments, one reader says:

    My site breadcrumb is seperated by |. Somehow, Google seems to put the > character in of their own accord. I’ve seen many results with breadcrumbs in the SERPS, and I havn’t seen any with a seperating character other than >. I do think Google puts in the > character regardless of your site’s seperating delimiter.

    Have you seen an increase in clickthrough from breadcrumbs in Google resutls? Discuss here.


    Related Articles:

    > Google Rolls Out Breadcrumb Display in SERPs

    > Google Makes it Easier to Tell Where Results Originate From

    > Get More Links in Your Actual Google Results

  • 10 Details About How Google Handles Natural Language Search

    Google has posted a thought-provoking piece to the Official Google Blog, discussing at length, Google’s system for understanding synonyms in search. As author Steven Baker says, "An irony of computer science is that tasks humans struggle with can be performed easily by computer programs, but tasks humans can perform effortlessly remain difficult for computers."

    Google considers understanding human language to be one of the hardest problems in artificial intelligence, and the key to returning the best possible search results. While it is far from perfect now, Google has invested a great deal of time into this (5 years of research to be exact).

    To cut to the chase, here are some things pertaining to Google’s handling of synonyms that you should keep in mind:

    1. Google contantly monitors its system for handling synonyms with regard to search result relevance.

    2. Google says synonyms affect 70% of user searches across over 100 languages.

    3. For every 50 queries where synonyms significantly improve search results, Google has only found one "truly bad" synonym.

    4. Google does not normally fix bad synonyms by hand, but rather makes changes to its algorithms to try and correct the problem. "We hope it will be fixed automatically in some future changes," Baker says.

    5.
    Google has recently made a change to how its synonyms are displayed: in SERP snippets, terms are bolded, just like the actual words you searched for.

    6. Google uses "many techniques" to extract synonyms. Its systems analyze perabytes of data to build "an intricate understanding of what words can mean in different contexts"

    7. Some words or initials can have tons of different meanings, and Google uses other words in the query to help determine the correct ones. For example, there are over 20 possible meanings for the term "GM" that Google’s system knows something about.

    GM Synonyms

    8. Google includes variants on terms (such as singular and plural versions) within its "umbrella of synonyms".

    9.
    Google still makes mistakes with synonyms.

    10. You can turn off a synonym in a search by adding a "+" before the term or by putting the words in quotation marks.

    Google wants feedback on algorithm mistakes. They’ll take it through the web search help center forum, or through a Twitter hashtag: #googlesyns.

    It will be interesting to see how far Google progresses in the area of natural language search, because Baker is absolutely right in that it is a key to providing more relevant results. If they can understand exactly what we want from our language, without us having to tweak it too much, that will be a tremendous stride for search. Instead of us trying to figure out what Google wants us to say, Google would just understand what we say. Luckily people have gotten much better at searching over the years, learning to enter longer, more specific queries.

    Related Articles:

    Google Launches Social Search Experiment

    Optimizing for Mixed Media Search Results

    Succeeding In SEO Requires Change

  • How Google Rates Links from Facebook and Twitter

    The first Matt Cutts Answers Questions About Google video of the year has been posted, and in it Matt addresses links from Twitter and Facebook, after talking about his shaved head again. Specifically, the submitted question he answers is:

    Links from relevant and important sites have always been a great way to get traffic & acceptance for a website. How do you rate links from new platforms like Twitter, FB to a website?

    Do you rely on links from Facebook and Twitter updates? Discuss here.

    Essentially, Matt says Google treats links the same whether they are from Facebook or Twitter, as they would if they were from any other site. It’s just an extension of the pagerank formula, where its not the amount of links, but how reputable those links are (the company uses a similar strategy for ranking Tweets themselves in real-time search).

    While Facebook and Twitter links may be treated like any other links, they do still come with things to keep in mind. For one, with Facebook, you have to keep in mind that a lot of profiles are not public. When a profile is not public, Google can’t crawl it, and it can’t assign pagerank on the outgoing links if it can’t fetch the page to see what the outgoing links are. If the page is public, it might be able to flow pagerank, Matt says. With Twitter, most links are nofollowed anyway.

    "At least in our web search (our organic rankings), we treat links the same from Twitter or Facebook or, you know, pick your favorite platform or website, just like we’d treat links from WordPress or .edus or.govs or anything like that," says Cutts. "It’s not like a link from an .edu automatically carries more weight or a link from a .gov automatically carries more weight. But, the specific platforms might have issues, whether it’s not being crawled or it might be nofollow. It would keep those particular links from flowing pagerank."

    There you have it. Matt’s response probably doesn’t come as much of a surprise to most of you, but it’s always nice to hear information like this straight from Google.

    Do you like the way Google handls links from Facebook and Twitter? Would you do it differently? Share your thoughts.

    Related Articles:

    > Tips for Getting Found in Real-Time Searches

    > Google Makes a Second Real-Time Search Announcement

    > Yahoo Rolling Out Something Kind of Like Real-Time Search

  • Google Reveals Factors for Ranking Tweets

    Google Reveals Factors for Ranking Tweets

    It’s ok to say "no" to Twitter if that’s your thing. There’s a chance that it just doesn’t fit into your strategy or help you achieve your goals. That’s cool. However, if it is your thing, you may be interested in how Google ranks tweets. That is if search marketing is your thing.

    Do you see Twitter as important to an effective search marketing campaign? Share your thoughts here.

    Google and Microsoft almost simultaneously announced deals with Twitter a few months back, that would give the companies access to tweets in real-time to fuel their respective search engines’ real-time results. Microsoft immediately launched their version, but it was separate from the regular Bing search engine. Google waited a while, but eventually started incorporating real-time results right into regular Google SERPs (including not only tweets, but various other sources).

    After the Twitter deals were announced, Bing came out and said, "If someone has a lot of followers, his/her Tweet may get ranked higher. If a tweet is exactly the same as other Tweets, it will get ranked lower."

    Amit Singhal Google was not as vocal about how it would rank tweets and other real-time results, but the company has now shed a bit of light on that via an interview with MIT’s Technology Review. David Talbot interviewed Google "Fellow" Amit Singhal, who has led development of real-time search at the company. According to him, Google also ranks tweets by followers to an extent, but it’s not just about how many followers you get. It’s about how reputable those followers are.

    Singhal likens the system to the well-known Google system of link popularity. Getting good links from reputable sources helps your content in Google, so having followers with that some kind of authority theoretically helps your tweets rank in Google’s real-time search.

    "One user following another in social media is analogous to one page linking to another on the Web. Both are a form of recommendation," Singhal says. "As high-quality pages link to another page on the Web, the quality of the linked-to page goes up. Likewise, in social media, as established users follow another user, the quality of the followed user goes up as well."

    But that’s only one factor.

    Do you commonly use hashtags in your tweets? If your goal is to rank in Google’s real-time search index, you may want to cut down on that practice, because according to Singhal, that is a big red flag for a lower quality tweet. This seems to be part of Google’s spam control strategy.

    Another noteworthy excerpt from the interview:

    Another problem: how, if someone is searching for "Obama," to sift through White House press tweets and thousands of others to find the most timely and topical information. Google scans tweets to find the "signal in the noise," he says. Such a "signal" might include a new onslaught of tweets and other blogs that mention "Cambridge police" or "Harry Reid" near mentions of "Obama." By looking out for such signals, Google is able to furnish real-time hits that contain the freshest subject matter even for very common search terms.

    Well, we certainly know more about Google’s strategy for tweet ranking now, but there are still plenty of questions about it. What is Google’s stance is on Ghost Tweeting? Are Google’s ranking factors a good reason to create and follow more Twitter lists in hopes for gaining more reputable industry followers?

    The factors mentioned aren’t the only ones Google employs. It’s not like Google is going to tell us everything. It also helps to keep in mind that real-time search spans far beyond just tweets. Still, Twitter is clearly a big part of it, and even the significance of tweets themselves will evolve in time.

    Google says it hopes to factor in geo-location data (with regards to tweets) into the real-time search results at some point. Google and Twitter engineers frequently collaborate on  real-time search, which Google itself says is evolving.

    By the way, it stands to reason that Google’s strategy for ranking tweets probably shares similarities for how it ranks content from other sources drawn from for real-time search.

    Is ranking in Google’s real-time search important to your strategy? Discuss here.