WebProNews

Tag: SEO

  • Yahoo Now Including Bing Results – Tips for Optimizing

    Yahoo has begun testing organic and paid search listings from Microsoft. Up to 25% of its search traffic in the U.S. may see organic listings from Microsoft, and up to 3.5% may see paid listings from Microsoft adCenter. I guess you could say that the early stages of the Search Alliance’s transition have begun.

    Will you place more emphasis on Bing optimization as it integrates with Yahoo Search?
     Let us know.

    "The primary change for these tests is that the listings are coming from Microsoft," says Yahoo’s VP of Search Product Operations, Kartik Ramakrishnan. "However, the overall page should look the same as the Yahoo! Search you’re used to – with rich content and unique tools and features from Yahoo!. If you happen to fall into our tests, you might also notice some differences in how we’re displaying select search results due to a variety of product configurations we are testing."

    Yahoo provides the following example, in which the Microsoft-powered parts are represented by the boxes:

    Yahoo Starts including Microsoft search results - paid and organic

    As far as SEO is concerned, the Yahoo Search Marketing Team provides the following tips for organic search:

    1. Compare your organic search rankings on Yahoo! Search and Bing for the keywords that work best for you.
    2. Decide if you’d like to modify your paid search campaigns to compensate for any changes in organic referrals that you anticipate.
    3. Review the Bing webmaster tools and optimize your website for the Microsoft platform crawler, as Bing listings will be displayed for approximately 30% of search queries after this change, according to comScore.

    Microsoft’s Satya Nadella also says that "now is a good time for you to review your crawl policies in your robots.txt and ensure that you have identical polices for the msnbot/Bingbot and Yahoo’s bots. Just to note, you should not see an increase in bingbot traffic as a result of the transition."

    The Bingbot is designed to crawl non-optimized sties more easily. The new Bingbot will replace the existing msnbot in October. More on this here.

    Also note that the new Bing Webmaster Tools experience is live. This has been completely redone with a  bunch of new features (and more features to come). Bing Webmaster Tools Senior Product Manager Anthony M. Garcia summarizes:

    The redesigned Bing Webmaster Tools provide you a simplified, more intuitive experience focused on three key areas: crawl, index and traffic. New features, such as Index Explorer and Submit URLs, provide a more comprehensive view as well as better control over how Bing crawls and indexes your sites. Index Explorer gives you unprecedented access to browse through the Bing index in order to verify which of your directories and pages have been included. Submit URLs gives you the ability to signal which URLs Bing should add to the index. Other new features include: Crawl Issues to view details on redirects, malware, and exclusions encountered while crawling sites; and Block URLs to prevent specific URLs from appearing in Bing search engine results pages. In addition, the new tools take advantage of Microsoft Silverlight 4 to deliver rich charting functionality that will help you quickly analyze up to six months of crawling, indexing, and traffic data. That means more transparency and more control to help you make decisions, which optimize your sites for Bing.

    WebProNews spoke with Janet Driscoll Miller of Search Mojo out at SMX a while back. She had presented on the topic of Bing SEO vs. Organic SEO. As she notes, some businesses actually see better results from Bing than they do from Google, and when Yahoo starts fully using Bing for search, Bing’s share of the search market is going to grow dramatically (it also powers search in Facebook, let’s not forget).

    Yahoo will be integrating Microsoft’s mobile organic and paid listings in the U.S. and Canada in the coming months. The company anticipates that U.S. and Canada organic listings in both the desktop and mobile versions of its search will be fully powered by Microsoft as early as August or September. This of course depends on how the testing goes.

    Yahoo and Microsoft have created new joint editorial guidelines for advertisers that will become effective in early August. These can be found here.

    As we’ve discussed, Bing optimization is about to get more important, and now the time has come to really look at your Bing strategy if you’ve not already been doing so.

    Are you prepared for the transition?
    Comment here.

  • BusinessWire to Give Businesses More Ways to Increase Press Coverage

    BusinessWire, the popular newswire service, is set to launch some new services for businesses and organizations next month. These are the NewsHQ online newsroom and the InvestorHQ investor center.

    BusinessWire says these microsites were developed specifically to help corporate communicators and investor relations officers house and maintain information for journalists, bloggers, investors, consumers, analysts, key influencers, etc.

    BusinessWire - Giving organizations new resources"There are many benefits to employing these content management solutions," explains BusinessWire VP of Web Communications, Ibrey Woodall. "One of the main reasons the online newsroom came into existence was because public relations representatives were having a hard time getting cooperation from their technical, or IT department, when they needed a press release posted quickly. Investor relations officers also needed a means by which they could get the most recent financial news and data to their company’s investors and analysts."

    "So, control was an initiating factor for this technological evolution," she adds. "Communicators needed to be able to post and organize content in a timely manner. They needed to be able to get their message on their website, and delivered directly to those who were interested in their organization."

    Getting press coverage can mean a great deal of gaining traffic and overall exposure for your business. Providing the givers of press (journalists, bloggers, and even everyday consumers via social media) with as many resources for finding information as possible, will only increase the likelihood of coverage. Businesses may find  these new services from Business Wire quite helpful.

    It also helps that the services come with some basic SEO features, which should further increase visibility.

  • Google Eyes Mouse Movement as Possible Search Relevancy Signal

    Google was granted an interesting patent today. The title is "System and method for modulating search relevancy using pointer activity monitoring". Here is how the abstract for the patent describes it:

    A method and system of modulating search result relevancy use various types of user browsing activities. In particular, a client assistant residing in a client computer monitors movements of a user controlled pointer in a web browser, e.g., when the pointer moves into a predefined region and when it moves out of the predefined region. A server then determines a relevancy value between an informational item associated with the predefined region and a search query according to the pointer hover period. When preparing a new search result responsive to a search query, the server re-orders identified informational items in accordance with their respective relevancy values such that more relevant items appear before less relevant ones. The server also uses the relevancy values to determine and/or adjust the content of an one-box result associated with a search query.

    Mouse movement as a ranking factor?"The patent presents a couple of assumptions about how mouse pointer movements can be interpreted," explains Bill Slawski at SEO by the Sea, who presents a much more readable explanation of the patent. "For example, a longer hover over a result may indicate a positive opinion about how relevant a listing on the results page might be to a query. And, if someone moves their mouse pointer across a snippet line by line at a normal reading speed, it may indicate a higher level of attention to that result than if pointer was kept in a static position or moved randomly."

    "So, the speed and movement of a mouse pointer as well as where it is placed on a search result page might be tracked to see how much attention a search pays to different search results," he adds. "If someone hovers over one sponsored listing, or ad, but not another, that might indicate more attention and interest in the ad hovered over. If a local map is shown, or a definition, or some other OneBox result, and the searcher viewing the page hovers over those OneBox results for a while, that could be an indication that the map or the definition or other OneBox listing was helpful."
     
    The patent was filed all the way back in 2005, and like Slawski notes, there’s no telling if Google will actually utilize it. A lot can change in 5 years, especially in this industry. Either way, they’ve been granted the patent. You can read it here.

  • New “Bingbot” Will Crawl Non-optimized Sites More Easily

    Microsoft has announced that it will be bringing the Bing web crawler out of beta on October 1st. It will be rebranded as "the Bingbot" and replace the existing msnbot. "It will still honor robots.txt directives written for msnbot, so no change is required to robots.txt file(s)," a Bing representative tells WebProNews.

    "Improvements to the bot enable more efficient crawling, and increase the ability to crawl content on sites not optimized for search," he says.
     
    Robot - This is not the real Bingbot, but it will be here in October.Rick DeJarnette has more about the change on the Bing Webmaster Blog:

    Instead of the old msnbot 2.0b showing up in your server logs, the updated user agent will be:

    Mozilla/5.0 (compatible; bingbot/2.0 +http://www.bing.com/bingbot.htm)

    The HTTP header From field will also change as shown below:

    From: msnbot(at)microsoft.com

    will become

    From: bingbot(at)microsoft.com

    If Bing finds separate sets of directives for Bingbot and for other crawlers, directives for bingbot will take precedence, the company says.

    I find the part about increasing the ability to crawl content on sites not optimized for search to be particularly interesting. I wouldn’t exactly call this an invitation to ignore SEO. Obviously Google is still the biggest search engine anyway, but even as far as Bing is concerned, good SEO practices will likely still help your rankings.

    Also keep in mind that optimizing for Bing is becoming increasingly important. Not only is Facebook giving more reason for people to search (where Bing provides the web results), but the Yahoo/Bing integration will be here (likely) before the holidays.

  • What’s More Important in Search? Freshness or Quality?

    It’s been a while since we looked at one of the Google Q&A webmaster videos that Matt Cutts does, but I found this recent one particularly interesting, considering the emphasis that has been put on freshness in search engines lately.

    How important is freshness to you as a search engine user? Share your thoughts here.

    The user question in this particular video says:

    Some people are under the impression that blogs are good for SEO only if they’re updated frequently. How much does frequency play into PageRank for blogs & other dynamic sites? Isn’t the content more important than the simple # of posts per day/week?

    Matt’s response is that it is indeed much more important to have quality content, but frequency can be a nice thing to have for the users.

    Essentially, if you post more frequently, people have more of a reason to keep coming back. That can be good for page views. However, as Matt says…

    "Whenever you’re thinking about search engines, it’s much, much, much more important to think about the quality of your content. For example, on my blog, I don’t post every day. Sometimes I don’t post every week. But I try to make sure that each post has something useful about it…"

    Matt implies that you’ll be better off in terms of search, if you wait until you can deliver some value to a post, rather than just crank stuff out that isn’t that much different than stuff that’s already out there. This strategy is likely to attract a lot more links, he says.

    Quality is always priority one, but I don’t think that’s to say that freshness doesn’t count. For example, as we looked at recently,

    Google’s time filters (which are more readily available to searchers, courtesy of the recent redesign of the SERPs), not to mention the realtime results Google often displays, can add some benefit to providing fresh content. Brian Klais, General Manager and VP of Product Mangagement at Covario had a very interesting post at Search Engine Land looking at how the time filter may even help smaller brands get some visibility.

    Of course Google has gone out of its way with Caffeine to increase the speed at which it indexes content so it can provide the freshest results possible.

    Do you take freshness into account for your search engine marketing strategy? Comment here.

  • Webmasters Cry Mayday for Google Rankings Again

    A lot of people had something to say about Google’s Mayday algorithm update from the beginning of May. A lot of people felt that it was costing them rankings and revenue.

    Google’s Matt Cutts talked more about Mayday at SMX Advanced a couple weeks ago. He said that it was designed to try and spot signals of quality on pages and sites that would be good for users, and that auto-generated pages and content farms tend to get hit the most by Mayday.

    Barry Schwartz at Search Engine roundtable is pointing to a WebmasterWorld thread indicating that there may have been another tweak on June 23rd and 24th that had a big impact on some sites’ rankings again.

    Cutts’ advice to webmasters affected by Mayday in the first place, was basically to improve quality. I’m going to go out on a limb and suggest that this advice would probably still apply.

    Have you experienced a dramatic change in Google rankings this week? Let us know.

  • Likes Mean Relevance in Facebook Search

    Likes Mean Relevance in Facebook Search

    Nick O’Neill at All Facebook reports that Facebook has confirmed that "all Open Graph-enabled web pages will show up in search when a user likes them." He also calls this Facebook’s "war on Google."

    While utilizing likes and the open graph as a ranking factor in search should help Facebook improve its internal search, it doesn’t represent much of a threat to Google search. Google indexes the web. Facebook indexes activity from Facebook users. There’s a pretty big difference, regardless of how big Facebook is.

    There is certainly something to be said for Facebook search, however. There’s no question that a lot lof people are using Facebook and spending a lot of time there, so having some kind of search strategy for Facebook is not a bad idea. Naturally, the Open Graph will play a huge role in this, and that means taking advantage of Facebook’s social plugins. As I’ve written about before, Facebook likes (as well as Twitter retweets) are like the new links in some ways.

    Facebook is definitely making a lot of moves to keep users getting the info they want from within Facebook. Fan pages essentially turn Facebook into a news reader. They’re working on a Q&A product. They’re launching content destinations themselves (like this politics page). However, no matter how much information Facebook is able to give users, that amount will always be limited, and will not be able to deliver the web in the way Google can. Of course, that’s why they have Bing results for web search.

    Facebook Search Results

    As far as search market share, it is probably Bing that stands to gain the most out of improved Facebook search. I don’t know how often people are going to go to Facebook for web searches, but the more people do search on Facebook, the more they are going to see those web results from Bing, when the actual (limited) Facebook results don’t deliver what they want. If Bing can deliver what they want in the top three results (the amount that is commonly displayed in Facebook search results), Bing only stands to gain.

    Optimizing for Bing is very connected to optimizing for Facebook and soon optimizing for Yahoo.

  • Google Shares Its Viewpoint on Earning Quality Links

    SEO changes all the time as search engines make adjustments to their algorithms and user interfaces, users adopt new technologies, etc. Still some things never change, like Google’s view on spammy links.

    Do you agree with Google’s philosophy on link-building? Share your thoughts here.

    In a new post to the Google Webmaster Central blog, the company has expressed its most recent viewpoint on earning quality links.

    The first piece of advice Google gives is to get involved with the community around your topic. If you were still not convinced that social media plays a very big role in search, consider this is coming straight from Google. Now the networks your community hangs out in may vary, but engaging with the community is simply a good way to get links and build credibility, which also will most likely lead to more links. Engaging is good for increasingly visibility outside of search anyway. Nothing new. Just reiterated by Google.

    Sidenote: Listen to what Arnel Leyva of Covario has to say about search and social media from this recent interview WebProNews did with him at SMX Advanced:

    Another tip Google suggests is to create content that solves problems for your users – things like tutorials, videos, and tools, surveys, research results, etc. Users who find helpful content are likely to pass it on.

    Google's Kaspar Szymanski Talks Building Quality Links Google notes that humor and other link-bait tactics can work for the short term, but does not recommend counting such tactics. "It’s important to clarify that any legitimate link building strategy is a long-term effort," says Google Search Quality Strategist Kaspar Szymanski. "There are those who advocate for short-lived, often spammy methods, but these are not advisable if you care for your site’s reputation. Buying PageRank-passing links or randomly exchanging links are the worst ways of attempting to gather links and they’re likely to have no positive impact on your site’s performance over time. If your site’s visibility in the Google index is important to you it’s best to avoid them." (emphasis added)

    "Directory entries are often mentioned as another way to promote young sites in the Google index," says Szymanski. "There are great, topical directories that add value to the Internet. But there are not many of them in proportion to those of lower quality. If you decide to submit your site to a directory, make sure it’s on topic, moderated, and well structured. Mass submissions, which are sometimes offered as a quick work-around SEO method, are mostly useless and not likely to serve your purposes."

    Szymanski also suggests looking to similar sites in other markets for inspiration – not to copy them, but to see the things that they have done to be successful and see if there is a way to apply that to your own site.

    Finally, probably the most obvious tip offered here is to make it easy for people to share your content. Things like Facebook "likes" and Twitter retweets can go a long way in creating new links to your content. Granted these won’t necessarily boost you "pagerank" but they will boost your visibility, which can lead to more quality links, and simply traffic, which is ultimately the goal anyway right?

    Have more link-building tips? Share them with WebProNews readers in the comments.

  • Time to Start Placing More Emphasis on Bing SEO

    Google SEO vs Bing SEO has been a topic of discussion throughout the industry since Bing was launched. The topic got some heavy play last week at the SMX Advanced conference, and with Yahoo and Bing coming together sometime this year, online marketers are going to want to start thinking harder about incorporating Bing into their strategies if they are not already doing so.

    Do you have a strategy for Bing SEO? Yahoo? Discuss here.

    WebProNews spoke with Janet Driscoll Miller of Search Mojo out at SMX, who presented on this topic. As she notes, some businesses actually see better results from Bing than they do from Google, and when Yahoo starts using Bing for search, Bing’s share of the search market is going to grow dramatically (it also powers search in Facebook, let’s not forget).

    Janet discusses a tool Bing has in its Webmaster tools that lets you see the types of links that point into you, and lets you look at their value, so you can go after similar links.

    Bing is actually redesigning its Webmaster Tools, however. WebProNews also spoke with Bing’s Eric Gilmore about this.

    The point is, Google’s Webmaster Tools have been very helpful for site-owners over the years in their conquest for better rankings. Now that Bing is growing in significance, its tools are going to be helpful as well.

    Have you used Bing’s webmaster tools? Did they help your rankings? Comment here.

  • Freshness May Count More in Google These Days (Not Just Because of Caffeine)

    Google’s new SERP design (you know, with the left-hand panel), has created more areas for webmasters to focus their SEO efforts on. While most of the options available here have been available for quite some time, they are now in the user’s face and they will be used more.

    Has the new user interface affected your traffic? For better or for worse? Tell us.

    Opportunities

    WebProNews spoke with Matt Cutts, head of Google’s Webspam team, at Google I/O recently. He talked a bit about the new Google SERP redesign, and the opportunities it creates for businesses to reach users beyond that one "trophy phrase".

    "The trick – the thing that’s really important – is that it’s different depending on what you’re searching for," he says. "So if you’re searching for Tom Cruise, you’re more likely to see images and options for the different types of image search – you can say, ‘show me the color images’ or ‘show me the large images’, whereas if you’re searching for Obama, you’re more likely to get real-time results, and updates, and stuff like that. So, what I like about it it is it surfaces more ways to slice and dice your data, and even if you’re not a power user, if you’ve got those options on the left-hand side, you’re more likely to kind of try to explore a little more. So that’s a good opportunity for webmasters and SEOs. You know, you don’t have to be number one. Maybe you can be number one in a slightly different area that people will find by exploring."

    "Some people get really obsessed with their one trophy phrase, or their one number one ranking, for their number one web search, and they don’t think about things like image search or video search, blog search, or book search – you know, people, it’s not that hard to write a book," he continues. "It can be done. And people also tend to think about search engines, when they might want to think about social media – things like Twitter, things like Facebook…because you want to go where the people are. And people aren’t ONLY searching. They’re hanging out online. You know, they’re on forums. So I think a lot of the time, you can think about ‘where can I show up besides just number one? What are the phrases people are gonna type in that show buying intent’ – where they really would like to get your product? And if you’re paying attention to those sorts of areas, and not just the trophy phrases, then you can find a lot more opportunities."

    Matt also talks a little more about the new UI and the speed of which Google indexes content here.

    Which of the options available in the new user interface have you found most helpful? Share your thoughts.

    Where Freshness Comes In

    One of the most important elements of this new user interface is the time filter – you can filter results by anytime, latest, past 24 hours, past week, past month, past year, or by a custom range. This isn’t just about real-time search, it’s about having regularly updated content, and staying fresh (though real-time search has its place within that).

    Brian Klais, General Manager and VP of Product Mangagement at Covario, has a really good post over at Search Engine Land, talking about the time dimension in Google and its effect on sites’ rank – in other words, how content appears when a user adjusts the time filter in the left-panel.

    "Here’s the bottom line: By institutionalizing a search time dimension with their UI, Google has introduced a new opportunity for all brands to steal (or have stolen) search marketshare from (or by) the competition," says Klais. "Brands that focus on dynamic site content with fresh social media output stand to gain searchers, at the expense of those brands who stay stagnant, one query at a time. The speed at which the gains and losses occur will be magnified by the availability (or lack) of content within each time filter. Now the ‘recency’ of social media will begin to matter in search."

    To illustrate what he’s talking about, he points to a query for "men’s jeans" and looks at the results (which are quite different) for each time constraint. Based on this example, the smaller brand sites have a better shot at showing up the more frequently they are updated. The bigger brands tend to rank higher, the wider the range of time selected.

    "The issue is a classic chicken or egg problem: unless you are present in the ‘fresh’ results now (aka ‘recency’), you cannot accurately predict what percentage of searchers are shifting to time-filtered results in order to make the business case for action," Klais says.  "Most analytic systems will not yet parse out this traffic either; it is just lumped in with all Google organic results."

    "Ask yourself this question," he adds. "Is there a way to estimate whether any of your current keyword markets are time-sensitive and if you are currently getting organic traffic from time-filtered results?"

    Of course, you don’t want to obsess over the time filter either. It’s just one of many options to consider, but to me, a blog could be a good way to keep offering fresh and relevant content around the keywords you are targeting. Plus, it can help you in the blog search option, not to mention provide useful content and engage an audience.

    Note: Interestingly enough, not long after I wrote this article, Google announced that Caffeine is now complete. This is the new version of Google’s indexing system. It does not affect how Google ranks content whatsoever. What it does is speed up how fast Google indexes content. You can learn more about it here, and in this other interview with Cutts from SMX this past week. Essentially, it’s all about providing users with the freshest content possible, which is why it is somewhat related to the topic discussed above.

    Do you you have a strategy to provide fresh content? Tell us about it.

  • Competition Analysis Basics for SEO

    Competition Analysis Basics for SEO

    In my last article titled, “Keyword Research Basics for SEO” I discussed keyword research and the basics of keyword selection. Of course – you can’t solidify your targets until you understand what you’re up against. All the keyword research in the world won’t help you rank for the keyword phrase “windows” in 6 months with a brand new site. So understanding how to analyze your competitors and get a feel for who you can compete with in a reasonable period of time is paramount to creating a solid strategy. I’ll also be flashing back a bit on keyword strategy.

     

    In the last article we closed with a list of potential keyword phrases, the idea that we needed to divide our phrases into major phrases and longtail phrases and also a new domain (just to keep things realistic). So where do we go from there?

    Generally I start at the top. From the highest searched phrases to the lowest – I do a quick analysis of the major phrases to determine the long term goals and the short term. I also like to look for what I call “holes”. These are phrases that have competition levels lower than one would expect when looking at the search volume. So let’s use the example I was using in the last article and imagine a US-based downhill mountain bike company. And let’s begin with the major targets.

    The phrases we’ll examine for the purposes of this article are the top 10 phrases as ordered by search volume. They are:

    • mountain bike
    • mountain bikes
    • specialized mountain bike
    • trek mountain bike
    • mountain bike frame
    • full suspension mountain bike
    • cannondale mountain bike
    • giant mountain bike
    • mountain bike parts
    • mountain bike reviews

       

    So what are we looking for? It’s obviously not feasible to do incredibly thorough competition analysis at this stage. I’ve listed 10 phrases here but in reality there are hundreds to consider and so we need a quick(ish) way to determine the competition levels of phrases. First, let’s install a couple tools to help you make some quick decisions. You’ll need to install the Firefox browser and the SEO Quake add on. Now when you run a search you’ll be able to quickly pull the competitor stats. I like to look at the PageRank, links to the ranking page and sitelinks. Remember now – this is the basic competitor analysis here.

    Here are the stats for the top 10 ranking sites across the 10 top phrases (I’ll leave out the URLs so there’s no promotion):

    Phrase: mountain bike

    Site 1 – PR6, 70,268 page links, 71,177 domain links
    Site 2 – PR6, 262,609 page links, 290,281 domain links
    Site 3 – PR5, 0 page links, 604 domain links
    Site 4 – PR6, 101,136 page links, 206,397 domain links
    Site 5 – PR5, 741 page links, 118,791,902 domain links

    Phrase: mountain bikes

    Site 1 – PR5, 33,097 page links, 40,747 domain links
    Site 2 – PR6, 42,010 page links, 91,385 domain links
    Site 3 – PR6, 262,609 page links, 290,281 domain links
    Site 4 – PR6, 101,136 page links, 206,397 domain links
    Site 5 – PR5, 25,059 page links, 38,132 domain links

    Phrase: specialized mountain bikes

    Site 1 – PR6, 101,136 page links, 206,397 domain links
    Site 2 – PR1, 1 page links, 206,397 domain links
    Site 3 – PR4, 2,001 page links, 2,095 domain links
    Site 4 – PR5, 734 page links, 738 domain links
    Site 5 – PR2, 4 page links, 230 domain links

    Phrase: trek mountain bikes

    Site 1 – PR6, 65,464 page links, 178,712 domain links
    Site 2 – PR4, 108 page links, 178,712 domain links
    Site 3 – PR4, 127 page links, 523 domain links
    Site 4 – PR4, 2,001 page links, 2,095 domain links
    Site 5 – PR0, 0 page links, 3,854,233 domain links

    Phrase: mountain bike frame

    Site 1 – PR4, 6,348 page links, 44,535 domain links
    Site 2 – PR2, 6 page links, 4,303 domain links
    Site 3 – PR4, 196 page links, 523 domain links
    Site 4 – PR0, 28 page links, 35 domain links
    Site 5 – PR1, 0 page links, 294,361,703 domain links

    Phrase: full suspension mountain bike

    Site 1 – PR4, 58 page links, 178,712 domain links
    Site 2 – PR4, 20 page links, 1,729 domain links
    Site 3 – PR3, 7 page links, 9,959,894 domain links
    Site 4 – PR5, 240 page links, 290,281 domain links
    Site 5 – PR3, 0 page links, 294,362,703 domain links

    Phrase: cannondale mountain bikes

    Site 1 – PR6, 62,614 page links, 91,301 domain links
    Site 2 – PR6, 410 page links, 91,301 domain links
    Site 3 – PR4, 0 page links, 2,056 domain links
    S ite 4 – PR3, 3 page links, 80,580 domain links
    Site 5 – PR2, 3 page links, 9,959,894 domain links

    Phrase: giant mountain bikes

    Site 1 – PR3, 7 page links, 136,232 domain links
    Site 2 – PR4, 2,001 page links, 2,095 domain links
    Site 3 – PR0, 6 page links, 6 domain links
    Site 4 – PR4, 2,262 page links, 2,392 domain links
    Site 5 – PR2, 1 page links, 60,131 domain links

    Phrase: mountain bike parts

    Site 1 – PR4, 610 page links, 2,366 domain links
    Site 2 – PR4, 851 page links, 4,303 domain links
    S ite 3 – PR4, 6,348 page links, 44,535 domain links
    Site 4 – PR5, 4,612 page links, 20,931 domain links
    Site 5 – PR6, 4,612 page links, 20,931 domain links

    Phrase: mountain bike reviews

    Site 1 – PR6, 262,609 page links, 290,281 domain links
    Site 2 – PR5, 240 page links, 290,281 domain links
    Site 3 – PR6, 560 page links, 361,873 domain links
    Site 4 – PR5, 0 page links, 604 domain links
    Site 5 – PR4, 22 page links, 90,123 domain links

    Now, I’d definitely look further down my keyword list than this but for the purposes of this article let’s assume this is all we have. If that’s the case – what do you suppose would be the primary choice(s)? Were it to me I’d go with:

    mountain bike frame – we have a range of PageRank, a range of links and a range of sites. Basically – we’re not up against a wall of high competition and the search volume is solid.

    full suspension mountain bike – a full range of sites. Higher competition than “mountain bike frame” but we’re looking at a phrase that would sell a whole bike which needs to be considered and a slightly higher competition is thus acceptable.

    So of these two phrases what would I do? Well – if this was all we had to work with I’d select “full suspension mountain bike” as the main phrase and follow that up with “mountain bike frame” as a major secondary phrase and thus a prime target for proactive internal page link building and optimization.

    So now let’s look at whether there are any good longtail phrases. In this industry we’ll be looking for specific parts. Since going through all the different types of parts would be a nightmare in an article I’ll focus on a couple parts I just ordered recently and that was a new handlebar and and a new rim. To keep things simple I’m going to focus on just a couple brands in the research BUT in reality we’d take the extra time and look into all the part types and all the brands that we’d be able to sell on our site.

    So for handlebars, here’s the long and short of the numbers and competition:

    Brands researched – origin and easton

    “easton handlebars” with 1,000 estimated searches/mth with low competition outside of the manufacturer is a great start. Further, when we look up the manufacturer we further see that the ea70 and ea90 Easton models are both sought after as well.

    When we build our site we obviously want to build a structure and heirarchy that are conducive to longtail rankings overall but what we’re looking for here are ideas as to where to put our energies when it comes to content creation and link building. Handlebars looks good by search volume. The average sale per item would be around $25.

    And now to rims:

    Brands researched – mavic and sun

    “mavic rims” and “sun rims” both come in at 1,900 estimated searches but the comeptition for “sun rims” is significantly lower with lower link counts and lower PageRank sites ranking. The average sale here is also going be in the $40 to $45 range.

    Based on this my first efforts for the whole site wold be “full suspension mountain bike” for the homeapge, mountain bike frame” as a major internal page and I’d focus my first efforts on “rims” (“sun rim” specifically).

    Now – we’d of course look further than this but what we can see is the direction that we’d go if all we had to go on was the above data. As noted – were we launching this site we’d look into every brand and every part type and research further than the top 10 phrases but that would have made for a book, not and article and let’s be honest – it would have been a very boring book unless you were planning on launching a mountain bike site.

    So now you’ve done enough competition analysis (remember – it’s basic research we’re talking about) to figure out what direction to head in. In my next article I’m going to cover more advanced competition analysis. We’ll go in knowing what we want to accomplish in the way of keywords and be working to map out how to take the top spots.

    Until then – get your campaigns sorted out for potential keywords and keep reading … this is where it gets really interesting.

  • Will Dmoz Continue to Have a Place in Search?

    Nearly a year ago, we looked at what Dmoz (aka: The Open Directory Project) was up to, and if it still had a place in search. The directory was talking about how it was looking for "a little respect" as it prepared to celebrate its 11th birthday (on June 5).

    Has Dmoz earned any more of that respect going into its 12th year?
    Tell us what you think.

    Dmoz has been brought back into the discussion as Google’s Matt Cutts appeared in a new Google Webmaster Help Video answering the following user question:

    Why is Google still taking notice of DMOZ? Many have alleged that the editors are corrupt. It’s impossible to get them to list a site even if it is very relevant to a specific area.

    "I know that people do have complaints about Dmoz, and we don’t show it in our one-Google-sort of tabs at the top of the page like we used to in previous years, but in some countries, it can be very hard to type in queries. It can take a lot of time," says Cutts. "For example in something like Chinese or Japanese or Korean, sometimes it might be easier to browse by clicking, rather than typing in the query, and so especially in those sorts of countries, it can be very helpful to show Dmoz."

    "But we don’t use Dmoz in a lot of the ways that we used to. We don’t show the Dmoz categories or the Open Directory categories beneath the snippet, and we used to do that," he adds. "We don’t show it on the main page like we used to anymore. So if you’re frustrated, you can always try a different category that you also think is relevant. You can always go to editors up the chain. But in general, if you can’t get into Dmoz, I wouldn’t necessarily worry about it. There are a lot of other great places to get links across the web."

    Dmoz continues down the slope it’s been on for quite some time in terms of unique visitors. Google not giving it as much play certainly must play at least some role in this. It does get over 18% of its referrals from Google:


    Dmoz on its Own Future

    Dmoz swears it still has plenty of life left in it, so if you believe the editorial department, there may be new opportunities from Dmoz down the road. In a post earlier this year, reflecting upon the last decade, Bob Keating, Dmoz editor-in-chief said, "Over the ’00 decade, DMOZ has grown to be one of the most successful collaborative projects on the web. It has outlasted its commercial counterparts, and continues to be relevant in the search industry. The keys to its longevity and usefulness are its dedicated community, its open, collaborative editorial model, its non-commercial nature, and open data distribution channel."

    "While DMOZ receives hundreds of editor applications, and lists thousands of websites each week, it needs a new Plan – a new blueprint for the future of how the web is organized, and how human organized data is consumed," he says. "Using traditional web directories as a means for information discovery is a thing of the past. However, the need for organized web-based content continues to grow exponentially. The future of DMOZ does not lie merely in improving its toolset, making it more SEO friendly, or convincing others of its collective brilliance. Its future lies in turning the entire thing on its head."

    Keating went on to list some goals for this decade, including the development of an API for Dmoz data to allow editors and developers to write new apps using it. He also wants to transform Dmoz from a fixed-path directory to "the largest faceted system for organizing information on the web," have it become a "major influencer" for bringing the semantic web out of the lab/enterprise and into the entire web, and transforming Dmoz into a "suite of products with multiple levels of participation and engagement."

    Things have been pretty quiet on the Dmoz front since then. The only updates on the Dmoz blog have been from editors talking about their experiences editing specific categories. Perhaps that is because some of the aforementioned goals are in the process of being realized behind the scenes.

    Note: With a great deal of talk in the comments about corrruption, you may be interested in hearing from a former editor on the topic. Read here.

    Do you think Dmoz has a place in the future of the web? In the future of search? What kinds of apps would you like to see built upon a Dmoz API? Share your thoughts in the comments.

     

  • Why New Google SERPs Might Mean More Traffic for You

    Why New Google SERPs Might Mean More Traffic for You

    Now that the masses have access to Google’s newly redesigned results pages, it’s time to consider this in an SEO light if you have not already been considering it.

    How do Google’s New SERPs Affect SEO? Comment here.

    Google has had its search options available for about a year, but they have not been in the face of the user like the newly redesigned SERP is. With this new design, users don’t have any choice but to notice the options that are available. It’s not too different from Bing or Yahoo in that respect (Danny Sullivan notes that Ask pioneered this design). The difference is that way more people search with Google on a regular basis (in fact, last month Google reportedly dominated the search market by even more than usual).

    SEO Strategies and Increased Engagement from Searchers

    The new SERPs may shake up SEO efforts, simply because users will start going to the different options Google provides them, taking them to different sets of results. Now that the options are in the limelight, users are more likely to use them.

    Yahoo tells us when they added features to their left-hand navigation bar, engagement increased. "We’ve been steadily adding more filtering options and relevant search suggestions to our left-hand navigation bar…and have seen engagement and click-throughs for those features double over the past seven months." I can’t imagine why Google wouldn’t also see an engagement increase for certain features that are now more visible.

    It’s going to come down to evaluating the different options for any given query that you wish to rank for, and focusing efforts upon those. I’ll refer back to the article I posted shortly after Google launched its search options in the first place you can find some tips in that. The same general thinking still applies, but it just got more important.

    New SERPs Make Social Even More Important

    The options in the left panel pull from "everything" – classic Google results (universal, organic, paid, etc.), blogs from Google Blog Search, Books from Google Books (which includes magazines), Images from Google Image Search, News from Google News, Maps from Google Maps, Shopping from Google Product Search, Videos from Google Video (which includes videos from YouTube and other sources), and Updates from Google’s real-time search.

    That last one is of particular note, because before users generally only saw Google’s real-time search in action on select newsy queries unless they hunted them down. Real-time search for any query is now much more accessible, which makes real-time search a bigger deal for search marketing (here’s some tips for getting found in real-time search). Here’s how Google ranks tweets.

    Social interactions are becoming more important. The new SERPs also place much more emphasis on social search results. The same goes for location. You’ll notice "nearby" is one of the options. Discussions is another option. Google appears to draw from a variety of sources for this one, but it stands to reason that engaging in conversation throughout the web has some value to Google’s results. There are definitely a lot of results from forums in these results – another reason forum participation can be a valuable use of your time. Forums and Q&A are actually a couple of sub-options, but I’ve seen blog posts in the discussions results too.
     
    Emphasis on Diversification of Where You’re Ranking in Google

    What it boils down to is that ranking in all of Google’s different search engines has become even more important for getting traffic from Google. Here are some tips for that. I expect traffic for sites listed in any of these to increase as a result of Google’s New SERP. Keep in mind that Google has been testing this for a significant amount of time. If you think Yahoo was seeing increased engagement, imagine what Google will attract.

    I would watch for Google to add more options to the left-panel at any given time. Though they have already experimented a great deal with this layout, I expect we’ll see a lot more tweaking as time goes on.

    Do you think Google’s new SERPs will increase your traffic? Tell us what you think.

  • Are Likes and Retweets the New Links?

    Are Likes and Retweets the New Links?

    Search has been evolving for years, and it looks as though its really starting to enter a new era entirely. While search and social media may be two different animals, it is becoming more clear that they’re directly related, and will continue to be more mixed into one another.

    We’re already seeing search engines attempt to place some kind of ranking on social updates. For example, we’ve already know that search engines take things like follower quality into account in how they rank tweets (see more on that from Google and Bing).

    There has been a lot of talk of Facebook "likes" and Twitter retweets taking the place of links. Nobody’s saying that links are dying exactly. There is obviously plenty of room for link sharing on either of these services, but in some ways these kinds of sharing are replacing links in many cases. Before Facebook even announced its plans to take over the web, WebProNews talked with Rand Fishkin of SEOMoz about how Twitter is "cannibalizing the Web’s link graph":

    Now that Facebook’s Open Graph and social plugins are devouring the web, suddenly liking is taking the place of linking in some speculative scenarios. We talked about some implications Facebook’s initiatve has for search in a recent article.

    While I dont’ think anyone specifically saw the Open Graph stuff coming too long before it was announced (maybe somewhat in the days leading up to it), it’s really still reflective of what we’ve known for some time. The way people are obtaining information online is diversifying. I feel like I’m beating a dead horse (as I’ve written about his repeatedly, but it’s just what the big picture is about. Google’s real competition isn’t coming from other search engines. It’s coming from different avenues of information access.

    The biggest threat to Google the search engine (as opposed to the company, which offers a lot more) is people not having to rely on the traditonal search engine. While I don’t think Google has anything to truly worry about in terms of losing users, it has to worry more about users just not using it as often, because they’re getting their information from apps…from friends via social networks…even when they’re not necessarily at Facebook.com itself, but on any given site or app, via things like social plugins (Twitter has its own @anywhere platform, and we’ll probably see more ways networks are penetrating sites. Hell, Google already has its Friend Connect and Buzz…I would not count the company out in expanding into more of this kind of stuff).

    Style Coalition CEO Yuli Ziv has an interesting article at Mashable about "5 reasons Google and Search wont’ Dominate the Next Decade". Her reasons include:

    1. The search process is inefficient
    2. Mobile GPS Eliminates the need for location-based search
    3. Social Matching Could Create Valuable Connections
    4. Content Recommendations to Replace Search
    5. Suggestions Will Be the Core of Our Shopping Experience

    She elaboraates on each of these of course, and some of them are debatable, but really, the diversification of how people obtain information has already begun.

    Facebook likes may not translate to better Google rankings, but so what? They may translate to a better Facebook ranking. After all, the more people that "like" you brand, the greater the visibility within Facebook. With over 400 million users and counting, and Facebook expanding its presence, that means more visibility period, and at a more meaningful level of personalization. It’s not about choosing between likes and links. Both are ideal.

    WebProNews recently stopped by comScore’s New York offices, and had a chat with search evangelist Eli Goodman who made some good points about where search is headed, and how not only the technology of search engines changes over time, but the habits of users, and the relationship between the two.

    As far as optimizing for search, it seems pretty clear that social and mobile will continue to play larger roles. It also seems clear that if you want social success, you need to work at your relationships with others within your networks. Look at Twitter’s Promoted Tweets strategy around "resonance." Look at tools like Trst.me, which uses a PageRank-like strategy to score Twitter users.

    Look at the implications of Facebook likes. Regardless of what Facebook chooses to do with this data itself, they’re already being utilized in other places, like in search via OneRiot. The whole point of Facebook’s Open Graph is to connect the web. It stands to reason that Facebook likes will be of influence in plenty more places.

    The point of all of this is, it’s not just about getting links anymore. Links will always be of use, but social interactions may equal them in importance, and in some cases may be of greater use to your visibility, and ultimately getting people to your site, your content, your store, or your shopping cart.

  • Where Does Location Fit into the SEO Equation?

    We’re living in an increasingly open and revealing world where people are eager to tell you where they are, where they’ve been, and where they’re going. Not everyone is so eager, but location-sharing is a rising trend that is not to be ignored. Naturally, the phenomenon will have a growing impact on search.

    There is still plenty of room for conversation about what location means to search.  Tell us what you think.

    Remember when the industry was still trying to make sense of how social media and search fit together? It’s now fitting together in a variety of ways, and now we’re at a similar point with location and search.  

    Google Has Its Own Significant Amount of Location Sharers

    At the Web 2.0 Expo this week, Google Product Manager Steve Lee revealed some interesting info about Google Latitude, the company’s location-sharing service, which has been around since long before location-sharing became such a huge trend. Foursquare – the location-sharing service you hear about most these days, has a million users. Latitude has 3 million active users, and this year it’s grown 30% per month each month so far.

    MG Siegler at TechCrunch says Lee hinted that Latitude would soon have a check-in component, something that has made services like Foursquare so popular, and of great use to local businesses. He also said that Latitude has taken some time to gain ground because of iPhone’s lack of the ability to run services in the background (so there isn’t a Latitude iPhone app), but the iPhone OS will have that ability, and Android usage is on the rise (apparently BlackBerry has been big for the service as well). Over 10% of All Android users are using Latitude.

    Location as a Search Signal

    Google has been very open about how much emphasis it is placing on mobile, and mobile and location-sharing go to together like corn flakes and milk. Smartphone usage will continue to grow. Therefore location-sharing will continue to grow. Android usage in particular is growing rapdily.

    Diana Pouliot, Director of Mobile Advertising at Google recently said a third of all Google searches via the mobile web pertain to some aspect of the searcher’s local environment. The company has also been quoted as saying it thinks of location as a "hugely important signal."

    With Google’s newly redesigned SERPs, location-based searches will increase, or rather filtering searches by location will. With the "nearby" option more visible, it stands to reason more people will use it. At this point, I’m not seeing real-time location-based info here, but that may change in the future. Google will continue making tweaks and adding features, and having real-time info here may begin to make sense.

    Best Pizza in Nearby Results on Google

     

    Of course you have the Updates option as well, where you get the real-time info. There’s not a "nearby" sub-option under this option at this point, but with Twitter enabling location info, Facebook launching such a feature soon, and of course Google’s own Buzz, it would also make sense for that sub-option to appear here soon. Don’t be surprised if it does.

    Best Pizza in Updates Results on Google

    According to Siegler, Google has been working "heavily" on location history with regard to Latitude, with updates to this feature expected in the coming weeks. "This will allow people who run Latitude in the background to get interesting information and data about where they’ve been," he says.

    Facebook Will Likely Have Location Info This Month

    According to AdAge, Facebook will be launching its location-sharing feature as early as this month. McDonalds is already building a campaign around it, and others are waiting to do the same.

    Users will be able to share their location in status updates, the report says. With Facebook taking over the web in general, this will likely have huge implications, but for search specifically, it may play a significant role as well. Google of course has its real-time search, which includes publicly accessible status updates from Facebook.

    With Google’s new SERPs, this feature is highlighted to a much greater extent. Before, users would generally only see real-time results for newsy queries that were seeing a great deal of current updates. Now, for any query, a user can simply go to the updates option, readily visible from the left panel, and see the results.

    Wrapping Up

    It remains to be seen just how important location will truly be. Despite its popularity and the rushing of companies and services to take advantage of the technology, it still freaks a lot of people out. Not everyone is going to share their info, or at least willingly. For this reason, there may always be a large part of the market MIA for any location-based campaigns. However, for search, as long as there is a substantial amount of location-related data out there (and it appears that this will only grow rapidly from here on out), local businesses stand to benefit, and so do consumers looking for location-based relevance.

    It’s still unclear just how location-sharing is going to impact search exactly, and just how search marketers will specifically be able to take advantage as far as results go, but all signs point to new opportunities for targeting customers on a very relevant level – where they are, where they have been, and where they are going.

    Either way, it might be a good idea to start looking for ways to reach consumers through their location-sharing habits. Without the benefit of search, there are still tremendous opportunities. As it gets more integrated into search, you’ll be ahead of the game.

    Are you currently using or planning location-based strategies in your marketing efforts? Comment here.

  • New Data From Google Can Help You Optimize Your Site for Conversions

    Google has just started sharing more detailed data for each individual search query in the Top search queries feature in Webmaster Tools.  Google used to just report the average position at which your site’s pages appeared in the search results for a particular query. Now users can click on a given search query to see a breakdown of the number of impressions (number of times your site’s pages appeared in the results for the query), as well as the amount of clickthrough (number of times searchers clicked on that query’s search results to visit a page from your site) for each position your site’s pages appeared at in the results associated with that query. Google also shows a list of your site’s pages that were linked to from the search results for that search query.

    Is the new data being provided by Google of use to you? Tell us what you think.

    How This New Data Can Help Site Owners

    WebProNews spoke with industry veteran Jill Whalen of HighRankings about how this new data can help site owners. "In the past, I haven’t found the data in Webmaster Tools all that helpful other than the occasional finding of a crawl error," she says. "Some of the information they provide isn’t quite accurate, such as when they say that certain Meta descriptions are duplicates when they actually aren’t. These inaccuracies cause people to wonder what they’re doing wrong and in some cases they even panic or waste time ‘fixing’ things that were not broken in the first place, just because they believe everything that comes out of Google."

    "This new data–assuming it’s accurate–provides a new layer of information beyond that which we can typically get elsewhere," Whalen continues. "As far as I know, there’s no other way to know the actual number of times an organic listing in Google is shown to people for a given keyword phrase. That’s pretty interesting and important information!"

    Google Offers New Query Data for Impressions and Conversions

    "Where I see some real value, however, would be in conversion optimization–trying to increase the clickthroughs for your existing organic listings. Just knowing what your clickthrough conversion rate actually is, is a whole new set of data that we never had before."

    Another industry veteran, Aaron Wall of SEOBook, tells me, "For years Google has provided some mystery meat data of marginal value and so I typically have not recommended registering with their webmaster tools. But this is the first tool they have offered which flips that recommendation on its head, as these stats give you new insights into how you are doing in search – data that is not easy to get anywhere else." He’s got an interesting post up about it himself.

    How Accurate is the Data?

    Google’s addition of the new data has been met with a great deal of enthusiasm. Comments on Google’s announcement are overwhelmingly positive. That’s not to say, however, that there isn’t some amount of skepticism.

    "As I said, this data will be very useful if it is indeed accurate. There’s been some Twitter buzz from other SEOs whose data doesn’t match up with their Google Analytics," says Whalen. "For our High Rankings website, the clickthroughs for any given keyword phrase didn’t exactly match what my Google Analytics showed for the same keyword phrases, but it was fairly close. For instance, my top two Google organic keyword phrases showed 3,020 and 1,193 visits when using Google Analytics. Via Webmaster Tools, the same keyword phrases show 2900 and 1300, respectively. That’s pretty close. Perhaps they’re sort of just rounding off (in a strange kind of way!).  Other phrases had similar differences in the numbers."

    Regardless of how precise the information is, webmasters have some new numbers to sink their teeth into, and assuming that many more share similar views to Whalen’s this might make Webmaster Tools a great deal more useful to a lot of site owners. In fact, a lot more site owners may soon be using Webmaster Tools for the first time. Google also just announced a new deal that will insert Google Services for Websites into the latest version of the Plesk Panel, which is said to be used by millions of site owners. Webmaster Tools is part of that Services for Websites package.

    Will you find this new data from Google useful? Let us know.

  • Yahoo: We’ve Got a Left-Hand Navigation Bar Too

    It seems like Yahoo is reminding people that it is still a search company a lot lately. In what I’m guessing is a response to Google launching its new search features, Yahoo has once again reminded us of its own features.

    "As you know, Search innovation continues to be a top priority at Yahoo! – that’s why we have been working hard behind the scenes to continuously evolve the search experience for our users," Meagan Busath, Yahoo’s Director of Global Product Communications tells WebProNews.

    "When we launched our New Yahoo! Search back in September 2009, we introduced a number of handy tools and a new interface that helps people find and explore what matters to them most through sites they know and love," she says. "We’ve been steadily adding more filtering options and relevant search suggestions to our left-hand navigation bar since then, and have seen engagement and click-throughs for those features double over the past seven months."

    Yahoo Has Left-hand side navigation too

    Features she refers to include:

    ·         Offering related suggestions modules, featuring thumbnail images for very relevant people, places or things related to the results you’re viewing – i.e. for entertainment searches, we show related people, movies, songs; for travel searches, we show related points of interest, etc.

    ·         Filtering results by favorite sites, for example if you enter the search term “how to make salsa,” we display filters for sites like eHow or Allrecipes.com on the left

    ·         Narrowing results by different types of structured data, i.e. for news searches you can filter by images, video, time

    ·         Links to “trending now” topics that highlight the most popular things people across the Web are searching for at any given time

    ·         Incorporating Yahoo!’s powerful SearchAssist technology for more generally related keywords – if you search for “digital camera,” you will also see links to related search results for Nikon, Canon, photography, etc.

    As Yahoo often says, search is not just about "ten blue links" anymore. Really, it’s been that way for quite some time, but we continue to see different deliveries of results across all of the major search engines, and it makes the SEO game all the more interesting.

  • Is Your Content Getting As Much Out of YouTube as it Could Be?

    YouTube still claims to be the second largest search engine in the world. Just think about that for a minute. If you produce online video and it’s not on YouTube, you’re probably missing out on a great deal of potential viewers. If you’re not producing video at all, you’re missing out a lot of searches.

    Do you consider YouTube important to search marketing?
     Let us know.

    However, just uploading content to YouTube is not going to be enough. Like with any other form of search engine, content needs to be optimized to be found. At SXSW in Austin back in March, WebProNews spoke with Margaret Gould Stewart, who leads YouTube’s user experience team. She talked about some reasons a lot of content producers are missing out on some tremendous opportunities when they use the world’s most popular online video site.

    "When you’re building a sustained audience, you have to continually create great content that connects with your audience," says Stewart. "I think the secondary part is understanding your audience – understanding who you want to reach, and proactively cultivating a relationship with the people in your audience. And on YouTube that means not just creating great content and uploading it to the site, but actively building your subscriber base, so that you can be in direct and regular interaction and conversation with those people."

    "We find that video producers who are really active in the conversation, whether it’s comments or uploading ‘how this video was made’- you know, kind of the behind-the-scenes – people are really fascinated by that stuff, and we see some our most successful partners really having that, again, kind of ongoing conversation – not an arm’s length relationship to the audience, but very engaged," she adds.

    "We sometimes see content producers not investing enough time in attaching great meta data to their content, because like I said, YouTube is the second largest search engine in the world, and we all know that for Google, it’s important to think seriously about search engine optimization, because you can have the great content, and ideally the cream will float to the top, but there’s definitely things you can do to help yourself along, right?"

    "Good clear, direct titling of your content, putting the right kinds of tags…because the fact is initially when content goes viral, people may discover it through search engines, or embed it in blogs, but then it reaches that really exciting word-of-mouth status, where I just may mention it to you person-to-person, and then what most people do is just go to YouTube.com and they search for it," she continues. "So if you’re not indexed well in the search engine because you haven’t attached great meta data to your content, you’re going to miss out on that audience."

    "The other thing that is really important is enabling embedding," notes Stewart. "It’s probably the number one most important thing, because what we see in videos that become very popular, very quickly and take on that kind of life of its own, a lot of that initial traffic in the first 48 hours happens actually off-site."

    Note: This actually plays to a point I made about Twitter embeds as well.

    If you want more success from your online video endeavors, read 35 Ways to Improve Your Online Video Performance, and Tips For Ranking Higher On and With YouTube, which features an interview with YouTube Product Manager Matt Liu. If real-time, live video is your thing, check out 8 Tips for Real-Time Video Blogging.

    By the way, YouTube is renting movies now, and while it’s not exactly taking over Netflix at this point, I would expect this to grow significantly and get more people spending more time at YouTube, where there is a YouTube search box very close by, and relevant related video suggestions served to viewers constantly.

    Is YouTube a significant part of your marketing strategy? Comment here.

  • Keyword Research Basics For SEO

    I’ve said it before and I’ll say it again … there is no more important step in the SEO process than keyword research. One could make a compelling argument for link building or for architecture or for copywriting but at the end of the day – ranking highly for keywords that either don’t convert or which you close up shop waiting to rank for isn’t going to help too terribly much so in my opinion – I’d put keyword research higher in importance. In fact, when I’m building affiliate sites my first step is to look up keywords and competition levels – then I look into products and websites and this method has worked very well indeed. It insures that I choose keywords that with both convert and that I can rank for in a period of time and with an effort level that matches the return.

     

    So – if you’re doing keyword research, where should you begin? Unless you’re an affiliate marketer you already have a product and since you’re the target audience of this article – I’m going to assume that’s the case. For the purpose of this article I’m going to pick a hobby of mine and also an area where I don’t have a client and imagine I’m doing keyword research for the imaginary online downhill mountain biking store DH Mountain Bikes.

    So Where To Begin …

    The first thing one needs to do is try to think up all the possible phrases that might apply. I call this my seed list … it’s the list of phrases that my research starts with and is generally based on brainstorming. In this case the list would be:

    • downhill mountain bike
    • dh mountain bike
    • mountain bike

    The keyword tool I generally use first is Google’s keyword suggestion tool. There are other great tools but I’ve found Google’s tool to be as accurate as any other, the price is definitely right (free), and they’re very good about providing the information required to know just how wrong the data is if you know where to look. So let’s do just that.

    Before we begin you’ll need to head over to Google’s keyword tool at https://adwords.google.com/select/KeywordToolExternal. In the top left (for now) you’ll see a link to a beta version of the tool. Click on the link and you’ll be at the new version of the tool which will provide you easy access to much more information – as long as you know what to look for. So let’s begin with our three seed phrases.

    When you see the list you’ll first have to know what the numbers are. This tool is a tool designed for AdWords and the default number is the Broad match which means it includes every phrase with the term. For example, the term “mountain bike” has a broad match total of 2,740,000 which will include “downhill mountain bike”, “mountain bike parts”, “kona mountain bike”, etc. etc. What we want to know is how many searches are for “mountain bike”. Down the left-hand side you’ll see a set of check boxes. Deselect “Broad” and select “Exact” and you’ll get the Exact match numbers – the number of searches for the exact phrase. You’ll quickly see that 2,740,000 drop to 450,000. This is how many people searched the GOOGLE SEARCH NETWORK for “mountain bike”. Why is this in caps – because it’s so commonly misunderstood that I definitely want your attention brought to it. This isn’t the number of searches on Google.com – it’s the number of searches on all sites who’s search is powered by Google. From YouTube to Beanstalk’s blog search – it’s all in there so the data starts to get skewed from the start. Then let’s add in all the automated queries from rank-checking tools and just manual searches from you and your competitors and the data gets further skewed. This skewing will exist in all data – the thing I like about using Google is that at least we know more about what’s adjusting the data.

    OK – so from there we need to organize the data into a more useful set of information. To do this one needs to understand the columns of data. The first column is the keyword, the second you’ll see is a link to the term on Google Insights. We’ll get into this later. The next is Global Monthly Searches – this is the average number of searches/mth worldwide. This can be helpful in some industries but in ours – I’m only concerned with the US market which is where my imaginary store ships to so I’m more interested in the next column Local Monthly Searches which is the number of searches in the US (or whatever region I’ve specified when entering my keyword phrases). This is the data I’m interested in. The last column is the search trend. This is extremely important but often overlooked. It is a column that wasn’t visible by default in the old/current version.

    OK – let’s organize our data by search volume. Click on the “Local Monthly Searches” and you’ll see the keywords order by descending search volume. With this data in front of me I then typically look over to the Trend data to see what I can find there. In our case we’re going to see an increase in search volume in the spring and summer. This make sense of course. Think of your industry and see if the trends reflect what makes sense.

    I’m also looking for anomalies. Often I’ll see phrases that jump for a single month. One has to know that unless there was a news story or other event that would spark interest in a single term or brand – a tool or some other such incident is likely falsifying the data. You need to look at these trends and see if they make sense. If not – you need to either test the phrases with PPC or just skip over them and select different phrases. There’s little worse as an SEO than focusing energies on a phrase only to find that the search volume is not what was expected based on the estimates delivered.

    So now what?

    So what do you do once you’ve filtered your data down to just what you’re interested in looking into competition levels on. Well – the first thing I do is to look to the trends to see if there are any phrases that obviously need to be filtered out. In this case there really aren’t any high in the search volume column. So the only thing left is to look at the competition levels to see what makes sense. For our purposes we’ll be dividing the list and research into two categories:

    Major phrases – We need to decide what the long-term goals are going to be and the targets for the main pages. These will be the totally generic phrases such as “mountain bike” and “downhill mountain bike” as well as brand or type specific phrases such as “specialized mountain bike” and “full suspension mountain bike”.

    Longtail phrases – We also need to look into the types of longtail phrases we’re going to want to target. In this case I know I’ll want to target specific parts which will require new research. I will spare you the details there but I’ll end up with specific models of components such as “hayes mx2”. You don’t need to know what that is – you need to know the makes and models in your industry (or other longatil opportunities such as “new york hotel with jacuzzi”, etc.)

    I generally would gather together a list of 15 or 20 major phrases and 50 or 60 longtail phrases and would then head into the competition analysis to determine which phrases to move forward with.

  • What Facebook “Likes” Mean for Search & Reputation

    It’s been nearly a week since Facebook rocked the world with its Open Graph announcements, and many of us are still wrapping our heads around all of the implications they have. I don’t think there’s any dispute that it’s a huge move, and that it’s important to pay attention to from a business perspective, but just what it means for businesses is still up in the air in some regards. Like Facebook itself, or even social media in general, we’re going to see more benefits (and possibly negatives) as time goes on, and more sites and applications harness the power of said Open Graph.

    As those wheels turn in our heads, there is plenty of discussion already happening around the subject – not just the Open Graph and the issues related to it (open web ramifications, privacy, etc), but how we can indeed take advantage of it.

    Traffic

    In a recent article we talked about why Facebook’s Open Graph and particularly its social plug-ins will be good to drive traffic. It’s pretty straightforward. The like and recommendation buttons are essentially different versions of the share buttons that people have been using to drive traffic for quite some time. The main difference is that instead of only showing up in the news feed only disappear shortly thereafter, they will remain on the user’s profile page for people to see in the future – a fixed link to your content.

    Have you seen more traffic from Facebook’s buttons since they’ve launched? Let us know.

    Search/SEO

    Facebook Pizza Search Search Engine Land contributing editor Greg Sterling makes some interesting points about the search implications of the whole thing:

    However, the vision here is a network of discovery tools and information that operate higher up in the funnel than search: what are my friends doing, where are they eating, what do they recommend? This clearly doesn’t eliminate the need for search. But it does represent an alternative way in many cases to discover information.

    Yet the mountains of data that Facebook will gain could improve Facebook search results and potentially the coming, new and improved Bing integration. At a simple level, if Facebook knows the most “Liked” sushi restaurants in New York and those liked by my social network it can show me that information in search results. That hypothetically makes Facebook search much more social and more of a “recommendations engine” than Google at this point.

    Nobody’s saying Facebook is poised to replace Google, but the whole thing falls inline with the diversification of search we’ve been talking about a lot lately. Search is getting more diversified, meaning that people are using more and more applications to find the information they’re looking for. Facebook obviously plays a huge role in this. Also consider that Facebook’s search market share has been on the rise, and it stands to reason that will continue as more and more data becomes available as this Open Graph expands.

    Do you see Facebook’s own search becoming more of a go-to place for finding information? Comment here.

    Local

    Assuming that every business rushes to get like/recommend buttons from Facebook in the way that they would rush to claim a listing in Google’s Local Business Center (Now named Google Places), Facebook may become a very valuable place to find out the best businesses to go to for any given category.

    As Sterling says on his Screenwerk blog, "It could do nothing in particular or it could build the single most effective local directory and search site that exists. This data will be more valuable than anything Google has or any individual local publisher-partner possesses. That includes Yelp, YPG or anyone else that joins the Open Graph and implements these new Facebook platform tools."

    This is mostly forward thinking, and we don’t know what Facebook is going to do. It’s definitely something to keep eye on. Either way, local businesses are likely to attract fans from their areas with more friends from that area, who may in turn pass it to their friends in the area. Facebook has already been a great marketing tool for local businesses, and the Open Graph will only help in that regard.

    Do you think Facebook is going to become increasingly important for local businesses? Share your thoughts.

    Reputation

    Facebook’s latest changes have plenty of implications for reputation management. Likes and recommendations are potentially great for building a good reputation, but even while there is no dislike button (at least yet), a lack of likes/recommendations may reflect poorly on your brand, particularly when your competition is getting all kinds of love from Facebook users.

    On the other side of things, you may want to be careful what you like and recommend yourself. Wording of likes and recommendations can come off as inappropriate, but the bigger issue may be liking and recommending stuff that that paints you in a non-professional light. Depending on what you do and the image you are trying to portray, this may or many not be a problem, but for those who wish to be careful about how others perceive them, this is something to think about.

    Should you be concerned about likes/recommendations from a reputation standpoint? Tell us what you think.

    Another thing worth mentioning about all of this is that Facebook is showing suggestions for things to like and recommend to new users. Facebook has posted something of an FAQ for the new features that aims to clarify how it all works for users.

  • If Google Indexing Goes Real-Time, What Will it Mean for Ranking?

    Last year, we saw the emergence of the technology PubSubHubbub, which provides real-time notifications to subscribers of content when there is new content or updates being made. There has recently been talk about Google developing a system that would use this technology it its indexing process.

    Do you want your content indexed instantly? Share your thoughts.

    In fact, Google’s Matt Cutts spoke with WebProNews about this, among other things:

    "Maybe some small site, you might only find a chance to crawl its pages once a week, but if that site is blogging like every 20 minutes, boom , you hit the submit button, and the search engines can find out about it," explained Cutts.

    "Now the tension is that more spammers would use this as well, so you can’t just say, ‘I’m gonna index everything that everybody pushes to me.’ So finding the right balance there is tricky, but the potential is really, really exciting," he said.

    "You can definitely imagine the reputable blogs getting very fast updates – the ones that we think are trustworthy, and then over time, maybe ramping that up, so that more and more people have the ability to do…just like, instant indexing," he says.

    And here we see another way Google may end up looking at the trust factor, with regards to ranking.

    Can We Learn from How Google Does Real-Time Search?

    Liz Gannes at GigaOm recapped a few things Google senior product manager Dylan Casey said at SMX last month:

    Casey said perhaps the most complex project in real time is to determine when to trigger the appearance of real-time results in search results. "We have huge internal debates on: Is this a good answer to this question, or are we just creating a tool for low-quality content?" he said.

    Casey spent some effort justifying Google paying to include Twitter’s real-time firehose of tweets, saying it was an intensive technical integration on both sides, and that tweets are a fundamentally different form of communication due to the restrictions of their form. For example, Google has developed a ‘complex system’ for removing users’ public tweets that are later deleted or marked private.

    Earlier this year, Amit Singhal, who has led development of real-time search at Google talked about how Google ranks tweets. According to him, Google ranks tweets by followers to an extent, but it’s not just about how many followers you get. It’s about how reputable those followers are.

    Singhal likens the system to the well-known Google system of link popularity. Getting good links from reputable sources helps your content in Google, so having followers with that same kind of authority theoretically helps your tweets rank in Google’s real-time search.

    "One user following another in social media is analogous to one page linking to another on the Web. Both are a form of recommendation," Singhal says. "As high-quality pages link to another page on the Web, the quality of the linked-to page goes up. Likewise, in social media, as established users follow another user, the quality of the followed user goes up as well."

    Now Google’s current real-time search product is separate from the whole PubSubHubbub-based system that isn’t in place yet, but Matt’s comments about blogs being trustworthy, indicates to me that trust is going to be key in being able to push content to Google’s index in real-time. So, I wonder if a similar strategy to how Google ranks its current real-time and Twitter results will be employed in determining this kind of trust.

    Does This Mean If You’re Not Trusted You Won’t Get Indexed?

    "PuSH wouldn’t likely replace crawling, in fact a crawl would be needed to discover PuSH feeds to subscribe to, but the real-time format would be used to augment Google’s existing index," says Marshall Kirkpatrick, who spoke in a session on the real-time web at SXSW, which also included Google’s Brett Slatkin, one of the guys responsible for PuSH (he’s in the following video explaining the technology in simple terms).

    Lots of sites out there already have PuSH technology in place. For example, WordPress and Typepad blogs have the ability to "PuSH" their content. That’s a lot of content itself. A lot of user-generated content, and that means the potential for spam is huge, which is why the trust factor is so important.

    If PuSh is to be heavily utilized by the search engines, and you want your content indexed as quickly as possible, you’re going to want to do what you can to build community trust and a solid reputation. One more reason to engage in meticulous online reputation management, put out great content, and engage with the community.

    Do you want to see Google index the web in real-time? Discuss here.