WebProNews

Tag: Search

  • SERP Alert: Google Social Search Goes Global

    SERP Alert: Google Social Search Goes Global

    Google announced via its new official Search Blog that it is rolling out Social Search around the globe. This comes just days after Bing upped the ante in the social search game by integrating Facebook data in much more elaborate ways. Google’s social search, however, may prove useful in some cases, but you may see more content from strangers than you do from your real friends.

    Does Google’s Social Search make results less relevant? Comment here.

    Google has been doing social search since 2009, and earlier this year it was updated to be more useful, with social results appearing throughout the SERP, as opposed to just in a cluster at the bottom of the SERP. Google says they’re mixed in based on relevance.

    “For example, if you’re looking for information about low-light photography and your friend Marcin has written a blog post about it, that post may show up higher in your results with a clear annotation and picture of Marcin,” says Google software engineer Yohann Coppel.

    “Social Search can help you find pages your friends have created, and it can also help you find links your contacts have shared on Twitter and other sites. If someone you’re connected to has publicly shared a link, we may show that link in your results with a clear annotation,” says Coppel. “So, if you’re looking for information about modern cooking and your colleague Adam shared a link about Modernist Cuisine, you’ll see an annotation and picture of Adam under the result. That way when you see Adam in the office, you’ll know he might be a good person to ask about his favorite modern cooking techniques.”

    How Google Determines What to Show In Social Search Results

    First of all, users must be logged into Google to get the benefits of social search. “If you’re signed in, Google makes a best guess about whose public content you may want to see in your results, including people from your Google Chat buddy list, your Google Contacts, the people you’re following in Google Reader and Buzz, and the networks you’ve linked from your Google profile or Google Account. For public networks like Twitter, Google finds your friends and sees who they’re publicly connected to as well,” explains Coppel.

    Google deserves credit for giving users great deal of control about what people they’re using here, though they could still go further. You can go to your Google Dashboard, find the Social Circle and Content section, and edit accordingly. If you go to the “view social circle link” you can see every single person listed by:

    • Direct connections from your Google Chat buddies and contacts. It even shows you which of these people have content and which don’t. For the ones that do, it shows you which sites they have content on.

      One important thing to note: it actually does include Facebook Page content. For example, I’m connected to Danny Sullivan in my social circle, for example, and Google will show me updates from his Facebook page, as he has it linked to his Google Profile. What’s missing, however, is your personal Facebook network of friends (which in my opinion is the most valuable social data there currently is on the web, if you’re a Facebook user).

    • Direct connections from links through Google Profiles or Connected Accounts “For example, if you listed your Twitter account on your profile or if your Twitter posts appear in your public Buzz stream, then relevant content from people you follow on Twitter will show up in your search results,” Google explains in that section. “You can change these relationships by visiting the corresponding services and adding or removing connections.”
    • Secondary connections that are publicly associated with your direct connections. In other words – friends of friends (at least public friends of friends). There is a little less control here, unfortunately. You can’t remove these people from your social circle unless you remove the friend that’s connecting you to them.

      To me, this actually seems like a step backwards in relevancy of social search. You’re probably a lot less likely to care about what someone thinks just because they know someone you know, than you are if you actually know them. A lot of people don’t even care about what the people they actually do know think.

      Naturally, this is the biggest list and potential source of material for Google to draw from, making it more likely that you see results from people you don’t know than people you do.

    A cool thing about the entire list is that you can click “show paths” next to any name that has content, and it will show you exactly how you’re connected. You can be linked to someone via Twitter, and if that person links their Twitter account to their Quora account, you might see their Quora content too. If that Quora account links to their Facebook account, you might see stuff from their Facebook account if you have permission to see that content (which if set to public or if you’re Facebook friends, you should be able to see it).

    Where are my friends?

    I notice one gaping hole in Google’s social search strategy besides the lack of comprehensive Facebook integration (though it’s certainly connected to that). That would be the lack of a substantial amount of my actual closest friends. I can only assume that many users have a similar issue.

    That’s exactly why Bing’s Facebook integration is a very important factor in its competition with Google. Bing, unlike Google, does tap into your actual Facebook friends for search relevancy (though there is plenty of room for improvement on Bing’s part as well). The Wajam browser extension is still currently a better solution to the problem, if yo ask me. It will add your Facebook and Twitter friends to your results on both Google and Bing.

    It is also for this reason (at least partially) that Google is competing more directly with Facebook now in social. Google wants users to develop the kinds of relationships among friends that people currently have on Facebook, on Google’s own network (which runs throughout various products, but ultimately the Google account, which is at the center of nearly everything – Gmail, YouTube, Buzz, Docs, Chrome OS, etc. The list goes on.

    As long as Google and Facebook aren’t going to play nice together, Google needs to succeed in social to have the best search relevancy in the social part of search. And that part of search is clearly becoming more and more important. That’s simply one competitive advantage Bing has over Google right now. It’s also why Facebook itself is a threat to Google search in some ways.

    It will be very interesting to see how far Google takes social search over time. We know Google is currently working on increasing its presence as a force in social, and the upcoming +1 button should play a significant part in that. As search gets more social, however, it presents new challenges for search engine optimization, and perhaps less significance on algorithm updates (like Panda) from the webmaster point of view.

    Social can not only be a signal of relevance on a personalized level, but if content is shared a lot, it can also be seen as a signal of quality, because people don’t share content that sucks, unless they’re doing it as a joke or using it as an example of what not to do (like I said, it’s just a “signal”). This is nothing new, but it shows the importance of diversifying your traffic sources.

    If you rely heavily on search, as many of the big victims of the Panda update have, you will always be at the mercy of the search engines. If you can find ways to get more love from social networks and links from others, it’s bound to help you in search as well.

    Is Google’s social search helpful or does it miss the mark? Tell us what you think.

  • J.C. Penney Sees Some Google Visibility Recovery After Paid Link Scandal

    J.C. Penney Sees Some Google Visibility Recovery After Paid Link Scandal

    Earlier this year, J.C. Penney was caught gaming Google. A New York Times article exposed that the company had been benefiting enormously from excessive paid links, which is obviously against Google’s rules. They had ranked number one or close for some very prominent search queries like “skinny jeans,” “home decor,” “comforter sets,” “furniture”, “tablecloths,” etc.

    As the news came out, Google took action to penalize the site for its practices. A J.C. Penney spokesperson had stated, “J. C. Penney did not authorize, and we were not involved with or aware of, the posting of the links that you sent to us, as it is against our natural search policies.”

    Searchmetrics has uncovered data which shows that J.C. Penney has seen its search visibility rise again. “Searchmetrics’ Organic Performance Index recorded a dramatic drop in visibility at the time for the site but now it sees that there has been a significant increase in visibility,” a representative for SearchMetrics tells WebProNews.

    J.C. Penney gets a second chance http://ow.ly/4XjmT 7 hours ago via HootSuite · powered by @socialditto

    In a post on the the SearchMetrics blog, the company shows a couple of graphs: one showing the general organic performance, and the other showing a specific keyword: “jewelry”:

    J.C. Penney SearchMetrics data

    J.C. Penney SearchMetrics data

    “What happened? We cannot see a massive reduction/change in their link structure – this also would be way to fast and require more time,” SearchMetrics says on the blog. “So it might be that the people at J.C.Penney have managed to convince Google that they really had no clue about what was going on at their agency – or the algorithm is giving them another chance. We have observed this happening for algorithm penalties many times before: after a couple of weeks or months the penalty has been taken back – at least partially. What’s surprising in this case is that the reinstatement did happen for what clearly was a manual adjustment.”

    SearchMetrics, which also shared a lot of widely-publicized data about the Panda update and it’s victims, is careful to note that none of this is actually Panda-related.

    J.C. Penney isn’t the only site recently penalized by Google to bounce back. In February, Google penalized Overstock.com after the site’s pages had ranked near the top of results for dozens of common searches. The site had been encouraging college and university sites to post links to Overstock pages for discounts, though by the time the penalty hit, the site had already discontinued the program.

    Late last month, Overstock announced (even putting out a press release) that it was no longer being penalized by Google. Overstock’s CEO Patrick Byrne was quoted as saying, “Google has made clear they believe these links should not factor into their search algorithm. We understand Google’s position and have made the appropriate changes to remain within Google’s guidelines.”

    Google, of course, would not comment on any specific site, so why should that be any different with J.C. Penney? Granted, this policy seems to only apply in certain cases, as Google’s Matt Cutts did tweet about J.C. Penney after the New York Times story came out, saying, “I really wish that our algorithms or other processes had caught this much faster – I’m definitely not celebrating.”

  • Mississippi Flood Photos from Google Maps Satellite Imagery

    Google has put together a whole bunch of data and imagery from the Mississippi floods on Google Maps.

    On Google’s Lat Long Blog, Pete Giencke of the Google Crisis Response Team writes:

    Emerging as one of the worst flooding events along the U.S. waterway in the past century, the Mississippi River floods of April and May 2011 have caused widespread destruction along the 2,300 mile river system. Historically high water levels from heavy rains and springtime snowmelt have provided no shortage of dramatic scenes — levees breached, downtown areas completely submerged, spillways opened, and more.

    The Google Crisis Response team has assembled a collection of flood data including satellite imagery for impacted cities along the river from GeoEye, flood extent and crest data forecasts from the US Army Corps of Engineers (kml) and NOAA’s National Weather Service (kml), and shelter locations from the American Red Cross (kml).

    The image at the top is Morganza, Louisiana on May 15. The following image is from Cairo, Illinois on May 8.

    Mississippi flood imagery

    Google Crisis Response is a project of Google.org, the company’s philanthropic arm. Its stated goal is to make critical info more accessible around natural disasters and humanitarian crises. Just this year it has provided data and resources for the earthquake/tsunami in Japan, the Christchurch earthquake, the Brazil floods and landslides, and Australian floods.

    Last year, it provided resources for the Pakistan floods, gulf oil spill, Qinghai earthquake, Chile earthquake, and Haiti earthquake. In 2009 it provided resources for Typhoon Morakot, the Lockheed Wildfire in Santa Cruz, the L’Aquila Earthquake, and Red River Floods.

    To see all available data for the Mississippi floods, simply search for “Mississippi flooding” on Google Maps. The data is also accessible in Google Earth via the “places” layer.

  • Google Dominates Search in Latin America

    Google Dominates Search in Latin America

    It is well known that Google hosts the most search queries in the world. It’s dominance is no doubt being challenged by the growth of Bing and persistence of Yahoo search, but Google is still the king of search, by a longshot.

    If you missed it, last week comScore came out with its United States search engine rankings for April. The rankings showed that Google still dominates the search market, holding 65.4% of the share. Yahoo came in a distant second with 15.9% and Bing rounded out the top three with 14.1%.

    Google dropped .3% from March to April, while both Yahoo and Bing increased their volume by .2%.

    So, while Google holding almost 2/3 of the share of search in America is dominant, it’s nothing compared to what’s going on in Latin America.

    comScore reports that over 90% of searches in Latin America are happening through Google (including YouTube). Of the 18.5 billion individual search queries in March 2011, 16.7 of them were performed through Google. Facebook garnered the next highest percentage at 2.8% and Microsoft rounded out the top three just under Facebook’s volume.

    The study also found that search in general surged in Latin America, up 21% year-over-year. They found that the average searcher in Latin America conducts 167 queries per month. The study reported a 14% increase in unique searchers and 6% increase in search volume per person.

    Brazil accounted for the largest volume of queries with 6 billion in March and also had the strongest growth year-over-year at 34%. Mexico and Colombia followed with 3.2 billion and 2.9 billion respectively.

    Google’s level of dominance in the States pales in comparison to their dominance in Latin America. And their dominance is a global thing – according to a report earlier this year there are only 5 countries in the world where Google isn’t the top search engine.

    In a move that may help their cause in challenging Google’s supremacy, Bing just announced Facebook integration in search. This move allows friend’s “likes” to become a factor in search rankings. Bing says that this move towards “social search” is what people want, and is the model of the future.

  • Blekko Powers Search for Flipboard

    Blekko Powers Search for Flipboard

    Blekko announced today that it is now powering RSS search for the popular iPad app Flipboard. The functionality is designed to weed out spam as blekko’s slashtag method of searching does, so Flipboard users can discover good content based on keywords.

    “It’s a tremendous opportunity to partner with a great brand and an outstanding product that delivers such an awesome experience for its readers,” commented blekko CEO Rich Skrenta. “We hope we can make the experience even better by pointing Flipboard readers to the absolute best sources of content via RSS feeds on any topic, without spam.”

    “Flipboard’s goal is to bring everything that you care about together into one place,” says Flipboard co-founder Evan Doll. “By adding blekko’s RSS search to Flipboard, you can now also find your favorite RSS feeds and enjoy them in a beautiful visual format within your social magazine.”

    Blekko powers flipboard search

    Blekko has been pretty busy. A couple weeks ago, the company announced that it was powering search on local news site Topix (which Skrenta co-founded). Since then, they announced new privacy features, including a 48-hour data retention policy.

    Partnerships with other widely used sites and apps could be just what blekko needs to really gain some exposure and ground in the search space. It will be interesting to see what other companies partner with blekko going forward.

    Even Google’s Matt Cutts is encouraging users to check the search engine out, saying that the competition helps keep Google on its toes. I don’t think Google is really on its toes too much because of blekko at this point (Bing might be a slightly different story), but blekko has only been around for a little more than half a year.

  • Google Search App for iOS Gets Update

    Google Search App for iOS Gets Update

    Google today introduced some new changes to its search app for iOS, including some speed adjustments and a new look and feel for search results. The app itself was launched back in March, aimed at presenting a faster search experience and making it easier to find apps.

    With the new changes, Google says it has made the app more responsive and made search results easier to read.

    “This version of Google Search app is up to 20% more responsive as you type search queries and interact with it,” says software engineer Nirav Savjani on Google’s mobile blog. “As part of the speed improvements, a feature called ‘Just Talk’ will now be off by default. Just Talk allowed you to search via voice just by bringing the phone to your ear and speaking rather than tapping the microphone icon. Turning off this feature may improve app performance, though you can easily re-enable it under the Settings > Voice Search menu.”

    Just Talk Switched to off on Google Search App for iOS

    “When searching on a phone, the small screen sometimes makes it difficult to read small fonts or to tap precisely on a link,” adds Savjani. “To help you read and tap with ease, we’ve made the font of our search results bigger and the entire search result is now a tap target rather than just the link.”

    Google iOS Search Results Page

    When browsing search results with the app, users can swipe down to view the search bar or change their settings. The app as it was redesigned a couple months ago introduced a new toolbar that makes it easier for users to filter results. Users can access the tool bar by swiping left to right, before they search or from within the search results. There is an image only option users can access by taping “Images.”

    This app is available for devices on iOS 3.0 and above, and can be downloaded from Apple’s App Store.

    Thanks to your feedback, the updated Google Search for iOS gets faster & more responsive. Find out more http://goo.gl/POlwC 1 hour ago via web · powered by @socialditto

  • Do Bing’s New Facebook Features Make it a Better Search Engine Than Google?

    Bing has been steadily increasing its integration with Facebook, and while that’s likely far from over, they’ve launched some significant new features. We’ve written plenty about social search in the past, and from the comments we’ve received, it’s clear that there are a lot of people out there who don’t think there is any value in it. Others acknowledge that there might be value there, but still have a hard time finding it. Bing says half of people (based on its own research) say seeing their friends “likes” with search results could help them make better decisions.

    Is there value to having info from your Facebook friends in search results? Comment here.

    Microsoft Corporate Vice President Yusuf Mehdi talks about the company’s line of reasoning on the Bing Search Blog:

    “Research tells us that 90% of people seek advice from family and friends as part of the decision making process. This ‘Friend Effect’ is apparent in most of our decisions and often outweighs other facts because people feel more confident, smarter and safer with the wisdom of their trusted circle. A movie critic may pan the latest summer block buster, but your friends say it’s the feel good movie of the year, so you ignore the critic and go (and wholeheartedly agree). Historically, search hasn’t incorporated this ‘Friend Effect’ – and 80% of people will delay making a decision until they can get a friend’s stamp of approval. This decision delay, or period of time it takes to hunt down a friend for advice, can last anywhere from a few minutes to days, whether you’re waiting for a call back, text, email or tweet.”

    With the new update, users will get more personalized search results on Bing based on the opinions of Facebook friends. You have to be signed into Facebook. “New features make it easier to see what your Facebook friends ‘like’ across the Web, incorporate the collective know-how of the Web into your search results, and begin adding a more conversational aspect to your searches,” says Mehdi.

    What Exactly is Bing Doing?

    • Displaying “likes” from news stories, celebrities, movies, bands, brands, etc. in search results, where applicable
    • Displaying actual sites your friends have “liked” – not just individual pieces of content. Bing says if you’re looking for a TV, and you have a friend that has “liked” overstock.com, you might see that in your results.
    • A very important element of this update is that it is actually influencing the rankings of content (on a personalized basis). Mehdi says, “Bing will surface results, which may typically have been on page three or four, higher in its results based on stuff your friends have liked. And, how often do you go beyond page one of the results?”
    • Bing is using Facebook data to show “well-liked content, including trending topics, articles and Facebook fan pages, from sites across the web”.
    • Bing is showing Facebook posts from brands when the brand is searched for. Search for Avis and you’ll see recent updates from the Avis Facebook page (in theory. I couldn’t get that to actually work).
    • Bing now has a feature that will let you have conversations with Facebook friends who live where you’re traveling.
    • They also recently launched a feature that lets you share shopping lists with friends.
    • When you search for a specific person, Bing will use Facebook to provide location, education, and employment details.
    • A “Travel Wishlist” feature lets you compare trips with Facebook friends, suggest new destinations, and learn more about locations. When you pick a travel destination, Bing will show you friends that live or have lived there.
    • If you “like” a city on Bing, Bing will send deals for flights to that city to your Facebook news feed.

    Turning it on/off

    The beauty of the feature is that if you don’t like it, you don’t have to use it. Just don’t sign into Facebook. It’s as simple as that.

    For the first five times you use Bing in this way, you’ll see a note at the top right of the screen saying that it is using your Facebook friends, and has a link to “learn more” and a “disable’ button. You can always connect to Facebook again under the sign-in menu.

    Will it deliver better search results?

    There are plenty of questions that surround the execution of social search, which is probably why nobody has really gotten it 100% right yet. For example, should Bing be focusing on friends that have similar interests to you rather than your whole body of friends? Perhaps it depends on the query.

    There’s no question that most Facebook users have friends they interact with more and some they don’t even really know that well. Maybe you’re friends with someone you went to middle school with and haven’t talked to since. Without measuring the level of friendship or common interests, can data from these more obscure “friends” really be valuable? If Bing found a way to identify the people you really interact with and/or have common interests beyond just being your Facebook friend, search results could improve for certain queries.

    What’s missing?

    As with some past Bing announcements, the execution doesn’t seem to quite live up to the hype. That doesn’t mean it won’t get better, but the features are not perfect by any means.

    I do notice that “like” information is incomplete. For example, if I search for the band Converge, Bing shows me that I have two friends that like it, when in fact, Facebook shows that I have four friends that like it. This has to improve, because which friends like certain things can make all the difference in the world. This is a critical element of social search.

    Facebook Likes in Bing

    Facebook LIkes in Facebook - Different than Bing

    I think I still prefer the Wajam approach to social search. They add all of the stuff from your friends right at the top, so it’s always easy to distinguish it from the natural results. It’s also easy to get a friend-by-friend break down on any given query, and see which friends have mentioned certain things.

    In fact, that’s a big element still missing in Bing’s experience, as far as I can tell. Conversations happen on Facebook itself. It’s not all just people liking content around the web. My friend that lives in Chicago may have mentioned a great hot dog shop in casual dialog, without “liking” it on the web or “liking” its Facebook page. Will Bing show me that when I search for a place to eat in Chicago?

    If I’m thinking about buying a new album, will it show me the comment my friend made about how much it sucks? Facebook is a treasure trove of data, and while these new features may be an improvement to the experience, there is a lot more that can be done (much of which Wajam, for one, has already made significant strides in).

    Challenging Google

    Google has made no secret of the fact that it considers Microsoft and Bing to be its main competitor. Bing, while it still has a ways to go before it gets into Google territory, has been steadily increasing search market share since it launched. The latest comScore data had both Bing and Yahoo gaining a little ground in April (with Bing of course powering the back-end of Yahoo’s search results).

    Bing has things in motion that should only increase its share significantly. These include deals with Nokia and RIM, which will put Bing as the default search engine in the pockets of a great many devices. While this is only speculation, I still expect Microsoft to eventually integrate Bing into Xbox in a major way, as the web and the living room become more integrated. Google is not shying away from this area, and Microsoft already has a significant edge with its gaming console. The recent follies of the Sony Playstation (the Xbox’s main rival) can’t hurt either.

    Google has been doing social search for quite some time, but really how social is it? How many conversations does it start? How often do the results influence your decisions? There has long been one major hole in Google’s offering, and that is Facebook data. This is simply because most people online that do any kind of social networking use Facebook. If they used Google Buzz, Google would have an enormous edge, but they use Facebook. As long as that’s the case, and Google is not tapping into that, its social-based results simply can’t be as good as they would be otherwise.

    The Facebook Like vs. the Google +1 Button

    Google has of course unveiled its strategy of using friends to influence search results with the +1 button, which is set to be rolled out in the coming weeks. There is a great deal of skepticism around this, however, and Bing has upped the ante. The strategies are similar in that both require friends hitting a button to influence the search rankings of content.

    Where Google is starting from the ground up, Bing is harvesting the data from a very well established system that we know works. Frankly, Google is going to have a hard time topping this.

    For one thing, people aren’t clicking the “like” button with the intent to influence search rankings (at least not the average person, though I suspect we’ll see people trying to game this). They’re clicking it because they use Facebook and they genuinely like things. That works.

    To most users, Google is still a search engine. It’s not where their friends are. Sure, maybe they use all kinds of Google services, but it’s still not their main social network of choice. We’re still waiting for Google to tie this whole social strategy together in a more cohesive way (that’s a whole other conversation) , but until that gets accomplished, the average user is just going to consider Facebook the place where their friends are going to see their “liking”. Who’s going to see their “+1ing”? Are they just going to click that button because they want other people to have a better chance of finding it for some search query that they may or may not ever enter?

    Less of the Same

    All of that said, it might be best that Google and Bing remain significantly different in their strategies. It is a good thing for Bing to differentiate itself more as a search engine. The less alike Bing and Google are, the more options users have. It’s even possible to use both. I know. Crazy, right?

    Google’s Matt Cutts is even encouraging users to check out other search engines like Blekko and DuckDuckGo. “I love when new search engines launch. I think competition is great,” he said in a recent webmaster video. “It keeps us on our toes. It makes sure that we’re doing the right things. I highly encourage people to check out both Blekko and DuckDuckGo. See what you like, see what you don’t like.”

    He has a point about Google “making sure it’s doing the right things”. We’ve certainly seen Google borrow some ideas from Bing in the past. We’ll see if Google and Facebook can ever come to an understanding. Don’t forget, Microsoft is an investor in Facebook.

    From a marketing perspective, Bing needs to find ways to stand out by leveraging its business relationships with Facebook. I wonder if we’ll start seeing more about this in Bing commercials. Microsoft is certainly spending a lot more on marketing Bing than Google is on its search engine. Perhaps that will change if Bing’s market share doesn’t stop growing.

    Which is the better search engine: Google or Bing? Tell us what you think.

  • Google’s Matt Cutts Encourages Users To Check Out Blekko and DuckDuckGo

    Google’s Matt Cutts Encourages Users To Check Out Blekko and DuckDuckGo

    Google’s Matt Cutts used a new webmaster video to share his thoughts about alternative search engines Blekko and DuckDuckGo. Long story short, he thinks you should check them out, and see what you like and don’t like about them.

    The discussion was spurred by a question submitted to him, asking what he thinks about Blekko. “In general, I love when new search engines launch,” he says. “It’s always cool to run a few queries, and see how do they score things differently than we would score things.”

    “I think it’s fantastic to have a lot of competition,” he says. “I think it’s good for users, as long as people are competing on a level or fair playing ground.”

    I can hear the critics of Google’s own competitive practices getting ready to chime in already.

    On Blekko’s slashtag strategy, Cutts says, “It’s unclear it will catch on, because it is some amount of work to build those individual restricts or groups or collections, but they do a relatively good job of showing auto complete, and sort of suggesting tags you might want to add.”

    “I love when new search engines launch. I think competition is great,” he reiterates. “It keeps us on our toes. It makes sure that we’re doing the right things. I highly encourage people to check out both Blekko and DuckDuckGo. See what you like, see what you don’t like. Certainly with power users, it will certainly have some amount of appeal, and then time will tell.”

    “The wonderful thing is that everyone has different philosophies about how to improve search, and how to make search better,” he adds. “So this is another company trying out their idea – their philosophy, and we’ll just see how well it works.”

    Below are a couple of interviews we did earlier this year with Blekko co-founder and CEO Rich Skrenta and DuckDuckGo founder Gabriel Weinberg about their search engines. They give you an idea of their search philosophies:

    Last week, Blekko announced some new privacy features, such as a new 48-hour data retention policy. Some users will like that either way.

  • Nokia to Replace Bing Maps Infrastructure?

    As you may know, Microsoft and Nokia signed a deal last month, aimed at creating a “third horse” in the smartphone race (alongside Apple’s iOS and Google’s Android platforms [kind of insulting to RIM, no?]). As part of the deal, the two companies indicated that Nokia Maps would become a core part of Microsoft’s mapping services, and would be integrated with Bing.

    “Maps would be integrated with Microsoft’s Bing search engine and adCenter advertising platform to form a unique local search and advertising experience,” the press release said.

    That integration might be much bigger than anyone realized, however.

    Greg Sterling at Search Engine Land says he had lunch with a “person with close connections to Nokia,” who told him that Nokia Maps would “effectively replace almost everything that Microsoft had developed over the past several years in terms of the Bing Maps infrastructure”. Sterling writes:

    I said I couldn’t believe Microsoft would agree to swap in Navteq for the guts of its own system. Yet my lunch guest argued that Microsoft’s role would mostly center on the Bing Maps UI — ironically not unlike Yahoo’s relationship to Microsoft search results — everything else would be powered by Nokia.

    And there was another very interesting remark. He asserted that Google’s unwillingness to agree to a co-mingling of Google Maps and Nokia Maps or substitution of Nokia Maps on the back end was one of the sticking points that prevented Nokia and Google from coming to terms.

    Navteq is a GIS data provider, owned by Nokia. It counts plenty of big brands (including Microsoft) among its customers.

    Based on the circumstances, we can only file this one under rumors at this point, but as Sterling points out, Microsoft and Nokia were indeed quite vague on the details about any Map integration resulting from the partnership.

    At this point, it’s unclear what would become of much of the progress Bing Maps has made on its own, and its own integrations – Bing Maps apps, for example. We’ve reached out to the Bing Team for comment, and will follow-up with any additional details.

    Update: Bing simply gave us the following canned response: “Bing Maps has utilized Nokia content for road data, geo-coding and routing services for several years, through Nokia’s Navteq vector data business, relying on the quality of its data for core location services. The Nokia/MS partnership will enable deeper collaboration and an improved experience for our customers in the future.”

  • Despite New Panda Guidelines, Google Still Burying Authoritative Results

    Despite New Panda Guidelines, Google Still Burying Authoritative Results

    There are a lot of elements of Google’s Panda update to discuss, and we’ve certainly discussed many of them over the last few months, but let’s not lose sight of the reason the update was launched to begin with – to improve search quality.

    Do you think Google’s search results are better now? Tell us what you think.

    While quality is often in the eye of the beholder, there are certain kinds of queries where the information being retrieved is simply more important than others. We’ve talked about this before, as it’s been a problem in some Google results.

    One example we’ve looked at a few times is where an eHow article written by a freelance writer with no clear authority on cancer (and whose body of work includes a lot of plumbing-related articles) was ranking at the top of Googe’s results for the query “level 4 brain cancer” above numerous other sources that would seem to be of greater authority on such a subject.

    Level 4 Brain Cancer in Google

    In fact, the article did get bumped down after the Panda update, but it does still rank number 2, followed by another result from eHow. Granted, this is just one example, and Demand Media has efforts in motion to improve its own content quality, but you get the point.

    Queries related to things like health or law demand authoritative advice. Not SEO’d content.

    We had a conversation with Mark Britton, founder and CEO of Avvo about this subject. Avvo is a site that offers Q&A forums where consumers can ask medical or legal questions and get responses from qualified doctors and lawyers. It provides apparently authoritative content in these two areas from certified professionals.

    This seems like the kind of content that should be ranking well for a lot of these types of queries. Does it not? Britton thinks it’s “very important” for commentary from experts in the medical and legal fields to surface high in search results for relevant topics.

    “There is a lot of noise both online and offline regarding health and legal issues,” he tells us. “This comes in the form of lay people, professional commentators and even celebrities who often offer advice that is well-intentioned but inherently inferior to that of a doctor or lawyer trained in the area. However, it is not always easy to get doctors and lawyers to speak. Some still look down on the Internet as a publishing or marketing vehicle. Others just downright fear it, as they have seen too many movies where someone says something on the Internet and they are subsequently hunted and killed by terrorist hackers.”

    “There is always room for improvement — especially with our newer pages,” he says of Avvo’s own search rankings. “We just launched our doctor ratings directory and our free medical question and answer forum in November, and it will take some time for those pages to rank as well as our legally related pages.”

    Look at the results for a query like “Does type 2 diabetes shorten life expectancy?” Avvo’s page on the subject ranks on the second page, while eHow ranks at the top of the first. The Avvo result has actually fallen since I began writing this article. It used to be right below the number one result from eHow and the number 2 from Yahoo Answers.

    Diabetes Results in Google

    eHow’s is an article (not very long by any means) by a guy whose bio says he “has been a freelance writer since 2007. He writes extensively in the fitness, mental health and travel sectors and his work has appeared in a range of print and online publications including Scazu Fitness and USAToday Travel Tips…[and] holds a Master of Arts in community psychology.”

    Keep in mind that USA Today has a deal with Demand Media for travel tips. So that presumably means his Demand Media content is simply published by USA Today. Does “Master of Arts in community psychology” indicate more authority to answer a life/death question about type 2 diabetes than say a licensed and practicing MD? That’s who provided an answer on Avvo’s page, which just got pushed further down in the search results.

    If you change the query to something simpler like “type 2 diabetes life expectancy” eHow still ranks close to the top, and Avvo’s result slips to….get ready for it….page 18! That’s with various articles from places like eHow, EzineArticles and Suite101 (all victims of the Panda update) ranking ahead of it. Now, I’m not saying that Avvo’s result is necessarily the one ultimate result for this query and should necessarily be the highest ranked, but come on. Interestingly enough, the result was on page 3 for this query when I started writing the article (yesterday) and it’s slipped that much further into obscurity just since then. I wonder where it will be in another day.

    Google has given publishers a list of questions to ask themselves about their content, as guidelines the company goes by as it writes its algorithms. The very top one is “Would you trust the information presented in this article?”

    While neither of the articles provide any helpful links to sources of information, the Avvo article comes from a medical doctor. I think most people would find that slightly more trustworthy, even if the article isn’t as long or as well SEO’d. Here’s the eHow article. Here’s the Avvo one.

    The second question on Google’s list is, “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?”

    While Google makes it clear that these questions aren’t actual ranking signals, they must be used to determine the signals at least, and you have to wonder just how much weight authority on a topic carries.

    Britton maintains that ALL of the site’s advice comes from qualified professionals, claiming that this is one of the site’s “greatest differentiators.”

    “We CERTIFY every doctor and lawyer offering free advice on the site in two principle ways: First, we verify with the state licensing authorities that the answering doctors or lawyers are licensed and in good standing,” he explains. “Second, we rate the professionals from 1 (“Extreme Caution”) to 10 (“Superb”), which was unheard of prior to Avvo’s entry into the professional ratings arena. We are big believers that not every doctor or lawyer is ‘Super’ or ‘Best’ which was the steady-state in professional ratings for decades.”

    “This was really just an extension of the Yellow Pages model, where the ‘recommended’ professional is the one paying the most money to advertise,” he continues. “But consumers are getting wise and demanding greater transparency regarding the qualifications of their doctors and lawyers.”

    “We have three ratings that speak to the expertise of our contributors: The Avvo Rating, client/patient ratings and peer endorsements,” says Britton. “For the Avvo Rating, we start with the state licensing authorities and collect all the information we can regarding a professional. We then load that information into our proprietary web crawler, which we call ‘Hoover.’ Hoover goes out and finds all the additional information it can regarding the professional. We match the licensing data with the Hoover data and then we score it. The scoring is based on those indicators of the professional’s reputation, experience and quality of work.”

    Britton says Avvo was not really affected by Google’s Panda update. “We saw a small dip, but things came back fairly quickly.”

    “While I understand the intent of Google’s latest update, I’m not sure they entirely hit their mark,” he says. “We noticed a number of pure lead-generation sites – i.e., sites that are selling leads to the highest bidder — jump ahead of us in certain key terms, which is not good for consumers.”

    Avvo encourages people to ask questions on the site, claiming it its Q&A boasts a 97% response rate.

    Avvo asked us to let readers know that in support of Skin Awareness Month, it is donating $5 to the Melanoma Research Foundation for every doctor review during the month of May.

    Should authority and certification of expertise carry greater weight in Google’s search rankings? Comment here.

  • SEO Isn’t A Fairy Tale

    SEO Isn’t A Fairy Tale

    There are many reasons companies invest in Search Engine Optimization ranging from a desire to attract new customers through online marketing channels to diversifying customer acquisition to ego.  That’s right, ego. Not every marketer makes SEO investment decisions based on pulling in prospects and customers to brand content for engagement and conversions.

    Often times, brands think of themselves as the leader in their category and therefore think their website should top Google’s list for queries on generic industry terms. The trouble is, leading an industry offline isn’t the same thing as being the BEST answer for a search query online.  Chasing after such terms is very much driven by ego and not unlike a fairy tale of chasing after unicorns where there’s an expectation that being #1 on a single word will magically solve their problems.

    However, going after broad industry terms isn’t a complete waste of time. When ego-driven SEO is productive, it’s geared towards building brand reputation and PR value. Of course, by “PR” I mean public relations, not page rank.  The affinity and credibility that comes from being in a top position for a generic industry term can add a lot of value to online public relations efforts, recruiting and investor relations.

    Achieving top placement on broad keywords can certainly drive a substantial amount of website traffic. In fact, TopRank Marketing has quite a few clients that have top spots for generic industry phrases and some with single word terms sending  a good portion of organic search visitors.

    In terms of buying cycle, broad queries tend to be “tire kickers” and have value for creating awareness and education but not conversions. And that’s ok, because the search experience isn’t just a single event – especially in B2B or with more sophisticated buying decisions. But brands that want those top spots need to understand what it takes to translate their offline industry dominance to search engines like Google and Bing.

    A while back I had a customer that said he wanted to be #1 on Google for the word “brain”. This client had a blog with a few thousand uniques per month.  While many SEO consultants will talk about how tough that will be and suggest options, my first response is to always ask “Why?”. Understanding motivation (chasing unicorns vs. a fighting chance at achieving goals) is essential for assessing the value and contribution to business goals.

    The client wanted to have top visibility for “brain” because it was a fairly relevant and highly popular search term. Top placement for such a word would send a significant amount of traffic and hopefully sales.  A few things to consider in such a situation include:

    • What is the potential contribution to website goals in what timeframe for a first page or top of fold position for the phrase?
    • What resources in what timeframe might it take to achieve this goal?
    • What are the current brand content and digital assets available to work with?
    • What is the current inbound link profile for the brand site?
    • What is the current position for brand content on the desired keyword(s)?
    • How many search results pages (SERPs) are there for the keyword(s)?
    • How many of those SERPs contain the exact match keyword(s) in title tags, on-page titles, in URLs?
    • How many inbound links are there to the top ranking pages for the target keyword(s)?
    • How many inbound links contain the exact match keyword(s)?
    • What is the distribution of website types as link sources? (news, blogs, web pages, .edu, .gov, etc)
    • How often are the top webpage URLs mentioned in Tweets, FB updates and other social streams?
    • What is the link acquisition growth over time for the current top pages for the target keyword(s)?
    • How many pages on the current websites showing well for the target keyword(s) are specifically optimized for those terms?
    • How old are the sites currently showing well for the target keyword(s)?
    • How much content is dedicated to the target keyword(s) on and offsite for top pages?
    • What is the difference on key metrics like quantity/quality of optimized pages, inbound links and social mentions of brand content vs. pages that occupy the top 5-10 positions for the target keyword(s)?

    A competitive assessment plus a forecast of resources, timeframe and business impact can paint a clearer picture for brands that want to chase after “unicorn” keywords and SEO.  When budget is not an issue at all, then by all means, satisfy basic business case requirements and go for it. But unlimited budget is rarely the situation.  Most SEO programs operate within a scope of work and resources must be allocated according to the SEO strategy.

    In the case of the “brain” client, a presentation of the numerous hospitals, universities and government websites plus the websites that had thousands of pages and many years head start with link building resulted in the conclusion that going after “brain” would be a losing proposition. Especially within the scope of available hours. The decision was made to go after a mix of keyword phrases representative of the interests potential customers might have in the cilent’s offering.   Better to go after keyword phrases that are achievable within a shorter time frame resulting in business outcomes like sales, than allocating a substantial portion of the program to a keyword that might take a year or years to achieve a first page placement on. This client’s blog has now achieved upwards of 350,000 uniques per month focusing on long tail phrases and opened up a new business model for advertising.

    Does this mean, going after all broad industry keyword terms is chasing keyword unicorns? No.  Go after the broad phrases or word(s) if:

    • There are substantial resources for content creation (creativity and diversity), link building, online PR, social media and networking and reverse link engineering.
    • The brand site is nearly the online leader in content and links for the desired keyword(s) and simply needs SEO refinement, targeted link building and process adjustments internally
    • The acquisition of top placement for the broad phrases is forecast within a reasonable time period and with a desirable outcome in comparison to resources and budget necessary.

    Companies that expect to drive customer acquisition and ongoing engagement through search should be focusing on customer-centric keywords anyway and not on ego phrases that give them a warm fuzzy with little chance of returning business value.  We’ve experienced a focus on keywords that represent consideration and purchase buying cycle behaviors to be more achievable more quickly. The interesting thing is, over time, broad phase visibility can still occur.

    The fork in the eye of my logic is when a senior executive with the brand simply wants the unicorn, period. They want that trophy and the internal marketer/SEO vendor are charged with finding a way to make it happen. If budget and resources can allow for succes – great. If not and logic fails, there’s not much more you can do.

    What’s your decision process for going after broad or single terms in a keyword mix? Do you dismiss in favor of long tail? Do you see it as a challenge and go after it anyway? Do you evaluate on the criteria I’ve listed above? What additional criteria would you include?

    Originally published on Lee Odden’s Online Marketing Blog

     

  • Does the Number of Ads on Your Website Affect Your Linkability?

    Does the number of ads on your website affect it’s linkability? IMHO, yes it does. I do most of my reading on my iPad, using either Ziteapp or Instapaper. Both of these tools do a good job of stripping the content down to its most essential elements and displaying it in a readable format. Unfortunately for publishers, this includes stripping out the advertising (see Advertising and Usability).

    Often, I will come across something I enjoy reading or that I am inspired by and want to link to and will email it to myself in a link. When I get to the full page version, I am often shocked. Recently I had this happen on a post called How To Write A Killer Article in 30 Minutes. Compare the stripped-down version in instapaper on the left to the ad-saturated version on the right:


    IMHO, the number of advertisements on the website might be negatively affecting the number of links the post gets. Now this may seem a bit hypocritical because anyone who views my site will see a similar number of ads in the sidebar and integrated into the content. That said, I think the SEO space in general is a lot more tolerant of advertising than many other vertical markets, but I have made some conscious decisions about implementation.

    Post Age – Ads don’t appear on posts when they are published. They only appear on posts that are more than 7 days old.

    Ad integration – I can’t tell you how many websites I visit where the first thing on the page–sometimes even before the post title–is an advertising banner or an adsense block. I made a very conscious decision to show the title, picture, and byline before showing any ads. IMHO nothing screams MFA like a block of ads before content (yes images are content–see my posts on image optimization for more information). I operate many websites not in the SEO space and have avoided linking to, tweeting about, sharing on Facebook, and social media bookmarking many related sites because of their overly aggressive advertising implementation.

    Finding a Balance – I think it’s important that sites monetize themselves (see Adsense: Why Bloggers Don’t Get it). I also think it’s important to integrate ads and thank your advertisers (see Blog Advertising is Broken). It”s something I do every month. However, I think that having too many ads can work against you. The extra pennies you make don’t offset the links and social signals you are giving up. Google recently filed a patent about ad detection … just sayin’ …

    Dealing with Ad Blockers – Ad blocking plugins and integrated browsing/reading technology like Instapaper and Readability are on the rise. In fact, Apple will be including the technology in an upcoming browser version. Publishers need to find ways to display ads in a format that allows them to remain financially viable.

    What are the takeaways from this post:

    • Look at your site from a user’s perspective. Does your site have so many ads it turns readers off?
    • Look at your pages from a long-term linkability angle. Is your ad placement too aggressive, and is it turning off the linkerati?
    • Try to find a balance that allows you to make money without turning off the linkerati or discouraging social social sharing.
    • Look at your website using ad blocking technology. Find a workaround that shows your ads but isn’t offensive.

    Originally published on Graywolf’s SEO Blog

  • Blekko Privacy Features Announced, 48-Hour Data Retention

    Blekko Privacy Features Announced, 48-Hour Data Retention

    Blekko announced some new privacy settings this morning. Personal info (like IP addresses) will now be retained for a maximum of 48 hours. That compares to the 18-month policy employed by Google (and Yahoo), and the 6-month policy employed by Bing.

    Blekko’s policy even goes for logged-in users.

    Other new privacy-related announcements from Blekko include:

    – A new HTTPS Preferred system, which automatically points searchers at HTTPS (secure) websites in many cases

    – SuperPrivacy and NoAds opt-out privacy settings allowing users to suppress ads and reduce logging of search keywords

    “Blekko provides HTTPS on our own website, but that’s not enough to keep our users’ information private,” said Blekko CTO Greg Lindahl. “If you’re searching a website using insecure WiFi at a café and click on a non-HTTPS Wikipedia link in a search result, anyone nearby could observe what Wikipedia page you are accessing. They could also easily see the search terms you used. But this same search on blekko would ensure that when you click on the Wikipedia link you’ll be protected and your private information secure.”

    “Search engines know too much about their users,” he added. “Our goal at blekko is to find a balance between retaining information to improve our search engine, and not retaining information that a user prefers to keep private.”

    When Blekko first launched, CEO RIch Srkenta expressed bold aspirations for it to become “the third search engine” alongside Google and Bing. While it has a long way to go before it gets to that kind of status (in terms of both users and quality of search results, based on the image above), it’s clear that Blekko is aiming to differentiate itself from its competitors in more ways than one.

    The human curation element of search that is the backbone of the search engine is obviously one way of doing that, and it continues to make other announcements like this – smaller things that some users may find appealing when comparing search engines.

    Another example would be the recently launched deeper integration with Facebook. Or the banning of domains deemed to be content farms from search results – a hot issues considering the controversy that has accompanied the Google Panda update.

  • Not Every Google Tweak Is Still Panda.

    There’s talk going around of a “Panda 3.0” or a “Panda 2.1” in reference to recent tweaks to the Google algorithm. The fact is that Google makes many adjustments to its algorithm on an ongoing basis, and generally only feels the need to officially comment on the really big ones.

    Don’t expect guidance from the company every time it makes a tweak. It’s actually somewhat surprising they’ve discussed the Panda update as much as they have.

    On May 6, Google Fellow Amit Singhal wrote in a post on the Google Webmaster Central blog, “Some publishers have fixated on our prior Panda algorithm change, but Panda was just one of roughly 500 search improvements we expect to roll out to search this year. In fact, since we launched Panda, we’ve rolled out over a dozen additional tweaks to our ranking algorithms, and some sites have incorrectly assumed that changes in their rankings were related to Panda. Search is a complicated and evolving art and science, so rather than focusing on specific algorithmic tweaks, we encourage you to focus on delivering the best possible experience for users.”

    Search Engine Land Editor-in-Chief Danny Sullivan says, “Google won’t release the percentage of queries impacted but says this is far less than in the other updates. Changes were made in the past few days.”

    In other words, not every tweak Google makes needs a name. Nor should every tweak since Panda be considered part of the Panda update.

    “We’re continuing to work on additional algorithmic iterations to help webmasters operating high-quality sites get more traffic from search,” said Singhal. “As you continue to improve your sites, rather than focusing on one particular algorithmic tweak, we encourage you to ask yourself the same sorts of questions we ask when looking at the big picture.”

    Those would be the questions we looked at here.

  • What Would Google Search Quality Be Like Without AdSense?

    What Would Google Search Quality Be Like Without AdSense?

    I don’t think many people will argue that Google’s AdSense program has been a major catalyst in increasing the amount of content/search spam on the web. This may not have been Google’s intention for the service, but it has clearly contributed. I’d love to see the ratio of sites that were hit by the Panda update that displayed AdSense ads to sites that were hit and didn’t display these ads.

    That’s not to say that simply using AdSense will get you penalized. Of course Google doesn’t want that. It makes money from these ads, but it is interesting to see how AdSense publishers of all kinds have been impacted by the update.

    One can’t help but wonder what Google’s search results would look like if sites using AdSense ads were removed. Would the quality be better? Maybe. Maybe not. It would be interesting to see either way. Obviously that will never happen, unless Google one day pulls the plug on AdSense, which is also highly unlikely.

    Google recently released a list of questions “that one could use to assess the ‘quality’ of a page or an article,” in light of the Panda update. How many sites do you come across regularly that meet all of these criteria and run Google AdSense ads? To recap, here’s the full list:

  • Would you trust the information presented in this article?
  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • Does this article have spelling, stylistic, or factual errors?
  • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
  • Does the article provide original content or information, original reporting, original research, or original analysis?
  • Does the page provide substantial value when compared to other pages in search results?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Is the site a recognized authority on its topic?
  • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • For a health related query, would you trust information from this site?
  • Would you recognize this site as an authoritative source when mentioned by name?
  • Does this article provide a complete or comprehensive description of the topic?
  • Does this article contain insightful analysis or interesting information that is beyond obvious?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Does this article have an excessive amount of ads that distract from or interfere with the main content?
  • Would you expect to see this article in a printed magazine, encyclopedia or book?
  • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?
  • It is certainly possible to have a “quality” site and use AdSense ads. There are plenty of examples out there, but is that the norm?

    Interestingly enough, Google is reportedly turning away some advertisers that were hit by the Panda update from advertising with AdWords. Aaron Wall of SEOBook tells an interesting story about a guy this has happened to. Here’s the situation as Wall presents it (pulling no punches):

  • Google algorithmically penalizes your site
  • Google won’t say why it is penalized, other than some abstract notion of “quality”
  • Google offers no timetable on when things can improve, but suggests you keep spending increasing sums of capital to increase “quality”
  • Google pays scraper sites to steal your content & wrap it in AdSense ads
  • Google ranks the stolen content above your site (so the content has plenty of “quality” but it is just not “quality” on your website)
  • Google ignores your spam reports & DMCA notifications about how they are paying people to steal your content
  • Google tells you that you can’t even buy AdWords ads, because you are now duplicate content for your own content!
  • That’s not everybody’s experience, but it’s also not the only such complaint we’ve seen. It’s not hard to find a similar analysis in any webmaster forum or comment section on a related article.

    http://www.youtube.com/watch?v=yBrbgh3CO-I Don’t Be Evil, just be corporate 3 hours ago via web · powered by @socialditto

    We reported before, that another Panda victim, Xomba, had its AdSense ads completely removed following a bogus takedown notice, though Google did restore them shortly thereafter.

    For another Panda victim – HubPages, a Googler went so far as to write a guest post on the company blog telling writers how to produce better content for AdSense. Granted, that was before the global roll-out of the update.

    One thing regarding Panda and AdSense seems pretty clear. Don’t overdo it on the ads. Don’t “have an excessive amount of ads that distract from or interfere with the main content.”

    [Image credit: kawanet]

  • Google Aims to Improve Estimates for Clicks, Costs, Positions in AdWords

    Google just announced that it had made an adjustment to the algorithm it uses to provide traffic estimates in AdWords. The change, the company says, should improve stats for estimated clicks, cost, and ad position.

    This data can be seen in places like the Traffic Estimator and the AdWords Keyword Tool, as well as on the Keywords Tab in your AdWords account.

    Google Looks to improve traffic estimates

    While Google doesn’t share much in the way of how the algorithm has been adjusted, Dan Friedman writes on Google’s Inside AdWords blog, “One of the most common uses of traffic estimates is to evaluate potential keywords and decide whether you should add them to your account. Traffic estimates are also useful in determining if your bids and budgets are appropriate for these new keywords.”

    “In order to determine if you’re setting an appropriate target bid, try entering a few different values in the Max CPC field the next time you use the Traffic Estimator,” he adds. “Look at how these different bids affect your statistics, and then decide which bid gives you the best return on investment. You can use the same process for trying out new budgets.”

    The change to the algorithm is currently live, and affects AdWords accounts all over the world.

    In other AdWords-related news, Google is adding new targeting options that allow advertisers to target users of tablets beyond just the iPad. Be careful that you get the settings how you want them, because as this goes into effect, you may see costs rise, due to the increase in tablets being targeted.

  • Google Lets Advertisers Target Tablets Beyond the iPad (Watch Out for Increased Costs)

    Google Lets Advertisers Target Tablets Beyond the iPad (Watch Out for Increased Costs)

    As one of about a hundred announcements from the company today, Google said it is developing new targeting options for advertisers, including a way to target tablet users.

    In a post on Google’s InsideAdwords blog, Nathania Lozada writes, “In the next couple of weeks, the ‘Networks and Devices’ section of your Settings tab within your AdWords account will include a new targeting option titled ‘Tablets with full browsers.’” While you’ve been able to specifically target Apple iPad devices in the past, the new capability will enable you to easily target your ads to the entire tablet device category.”

    “In addition, you’ll be able to select more precisely the types of devices and operating systems on which your AdWords ads will show,” adds Lozada. “For example, to display your ads on the Apple iPad, you’ll be able to choose ‘Tablets with full browsers’ as your device targeting setting and ‘iOS’ as your operating system setting. Tablet targeting will be available initially for Apple devices only, but we’ll expand ad serving to other specific devices in the near future.”

    Ads will automatically start running on tablet devices, once the option becomes available in advertisers’ accounts. Google warns that if you were targeting iPads before, you might start seeing more impressions and costs as they include more tablets in the serving options, so if you don’t want ads to appear on other tablets, you’ll have to go into the settings and specify this.

    How good of them to note.

    Only standard text and image ads can be shown on tablets at this point. Nothing fancy for your landing pages either, because Google will limit the ads it shows on tablets if the landing pages have significant amounts of Flash and can’t render properly on the devices. Here is what Google recommends for mobile landing pages in the help center:

    • Task-oriented, simple site design
    • One-column layout
    • Compatible browser plug-ins. For example, Flash is currently not supported on iPhones and has only limited support on Android and other high-end mobile devices.

    According to Google, 165 million tablets are expected to ship over the next two years. Google will certainly be doing everything it can to fuel that with Android. Just today, the company gave all attendees of its sold-out Google I/O conference new tablets to play with.

  • Can Skype Help Microsoft Beat Google?

    Can Skype Help Microsoft Beat Google?

    You’ve probably heard by now that Microsoft is buying Skype (pending regulatory approval). This is Microsoft’s biggest acquisition to date at $8.5 billion, and Skype’s second acquisition (it’s already been bought and sold by eBay). Since Skype’s release from eBay, it has been quite busy adding features and functionalities, and even making some acquisitions of its own, such as that of live streaming video service Qik.

    Was this acquisition a good idea? Comment here.

    Skype has a reported 663 million registered users and 145 million average connected users. The company recently announced a record of 30 million users online at the same time.

    The deal has enormous implications, not only for Microsoft’s own offerings, but for the industry at large. There are also plenty of concerns. Let’s get to those first.

    Concerns

    Clearly, Skype has a big user base, and users have the right to be worried about what is going to become of their beloved service in the hands of a giant like Microsoft. Especially considering Microsoft’s track record of acquisitions (laid out its graphic nature here).

    Marshall Kirkpatrick at ReadWriteWeb brings up some reasonable fears, such as product neglect and malware issues. “Will Skype in 14 years look like Hotmail does today?” he asks. “Malware is already an issue for Skype and of course it’s a well known part of the Microsoft landscape,” he also notes.

    How will it affect use across various platforms? Microsoft says it will continue to invest in and support Skype clients on non-Microsoft platforms. Still, this is a little vague, and considering how much head butting goes on between Microsoft and Google, it wouldn’t be an enormous shock to see some issues raised in this area in the future.

    On reassuring the continued support of other platforms, Steve Ballmer said at the press conference, “I said it and I mean it. We will continue to support non-Microsoft platforms.”

    Steve Ballmer Talks Skype

    “We’re one of the companies that has a track record of doing this,” he added. Still, does that mean all platforms?

    The fact that this is such a huge acquisition for Microsoft, however, should be an indication that the company will take it very seriously, as it has so much invested in Skype’s future success.

    Mobile

    Skype, which has more users than Twitter, should help Microsoft on numerous strategic levels. Mobile would be a major one. Skype will support Windows Phone, of course, and while it remains to be seen what kinds of integrations we can expect, there’s little doubt that it will be an integral part of the Microsoft mobile strategy as it tries to gain ground against Google’s Android and Apple’s iOS.

    Also consider that Microsoft has recently made deals with Nokia and RIM that will see Microsoft services heavily integrated on these companies’ mobile devices. It stands to reason that Skype will play a major role here as well.

    It doesn’t seem out of the realm of possibility that Microsoft would at some point create a Skype-branded phone.

    The Living Room

    The living room is one area where Microsoft already has a tremendous edge over competitors like Google and Apple. While the jury’s still out on the future success of Google TV and Apple TV, it’s been pretty well established that Microsoft’s Xbox line is a smashing success. Kinect is doing pretty well too. Guess what will be integrated with both of these.

    In its announcement, Microsoft points out its “long-standing focus and investment in real-time communications across its various platforms” including Xbox Live. It also says Skype will support Xbox and Kinect, and will connect Skype users with Xbox Live (in addition to Lync, Outlook and other communities).

    PayPal is also coming to Xbox Live. That can’t hurt either.

    The Enterprise

    Let’s not forget about the implications for businesses. Microsoft says the acquisition will increase accessibility of real-time video and voice communications for enterprise users and generate “new business and revenue opportunities”.

    Plenty of businesses are already using Skype. How many are using Microsoft products? This could be a huge blow to Google, who is aggressively going after the enterprise market with Google Apps, and soon with Chrome OS. Skype may give businesses another reason to stick with MS. Of course it remains to be seen what kinds of integrations we’ll see.

    Competition and Google

    There are plenty of areas where Microsoft and Google compete with one another, and Skype could go a long way in helping Microsoft with maybe all of them. That includes the areas we’ve already discussed – mobile, the living room, and the enterprise. It also includes the communication services Skype provides on its own.

    Google has been doing more and more in this area, whether it be in the form of Google Voice or video chat via Google Talk and Gmail (email being another prime example of where Google and Microsoft already compete). How about live streaming video? Skype recently bought Qik for this, and YouTube recently announced its own YouTube Live (both a viewing destination and a platform for streaming live video).

    YouTube is also doing plenty of other things to cement its position of being THE online video destination. This week, the company announced new partnerships with movie studios, the doubling of its catalog of movie offerings (including new releases), and increased investments in original content from partners. This comes back to the living room discussion, but I’m guessing we will continue to see overlap in the offerings from these two companies here.

    Bing

    And then there’s Bing. What in the world could Skype possibly have to do with search? Well, everything we’ve talked about up until now is all about Microsoft expanding its presence and user base. The more people using Microsoft products (now including Skype), the more opportunities Microsoft has to push Bing on people. The more businesses using Microsoft products, the more opportunities for Bing integration. The more consumers using Microsoft in the living room (where Microsoft is already heavily pushing Bing via television commercials), the more opportunities for Microsoft to push Bing on users through products.

    We’ve had the mobile conversation more than once – both when Microsoft announced its partnership with Nokia, and its partnership with RIM. They both equate to Bing search being the default search on more mobile devices, and getting Bing into more consumers’ hands (literally). These things can only help Bing’s continued growth.

    Last week, we asked, “Will Bing catch Google?“. The Skype acquisition can’t hurt. Much of this is simply about opportunity. We don’t know all of the details about Microsoft’s plans for Skype, but there’s no question that there is an incredible amount of possibilities that can help give the company some much-needed boosts.

    Kirkpatrick brings up another good point about developer opportunities, making the case that “social graph and address books, presence, file sharing, Instant Messaging, [and] mobile” elements of Skype are all things developers salivate over, and that with Microsoft behind it, developers could get a great deal more access to build more useful applications and integrations on top of Skype.

    The social element was played up in the press conference about the acquisition.

    The Facebook Factor

    As long as we’re talking about how much of a strategic buy this could turn out to be for Microsoft, in its ongoing competition with Google, let’s not leave out the implications for Facebook – another company that not only has a partnership with Microsoft, but increasingly competes with Google in numerous areas.

    Om Malik brings up some good points about how the acquisition relates to Google’s competition with Facebook, which he says could be the biggest winner of the deal.

    “The Palo Alto-based social networking giant had little or no chance of buying Skype. Had it been public, it would have been a different story. With Microsoft, it gets the best of both worlds — it gets access to Skype assets (Microsoft is an investor in Facebook) and it gets to keep Skype away from Google,” he says. “Facebook needs Skype badly. Among other things, it needs to use Skype’s peer-to-peer network to offer video and voice services to the users of Facebook Chat. If the company had to use conventional methods and offer voice and video service to its 600 million plus customers, the cost and overhead of operating the infrastructure would be prohibitive.”

    “Facebook can also help Skype get more customers for its SkypeOut service, and it can have folks use Facebook Credits to pay for Skype minutes,” he adds. “Skype and Facebook are working on a joint announcement and you can expect it shortly.”

    Also, while Google continues to struggle in social, Skype makes Microsoft more social by default, with or without Facebook (MUCH more so with any Facebook integration).

    The New York Times says Microsoft analysts see the acquisition as a move to block Google from “gaining greater ground in Internet communications”. Google was said to have been in talks with Skype about a potential partnership. It may or may not be the entire basis for the acquisition, but it’s not hard to see this logic.

    To put it simply, it’s all about products that people use, and Microsoft just added another major one to its list.

    Google is just kicking off its Google I/O developer event. It will be interesting to see what all news comes out of this, and how it might pertain to this discussion. Also keep in mind the ongling regulatory scrutiny over competition that Google continues to attract.

    Do you think the acquisition will be good for Microsoft? Good for Skype? Tell us what you think.

  • Website Siloing: An SEO Strategy For Optimal Results

    Relevancy in Google’s eyes begins with strong content. Furthermore, that same content needs to be well structured. One of the best ways to inject positive results into your organic campaign, is to structure your website into a symmetrical organization – silo structuring.

    A disjointed website will many times wreak havoc and minimize the keyword performance of your site. On the other hand, if you tightly theme your content, search engines will index your site better and give you a boost in keyword rankings.

    A clean site structure is clearly one of the most powerful and underrated SEO strategies today.

    Latent Semantic Indexing (LSI)

    Before we delve into silo structures, let’s touch on Latent Semantic Indexing (LSI). The definition of LSI is complex, and the benefits many. As it relates to search engines, it is used as an indexing and retrieval method to identify patterns in the relationships between the terms and concepts contained in an unstructured collection of text.

    How does LSI work? Along with determining which keywords a web page or document contains, other web pages/documents are also examined for the same words. In essence, LSI attempts to determine relevancy on a grand scale.

    As you can see, LSI plays a direct role in your silo strategy. Your goal is to help your website play nicely within the LSI confines.

    Silo Structure to the Rescue

    Silos are used to categorize themed content in an effort to help the search engines determine relevancy. The tighter the focus, the more relevant Google will see each page of your website.

    Figure 1 shows the a typical website siloing structure:

    Silo Structuring

    For example, suppose you have a website on guitars. If you offer up relatively dissimilar topics such as how to play acoustic guitar, how to play electric guitar, and even how to play classical guitar, all on the same page, the relevancy will be diluted. Even though your content may be far better and more useful than your competitors, poor structure will hurt you at some level.

    To aid the LSI process along, the webmaster can structure the site into one of two types, Physical or Virtual. Both silo types are highly effective for SEO because it creates a working ecosystem for the content.

    Physical Silo Structures

    A physical silo is a means of theming or grouping your website content into like categories. Also referred to as a directory silo, they are known to be the easiest to set up and maintain. Think of a directory silo as a filing cabinet. Everything in a file must be distinctly associated within the category to remain relevant.

    The directory-style silo structure is the easiest for both search engines and visitors to follow, and most times should be the starting point when designing your site.

    Virtual Silo Structures

    Once a website becomes established, the structure can break down over time. The virtual silo model can enforce the structure once again through highly targeted internal linking strategy.

    Virtual silos use a drill-down cross-linking structure to enforce distinct categories. In other words, the top landing page of each silo is supported by the pages linking to it. Figure 2 shows how you can prop content using a virtual silo model:

    Virtual Silo Model

    Internal linking is a major component of virtual silos. Linking should be done between like-topics, avoiding unrelated categories as much as possible.

    Obviously, there will be times when you need to link to different silos to make a point or improve an article. When cross linking between unlike silos, use the Rel=”NoFollow” attribute to reinforce the structure.

    For new websites, a physical silo structure should almost always be used. Directory silos are much easier to set up and manage. Established sites, on the other hand can find value in virtual silos if the physical structure is non-existent, or breaks down over time.

    Get on Track using a Silo Structure

    Many websites never reach their potential due to a lackluster theming strategy. You can have a good looking website with great content. However if your site lacks clarity in terms of topical relevance, your targeted keywords will be discounted, and even devastate the most well meaning campaign.

    Siloing your content is a form of on-page SEO that deserves special attention throughout the life of your website.

    The main objective of siloing is to create a website that ranks well for both short and long-tail keywords. Get on track by using sound silo structuring. A symmetrical organization is one sure-fire way to propel your keyword rankings and improve visitor experience.

  • Helium Raises $10 Million After Being Victimized by Google Panda Update

    Helium Raises $10 Million After Being Victimized by Google Panda Update

    Helium is showing that life can go on for victims of Google’s Panda update. Helium is a user-generated content site, often compared to other known Panda victims like Demand Media, HubPages, Suite101, Associated Content, etc.

    Of course, Demand Media (now a publicly traded company) posted better-than expected earnings, but Helium has managed to secure a new $10 million in financing. It would appear that a commitment to improved quality, an increased focus on local, and/or dialogue with Google has been enough to convince somebody that Helium is here to stay. VatorNews points to an SEC form that indicates as much.

    “Helium has engaged in an on-going dialogue with Google for the last three years or more. Google understands the Helium business and content model and agrees that the Helium site publishes quality content,” Helium VP Architecture and Technology Tracy Flynn recently said.

    The main way writers earn money from Helium comes from views, which are largely driven by search. Clearly, the site’s performance in Google results plays a key role here. However, there are other ways writers can make money from Helium. These include payments from Helium when third-parties purchase articles for use elsewhere, and one-time incentive payments through various programs run by the site, such as contests, up-front payments, customer sponsorships, etc.

    Of course, like many other big victims of the Panda update, they’re doing numerous things to adjust their content strategy, to comply more with what Google is seeking out in terms of higher quality (and less shallow) content. Among other things, Helium is asking writers to submit their articles to Helium only, to avoid duplicate content issues, and to use social media to promote articles (which in turn, Google can see and apply it in its own rankings).

    Over the months, Helium has been providing writers with various tips and guidelines on its blog. For example, a recent post entitled, “Why your article or blog posts just aren’t making the cut” lists:

    1. You didn’t cite your resources
    2. You didn’t proofread or use spell-check on your article
    3. You don’t format the article to your advantage
    4. You don’t include simple SEO techniques
    5. You neglect to add it to your social networking realms like Twitter, Facebook and even your own blog.
    6. You posted it in more place[s] than one.

    Helium also pointed to some do’s and don’ts for writer bios, which is probably a good idea, as bios can be indicative of authority on a given subject. Keep in mind that one of the top questions Google is asking itself as it tweaks its algorithm is, “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?”

    Helium has also made adjustments to its assignment system. “A highlight of the new system is the ability to tailor assignments by writing skills and expertise, as well as allowing all writers to pick up general assignments,” the company explains. “As we learn more about your strengths, we can provide more opportunities that are targeted for your favorite subjects and writing style.”

    In April, Helium encouraged writers to get more involved with local-based writing, as the company has filled positions for local writers for city guide websites, a national real estate web site, a regional newspaper, and a neighborhood profiler for a “major daily newspaper” in LA. “Helium Content Source staffers are constantly on the lookout for writers for these types of assignments,” the company said.

    Google has been placing a great deal more emphasis on local these days, no question. Local results seem to have even been helped by the Panda update.

    Last week, Helium launched a new mobile version of its assignment system for Android and iPhone.

  • Don’t Expect a Lot of eHow Content to Be Removed from YouTube

    Demand Media shared with WebProNews, some further insights into its content strategy on the video side of things. In light of the recent discussions surrounding Google’s Panda update, and its impacts on YouTube and Demand Media, we thought it would be worth taking a closer look at the relationship these enormous web entities have with one another.

    YouTube was a clear winner (along with other Google properties and other video sites) after the Panda update. Various Demand Media properties were impacted negatively, and the company announced last week that its flagship eHow property experienced a 20% decline in search referrals following the update. It’s interesting that YouTube would go up, and eHow would go down, considering that earlier this year (as the company’s IPO approached), Demand Media CEO Richard Rosenblatt told All Things Digital’s Peter Kafka, “We’re the largest supplier of all video to YouTube, over two billion views…”

    Quinn Daly, Demand Media’s SVP, Corporate Communications, tells WebProNews that while it is Demand Media (not just eHow) that is the biggest supplier of video to YouTube, it is the company’s Expert Village brand that has the largest number of videos from Demand Media on YouTube.

    Of course the Expert Village brand has been rolled into eHow. If you go to ExpertVillage.com, you’ll be greeted with an eHow header, and the following message:

    Liked Expert Village? You’re gonna love eHow.

    All of the Expert Village videos you’ve come to count on now live at eHow.com – and that’s only the beginning. With two million articles and videos, plus a supportive community, eHow empowers you with the kind of help and advice you need to accomplish your goals each day.

    Expert Village YouTube channel

    On the Expert Village YouTube channel, it shows 1,898,439,921 total upload views. The eHow channel boasts 112,350,956 total upload views. Based on my own experiences, eHow-branded videos seem to appear more frequently in Google search results. This is just an observation, however, and is inconclusive.

    Last week, Demand Media held its quarterly earnings call, and announced some new clean-up efforts around its content strategy, and eHow in particular. These efforts include the deletion of some articles, and further editing of others. This content comes from Demand Media’s writers’ compensation program, a user-generated content effort, which the company has now completely shut down. Much of this content is/was on eHow.

    Given that Demand Media is the biggest supplier of video to YouTube, and that Google has taken some criticism on how YouTube has performed following the Panda update (criticism namely from HubPage CEO Paul Edmondson), we wondered if Demand Media was pulling any videos in these new efforts.

    Daly tells us, “The efforts around UGC content on eHow.com were primarily articles. There was a short time that people could also upload videos as well but we removed that function. None of that UGC content was distributed on YT.”

    “There are no UGC videos from the WCP program on YT; so in that context, there is no content coming down,” she says. “That said, the body of work on YT is always changing; we are adding new video and working to review older content that may no longer meet our quality standards, so I can’t say that there will be NO content coming down from YT because that wouldn’t be accurate.”

    YouTube recently dropped an interesting stat, in that 30% of YouTube videos make up 99% of views. Edmondson is bringing Google’s competitive practices into the Panda conversation, at a time when Google faces regulatory scrutiny over them.