WebProNews

Tag: Search

  • AdWords Express (Formerly Boost) Launched for U.S. Businesses

    Google announced the launch of AdWords Express, which was tested with a small number of local businesses under the name Boost last fall.

    The product is aimed at local businesses who aren’t already using AdWords. “AdWords Express helps potential customers find your website or Place page, and gives you a quick and straightforward way to connect with them and grow your business,” explains AdWords Express Product Mangager Kiley McEvoy. “You simply provide some basic business information, create your ad, and your campaign is ready to go.”

    “After you sign up, the campaign will be automatically managed for you,” continues McEvoy. “AdWords Express will figure out which searches should trigger your ad to appear and displays it when these searches happen. Your ad will be shown in the Ads section of search results pages—on the top or right hand side—and in Google Maps with a distinctive blue pin. Customers can see your ad whether they’re searching on laptops or mobile phones.”

    AdWords Express – a fast/simple way for local businesses to start advertising online in <5 minutes http://t.co/PbDIcRQ 21 minutes ago via web · powered by @socialditto

    According to the company, AdWords Express automatically optimizes ads to get the most out of the campaign and the budget.

    To use the product, the only thing you need is a mailing address, which you can make private. You don’t even need a website, as your Google Place Page can serve as your landing page.

  • Should Bing Make Paid Search Ads Blend Into Organic Results?

    Emily Kirk, a Rimm-Kaufman Group analyst spotted a feature Microsoft is testing with Bing: paid results in the middle of organic results. Yep.

    Ordinarily on Bing, paid ads appear above and below organic results, and they’re easily identifiable in a blue box:

    Bing Search Ads

    The test feature eliminates the blue box and sticks them in the middle of the results, making them far less distinguishable as ads, though there is still a an “ads” label off to the right.

    Bing Ad Test

    Image credit: RKGz

    Barry Schwartz at Search Engine Land obtained the following statement from Microsoft, “We’re constantly testing and experimenting on Bing, and with that, we carefully measure user engagement and reaction to these changes. We have nothing further to share at this time.”

    I’d be very surprised if this moved from testing to become an actual feature, because there would be a fair amount of negative publicity I think. It’s become pretty well established throughout the industry that paid search results should be very clear to the user. That said, perhaps Microsoft considers the small “ad” label to be sufficient.

  • Help Google Crawl Your Site More Effectively, But Use Caution

    Google has introduced some changes to Webmaster Tools – in particular, handling of URLs with parameters.

    “URL Parameters helps you control which URLs on your site should be crawled by Googlebot, depending on the parameters that appear in these URLs,” explains Kamila Primke, Software Engineer with the Google Webmaster Tools Team. “This functionality provides a simple way to prevent crawling duplicate content on your site. Now, your site can be crawled more effectively, reducing your bandwidth usage and likely allowing more unique content from your site to be indexed. If you suspect that Googlebot’s crawl coverage of the content on your site could be improved, using this feature can be a good idea. But with great power comes great responsibility! You should only use this feature if you’re sure about the behavior of URL parameters on your site. Otherwise you might mistakenly prevent some URLs from being crawled, making their content no longer accessible to Googlebot.”

    Do you use URL parameters in Webmaster Tools? What do you think of the changes? Comment here.

    Google Webmaster Tools - URL Paramter page

    Google is now letting users describe the behavior of parameters. For example, you can let Google know if a parameter changes the actual content of the page.

    “If the parameter doesn’t affect the page’s content then your work is done; Googlebot will choose URLs with a representative value of this parameter and will crawl the URLs with this value,” says Primke. “Since the parameter doesn’t change the content, any value chosen is equally good. However, if the parameter does change the content of a page, you can now assign one of four possible ways for Google to crawl URLs with this parameter.”

    Those would be: let Googlebot decide, every URL, only crawl URLS with value or no URLs.

    Users can tell Google if a parameter sorts, paginates, determines content, or other things that it might do. For each parameter, Google will also “try” to show you a sample of example URLs from your site that it has already crawled that contain a given parameter.

    To bring up the use of caution again, Primke warns about the responsibilities that come with using the No URLs option. “This option is the most restrictive and, for any given URL, takes precedence over settings of other parameters in that URL. This means that if the URL contains a parameter that is set to the ‘No URLs’ option, this URL will never be crawled, even if other parameters in the URL are set to ‘Every URL.’ You should be careful when using this option. The second most restrictive setting is ‘Only URLs with value=x.’”

    She runs through some examples in this blog post, and there is more related information in Google’s Webmaster Help forum.

    Webmasters & SEOs: here’s *tons* of great info on our improved tool to handle url parameters better: http://t.co/TtBs8tp 2 minutes ago via Tweet Button · powered by @socialditto

    Be Careful About Selling the Same Stuff From Multiple Domains

    As long as we’re discussing webmaster issues for Google, I’ll also point to the latest Webmaster Help video from Matt Cutts, who discusses selling products on multiple domains. The user question he sought to answer was:

    “I manage 3 websites that sell the same products across 3 domains. Each site has a different selling approach, price structure, target audience, etc. Does Google see this as spumy or black hat?”

    Cutts says, “On one hand, if the domains are radically different lay-out, different selling approach, different structure – like, essentially completely different, and especially the fact that you said it’s only 3 domains, that might not be so bad. Clearly if it were 300 domains or 3,000 domains – you can quickly get to a fairly large number of domains that can be crowding up the search results and creating a bad user experience…by the time you get to a relatively medium-sized number of sites.”

    “The thing that was interesting about the question is that you said it’s the same products, as in identical. So it’s a little weird if you’re selling identical products across 3 domains. If you were selling like men’s sweaters on one, and women’s sweaters on another, and shoes on a third….I’ve said before, there’s no problem with having different domains for each product, and a small number of domains (2, 3, or 4) for very normally separable reasons can make perfect sense, but it is a little strange to sell the same products, so if they’re really identical, that starts to look a little bit strange – especially if you start to get more than 3 domains.”

    “Definitely, I have found that if you have one domain, you’ve got the time to build it up – to build the reputation for that domain…in my experience, when someone has 50 or 100 domains, they tend not to put as much work – as much love into each individual domain, and whether they intend to or not, that tends to show after a while. People have the temptation to auto-generate content or they just try to syndicate a bunch of feeds, and then you land on one domain vs. another domain, and it really looks incredibly cookie cutter – comparing the two domains, and that’s when users start to complain.

    Do you think Google takes the right approach to sites selling products from multiple domains? Share your thoughts here.

  • Is Your Paid Search Campaign Cannibalizing Your Organic Clicks?

    Is Your Paid Search Campaign Cannibalizing Your Organic Clicks?

    In case you’re wondering if your paid campaigns are cannibalizing clicks from your organic search results, the answer is: not so much. That is If you take Google’s word for it anyway.

    Google says its statisticians have run over 400 studies on accounts with paused paid search campaigns to gain some insight into how paid search affects organic clicks for websites.

    “In what we call ‘Search Ads Pause Studies’, our group of researchers observed organic click volume in the absence of search ads,” Google’s Quantitative Management team said in a post on the Google Research Blog. “Then they built a statistical model to predict the click volume for given levels of ad spend using spend and organic impression volume as predictors. These models generated estimates for the incremental clicks attributable to search ads (IAC), or in other words, the percentage of paid clicks that are not made up for by organic clicks when search ads are paused.”

    “The results were surprising,” the team added. “On average, the incremental ad clicks percentage across verticals is 89%. This means that a full 89% of the traffic generated by search ads is not replaced by organic clicks when ads are paused. This number was consistently high across verticals.”

    Hmm. Sounds like you should really be spending money paying for Google ads…at least according to Google.

    David X. Chan, Yuan Yuan, Jim Koehler, and Deepak Kumar explain in the report:

    In order to determine the incremental clicks related to search advertising, we quantify the impact pausing search ad spend has on total clicks. Indirect navigation to the advertiser site is not considered. Each study produces an estimate of the incremental clicks attributed to search advertising for an advertiser. To make comparison across multiple studies easier, we express the incremental clicks as a percentage of the change in paid clicks. This metric is labeled \Incremental Ad Clicks”, or \IAC” for short.

    IAC represents the percentage of paid clicks that are not made up for by organic clicks when ads are paused. Conversely, when the campaign is restarted, the IAC represents the fraction of paid clicks that are incremental. Since we do not assume a positive interaction between paid and organic search in our analysis, the IAC estimate is bounded at 100%.

    The team does acknowledge that it has not conducted enough studies to determine the impact of seasonality on the results.

    The full report can be read here (pdf).

  • Google Place Pages Drop Third-Party Reviews

    Google announced some changes to Place Pages as part of its new refocused efforts.

    For one, they’ve added a “write a review” button at the top of the page, encouraging users to talk about your business. I hope your customer service is good.

    “Some of the changes you’ll notice today have been made so you can quickly get a sense for what other people are saying about a place, more easily upload photos of places you’ve been (by using a more obvious ‘Upload a photo’ button), and see reviews in a single section on the page,” said Director of Product Management Avni Shah. “Since the introduction of Google Places’ local rating and review feature last fall — originally called Hotpot — we’ve heard loud and clear that reviews help you find the places that are right for you, especially when you’re able to get recommendations based on your tastes and those of your friends.”

    In fact, Google is relying solely on its own set of user reviews now, and has removed reviews from other sources This should make Yelp happy. I wonder how their traffic will be affected.

    Google Place Pages

    “Rating and review counts reflect only those that’ve been written by fellow Google users, and as part of our continued commitment to helping you find what you want on the web, we’re continuing to provide links to other review sites so you can get a comprehensive view of locations across the globe,” said Shah.

    Google says its long-term vision for local search include more personalized results, integrating information from Place pages into web search, and providing more ways to rate, discover, and share places faster and easier.

    Separately, Google says it will open up brand profiles for Google+ in the coming months. They’re making a big deal about how they want businesses to have a different experience than what the regular profiles have to offer, and have been talking a great deal about integrating Google+ with other Google products. It would not be surprising to see Place Pages and business profiles on G+ integrated with one another. That could be quite powerful for businesses.

    Google+ should certainly help with that goal of making results more personalized.

  • SEO Developments You Need to Know About

    SEO Developments You Need to Know About

    Much of the Google talk lately has been centered around Google+, the company’s new social network, and with good reason. It may have a significant impact on how Internet users use other established social sites like Facebook, Twitter, LinkedIn, and even StumbleUpon. However, it is still Google search that drives the majority of web traffic for most site owners, and there is plenty going on in search as well.

    What do you consider to be the most significant recent development in search? Share your thoughts in the comments.

    What Would This New Google Design Mean for SEO?

    First, I want to talk about a new user interface tweak Google is testing, which could have major implications for site owners and their visibility in Google search results.

    The change, seen in the video below, has the search bar and navigation bar sticky at the top of the page and the left panel of search options sticky to the side. In other words, these things stay put as you scroll through search results, rather than disappearing as you scroll down as they do in the regular interface currently.

    In the video, we see that results are still paginated. You still have to click through various pages of search results. How often do you really click past the first page?

    However, the interface change closely resembles the current interface of Google Image Search. Here, the same things are stickied, but instead of paginated results pages, it has infinite scroll, meaning you can keep scrolling down the page to see more results. Eventually, you have to click “show more results,” but it’s not like clicking through multiple pages.

    For all intents and purposes, all of the images appear on page one. It seems likely that if Google switches to this type of interface for regular web search results, it may implement the infinite scroll functionality as well. This would mean, of course, that users wouldn’t have to click to page 2 of the search results to see your site if that’s where you’re currently ranking.

    Users are far more likely, in my opinion, to look at more results if they’re all presented on the page. I know this has been the case for me personally, using Google Image search. Similar functionality is also available in Twitter’s timeline, and I know I take in more results there as well.

    Google has changed its algorithm and interface so much over the years, with added personalization, local results, universal search, etc. that it is has become harder and harder to get your content seen by searchers, but if this actually pans out, it may actually help with visibility. Hopefully content quality will also be reflected.

    We dont’ know for sure that Google will implement any of this, but would it not make for a better user experience?

    How would these changes impact SEO? Tell us what you think.

    Google is getting more focused.

    As you know, Google has tons of products and services, and constantly experiments with new potential ones. With Larry Page at the helm now, however, the company is getting much more focused. This was a major theme of what Page had to say in the company’s earnings call last week. Since then, Google even made the bold announcement that it is shutting down Google Labs, which holds most of Google’s experimental offerings.

    “While we’ve learned a huge amount by launching very early prototypes in Labs, we believe that greater focus is crucial if we’re to make the most of the extraordinary opportunities ahead,” said Google SVP for Research and Systems Infrastructure Bill Coghran.

    Search items like Google Code Search, Google Trends, Google Suggest, Google Social Search, and even Google Maps started out in Google Labs.

    That doesn’t mean Google is looking to stop innovating. “We’ll continue to push speed and innovation—the driving forces behind Google Labs—across all our products, as the early launch of the Google+ field trial last month showed,” said Coghran.

    “Greater focus has also been another big feature for me this quarter–more wood behind fewer arrows, Page said in the earnings call. “Last month, for example, we announced that we will be closing Google Health and Google PowerMeter. We’ve also done substantial internal work simplifying and streamlining our product lines. While much of that work has not yet become visible externally, I am very happy with our progress here. Focus and prioritization are crucial given our amazing opportunities. Indeed I see more opportunities for Google today than ever before. Because believe it or not we are still in the very early stages of what we want to do.”

    “Even in search … which we’ve been working on for 12 years there have never been more important changes to make,” he said. For example this quarter we launched a pilot that shows an author’s name and picture in the search results, making it easier for users to find things from authors they trust.”

    Who You Are Matters More

    That last point by Page brings me to the next point. Who you are is becoming more important in search. We made note of this when Google announced the authorship markup, which enables the feature Page spoke of. To implement this, by the way, here is Google’s instructions:

    To identify the author of an article or page, include a link to an author page on your domain and add rel=”author” to that link, like this:

    Written by <a rel=”author” href=”../authors/mattcutts”>Matt Cutts</a>.

    This tells search engines: “The linked person is an author of this linking page.” The rel=”author” link must point to an author page on the same site as the content page. For example, the page http://example.com/content/webmaster_tips could have a link to the author page at http://example.com/authors/mattcutts. Google uses a variety of algorithms to determine whether two URLs are part of the same site. For example, http://example.com/content, http://www.example.com/content, and http://news.example.com can all be considered as part of the same site, even though the hostnames are not identical.

    I find it interesting that this is the sole feature Page alluded to in the earnings call, with regards to search. This makes me wonder if Google places even more emphasis on this than I thought.

    Watching the Subdomain Impact on Panda Recovery

    We may find out how big a role content author can play in search rankings soon (separate form the actual authorship markup element) thanks to some experimenting by Panda update victim HubPages. We recently reported on HubPages’ strategy of subdomaining content by author to keep content separate, so that the poor quality postings by some authors doesn’t have an effect on the search rankings of those authors who are putting out higher quality. This also, in theory, is designed to keep the entire site from being pulled down by some less than stellar content.

    This week, Hubpages announced that it was rolling out these subdomains. One author told WebProNews, “On one of my accounts at HubPages, I’m already seeing a bit of an increase of traffic and I’m quite sure it is from the subdomain/URL forwarding. HubPages, from what I can make of the update, is definitely heading in the right direction.”

    Definitely something to keep an eye on in the coming weeks/months.

    Do you think subdomains are going to make a significant impact? Tell us what you think.

    PageRank Gets Updated Again

    Several weeks ago, Google launched an update to its PageRank (which displays in the Google toolbar). Google has played down the significance of PageRank, as it is only one of many signals, but it is still a signal, and one worth considering.

    Interestingly, that update caused Google’s own PageRank to drop from a 10 to a 9. This week, PageRank got another update, and sent Google back up to a 10.

    Google doesn’t usually update PageRank that frequently, so the new update raised a few eyebrows. Barry Schwartz at Search Engine Roundtable thinks it’s related to Twitter. “It was because, I believe, Twitter’s PR was a PR 0 and Google didn’t want people to think that Google downgraded Twitter’s PageRank manually because of contract deals breaking between the two,” he writes. He got the following statement from Google:

    Recently Twitter has been making various changes to its robots.txt file and HTTP status codes. These changes temporarily resulted in unusual url canonicalization for Twitter by our algorithms. The canonical urls have started to settle down, and we’ve pushed a refresh of the toolbar PageRank data that reflects that. Twitter continues to have high PageRank in Google’s index, and this variation was not a penalty.

    Twitter’s PR is a 9. Twitter’s wasn’t the only one to change, however. Various webmeisters took to the forums to note that their own had been changing.

    Google is Nixing the Google Toolbar for Firefox

    While we’re on the topic of the Google Toolbar, it’s also worth noting that it’s being discontinued for Firefox.

    “First of all, we’d like to thank all of our loyal users of Google Toolbar for Firefox,” Brittney said on the Google Toolbar Help blog. “We deeply appreciate all of the feedback over the years that helped to make the product so useful. As we all know, over the past few years, there has been a tremendous amount of innovation in the browser space. For Firefox users, many features that were once offered by Google Toolbar for Firefox are now already built right into the browser. Therefore, while Google Toolbar for Firefox works on versions up to and including Firefox 4 only, it will not be supported on Firefox 5 and future versions. Please see our Help Center for additional details.”

    Google’s own Chrome browser has over 160 million users, according to Page.

    Google +1 Button Impressions

    Page also announced that the +1 button is begin served 2.3 billion times a day. That means people are consuming a whole lot of content out there that carries this button. The button itself, as you may know, contributes directly to search rankings. The more +1’s a piece of content gets, the more signals Google is receiving that people like this content, which increases its chances of ranking better.

    It’s just one of many signals Google uses, but it’s a pretty direct signal.

    While the button is yet to be integrated directly into Google+, the tremendous momentum of Google+ will likely only serve to fuel clicks of the +1 button. When I say it’s not integrated, I mean that when you click the +1 button on a piece of content, it’s not sharing it to your followers’ streams. It’s not like Facebook’s “like” button, where it promotes that content to your friends’ news feed. At least not yet. It goes to a separate tab on your Google profile that few probably see.

    Still, despite any confusion that may arise from that, people are going to associate that “+1” with Google+. They’re not only seeing it on content on the web, but on Google+ posts from within the social network. Presumably, they’ll click it on the web more too.

    Is Google+ going to greatly impact search? Let us know what you think in the comments.

  • A New, Fun Way to Use StumbleUpon

    A New, Fun Way to Use StumbleUpon

    StumbleUpon is testing a new feature called the “Explore Box”.

    “For any interest, type a word into the Explore Box, and we’ll pull out great stumbles for you that match that interest,” the company explained in an email.

    It could be viewed as a new way to search the web. As you can see, it looks very much like a search box, only instead of a search results page, you are simply transferred to the StumbleUpon experience of stumbling through various webpages.

    StumbleUpon Explore

    Obviously, you wouldn’t want to use this to search for specific information like you use a search engine like Google for, but it would be more useful for searching for cool things related to certain keywords – things you may not know exist. In other words: search for when you don’t know what you’re really looking for.

    Stumbleupon Exploring SEO

    What it does more than anything is expand the Stumbleupon experience to encompass a much broader set of topics, while allowing the user to drill down to very specific things using keywords. Perhaps before you had food listed in your topics of interest, but now you can stumble through pot roast (and see some pretty good-looking recipes, I might add).

    The company is asking for feedback from testers, asking them to rank the “Explore an Interest” feature on a scale of 1 ot 10 and explain why.

    StumbleUpon Explore Survey

    If the feature makes it out of testing, it may become more important to use more keyword tagging when stumbling items.

  • Google Letting Users Know Their Computers are Infected

    Google announced today that some users will see a message at the top of their search results telling them that their computer is infected, after the company discovered some unusual search activity.

    “As we work to protect our users and their information, we sometimes discover unusual patterns of activity,” explains security engineer Damian Menscher. “Recently, we found some unusual search traffic while performing routine maintenance on one of our data centers. After collaborating with security engineers at several companies that were sending this modified traffic, we determined that the computers exhibiting this behavior were infected with a particular strain of malicious software, or ‘malware.’”

    “This particular malware causes infected computers to send traffic to Google through a small number of intermediary servers called ‘proxies,’” says Menscher. We hope that by taking steps to notify users whose traffic is coming through these proxies, we can help them update their antivirus software and remove the infections.”

    Important: we’ve detected some specific malware. Go to google.com to see if you have it: http://t.co/Jc6WAGw Please RT! 14 hours ago via Tweet Button · powered by @socialditto

    Matt Cutts said on Google+, “We’re trying this as an experiment to alert and protect consumers that we believe have infected machines. Please share this widely…This is malware that’s specific to Windows. Remember to do an actual search (any search will do) and check the top of the search results page; don’t just go to the home page.”

    Menscher brings up the possibility that the message might not actually reach everyone, but offers users an alternative way to see if their computer is infected. He points to a three step process here.

    Google offers additional security advice here.

  • Google Panda Update: HubPages Enables Subdomains to Help Content Recover

    Is subdomaining the answer to a recovery from the Google Panda update? After some testing, HubPages is convinced that it can play a significant role, at least for a site of its type, which includes numerous article from numerous authors.

    The company announced new changes to its site on its blog today reflecting this thinking, and saying that as a result, it “should allow each author to be judged by Google separately.”

    Right now, HubPages is letting authors set up their own subdomains. HubPages’ Simone Smith writes:

    To test the success of moving accounts to subdomains, we ported over several accounts over with the expectation of some traffic improvement based on an earlier experiment, but with the understanding that there was some risk involved. We have concluded the test, and after 2 weeks of observing Google’s response, we saw a dramatic recovery among many accounts, validating the decision to move each Hubber’s account under his/her own subdomain. We expect that, with the move, some accounts will recover traffic, while others won’t.

    Based on these positive results, we have opened up the option to move Hubber accounts over to subdomains to the entire community.  Moving to your own subdomain will comprise 2 steps:
    1. Selecting your subdomain, and
    2. Activating the move.

    HubPages has a more detailed walkthrough of the process here.

    The company says that in most cases, the subdomain will be users’ usernames or usernames without spaces, periods, underscores, etc. For those that aren’t available, they’ll present other, similar options or let users suggest other options.

    Users will be able to claim subdomains for about a week, then they’ll be automatically assigning subdomains based on usernames.

    It will be interesting to watch HubPages over the next couple months, and see how its overall traffic is affected buy this change. If it proves successful, I suspect we’ll see some other victims of the Panda update implementing similar strategies.

  • Google Launches g.co URL Shortener for Google Services

    Google announced the launch of a new URL-shortener that will only link to official Google products and services with g.co.

    “The shorter a URL, the easier it is to share and remember,” said Google VP Consumer Marketing Gary Briggs. “The downside is, you often can’t tell what website you’re going to be redirected to. We’ll only use g.co to send you to webpages that are owned by Google, and only we can create g.co shortcuts. That means you can visit a g.co shortcut confident you will always end up at a page for a Google product or service.”

    “There’s no need to fret about the fate of goo.gl; we like it as much as you do, and nothing is changing on that front. It will continue to be our public URL shortener that anybody can use to shorten URLs across the web.”

    In April, Google launched some upgrades to goo.gl like an improved “copy to clipboard” option, a remove from dashboard option, and spam reporting. There were also improvements to speed and stability.

    Given that Google just launched its new social network Google+ a couple weeks ago, I’d have to assume we’ll still be seeing plenty of goo.gl URLs. It will be interesting to see where all Google uses the new ones.

    Earlier this year we looked at URL shorteners and how they affect SEO, with commentary from Google’s Matt Cutts:

    More on that here.

  • Google May Sticky Search Box, Navigation Bar

    A Google design tweak being tested would see the bar at the top of Google.com, as well as the left panel with the search options remain in place as the user scrolls through search results.

    Alex Chitu at the Google Operating System blog has shared a video from Alon Laudon of what it looks like:

    It’s a subtle, but helpful change in my opinion, and I would not be surprised to see this one become an actual feature. It would also help keep Google+ on the screen at all times (the notifications and share box). This would make it easy to share content at any time while on Google.

    This kind of interface already exists on Google Image Search, as Chitu points out, and that comes with infinite scrolling. One has to wonder if this would open the door to infinite scrolling on regular search results as well. If so, that ought to be good for boosting those first page rankings. In all honesty, I can actually see this benefiting pages that would rank on the second or third page of results, because users wouldn’t be required to click through to another page to see them. It’s much easier to keep scrolling down.

    I really can’t see much negative about the fixed navigation or the possibility of infinite scrolling in search results. How about you?

  • FairSearch Blasts Google with “Searchville” Site

    The FairSearch Coalition, the organization started last year to try (unsuccessfully) and see Google’s acquisition of ITA Software blocked, has launched a new site called Searchville with videos, comic strips and other interactive content, as well as a white paper, designed to illustrate what it considers to be “Google’s anti-competitive business practices,” and how they “harm consumers and advertisers and curb innovation.”

    As you’re welcomed to Searchville, FairSearch tells you:

    Searchville is like a lot of towns where one company dominates.

    Visit Searchville’s local businesses to learn more about the impact that Google’s anticompetitive behavior has on search, companies who advertise and ultimately consumers.

    A trip to the Searchville Lemonade Stand tells the story of Google’s practice of scraping content from other websites.

    Visit Club Droid to learn about Google’s exclusive deals which harm partners and block the introduction of new innovations for consumers.

    The Searchville Travel Agent demonstrates the negative impact of Google’s acquisitions of competitive threats on competition and innovation.

    Searchville

    Searchville

    Searchville

    Searchville

    Searchville

    The Federal Trade Commission recently launched an investigation into Google’s business practices, while the Department of Justice is currently scrutinizing the company’s proposed acquisition of ad optimization firm AdMeld.

    View our past coverage of FairSearch here.

  • Microsoft Doing Something Else with Social Search (Internally)

    What looked like the beginning of a social search engine from Microsoft was spotted at socl.com. Fusible discovered that Microsoft owns the domain socl.com, visited the URL, and was greeted by a landing page for something called Tulalip.

    The screen (pictured below) said: Welcome – With Tulalip you can find what you need and share what you know easier than ever. There was a link for “see how it works” and some boxes with people’s faces in them, reminiscent of the friend/follower display on various social networks. There was also the ability to sign in with either Facebook or Twitter.

    Tulalip from Microsoft Research

    J.B. at Fusible wrote:

    The four letter domain socl.com would complement bing.com.

    Although the site isn’t operational, visitors can get an idea of where Microsoft is going with the service called “Tulalip”, which also happens to be the name of a group of Native American tribes located not far from Redmond, Washington, where Microsoft is headquartered. 

    If you go to socl.com now, you are greeted with the following message:

    Thanks for stopping by. 

    Socl.com is an internal design project from a team in Microsoft Research which was mistakenly published to the web.

    We didn’t mean to, honest.

    So is this simply an experiment the company is testing internally for possible future Bing features? Matt McGee at Search Engine Land shares a screenshot of what it looked like if you tried to sign in with Twitter, and the authorization box said the app would be able to:

    – Read tweets from your timeline
    – See who you follow, and follow new people
    – Update your profile
    – Post tweets for you.

    It’s hard to say just what the company is up to with this. Bing already has social search features of course, and will no doubt continue to improve upon them.

  • Google Now Lets You Search for Recent Images

    Today, Google revealed a new date-filtering feature for Google Images, which lets you search for recent images. On Google’s Inside Search Blog, Google Images software engineer James Synge wrote:

    Back in April, we introduced Google Images with date annotations, a change that added dates on image thumbnails to help you see which images are most recent. Now, you can use the new date filter in the left-hand set of tools to narrow your search to just images from the previous week.

    For example, if you search for [space shuttle], you’ll see images for shuttle launches throughout the years. If you want to see just recent images, like ones of Atlantis, the most recent NASA shuttle to launch, you can click “Past week” in the left-hand panel of tools to see images from the last seven days.

    Google Image Search Filter

    Use our new date filter to see recent images for Harry Potter, the Women’s World Cup or anything else http://t.co/k1epqAB 9 minutes ago via web · powered by @socialditto

    We recently ran an article by Michael Gray about how to optimize your images for search engine traffic. This new feature is simply one more thing to take into consideration.

    Last month, Google revealed the most innovative desktop image search feature we’ve seen in quite some time when it announced Search By Image. While it utilizes technology from Google Goggles, the company’s mobile app that lets you take pictures to search for things, it’s quite different to be able to drag an image file into the search box to enter a query.

    It just goes to show that while images have been a key part of search for quite some time, they are becoming an even more interesting part of the picture. I’d expect Google to continue to make progress in this area, particularly as it continues to photograph the world for Google Maps, and even business interiors.

    Now, they’re giving people Instant Upload with Google+ on mobile. People are already sharing images much more rapidly than ever before.

  • Is Google Creating Too Much Confusion Around Google+?

    I was perusing a thread at WebmasterWorld about how people are trying to get more traffic from Google’s +1 button, and one person’s comments kind of drove home a point I’ve been thinking about lately.

    Regarding the +1 button, member Shatner said:

    Here’s a personal anecdote to illustrate the uphill battle here.

    Tonight I was talking to a friend of mine, a friend who I know visits my site religiously, 5 – 10 times a day.

    He IM’d me and said, “Hey have you heard about this new +1 thing Google is doing? I heard about it on the radio. You should add that to your site.”

    To which I responded, “I’ve had the +1 button on my site for a month now!”

    To which he responded, “Oh is that what that is? I saw it, but didn’t know what it was for.”

    Since the guy apparently only recently heard about it on the radio, I have to wonder if he was confusing it with Google+ which has been the subject of much more press coverage since its announcement. To the average user of a site, who doesn’t follow news about Google religiously, it’s hardly unlikely that a lot of people will assume the +1 button is directly related to Google+.

    Sure, it’s related to some extent. There are +1 buttons on posts in Google+. One could hardly blame someone for assuming it’s like the Facebook “like” button for Google+. This isn’t the case though. If you hit a +1 article on an article on the web, it’s not going to show up in the stream (the Google+ News feed if you will). It’s going to show up in a separate tab on your Google Profile (and everyone’s rushing to check that out right?). This could change, but that’s how it works for now.

    I’ve shown skepticism about just how much the average web user would be compelled to click “+1” on any given article, even before the launch of Google+. Google+ hasn’t done much to change this other than the fact some might be misled into thinking it’s going to share it to their Google+ accounts.

    That’s not to say that there won’t be more integrations in the future. Googlers are taking to the new social network to connect with users for feedback on improving the service and finding new and useful ways to implement it.

    Of course plenty still don’t even know what Google+ itself is. When I asked my Facebook friends (many of which are just people I know in real life, and are not necessarily big followers of the tech and marketing industries) if anyone wanted or needed a Google+ invite, there might as well have been an animated gif of a tumbleweed blowing by. Then someone finally asked, “What’s that?” Eventually one person asked for an invite (not the person that asked what it was, despite my offer of an explanation).

    There are several possible reasons for the lack of response:

    1. The status update didn’t make it into everyone’s news feeds (likely).
    2. They are already on Google+ (Not so much. I haven’t been able to find many of them on there.).
    3. They don’t want to bother with another social network (likely – see obstacle 1 from this article).
    4. They don’t even know what that is, and therefore don’t have much reason to request an invitation.

    Things will probably change in that regard. It’s had strong buzz among early adopters, and Google will continue to push it and integrate it with various products. The branding will come. As the integrations come, it will start to make more sense to more people, I think.

    In terms of the +1 button, I think as more people use Google+, we might see more people clicking it, but right now, it’s missing that “check this out” feel of the like button, simply because nobody’s “checking out” your +1’s. That’s my gut feeling, anyway.

    To me, it feels like the +1 button is likely to only be clicked (for the most part) by search-savvy people, and those trying to game it (see aforementioned WebmasterWorld thread). That’s hardly representative of people who use Google, which means that it maybe it shouldn’t necessarily be indicative of quality results.

    Feel free to disagree.

  • Google Panda Update: The Solution for Recovery?

    Many sites are still wondering how they can come back from being hit by the Google Panda update. Google has certainly stressed quality, and victims of the update have been striving to improve it, but have had little luck in terms of boosting their rankings for the most part.

    Have you been able to recover any search traffic after being hit by the Panda update? Let us know.

    When we talked to Dani Horowitz of DaniWeb, she told us about some other things she was doing that seemed to be helping content rank better, but it was hardly a full recovery in search referrals.

    An article ran at WSJ.com about HubPages, one of the victims that we’ve written about a handful of times. CEO Paul Edmondson is claiming that the use of sub-domains is helping its content work its way back up in Google – something he stumbled upon by accident, but also something Google has talked about in the past.

    The article quotes him as saying that he’s seen “early evidence” that dividing the site into thousands of subdomains may help it “lift the Google Panda death grip.” Amir Efrati reports:

    In June, a top Google search engineer, Matt Cutts, wrote to Edmondson that he might want to try subdomains, among other things.

    The HubPages subdomain testing began in late June and already has shown positive results. Edmondson’s own articles on HubPages, which saw a 50% drop in page views after Google’s Panda updates, have returned to pre-Panda levels in the first three weeks since he activated subdomains for himself and several other authors. The other authors saw significant, if not full, recoveries of Web traffic.

    The piece also points to a blog post Cutts wrote all the way back in 2007 about subdomains. In that, Cutts wrote, “A subdomain can be useful to separate out content that is completely different. Google uses subdomains for distinct products such news.google.com or maps.google.com, for example.”

    HubPages is rolling out subdomains for all authors, which in theory, should help the site’s performance remain tied to the quality of the output by specific authors. This is also interesting given that Google recently launched a new authorship markup, putting more emphasis on authors in search results.

    When that was launched, Google said in the Webmaster Central Help Center, “When Google has information about who wrote a piece of content on the web, we may look at it as a signal to help us determine the relevance of that page to a user’s query. This is just one of many signals Google may use to determine a page’s relevance and ranking, though, and we’re constantly tweaking and improving our algorithm to improve overall search quality.”

    It may be a little early to jump to the conclusion that subdomains are the silver bullet leading to a full Panda recovery, but for those sites with a mix of great quality and poor quality content, this could very well help at least the great stuff rise. It will be interesting to see how HubPages performs over time, once the new structure has been live for a while.

    Google’s statement on the matter (as reported by Barry Schwartz) is: “Subdomains can be useful to separate out content that is completely different from the rest of a site — for example, on domains such as wordpress.com. However, site owners should not expect that simply adding a new subdomain on a site will trigger a boost in ranking.”

    To me, it sounds like if your entire site was hit by the Panda update because of some content that wasn’t up to snuff in the eyes of Google, but some content is up to snuff, you may want to consider subdomain, at least on the stuff that Google doesn’t like – to “separate it out”. You’ll have to do some content evaluation.

    Edmondson’s concept of doing it by author actually makes a great deal of sense. It makes the authors accountable for their own content, without dragging down those who have provided quality content (again, in theory). Not everybody hit by Panda is a “content farm” (or whatever name you want to use) though. For many, it won’t be so much about who’s writing content.

    Content creators will still do well to consider Google’s lists of questions and focus on creating content that is actually good. I case you need a recap on those questions, they are as follows:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Those are, by the way, “questions that one could use to assess the ‘quality’ of a page or an article,” according to the company.

    What do you think of the subdomain theory? Tell us in the comments.

  • Are Things Getting Better for Facebook App Developers?

    Facebook announced that it is removing its app directory, but also creating a new way to get apps into the Facebook search index. The company says the App Directory (in its current form) just doesn’t drive a significant amount of traffic to apps.

    “Many developers have been confused about what it means to submit to the App Directory and frustrated by the length of time it took to get approved,” says Facebook’s Carl Sjjogreen. “As we have looked into this issue, we found that the App Directory drove less than 0.5% of all app installs while a significant number of app installs came as a result of Facebook search. Until now, to be visible in search, you had to submit your app to the App Directory.”

    For your app to show up in search, it has to have over 10 monthly active users if it’s not already listed. You can go to the Developer App and click “submit to Search” in the left sidebar.

    Facebook Submit to Search

    “After clicking the link, we will submit your app to our search index,” say Sjogreen. “There is no longer an approval process for getting your app into search. When you make any changes to your app settings, we will automatically update the listing. It can take up to 72 hours for your app to appear in search results. As always, there is no approval process for launching an app on Platform.”

    He adds that Facebook doesn’t expect any noticeable decrease in traffic to apps as a result of the changes.

    The comments on the announcements seem generally positive for the most part, though there is a bit skepticism about how well it will really work. Michael Robellard, who works at American Greetings Interactive, for example, questions when people search in the search bar, what patterns are they looking for, and how are the apps being presented. In other words, are they looking for exact titles or types of apps?

    “If you know exactly what app you are looking for this works great, otherwise not so well,” he comments. “The old app directory didn’t work well either, so I am OK with it going away.”

    “If the search box is a way that users try to find good meaningful apps to use, then I would love to see it improved, to give more relevant results, if the search box is not used much more than the App Directory was used, then we should try and come up with other ways to connect users to our apps,” he adds. “Ads are OK, but it seems that there should be some sort of directory/store type mechanism as that has become very effective and popular on Mobile platforms and is beginning to show up on desktops (Mac App Store) and in browsers (Chrome App Store).”

    Facebook did launch new Insights for developers last week, that should help them take actions to improve their app performance. This comes after Facebook’s “ban bot” sent a tidal wave through the developer community shutting down apps it deemed too spumy without warning. Had the only victims actually been unwanted spam apps that would’ve been one thing, but some seemingly legitimate apps were pulled down as well. You can see one case about this here.

    Facebook’s Mike Vernal said, “To prevent spam and other bad user experiences, we have systems in place that constantly monitor user feedback about apps. Historically, if an app crosses a threshold of negative feedback, our systems have automatically disabled the app.”

    “We recently launched some changes to those systems that over-weighted certain types of user feedback, causing us to erroneously disable some apps,” he added. “While we quickly re-enabled those apps, we realize that any downtime has a significant impact on both our developers and users. Many of our developers have chosen to build their businesses on top of Facebook, and we take that responsibility very seriously.”

    The company launched what it called “improvements to its enforcement systems” to provide more user feedback directly to developers. They began rolling out a “News Feed” tab in Insights to show positive feedback like comments, likes and clicks, as well as negative feedback like hides and marks as spam.

    Spam Insights for Facebook developers

    In addition, the company is taking a “granular enforcement” approach. Vernal explains, “When our systems detect an excessive amount of negative user feedback, we will look to disable only the impacted social channel. For example, if an app is generating a lot of negative feedback via chat messages, we will take action only on that app’s ability to publish to chat but otherwise leave the app intact. Developers will be able to appeal these granular enforcement actions.”

    If Facebook still decides it needs to disable an app, it will be placed in a new “disabled mode” rather than just being deleted. This way, users won’t be able to access the app, but developers will be able to edit it, view insights, and continue appealing, presumably in hopes that if enough changes are made, Facebook will re-enable it. We’ll see how that goes.

    One more important thing to note is that Facebook says it will be moving from per-channel enforcements to a “more sophisticated” ranking model where “the amount of distribution that content gets will be a direct function of its quality.”

    According to Facebook, this means good content will be seen by more people, and bad content will get seen by less. We’ll see how that goes too.

  • A Holistic Look at Panda with Vanessa Fox

    Vanessa Fox, called a cyberspace visionary by Seattle Business Monthly, is an expert in understanding customer acquisition from organic search. She shares her perspective on how this impacts marketing and user experience and how all business silos (including developers and marketers) can work together towards greater search visibility at ninebyblue.com. She’s also an entrepreneur-in-residence with Ignition Partners, Contributing Editor at Search Engine Land, and host of the weekly podcast Office Hours. She previously created Google’s Webmaster Central, which provides both tools and community to help website owners improve their sites to gain more customers from search and was instrumental in the sitemaps.org alliance of Google, Yahoo!, and Microsoft Live Search. She was named one of Seattle’s 2008 top 25 innovators and entrepreneurs. Her book, Marketing in the Age of Google, provides a blueprint for incorporating search into organizations of all levels.

    Key Interview Points

    I really enjoy speaking with Vanessa about search because of her perspective about how to do things. As readers of mine know, I am a fan of the trite old way of doing it – producing a great web site, making it search friendly, and then promoting it well. Vanessa is truly an industry leader in promoting this type of thinking.

    This is a great interview for you to read if you want to get a strong feeling for the philosophy that drove the Panda algorithm, and the implications of that philosophy going forward. Here are some of the major elements that I extracted (and paraphrased except in those situations which are quoted) from the discussion we had:

    1. Like any business, Google seeks to maximize its profitability. However, Google believes that this is best done by providing maximum value to end users, as this helps them maintain and grow market share. They make more money this way than trying to squeeze extra CPM out of their web pages at the cost of user experience.
    2. The AdWords team does not have access to the organic search team, and as a result the engineers working on organic search are free to focus on delivering the best quality results possible.
    3. (Vanessa) “Panda isn’t simply an algorithm update. It’s a platform for new ways to understand the web and understand user experience”.
    4. Panda is updated on a periodic basis, as opposed to in real time. This is similar to updates to the PageRank displayed on the Google Toolbar, except it is a whole lot more important!
    5. It is easier to reliably detect social spam than link spam.
    6. (Eric) “If you’ve got twelve different signals and someone games two of them and the other ten don’t agree, that’s a flag.”
    7. Don’t focus on artifical aspects of SEO. If it seems like a hokey reason for a web page to rank higher, it probably isn’t true. If by some chance it is true, first it is most likely a coincidence, and second and more importantly, you can’t count on it staying that way.
    8. (Vanessa) “I suggest you get an objective observer to provide you feedback and determine if there are any blind spots you’re not seeing.”
    9. (Vanessa) “The question then becomes if someone lands on your site and they like that page, but they want to engage with your site further and click around your site, does the experience become degraded or does it continue to be a good experience?”
    10. Added value is key. Search engines are looking more and more for the best possible answer to user’s questions. Even if your article is original, if it covers the exact same points as hundreds of other articles (or even 5 other articles) there is no added value to it.
    11. Reviews can be a great way to improve web page content provided that they are contextually relevant and useful.
    12. Crowd sourced content is also potentially useful, but must also be relevant and valuable.
    13. One of the challenges facing both UGC and Crowd Sourcing is the editorial challenge of making sure it is useful and relevant.
    14. Branding can be very helpful too, as it helps people trust the content more. Search engines recognize this as a differentiator as well.
    15. (Vanessa) “I think social media levels that playing field a bit. In the past, you had to hire a publicist, do press releases, have relationships with reporters, and get on Good Morning America, or something on that order, to get your name recognized.”
    16. SEO is still important! Making sites that are easily understood by search engines is still something you need to do. Effective promotion of your web site remains critical too.
    17. Unfortunately, for many sites that have been hit by Panda, there is no quick fix. There are exceptions, of course, but they will be relatively rare.

    Motivations of Google

    Eric Enge: Let’s talk about what Panda was from a Google perspective and what they were trying to accomplish rather than the mechanics of what they did.

    Vanessa Fox: I like that you addressed it that way because many people simply want to know mechanically what did they.

    This update took many people by surprise and, certainly, there are things to be worked out. However, Google has never been secretive about what it’s trying to accomplish and, specifically, what it’s trying to accomplish with Panda.

    Ever since Google launched, its primary goal has been to figure out what searchers want and give them that. This encompasses a lot of things. It encompasses answering their question as quickly and as comprehensively as possible. It involves all the things you think about in terms of making the searcher happy and providing a good user experience.

    In the early days of the web, the only way Google knew if people found something valuable was if there was a link to it. Today, the web is more sophisticated and Google has much more information available to it. The bottom line is that Google is trying to provide the best results for searchers and, for them, Panda was a major step forward in accomplishing this.

    Eric Enge: Yes, some people believe that Google made these changes because it favors their advertisers and their objective is to make more money in the short term. I don’t believe this. To me, the value of market share far outweighs the impact you could get by jacking up your effective CPM by a few percent on your pages.

    It is short term and shortsighted to think Google is now focused on improving CPMs or trying to drive people … to advertise via AdWords.

    Vanessa Fox: That’s absolutely right. It is short term and shortsighted to think Google is focused on improving CPMs or is trying to drive people, who lost ranking in the organic results, to advertise via AdWords. Google is looking for long term market share which is the best way for them to maximize profitability.

    The root of their market share is the fact that they get so many people searching all the time. The best monetary decision for the company is to ensure that searchers experience excellent search results. That’s the core that’s going to help Google maintain their market share which, in turn, is what will help them grow.

    Eric Enge: I’ll paraphrase it simply and say they are totally selfish and they are being selfish by working on their market share.

    Vanessa Fox: That is exactly right. Many people don’t believe that there is a wall between the organic search people and everything else at Google. If they didn’t have such a wall you would have a situation where someone on the AdWords team would be approached by a large advertiser saying “I am having problems with the organic results, can you help me?”

    Of course, that person would want to help the advertiser. By having that wall, the AdWords person doesn’t have access to the organic search people. There is this protectiveness around organic search, which enables those engineers to focus on the search experience. They don’t have to think about AdWords, they don’t have to think about how Google is making money, or what the CPMs are. They don’t have to think about any of those things and are able to concentrate on making the best search experience.

    The whole environment was built that way which is unlike many other companies. In other companies, no matter what part of the organization you work in, you have to always think about how does this impact our revenue. At Google this is not part of the search engineers’ focus, which is great. Another reason is that many of the search engineers have been at Google since the beginning. They don’t have to work there anymore.

    Island Eric Enge: At this point they could easily retire and buy an island.

    Vanessa Fox: They continue to work there because they love data and love working with large amounts of data and improving things. I think if someone said to them,”I know you work on organic search, but we’ve decided it’s really important to either give advertisers preference or hold advertisers down. Could you tweak the algorithms?” They would probably say, “I am going to buy my island now, see you later.”

    That’s not why they are at Google. They are there because they get to do cool things with large pieces of data. I think these two big factors make it basically impossible for anything other than a search experience to infiltrate what’s going on there.

    Think of Panda as a Platform

    Eric Enge: What is Panda?

    Vanessa Fox: Panda isn’t simply an algorithm update. It’s a platform for new ways to understand the web and understand user experience. There are about four to five hundred algorithm updates a year based on all the signals they have. Panda updates will occur less frequently.

    Eric Enge: Right. In the long run it will probably be seen as significant as the advent of a PageRank update.

    Vanessa Fox: Yes, absolutely.

    Link Graph Eric Enge: At SMX Munich Rand Fishkin heard from Stefan Weitz and Maile Ohye that it’s a lot easier to recognize gaming of social signals than it is to recognize link spam.

    Vanessa Fox: The social signals have more patterns and footprints around them. Also, the code that search engines use has gotten more sophisticated, and they have access to more data.

    Eric Enge: Another thing I hear people talking about is that over time Google is looking to supplant links with other signals. My take on this is that links are still going to be a good signal, but they are not going to be the only signal.

    Links will continue to be augmented with more data, which will make the value of links less important because there are other signals now in the mix.

    Vanessa Fox: Google has been saying that for years. I don’t think the value of links will ever go away. They’ll continue to be augmented with more data, which will make the value of links less important because there are many other signals now in the mix.

    Google never intended to be built solely on links. We didn’t have social media and Facebook like buttons, and all these things in the past. We only had links. Google was based on how can we build an infrastructure that algorithmically tells us what content people are finding most valuable on the web.

    Google and Bing as black boxes

    Eric Enge: I think another key component of this story is that Google and Bing are increasing the obscurity of the details of the algorithm. That’s not perfect phrasing, but I think you know what I mean.

    Vanessa Fox: I think it becomes harder to reverse engineer for a number of reasons. There are so many moving parts that it’s hard to isolate. People who have systems that attempt to reverse engineer different parts of the algorithm for different signals may come to conclusions that are, or are not, accurate. This is because it’s impossible to isolate things down to a single signal.

    You find cases where people think they have but, in reality, it’s the tip of an iceberg because you can’t see everything that’s under the surface. By having more signals and knowing so much more about the web the artificial stuff becomes more obvious.

    Eric Enge: Absolutely. If you’ve got twelve different signals and someone games two of them and the other ten don’t agree, that’s a flag.

    Vanessa Fox: Right. Which is why it’s so disheartening to me to see that some SEOs continue to react to this by saying, “okay, how can we figure out the algorithmic signals for Panda so we can cause our pages to have a footprint that matches a good quality site.” This is very short term thinking because the current signals are in use only during this snapshot in time.

    At this point it’s going to be as difficult to create a footprint of a site with a good user experience as it would be to just create a site with a good user experience. This, of course, is not only a better long term perspective and more valuable, but it will result in a better rate of conversion for most businesses.

    I’ve heard some people say things like, they’ve done some analysis and found that you have to vary the length of your articles on pages, so make sure that all of your articles are variable in length. And this is craziness. Even if it works this minute, next week it won’t work and then they will say the sky is falling again.

    I read an article where a person said Seth Godin writes really short blogposts so he is going to be impacted by Panda, and how does Google know that if an article is short, it’s not valuable. But Google’s algorithms are not as simplistic as that. Seth Godin has not said he’s lost ranking because of Panda.

    I commented on the post, and said this is not true. Google isn’t saying that a short article is not a valuable article. Publishers should make blog posts or articles as short or long as they need to be.

    There will be plenty of cases where the best article is a short article.

    Eric Enge: There will be plenty of cases where the best article is a short article.

    Vanessa Fox: Absolutely and those will continue to rank.

    How Publishers should think about Panda

    Eric Enge: What would you say to a publisher if they believe they were unfairly affected by Panda? This is a tough question because 98% of the people affected by Panda will say they are in this category. They believe they were a drive by victim rather than something that fell out of the algorithm.

    Vanessa Fox: That is a complicated question. I will not dispute, and I don’t think Google would dispute, any algorithmic change from any search engine has the potential of causing some collateral damage. If what you are doing as a search engine is asking, ” are the search results better?” then if the search results are better that doesn’t mean that a site with good content doesn’t accidentally end up lower.

    That’s going to be the case with any change a search engine makes. From a content-owner perspective that is not good, which we’ll talk about in a second. However, I talked to many people affected by this and 75% to 80% of the time they said I’ve been hit and I shouldn’t have been hit. There have been only a few occasions where people say, “yeah, I’ve gotten away with it for a long time and they cut me off.”

    Eric Enge: You appreciate their honesty, don’t you?

    Vanessa Fox: Oh, absolutely. But most of the time people say I shouldn’t have been hit. If you’ve been working on a site for a long time, you may not see the areas it can be improved. I suggest you get an objective observer to provide you feedback and determine if there are any blind spots you’re not seeing. I think that would be a good first step.

    It’s not one signal that’s been used. You need to determine does this page answer the question, does this help someone accomplish something.

    Essentially, this has become a holistic thing. It’s not one signal that’s been used. You need to determine does this page answer the question, does this help someone accomplish something?

    As a business you have to make money. You also have to understand that if a site is optimized for making as much money per visitor from ads as possible, as opposed to being optimized at being useful to the searcher, this site is probably not what a search engine wants to show as the best search results.

    You have to balance that. Does it answer a searcher’s question, but also does it answer that questions better than any other site and is the answer easy to find? Look at the quality of what’s being said versus the quality of the other pages that are ranking. Is it better or worse? Then you have to determine if the content is awesome and is that obvious to the searcher.

    From a user experience perspective, when they land on that page is the content they need buried? The user experience becomes important because Google wants the searcher to be happy and easily find their answer.

    Let’s say the content and the user experience are good for that page. Then you run into the issue of quality ratio of the whole site. The question then becomes if someone lands on your site and they like that page, but they want to engage with your site further and click around your site, does the experience become degraded or does it continue to be a good experience?

    For example, last year Google had this emphasis on speed, because their studies found that people are happier when pages load faster and abandon sites that load slowly. I’ve worked with companies whose pages take fifteen seconds before they load. No one will wait around anymore for fifteen seconds to load a page.

    I don’t think this is a big part of Panda, it is just for illustration purposes.

    If you isolate that as a signal you can have the best content in the world and the best user experience in the world. However, if someone does a search and lands on your page but it takes fifteen seconds for anything to appear, they’ve had a bad experience and they are going to bounce off.

    You have to look holistically at everything that’s going on in your site. This is what you should be doing, as if search engines didn’t exist.

    Eric Enge: Right. There is another element I want to get your reaction to which I refer to as the “sameness” factor. You may have a great user experience. You may have a solid set of articles that cover hundreds of different topics, and they may all be in fact original. However, it’s the same hundred topics that are covered by a hundred other sites and the basic points are the same, even though it’s original, there is nothing new.

    Vanessa Fox: Right. I think that’s where added value comes into play. It’s important to look and see what other sites are ranking for. What are you offering that is better than other sites? If you don’t have anything new or valuable to say then take a look at your current content game plan.

    Eric Enge: So, saying the same thing in different words is not the goal. I like to illustrate this by having people imagine the searcher who goes to the search results, clicks on the first result and reads through it. They don’t get what they want so they go back to the search engine, they click on a second result and it’s a different article, but it makes the same points with different words.

    They still didn’t find what they want so they go back to the search engine, they click on the third result and that doesn’t say anything new either. For the search engine it is as bad as overt duplicate content.

    Vanessa Fox: That’s absolutely right.

    Eric Enge: It may not be a duplicate content filter per se, which is a different conversation than this one, but the impact is the same. It’s almost like an expansion of query deserves diversity, right.

    The search engines have always said they want to show unique results, diverse results, valuable results.

    Vanessa Fox: Right. These concepts have all been around for a long time, but we are seeing them perhaps played out with different sets of signals, but they are not anything new. The search engines have always said they want to show unique results, diverse results, valuable results, all these things.

    Adding Diversity to your site with User Generated Content

    Eric Enge: One thing I hear people talk a lot about regarding diversity is doing things with user-generated content. In my mind that can be a useful component provided it is contextually relevant and has something useful to say. Do you have some thoughts on that?

    Vanessa Fox: Yes. I agree with you, it could go either way. Since Google’s goal is to provide useful, valuable results then you can certainly find pages where user-generated content provides that. If you look at TripAdvisor, which may have its faults, one benefit is that there are numerous first person accounts of hotels and other experiences.

    Any hotel or vacation destination you are thinking of going to, you will find authentic, real information from people who’ve actually gone there.

    stackoverflow Forums are another example where user-generated content is great. For instance on stackoverflow people are interested in answering questions and having discussions and that’s valuable content. You might have other forums where people aren’t saying anything or are there to spam and put their links.

    I think it depends on both the topic and how much you are moderating things, how much time you are spending in curation, how much time you are spending organizing things in a useful way so it’s easy to find.

    For instance, let’s say you have a recipe site and people tag their recipes with different variations. If you have a curation process that cultivates that and puts it into topics that people could land on a landing page and see all of the recipes about a particular topic, that will be more useful than things scattered everywhere with random tag pages.

    I think there can still be work involved in UGC, although it can be useful and valuable. When you begin looking at health information, for instance, it might become harder. If it’s a site about sharing your experience about an illness, that’s one thing.

    If it’s a site about diagnosing people and telling them what they should do to fix their illness, that’s another thing. If it is a group of people as opposed to doctors, you get into this authoritative issue and how do you know it is credible.

    Crowd Sourced Content

    Eric Enge: There is a related topic that has a different place in the picture, which is the notion of crowd sourced content. Essentially, using crowd sourced data to draw a conclusion, for example, with surveys and polls.

    Vanessa Fox: This boils down to the same thing. Is it useful, valuable, credible, authoritative, and comprehensive? Is it all the things people are looking for and does it answer their question better than anything else out there on the web? We can look to TripAdvisor as an example of a site that’s been able to create valuable content on a large scale.

    At a larger scale you have to move towards automated processes and, at that point, the curation process becomes harder.

    At a larger scale you have to move towards automated processes and, at that point, the curation process becomes harder. Wikipedia has editors that are aggressive towards making sure the content is accurate. However, not all sites have that.

    When you do surveys it can be fine, but if you are not manually reviewing the results, because of the large volume of data, that’s when something can potentially go awry, so you have to be careful with it.

    walkscore The same thing can happen with aggregating data from different sources. If you look at something like Walk Score, they’ve been able to aggregate the data of how close are schools, bars, and other facilities from your house. Of course, you see other examples where it goes poorly, and you look at the page and it doesn’t make any sense.

    Eric Enge: Right. It’s a matter of the context, the effort, and the level at which you are trying to do it.

    Vanessa Fox: Yes. I think ultimately there will be a fair amount of work involved with running a business that adds value for people. With this age of technology, you see many cases where people say, “look at all the cool things I can do with technology and it’s very little work on my part.” This is sort of the four-hour work week syndrome.

    Often, that does not produce the most valuable results. For instance, if we examine travel and look at a site like Oyster, which was started by Eytan Seidman who used to work on the search team at Microsoft, they pay full-time staff writers with a travel background to travel to hotels, write reviews, and take pictures. They aren’t in every city in the world, and they don’t have every hotel in the world.

    That’s a corporate example, but there are travel bloggers, and food bloggers, and other people who only write ten blog posts. However, those ten posts are very comprehensive on the topic.

    At a large scale, if you attempt to cover every topic in the world, you are not necessarily going to be able to compete with someone who has written something manually.

    At a large scale, if you attempt to cover every topic in the world, you are not necessarily going to be able to compete with someone who has written something manually, gone there, and spent time editing their article. It wouldn’t make sense that your automated content would outrank them.

    Fox News Eric Enge: Absolutely. It reminds me of another thread which I am not sure fits in the interview, but I am going to say it anyway. When I grew up I watched the news with Walter Cronkite. He was completely trusted and authoritative. Today we have Fox News, which is entertainment.

    That’s the design of Fox News and more power to them; however, you have to imagine that as a culture we are going to have a drive towards getting news from a source that you can trust.

    Vanessa Fox: Right. Google did a blog post recently where they talked about the trust element. They said it is certainly one of the questions you should ask yourself when you are evaluating a site. Can you trust it?

    Eric Enge: Right. Will you give it your credit card or will you trust it for medical advice?

    Vanessa Fox: Would you follow the instructions to save your life? This is where brand comes in. I don’t think it has to be a huge brand, but brand does help the trust factor. Building a brand that people see over and over makes a difference.

    This is a major reason why I do not recommend microsites. I know many people who want to do a bunch of micro sites but lack of a brand is one reason I tell them it’s probably not a good idea.

    It’s hard to build a brand with a bunch of micro sites that aren’t branded in a unified way. If you build one site under one brand you can build brand engagement; however, you can’t do that with a bunch of micro sites that are branded separately.

    Social Media and Branding

    Eric Enge: Do you think an effective tactic for beginning to build the brand would involve social media?

    Vanessa Fox: It depends on the topic and audience. Where is your audience, are they on social media? If you can engage that audience and build up authority with them that is great. I think social media levels that playing field a bit. In the past, you had to hire a publicist, do press releases, have relationships with reporters, and get on Good Morning America, or something on that order, to get your name recognized.

    It still takes work but you can go out on social media, see where people are talking about your topic area, answer their questions, and be that authoritative source. I think it can be great but it doesn’t fit every situation.

    SEO still matters

    Eric Enge: One last question since we’ve been talking about holistic marketing. The search engines still have mechanical limitations because of how they crawl web pages. So being search engine savvy is still important,

    Search Engine Robot Vanessa Fox: Absolutely. Search engines crawl the web and they index the web. Technical aspects, such as how the server responds, how the page URLs are built, and what the redirects are, make a huge impact. You can have the best content in the world but if search engines can’t access that content it’s never going to be indexed to rank. So, absolutely, all that stuff is vitally important.

    Eric Enge: The other component is the promotional component which is to go out and implement programs to make people aware of your site and draw links to it, and social media campaigns.

    Vanessa Fox: Yes. That’s absolutely the case. I think it goes with the idea you’ve heard from the search engines for a long time which is what would you do if search engines didn’t exist? You need to build your business and part of that is building awareness about your business.

    I think the web makes it easier but you need to raise awareness so people know that it’s there. Whether it is through social media or other types of PR, there are many things you can do. You can’t think of your audience engagement strategy as simply SEO. All these other components help SEO, but there are things you need to do in business even if you weren’t doing it for SEO.

    The Scope of Panda

    Eric Enge: Any last thoughts on Panda?

    I talk to many people who have sites that have been hit and I certainly sympathize with their plight. However, there is no quick fix in these cases.

    Vanessa Fox: I talk to many people who have sites that have been hit and I certainly sympathize with their plight. However, there is no quick fix in these cases.

    I talked to a site owner two weeks ago that said, “maybe if we change our URL so that they are closer to the root of the site instead of having folders in them that will get us back in.” This is the wrong way of looking at it.

    Eric Enge: Yes. That’s a clear “no”. For sites who have been hit by Panda, I don’t think, for the most part, there is a quick fix.

    Most sites will not be lucky enough to have one section of their site that is a total boat anchor that they can just not index and be done with it. Most sites probably have a real process to go through.

    Vanessa Fox: Yes. It’s hard to hear because this is affecting people’s businesses. I think it is going to be a lot of work to figure out who your audience is, what they are they looking for, are you engaging them well, and are you providing value beyond all the stuff that we talked about. It is a process.

    Eric Enge: Thanks Vanessa!

    Check out Ramblings About SEO for more articles by Eric Enge.

  • Google’s “Different Kind of Search Experience” Officially Launched

    We recently reported on what appeared to be a new kind of search experience from Google, called “What Do You Love?”

    Google has now officially announced the project, with a bit of explanation about the reason it exists. Google’s Andy Berndt writes on the official blog:

    A while back, a few of us wanted to make a little tool that we could use to show just about anybody more of what Google makes. That led to some simple ideas, and then a few more ideas and ultimately, to a challenge: how we could connect people to products they might not know about and may find useful, but make the discovery relevant to them and keep it fun.

    Playing about with that challenge produced a website—What Do You Love?—that we hope meets at least some of the challenge by demonstrating how different Google products can show you different things about any particular search query. Like always, you’re the judge, so give it a go. Type in something that you love—polar bears, space travel, pickup trucks, Lady Gaga, early Foghat—whatever strikes your fancy (for some reason, the results for cheese always crack us up, so try that if you’re momentarily stumped). No matter what it is, we’ll give you back something that will let you get even more into what you love.

    As I demonstrated previously, results pages will show you a graphical interface with boxes for results in Image Search, Maps, Google Alerts, Patent Search, Google Trends, Product Search, Sketchup, YouTube, Books, Google Translate, Blog Search, Picasa, Google News, Google Earth, Google Mobile Search, etc. It also has boxes for various Google tools.

    For example, if I tell Google I love “hamburgers” it will also show Gmail box telling me to email somebody about hamburgers, a Google Calendar box telling me to plan an event about hamburgers, a Google Voice box telling me to call somebody about hamburgers, a Google Moderator box telling me to organize a debate about hamburgers (a fantastic idea), a Google Groups box telling me to start a hamburgers discussion group (another great idea), and a Chrome box telling me to access hamburgers stuff on the web, faster.

    What do you love?

    As far as sharing goes, you have the option to share results pages via Gmail, Google Buzz, or the +1 button. I’m still wondering when we’re going to see a straight Google+ share button. I know everybody’s talking about Google Buzz, but I hear that Google+ isn’t too shabby.

  • Will Google+ Maim Twitter?

    Will Google+ Maim Twitter?

    There’s been a great deal of talk about Google+ as a Facebook competitor, but how does Google+ stack up as a Twitter competitor? Some think Twitter should be more concerned than Facebook.

    I don’t want to say Google+ will “kill” Twitter. Not a big fan of those “this thing will kill this thing” proclamations. Even MySpace still has millions of users. I think “maim” is a more appropriate word though. Google+ could very well damage Twitter’s momentum.

    Do you think Google+ is a threat to Twitter? Comment here.

    A lot of the Google+ vs. Twitter talk is happening on Google+ itself. These are people that are actually using the service, and actively at that. Many of these early adopters, were also early adopters to Twitter, it’s worth noting.

    iEntry CEO and WebProNews publisher Rich Ord recently made some comparisons, saying, “”To me it’s like Twitter in Facebook form on steroids!” He also said, “Google+ looks a lot like Facebook but actually functions more like Twitter. The key is that on Google+ you follow and get followed without reciprocation, just like Twitter. IMO Google+ is competing more for my Twitter time, rather than my Facebook time.”

    He also made a good point about the Circles feature, which could be thought of in comparison to Twitter lists..if only they were shareable. “Warming up to Google+ as a business tool. Viewing the company people work for on mouseovers is immensely helpful in finding people for circles.”

    Influential social media guy Chris Brogan responded, “Agreed. I’m doing what I can to find people for circles via surfing the interesting folks.”

    Other influential social media guy Jason Falls wrote, “I’m wondering if content sharing on Google+ will be more efficiently used than on Twitter. A) I’d be sharing with people who choose to follow me (similar to Twitter), but the action=content prioritization of G+ would put that content in front of more people?”

    On our Facebook Page, we asked, “What do you like better: Twitter or Google+?”

    We got some interesting responses. One was, “What do you like better: cheese or girls?” I find a few more similarities between Twitter and Google+ than between cheese and girls, but interesting comment nonetheless.

    One said, “Too early to tell.” Perhaps.

    Another said, “Both.”

    Some said Google+. Some said neither. Nobody said Twitter.

    Twitter or <a href=Google+?” src=”https://img.ientry.comm/webpronews/pictures/twitter-or-plus.jpg” title=”Twitter or Google+?” class=”aligncenter” width=”483″ height=”707″ />

    Of course there are plenty of takes being expressed on Twitter as well:

    Facebook = Star Wars, Twitter = Empire Strikes Back, Google+ = Return of the Jedi. MySpace = Stupid prequels. 2 days ago via web · powered by @socialditto

    Am I the only person who doesn’t care about Google+? Call me when all my friends have gone there and left Facebook and Twitter. 1 hour ago via web · powered by @socialditto

    So I’m on MySpace, Facebook, Twitter, Gmail & now Google + this shit is getting out of hand if you need me just contact my cell 1 hour ago via web · powered by @socialditto

    Over the past few years, Twitter’s 140-character limit saved me from destroying my career. Google+‘s unlimited will reverse that in a week. 11 hours ago via web · powered by @socialditto

    Why is everyone hating on #Google+ today? My Twitter feed is filled with negativity, lighten up everyone!! 1 hour ago via Twitter for Mac · powered by @socialditto

    Matthew Ingram at GigaOm wrote an article asking, “Is G+ more of a threat to Twitter than Facebook?

    Steve Rubel of Edelman responded (via Google+): “I really think it is, sorry Twitter.”

    In the article, Ingram notes that Rubel is de-emphasizing Twitter focus in favor of Google+, while Digg founder Kevin Rose has redirected his blog to his Google+ account “because there is better conversation there.”

    Vator News reported on a BrightEdge study finding that Google’s +1 buttons are already being adopted more than Twitter’s share and Instant Follow buttons. Given that they’ve only been around since early June, that’s pretty impressive.

    BrightEdge Research - Social Button Adoption

    There are some things to take into consideration here. +1 buttons, while present throughout the Google+ experience, are not exactly the equivalent of a “share to Google+” button. The main draw for publishers to implement the buttons is the potential of influencing search rankings. Twitter sharing does play into that in a more indirect way, but Google has said flat out that +1’s influence rankings. Meanwhile, Google and Twitter have been unable to come to an agreement thus far, regarding the use of Twitter’s firehose. It will be very interesting to see how Google+ impacts search over time – particularly real-time search, which for some sites can have a pretty significant impact on pageviews.

    “Twitter has been available to the public for over 5 years. It had a much slower start than Google+, but that’s because of the size of Twitter at the time it launched compared to Google now. Twitter was a small startup with next to no funding; Google is a multi-billion-dollar internet behemoth,” says Lauren Dugan on AllTwitter, an unofficial Twitter resource blog. “However, over the last 5 years, Twitter has gained much media attention and has attracted notable users from the ranks of the celebrities, sports stars, politicians and other icons throughout the world.”

    “But despite this growth in awareness, Twitter still hasn’t broken through into the mainstream in terms of actual use,” she adds.

    She’s writing in response to a Google+ post from Paul Allen of Ancestry.com, who through some interesting analysis has Google+ pegged at around 10 million users, and on pace to hit 20 million by the weekend. Grain of salt encouraged, but you can read more about this here.

    “Now, Twitter touts the fact that it has well over 200 million (and probably more than 300 million) registered users, but in terms of active users, like most of the new signups for Google+ undoubtedly are, the number is likely much less,” writes Dugan. “Think about 10x less… meaning Google+ may have amassed about half of Twitter’s user base in only two weeks.”

    Google+ has come about at an interesting time in Twitter’s lifecycle. 2011 has been a year of big changes at Twitter, with the stepping away of co-founders Evan Williams and Biz Stone and the return of co-founder Jack Dorsey, who also runs the hot mobile payments startup Square.

    Since Dorsey’s return, Twitter has been much more aggressive in its conquest for increased user interest, engagement, and retention. This has manifested itself in subtle ways, like design tweaks and ways of suggesting users, and bigger ways, like the acquisition of TweetDeck and the announcement of native photo sharing.

    Twitter has faced a fair amount of backlash from developers, over its apparent strategy of weeding out some of the most popular ones by entering the territory on which their respective businesses are built upon (photos, mobile apps, etc.). Twitter has even reportedly caught the eye of the FTC over its competitive practices in this regard ,though in my opinion, all of these developers have already been playing in Twitter’s territory to begin with. When you rely on a third-party like this for your entire business, you have to be prepared for this kind of thing.

    Facebook app developers know what I’m talking about. The recent “ban bot” fiasco saw some legitimate Facebook apps get shut down without warning because an algorithm deemed them too spammy. It’s a different scenario, but it drives home the same point: building businesses that rely on bigger businesses to not only thrive, but to operate entirely, is a risky game.

    Of course, Google is no stranger to government scrutiny of competitive practices. The FTC recently launched a broad investigation into the company’s business practices, and that started before the company even announced Google+ which should be seen as a direct competitor to: Facebook, Twtitter, LinkedIn, StumbleUpon (if this last one seems a little questionable to you, I’m basing this on the Sparks feature – a content discovery tool based on the topics you define as your interests – this is essentially what StumbleUpon is, despite its differences in execution), etc.

    So yes, there is a lot of competition here. It’s not likely that Google+ will hurt the company in the eyes of the government on the competitive level. It’s got some dragons to slay before it gets to that point – obviously the 700 million-user dragon controlled by Mark Zuckerberg (who has a Google+ account).

    That’s not to say Google will never get to that point. It’s way too early to tell just how big this is going to get, but early buzz has been incredibly strong, especially considering the skepticism that usually accompanies Google’s social efforts. Bill Gross thinks it might be the fastest-growing social network ever, and there are still plenty of people just waiting for their invites.

    While there is still a whole lot of room for improvement, Google has done some very smart things in the way it has integrated Google+ with its other properties. The navigation bar across Google properties with Google+ notifications in a bright red box with a number in it will keep people coming back (Lee Odden of TopRank Online Marketing said, “Sheesh, that little red number is like G+ crack!”). As will email notifications and the mobile notifications. The mobile app is also a powerful tool, and one that I believe will be very crucial in driving the continued success and customer retention of Google+. Instant upload (for photos) is very convenient and powerful. Right now, Android users get to use it. Soon, iPhone users will too, and then a whole other large segment of people will be able to see these benefits and conveniences first-hand.

    And that brings us back to Twitter. iOS 5, coming this fall, will have deep Twitter integration, which will be huge for Twitter. Many people will adopt or simply use Twitter more frequently as their ID. It will be what they’re using throughout many of their IPhone (and iPad) apps.

    However, many people are already using their Google IDs for a lot of things (especially on Android), and the beauty of Google+ is that it’s not a new ID. It’s essentially just new features and integrations built around the ID many people have been using for years, whether from Gmail, Google Docs, Picasa Web Albums, YouTube, Google Reader, Google Calendar, etc.

    While I do believe that Google+ and Facebook are very much in competition with one another (these companies are in more than one area, as we’ve discussed numerous times), Twitter is very much a part of this conversation.

    Can Twitter survive Google+? Tell us what you think.

    Follow me on Google+ here. If you need an invite, send us an email at [email protected], and we’ll see what we can do.

  • Bing Rolls Out New Maps Interface

    After months of testing, Bing has now made available its new interface for Bing Maps. Changes include adjustments to the task and navigation controls. Bing says it’s now easier to find the most common actions to complete your tasks.

    “We’ve consolidated actions that were previously scattered throughout the page, and concentrated them along the top, where you expect to find them,” explains Senior Program Manager Dan Polivy. “We’ve included text labels for most of the buttons. And, most importantly, we’ve focused on making the controls accessible while still allowing the map to be the focus of the page.”

    Here’s a look at the before and after (respectively):

    Bing Maps UI - Before

    Bing Maps UI - After

    “These improvements are being rolled out to all of our international sites with appropriate market-specific functionality,” notes Polivy. “For example, Bing Maps users in the UK will still have access to the London Street Map and Ordnance Survey styles, along with our standard Road map, via the vector style drop-down. The public transport overlay, showing tube, DLR, and tram networks, is also readily available from the navigation bar when the map is centered over the greater London area.”

    Bing has also added accessibility of its Bird’s eye 45-degree and high resolution Aerial imagery views from the top of the navigation bar. With that, Bing has made it easier to switch between views in general with a single click.

    Bing Maps interface updates and geolocation support 
    http://binged.it/pJZmFC 16 hours ago via CoTweet · powered by @socialditto

    Finally, there’s a new “locate me” button, which will center the map around your location, if you’re using a browser that supports the W3C Geolocation API.