WebProNews

Tag: SEO

  • You Better Have More Than A Great Site If You Want To Rank In Google

    In a thread in the Google Webmaster Central forum (hat tip: Barry Schwartz), a user claimed to have lost all of their traffic over the weekend, and to have found thousands of “fake backlinks”.

    The user asked what they can do to make Google know the links have nothing to do with them.

    Well, Google’s Matt Cutts recently indicated that Google may soon launch a tool that will let you tell Google to ignore certain links, but also interesting is what Google Webmaster Trends Analyst John Mueller (pictured) said in response to this user’s post.

    Mueller said:

    From what I can tell, your site is still fairly new – with most of the content just a few months old, is that correct? In cases like that, it can take a bit of time for search engines to catch up with your content, and to learn to treat it appropriately. It’s one thing to have a fantastic website, but search engines generally need a bit more to be able to confirm that, and to rank your site – your content – appropriately.

    That said, if you’re engaging in techniques like comment spam, forum profile link-dropping, dropping links in unrelated articles, or just placing it on random websites, then those would be things I’d strongly recommend stopping and cleaning up if you can.

    Emphasis added.

    Google, especially in the last year or two, has talked up the importance of quality content probably above all else, so it is interesting to see Google so openly talking about how that’s not necessarily enough.

    Consider that when Google launched the Penguin update, Google’s Matt Cutts said in the announcement, “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites.”

    Of course, this is still important, but if that’s all you got, it sounds like you better have some patience too, even if Google is all about freshness too.

    Watch this video from Google’s Maile Ohye for some good SEO ideas as far as Google is concerned:

    Image: Mueller’s Google+ Profile

  • Matt Cutts: Google’s Updates Are Car Parts, Data Refreshes Are Gas

    Google frequently updates its algorithm, and sometimes these updates have huge effects on numerous sites. Panda and Penguin are two of the most well-known these days. Google also launches regular data refreshes for these updates.

    While even these data refreshes are enough to keep webmasters on their toes, they are much smaller than the updates themselves.

    Google’s Matt Cutts has talked about the difference between an algorithm update and a data refresh in the past. He put out a blog post all the way back in 2006 on the topic. Given that this was years before Panda and Penguin, it seems worth highlighting now, as businesses continue to struggle with these updates (tip of the hat to Search Engine Journal for linking to this post).

    Here are the straight forward definitions Cutts gave:

    Algorithm update: Typically yields changes in the search results on the larger end of the spectrum. Algorithms can change at any time, but noticeable changes tend to be less frequent.

    Data refresh: When data is refreshed within an existing algorithm. Changes are typically toward the less-impactful end of the spectrum, and are often so small that people don’t even notice.

    In that post, Cutts also pointed to a video of himself talking about the differences:

    Algorithm updates involve specific signals being tweaked. For instance, PageRank could matter more, or less, Cutts explains in the video. With a data refresh the input to that algorithm is being changed. The data that the algorithm works on is being changed.

    He uses a car metaphor, saying that an algorithm update is like changing a part in the car, such as the engine. A data refresh, he says, is more like changing the gas.

    Data refreshes happen all the time, he says. PageRank, for example, gets refreshed constantly.

    In the end, I’m not sure how much any of this matters to the average webmaster. If your site was hit by an update, or by a data refresh, you probably don’t care what the technical name for it is, as long as you can identify the update it’s based on, and make the necessary adjustments to gain back your Google traffic.

  • Duane Forrester: More Details on Bing Phoenix Update

    Duane Forrester continues the SEO media tour supporting Bing’s Phoenix update, this time stopping by Stone Temple Consulting to give a detailed explanation of what the new Bing Webmaster Tools can do for users. Forrester’s doing a great job of not just promoting the update but actually explaining in detail how Bing Webmaster Tools opens up a new world of capability for webmasters. Or, as Stone Temple describes it, the Phoenix update was built by SEOs for SEOs.

    The full interview transcript at Stone Temple is a long read so I’m not going to try to summarize it here; if you’re a student or teacher in the school of SEO, I recommend you visit the site, make some tea, and step up your Bing Webmaster Tools vocab. Forrester discusses the data range tools, the new Link Explorer that lets users dive through the internet to find links associated with any domain, some more explanation of the “Fetch as Bingbot” web crawler simulator, and more.

    One key note worth mentioning is how available Forrester has been making himself to feedback from users. In the Stone Temple discussion, he encourages users of Bing Webmaster Tools to reach out to him directly on Twitter at his personal account, @duaneforrester, to offer up some feedback, make some suggestions, or to ask him some specific questions. Or, I’m sure he also enjoys hearing how awesome you think the Phoenix update is.

    The Bing Team originally promised to go into more detail on their official blog about the Phoenix update, and they have been doing that so far, but Forrester’s been beating them to the punch lately when it comes to covering all the new tools. Forrester made a stop last week at SEOmoz to discuss the new Webmaster Tools and offered up a half-hour tutorial on what you can do with the new features.

    Bing Program Management VP Derrick Connell announced the Phoenix Update at SMX earlier this month. So far, most of the response appears to have been very approving of the update.

  • Using Video To Recover From A Google Algorithm Update

    Dr. Melody King, VP of marketing at Treepodia, recently wrote an article called “Pushing Back on Google Penguin: How to Improve SEO with Video Links“. It’s not so much about Penguin, as it is generally improving your ranking using video. Really, this is about doing better in search, regardless of whether you’ve been hit by an update, but with so many hit by updates like Penguin and Panda, webmasters are looking for ways to quickly recover. Done right, video just might be a great way to do so.

    We reached out to King to discuss this a bit further. She says a site can bounce back quickly after being hit by an algorithm update, by using video.

    “Google starts indexing the video sitemap practically immediately,” King says. “I’ve seen immediately many times, but I hate to state it as an absolute. As soon as the videos are indexed they are eligible to start displaying in the universal search results, and in most cases that means video appears at the top of page.”

    On strategies to get maximum SEO value out of video, King tells us, “Put the results of your SEO research into the sitemap creation in a formula structure. I.E. Meta Title = Keyword + Brand + Category + Product name.”

    “The goal is to target the long tail searches that are popular & appropriate for your items,” she says. “Doing it in a formula format makes this task super quick and easy. With Treepodia, the retailer tells us the desired formula and we take care of the rest. The formula can be unique for different segments of the catalog too – this piece is valuable to retailers depending on their unique product set and spread.”

    “You should host your own videos with a e-commerce video platform, which will allow you to add critical elements such as add-to-cart links, cross & up sell, analytics, etc. This also gives you greater flexibility with the sitemap content (thumbnail image, meta title structure, etc.) and piece of mind that the videos are being indexed to your domain,” says King. “The ability to get videos indexed to your domain that are hosted by YouTube is a recent addition, and it’s reliability is still under debate. However, it is a business decision to decide to ALSO syndicate your videos to YouTube as well.”

    “I’d also say that social backlinks are the best, and people are WAY MORE LIKELY to Facebook share a video than a static image or textual product description,” adds King. “User generated product review videos would be the ultimate social video for an ecom shop, since it is likely to be cute, funny, entertaining, etc. (all great ingredients for a viral or semi-viral product video). An ecommerce video platform can also help filter, manage, and A/B test user generated videos for this purpose.”

    “YT is the second largest search engine & your site will get PageRank from the website link in the About section – but is it worth it to invest the time to manage the channel? Your call,” she says. “I’d say yes (especially since YouTube is an insanely powerful social avenue – see stats here), but many retailers I’ve spoken to are not drinking the Kool-Aid, yet.”

    When asked about the importance of on-page text for video pages, King says, “Actually, we recommend using video to improve the PageRank of existing category, brand, and product pages – especially product pages because of the sheer volume.”

    Another article on the subject of video SEO published this week recommends 9 YouTube tips for better ranking. While Amanda Dhalla at Video-Commerce.com elaborates on each of them, they come down to: Don’t be lazy, optimize your titles, maximize descriptive text areas, use annotations, create playlists, encourage sharing, customize your channel, use calls to action for conversions, and be unique.

    Chris Atkinson at ReelSEO also posted a good video SEO article this week, discussing video metadata and its search benefits.

  • Google Will Soon Ignore Links You Tell It To

    Google’s Matt Cutts gave a keynote “You and A” presentation at SMX Advanced this week, and mentioned that Google is considering offering a tool that would let webmasters disavow certain links.

    Would you find such a tool useful? Let us know in the comments.

    Matt McGee at SMX sister site Search Engine Land liveblogged the conversation. Here’s his quote of Cutts, which was in response to a question about negative SEO:

    The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines. People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.

    Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.

    We recently wrote about Google’s wording change regarding negative SEO, which seemed to be an admission from the company that this practice is indeed possible. These words from Cutts seem to be further confirmation.

    Rand Fishkin, CEO of SEOmoz, recently issued a challenge to people to show that if you have a strong enough reputation and link profile, you can’t be hurt by negative SEO. That seemed to go pretty well, but not everyone has the reputation of SEOmoz, even if they don’t necessarily have a bad one. Such a tool from Google could go a long way in helping combat negative SEO practices.

    As far as people suggesting that Google could disavow links, Search Engine Land editor Barry Schwartz actually had a pretty good article talking about this last month. “The concept is simple,” he wrote. “You go to your link report in Google Webmaster Tools and have an action button that says ‘don’t trust this link’ or something like it. Google will then take that as a signal to not use that link as part of their link graph and ranking algorithm.”

    “What I can’t understand is why hasn’t Google released it yet,” he wrote. “It is a great way for Google to do mass spam reporting by webmasters and SEOs without calling it spam reporting. You will have all these webmasters rush after a penalty to call out which links they feel are hurting them. Google can take that data to back up their algorithms to on links they already know are spam but also find new links that they might not have caught.”

    He went on to make the point that Google would find more spam this way.

    Once Google launches this tool, assuming that it actually does, it will be very interesting to see how the rankings shake out. It should be an indication of just how important links actually are these days.

    As you may know, Google has sent out a ton of Webmaster Tools warnings this year, and such a tool would help users take quick “manual action” on links rather than spend a ton of time sending link removal requests to other sites. It might even prevent some lawsuits (and the death of the web as we know it).

    According to Cutts, however, not many of the warnings were actually about links.

     

    @VegasWill that’s the right range. I may pull the stats just to help clarify.
    6 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Update: Here’s his clarification:

    Matt Cutts
    Matt Cutts   15 minutes ago Earlier this year, Google revealed that we sent out over 700,000 messages to site owners in January and February 2012 via our free webmaster console at http://google.com/webmasters . I wanted to clarify a misconception about those messages. A lot of people assumed that most or all of the 700K messages were related to "unnatural link warnings" that some site owners received.

    The reason for sending the 700,000 messages via Webmaster Tools was actually because we started sending out warnings about blackhat techniques. The vast, vast majority of manual actions we take are on pages that are engaging in egregious blackhat SEO techniques, such as automatically created gibberish or cloaking.

    In fact, of the messages that we sent out to site owners, only around 3% were for unnatural or artificial links. So just to be clear, of the 700,000 messages we sent out in January and February, well above 600,000 were for obvious blackhat spam, and under 25,000 of the messages were for unnatural links. #smx   #seo  


    Google Sent Over 700,000 Messages Via Webmaster Tools In Past Two Months
    At SMX West last week Tiffany Oberoi from Google shared that Google has sent over 700,000 messages to webmasters via Google Webmaster Tools in January and February 2012. That is more than the total nu…

    By the way, Google only sends those messages when it’s a penalty, and penalties, as far as Google is concerned, are manual action.

    It will be interesting to see if the new link tool helps a lot of sites recover from algorithm updates like Penguin, and/or prevents a lot of sites from getting hit. Will we see less complaining about Google’s algorithm changes? Somehow, I doubt that. I have no reason to believe we will see less finger pointing.

    Will you use the new link tool if Google provides it? Let us know in the comments.

  • Google Panda Update: Another Claims Recovery

    Another webmaster claims to have recovered from the Google Panda update. Like the rumors about actual updates (or refreshes) occurring, it’s probably best to take this with a grain of salt, because there are so many factors at play, and it’s hard to tell for sure that it’s really Panda.

    Either way, the story is about a webmaster who claims to have recovered Google traffic, so looking at what was done to achieve that could prove useful.

    Barry Schwartz at Search Engine Roundtable points to a Google forum thread where this person shared his story: “Glad to inform that mine site has recovered from Google panda 3.6 in just 35 days and now the ranking are even much better as compared to the past. I can see a traffic jump of around 150%. Awesome and cheers. Thanks for the suggestions and as now I am a perfect Google panda expert.”

    He doesn’t mention what the site actually is, despite being asked. He does share a LinkedIn profile, which includes a URL to: Americasjobexchange.com, though it’s not clear if this is the site in question. If the grammar used on the site was anything like the grammer used in the forum post, it’s not hard to see why Google may not have liked the site.

    The webmaster claims to have “deeply analyzed” his site, discussed the problem in Google forums and with industry experts, written “maximum posts” on specific topics, removed pages with little content (noindex, nofollow), modified the URL structure and placed canonical or 301 redirects to old ones, continued link building “with brand names that looks natural,” solved WMT crawl errors to a greater extent and removed some internal duplicate pages.

    Much of this seems like stuff that could help regardless of Panda, kind of like what we saw with DaniWeb, who was also able to recover (before being hit again, more than once).

    Scwhartz points out, “He mentions he recovered on June 6th, which is not exactly when Panda 3.7 was released. It was released a couple days later. Although he said he did see even a greater boost on June 8th. So was this really a Panda recovery?”

    Perhaps rather than focusing on specific algorithm updates, webmasters should just follow these types of best practices and Google’s quality guidelines. Chances are, they’ll help in the long run anyway, regardless of which update is rolling out (and don’t worry, there will be more).

    Image: Awesome fat Panda eating=]] (YouTube)

  • Facebook Shares Better Than Links For Google Ranking?

    Searchmetrics released a new study finding that the volume of Facebook shares a web page receives is closely correlated with how high it ranks in Google searches. “At the same time, too many Google AdSense ads on a page are likely to have a negative effect on search visibility,” a representative for Searchmetrics tells WebProNews.

    “The study analyzed search results from Google for 10,000 popular keywords and 300,000 websites in order to pick out the issues that correlate with a high Google ranking,” the representative explains. “The findings come at a critical time when many websites, as you know, try to recover or make sense of the recent Google updates like Penguin and Panda.”

    According to the study, some of the top factor categories that correlate most highly with a successful Google ranking are:

    1. Facebook Shares
    2. Number of Backlinks
    3. Tweets

    Ranking Factors in the US

    Ranking Factors UK

    The firm highlights the following as key findings:

    1. Social media signals show very high correlation with high rankings

    2. Top brands appear to have a ranking advantage

    3. Too much advertising is a handicap

    4. Quantity of links is still important but quality is vital

    5. Keyword domains still frequently attract top results

    Social Media And Search Rankings

    There’s been a lot of talk about how social media’s impact on search this week, with industry conference SMX Advanced having taken place. It’s interesting to see this study site Facebook shares and Tweets as major signals. It’s not incredibly surprising, given that these are two of the Internet’s major social networks, but it is interesting that they top Google’s own Google+.

    At SMX, Google’s Matt Cutts spoke briefly about the +1 button and Google+ as they relate to SEO. SMX’s Danny Sullivan asked him about the topic, and he said (according to a liveblog), “When we look at +1, we’ve found it’s not necessarily the best quality signal right now.”

    Sullivan asked Cutts if you have to be on Google+ to rank well in Google. According to the liveblog, his response was, “No!!!! It’s still early days on how valuable the Google+ data will be.”

    That’s not to say that Google+ isn’t important to search, and if Google has its way, it will likely only grow in importance.

    Here’s an excerpt from the study about Google+:

    Social media signals show extremely high correlation: social signals from Facebook, Twitter and Google+ are frequently associated with good rankings in Google’s index.

    A note on Google+: analyzing Google +1s with a Spearman correlation, we found a significant result of 0.41. From this we can assume that the quantity of +1s has the strongest correlation of any of the metrics analyzed in the study.

    However, we have not included this figure in the overview because we consider it to be too unreliable. This is because Google+ does not currently have enough users and the possibility of a +1 leading directly to changes in SERPs follows accordingly, since pages receive +1s in the order that they would already be placed without them. When Google+ has values that are stronger and more independent from SERPs, these values will also be included in the overview. That Google is trying to make Google+ an important player is indisputable and therefore SEOs should be sure to keep an eye on further developments.

    Then there’s the fact that your Google+ profile is still directly tied to authorship in Google, and that helps you search visibility. It’s also heavily used in Google’s personalized results (Search Plus Your World), which we see all the time.

    Just don’t expect +1s to be as valuable as Facebook likes or tweets unless Google+ growth gets to Facebook or Twitter-like numbers.

    In fact, there was also some talk at SMX about Pinterest being a significant signal for Google. Even though the social network is still in its infancy, it’s gained a lot of popularity very quickly.

    Bing’s Duane Forrester, in discussing Penguin recovery, recently wrote, “It didn’t take long for the Pin It button to start popping up on websites. And it didn’t take a passing grade on the MENSA quiz to see it coming, did it? Rapid growth, huge adoption, media buzz, your friends recommending it, and so it goes. An exercise in obviousness that you’d better pay attention to this little gem.”

    Of course, the same session suggested that Google+ was a huge signal, beating out the others. Here are a couple of tweets straight from the session:

    Social Signals Test: G+ most, Facebook like least, twitter over time, Pinterest strong overall. @yerrbo #smx #11a
    1 day ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Testing what social signals help with Google surprise, Google+ was huge, but also Pinterest seems to build good links @yerrbo #smx #11a
    1 day ago via Twitter for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    Of course different signals will be stronger for different sites. It’s likely that any of them can play a significant role with enough engagement.

  • Facebook: Here’s An SEO Boost For Your Site

    Facebook: Here’s An SEO Boost For Your Site

    Facebook announced some new WordPress integration today. This includes a new plugin called Facebook for WordPress. It comes with “social publishing features” and the following widgets (as listed on Facebook’s developer blog) which have been around for some time:

    • Activity Feed: Shows readers their friends’ activity on the site, such as likes and comments.
    • Recommendations: Gives readers personalized suggestions for pages on your site they might like, as well as a Recommendations Bar option to give users the option to add content to their Timeline as they read.
    • Customizable Like, Subscribe and Send buttons
    • Comments Box: Makes it easy for people to comment on your site and post back to Facebook, and includes moderation tools. The plugin also features automatic SEO support for Facebook Comments, so search engines can index them to improve your site’s visibility.

    Emphasis added.

    While the comments plugin has been around a while, I find it noteworthy that Facebook is touting the “SEO support” as a selling point.

    If you go to Facebook’s page for the Comments plugin, the company says:

    How can I get an SEO boost from the comments left on my site?

    The Facebook comments box is rendered in an iframe on your page, and most search engines will not crawl content within an iframe. However, you can access all the comments left on your site via the graph API as described above. Simply grab the comments from the API and render them in the body of your page behind the comments box. We recommend you cache the results, as pulling the comments from the graph API on each page load could slow down the rendering time of the page.

    As we’ve seen in the past, Facebook comments on your site can show up in Google search results. That said, the search value of comments, in general, is debatable. A while back, we spoke with Shoemoney’s Jeremy Schoemaker, who spoke with a Google engineer friend about blog comments.

    This was pre-Penguin update, when much of the industry focus was still on the Panda update. It was (may still be) worth considering how comments might impact a page’s quality in terms of how Panda looks at content.

    According to Schoemaker, the Google engineer indicated that if anything, it’s “diluting the quality score of my page” by possibly diluting overall keyword density. Another factor could be the few common comments that go through that are clearly spam send signals that the page is not being well maintained.

    “So he said he did not see a positive to leaving indexable comments on my site,” Schoemaker told us.

    Of course, no Google employee knows everything about Google. That’s not to say this person didn’t know what they were talking about, but one Googler recently indicated that Google didn’t have anything called Penguin. It’s just wise to keep a grain of salt on you with these types of things.

  • Local SEO Factors: Survey Attempts To Rank 90 Of Them

    Portland-based David Mihm Web Design put out the results of a big local search survey, attempting to rank the top 90 local search factors that influence a business’ local Google rankings. It’s an interesting list, but it’s hard to say just how accurate it is, as Google plays its rankings signals cards pretty close to its chest. That said, it has some pretty credible contributors. You can see the whole list (as well as all the results) here.

    There’s also the fact that the survey may lose its relevance sooner rather than later, as David Mihm acknowledges in the survey’s introduction.

    “Of course, all of this preceded the colossal sea change represented by the release of Google +Local on May 30,” he writes. “This release actually came just as the responses for this year’s survey started pouring in. Which means that although this year’s version is more likely to be outdated sooner than previous years, it will represent an incredibly valuable historical data point, and I’m already looking forward to looking at the differences in 2013’s survey.”

    Here’s the big list:

    1. Physical Address in City of Search (PLACE PAGE)
    2. Proper Category Associations (PLACE PAGE)
    3. Proximity of Address to Centroid (PLACE PAGE)
    4. Domain Authority of Website (WEBSITE)
    5. Quantity of Structured Citations (IYPs, Data Aggregators) (OFF-SITE)
    6. City, State in Places Landing Page Title (WEBSITE)
    7. Quantity of Native Google Places Reviews (w/text) (REVIEWS)
    8. Quality/Authority of Structured Citations (OFF-SITE)
    9. Local Area Code on Place Page (PLACE PAGE)
    10. HTML NAP Matching Place Page NAP (WEBSITE)
    11. Consistency of Structured Citations (OFF-SITE)
    12. Individually Owner-verified Place Page (PLACE PAGE)
    13. Quality/Authority of Unstructured Citations (Newspaper Articles, Blog Posts) (OFF-SITE)
    14. Quality/Authority of Inbound Links to Domain (OFF-SITE)
    15. Product / Service Keyword in Business Title (PLACE PAGE)
    16. Quantity of Inbound Links to Domain from Locally-Relevant Domains (OFF-SITE)
    17. Quantity of Unstructured Citations (Newspaper Articles, Blog Posts) (OFF-SITE)
    18. Product/Service Keywords in Reviews (REVIEWS)
    19. Page Authority of Landing Page Specified in Places (WEBSITE)
    20. Quality/Authority of Inbound Links to Places Landing Page URL (OFF-SITE)
    21. Product / Service Keyword in Website URL (WEBSITE)
    22. Location Keyword in Business Title (PLACE PAGE)
    23. Quantity of Inbound Links to Places Landing Page URL from Locally-Relevant Domains (OFF-SITE)
    24. Quantity of Third-Party Traditional Reviews (REVIEWS)
    25. Quantity of Inbound Links to Domain (OFF-SITE)
    26. Location Keywords in Reviews (REVIEWS)
    27. Diversity of Inbound Links to Domain (OFF-SITE)
    28. Geographic Keyword in Website URL (WEBSITE)
    29. NAP in hCard / Schema.org (WEBSITE)
    30. GeoTagged Media Associated with Business (e.g. Panoramio, Flickr, YouTube) (OFF-SITE)
    31. Velocity of Native Google Places Reviews (REVIEWS)
    32. City, State in Most/All Website Title Tags (WEBSITE)
    33. Quantity of Inbound Links to Places Landing Page URL (OFF-SITE)
    34. Quantity of Reviews by Authority Reviewers (e.g.Yelp Elite, Multiple Places Reviewers, etc) (REVIEWS)
    35. Product/Service Keywords in Anchor Text of Inbound Links to Places Landing Page URL (OFF-SITE)
    36. Business Title in Anchor Text of Inbound Links to Domain (OFF-SITE)
    37. Association of Photos with Place Page (PLACE PAGE)
    38. Location Keywords in Anchor Text of Inbound Links to Domain (OFF-SITE)
    39. Location Keywords in Anchor Text of Inbound Links to Places Landing Page URL (OFF-SITE)
    40. City, State in Places Landing Page H1/H2 Tags (WEBSITE)
    41. Product / Service Keyword in Place Page Description (PLACE PAGE)
    42. Location Keyword in Place Page Description (PLACE PAGE)
    43. Age of Place Page (PLACE PAGE)
    44. Business Title in Anchor Text of Inbound Links to Places Landing Page URL (OFF-SITE)
    45. Product/Service Keywords in Anchor Text of Inbound Links to Domain (OFF-SITE)
    46. High Numerical Ratings by Authority Reviewers (e.g.Yelp Elite, Multiple Places Reviewers, etc) (REVIEWS)
    47. City, State in Most/All H1/H2 Tags (WEBSITE)
    48. Diversity of Inbound Links to Places Landing Page URL (OFF-SITE)
    49. Overall Velocity of Reviews (Native + Third-Party) (REVIEWS)
    50. Quantity of Third-Party Unstructured Reviews (REVIEWS)
    51. Product / Service Keywords in Place Page Custom Attributes (PLACE PAGE)
    52. Quantity of Native Google Places Ratings (no text) (REVIEWS)
    53. High Numerical Ratings of Place by Google Users (e.g. 4-5) (REVIEWS)
    54. Number of Actions Taken by Searchers on a Place Page (e.g. Driving Directions, Mobile Phone Calls) (PLACE PAGE)
    55. Numerical Percentage of Place Page Completeness (PLACE PAGE)
    56. Marginal Category Associations (PLACE PAGE)
    57. Number of +1’s on Website (SOCIAL/MOBILE)
    58. Bulk Owner-verified Place Page (PLACE PAGE)
    59. Matching Google Account Domain to Places Landing Page Domain (PLACE PAGE)
    60. Velocity of New Inbound Links to Domain (OFF-SITE)
    61. Number of Adds/Shares on Google+ (SOCIAL/MOBILE)
    62. Velocity of Third-Party Reviews (REVIEWS)
    63. Click-Through Rate from Search Results (SOCIAL/MOBILE)
    64. Authority of +1’s on Website (SOCIAL/MOBILE)
    65. Association of Videos with Place Page (PLACE PAGE)
    66. Velocity of New Inbound Links to Places Landing Page URL (OFF-SITE)
    67. KML File on Domain Name (WEBSITE)
    68. Quantity of MyMaps References to Business (OFF-SITE)
    69. High Numerical Third-Party Ratings (e.g. 4-5) (REVIEWS)
    70. Velocity of Adds/Shares on Google+ (SOCIAL/MOBILE)
    71. Loadtime of Places Landing Page (WEBSITE)
    72. Popularity (# of Views) of MyMaps References to Business (OFF-SITE)
    73. Authority of Adds/Shares on Google+ (SOCIAL/MOBILE)
    74. Positive Sentiment in Reviews (REVIEWS)
    75. Location Keywords in Place Page Custom Attributes (PLACE PAGE)
    76. Matching, Public WHOIS Information (OFF-SITE)
    77. Velocity of +1’s on Website (SOCIAL/MOBILE)
    78. Volume of Check-Ins on Popular Services (e.g. Foursquare, Facebook, Twitter) (SOCIAL/MOBILE)
    79. Number of Shares/Likes on Facebook (SOCIAL/MOBILE)
    80. Number of Followers/Mentions on Twitter (SOCIAL/MOBILE)
    81. Authority of Followers/Mentions on Twitter (SOCIAL/MOBILE)
    82. High Numerical Rating of hReview/Schema Testimonials (WEBSITE)
    83. Volume of Testimonials in hReview / Schema.org (WEBSITE)
    84. Velocity of Check-Ins on Popular Services (e.g. Foursquare, Facebook, Twitter) (SOCIAL/MOBILE)
    85. Volume of HTML Testimonials (WEBSITE)
    86. Velocity of Followers/Mentions on Twitter (SOCIAL/MOBILE)
    87. Velocity of Shares/Likes on Facebook (SOCIAL/MOBILE)
    88. Inclusion of Offer on Place Page (PLACE PAGE)
    89. Authority of Shares/Likes on Facebook (SOCIAL/MOBILE)
    90. Participation in Adwords Express or Google Offers (OFF-SITE)

    One thing that strikes me, looking at this list, is that “Velocity of Adds/Shares on Google+” is so far down on it. Considering that Google just made Google+ the backbone of local search, it will be interesting to see how this is looked at the next time the survey is taken. In fact, it’s interesting to see social factors in general appearing so far down on the list, given this recent study from Searchmetrics showing such factors’ apparent importance to search in general.

    I suggest taking a look at the whole survey. It goes further than just the 90 things listed above, breaking it down by general signals, Place Page factors, off-site factors, on-site factors, review factors, social/mobile factors, and additional factors suggested, as well as numerous comments from experts.

    Local search, in general, may soon be getting turned on its ear. In addition to Google’s new Google+-based local strategy, Apple is doing some interesting things of its own, while dumping Google from its Maps app, and getting much more integrated with Yelp and making search improvements to Siri.

    Regardless of the ranking of the list above, it does stop and make you think about all the potential factors that could go into your local ranking, and many are certainly worth paying attention to.

    Here’s another recent attempt at listing Google local ranking factors, from a study by Bizible (including former Bing staff).

  • Google Talks Showing Multiple Results From The Same Site

    Google’s head of webspam, Matt Cutts, put out a new Webmaster Help Video, responding to the user submitted question:

    Under which circumstances will Google decide to display multiple results from the same website?

    “The answer has changed over the years,” he says. “But the high level answer is, when we think it’s useful and it doesn’t hurt diversity too much.”

    Cutts talks about a strategy Google used for years, called host crowding, where Google would group results from the same site together, but says people would get around this, and game the system by using different subdomains. He also talks about some other limitations of host crowding.

    Discussing how things are these days, Cutts says, “You want to show as many results as you think is useful, and that’s the tricky bit. What the user is looking for can vary depending on what they’re searching for. For example, if they type in something like HP or IBM, probably a lot of pages or a lot of results from HP is a good answer. So several people have noted that it’s possible to get more than two, more than four, lots of results from Hewlett Packard if you search for HP. But that’s OK. The user has indicated that’s their interest by doing that query.”

    He continues, “But in general, what we try to balance is this trade-off between a good diversity of results, because you don’t know exactly what the user was looking for, so you want to give them a little bit of a sampling to say, ‘OK, here’s a bunch of different possible interpretations. Here’s what you might be looking for.’ And then we also want to absolutely give the results that we think match the query well, and sometimes that can be from multiple pages within the same site.”

    “So there’s always a tension,” says Cutts. “There’s always a trade-off in trying to figure out what is the best set of search results to return. There’s no objectively true or perfect way to do it. We’ve varied our scoring. We’ve varied our user interfaces. And if there’s one thing you can count on, it will be that Google will continue to test out ideas. Google will continue to evolve how often we think it’s appropriate to show how many results from how many sites in the search results.”

    Google, as you may know, makes changes to its algorithm every day. Each month, Google puts out a big list of recent changes. Here are the changes Google made in May. Those are just the actual changes. Google also runs 20,000 search experiments a year.

  • Google Panda Update: Google Rolls Out Data Refresh

    Google just announced via Twitter that it started rolling out a Panda refresh on Friday. According to the company, less than 1% of queries are noticeably affected in the U.S. Worldwide, 1% are apparently affected.

    Google told us earlier this month that there had not been another Panda update, after some webmasters suspected one, but that has obviously changed now.

    If you’ve been hit by the Panda update, remember, you can recover. Last year, Google put out this list of 23 questions to ask yourself about the quality of your content:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Google didn’t say exactly that these are official guidelines, though many of them did reappear in Google’s recently launched Webmaster Academy.

    Were you affected by this update? Let us know in the comments.

    More Panda coverage here.

  • Google Removes Parts Of Penalties If You Make Changes

    Just as you can recover from a Google algorithm update like Penguin, you can bounce back from a penalty as well. In fact, you can even partially bounce back, even if you’re unable to bounce all the way back at once.

    Link buyers, pay attention.

    There’s a discussion in Google’s Webmaster Central forum, discussing Google partially removing penalties, complete with word from a Google representative (hat tip: Barry Schwartz).

    Member T-Harris says his site was hit with a penalty due to inorganic links, that he “removed a great deal of these links, amended anchor text when we had been participating in guest blog posts,” and received a letter from Google’s search quality team saying that after re-evaluating the site’s backlinks, they were able to revoke a manual action.

    Google only considers manual action to be actual penalties, so Penguin victims, don’t get your hopes up, though you can still recover.

    “There are still inorganic links pointing to your site that we have taken action on,” the message said, according to T-Harris. “Once you’ve been able to make further progress in getting these links removed, please reply to this email with the details of your clean-up effort.”

    Google Webmaster Trends analyst, John Mueller (pictured), jumped into the discussion to say:

    That usually means that the team has been able to remove a part of the manual actions being taken due to the changes that you’ve made. It sounds like there still are some issues that you might want to review & resolve though. Generally speaking, it can take a bit of time for these kinds of changes to bubble up, and to be visible in search results, it would be rare to see a jump right afterwards. My recommendation (not knowing the specific case/site) would be to follow the advice of the search quality team and to continue working on removing any unnatural links that your site may have collected over time.

    On that note, Google may soon let webmasters tell it specific links to ignore. Last week, Google said such a tool may become available in the next few months.

    Image: John Mueller’s Google Profile pic

  • Google Algorithm Changes For May: Big List Released

    We’ve all been waiting for it, and now it’s here: Google’s monthly list of algorithm changes for May. This time, it’s 39 changes (less than last month).

    Of particular note, Google says it made a couple of adjustments to Penguin:

    Improvements to Penguin. [launch codename “twref2”, project codename “Page Quality”] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm.

    Also noteworthy:

    Better application of inorganic backlinks signals. [launch codename “improv-fix”, project codename “Page Quality”] We have algorithms in place designed to detect a variety of link schemes, a common spam technique. This change ensures we’re using those signals appropriately in the rest of our ranking. 

    Of course, Google also made more adjustments to freshness.

    We’ll be digging into these much more, but for now, here’s the list in its entirety:

    • Deeper detection of hacked pages. [launch codename “GPGB”, project codename “Page Quality”] For some time now Google has been detecting defaced content on hacked pages and presenting a notice on search results reading, “This site may be compromised.” In the past, this algorithm has focused exclusively on homepages, but now we’ve noticed hacking incidents are growing more common on deeper pages on particular sites, so we’re expanding to these deeper pages.
    • Autocomplete predictions used as refinements. [launch codename “Alaska”, project codename “Refinements”] When a user types a search she’ll see a number of predictions beneath the search box. After she hits “Enter”, the results page may also include related searches or “refinements”. With this change, we’re beginning to include some especially useful predictions as “Related searches” on the results page.
    • More predictions for Japanese users. [project codename “Autocomplete”] Our usability testing suggests that Japanese users prefer more autocomplete predictions than users in other locales. Because of this, we’ve expanded the number or predictions shown in Japan to as many as eight (when Instant is on).
    • Improvements to autocomplete on Mobile. [launch codename “Lookahead”, project codename “Mobile”] We made an improvement to make predictions work faster on mobile networks through more aggressive caching.
    • Fewer arbitrary predictions. [launch codename “Axis5”, project codename “Autocomplete”] This launch makes it less likely you’ll see low-quality predictions in autocomplete.
    • Improved IME in autocomplete. [launch codename “ime9”, project codename “Translation and Internationalization”] This change improves handling of input method editors (IMEs) in autocomplete, including support for caps lock and better handling of inputs based on user language.
    • New segmenters for Asian languages. [launch codename “BeautifulMind”] Speech segmentation is about finding the boundaries between words or parts of words. We updated the segmenters for three asian languages: Chinese, Japanese, and Korean, to better understand the meaning of text in these languages. We’ll continue to update and improve our algorithm for segmentation.
    • Scoring and infrastructure improvements for Google Books pages in Universal Search.[launch codename “Utgo”, project codename “Indexing”] This launch transitions the billions of pages of scanned books to a unified serving and scoring infrastructure with web search. This is an efficiency, comprehensiveness and quality change that provides significant savings in CPU usage while improving the quality of search results.
    • Unified Soccer feature. [project codename “Answers”] This change unifies the soccer search feature experience across leagues in Spain, England, Germany and Italy, providing scores and scheduling information right on the search result page.
    • Improvements to NBA search feature. [project codename “Answers”] This launch makes it so we’ll more often return relevant NBA scores and information right at the top of your search results. Try searching for [nba playoffs] or [heat games].
    • New Golf search feature. [project codename “Answers”] This change introduces a new search feature for the Professional Golf Association (PGA) and PGA Tour, including information about tour matches and golfers. Try searching for [tiger woods] or [2012 pga schedule].
    • Improvements to ranking for news results. [project codename “News”] This change improves signals we use to rank news content in our main search results. In particular, this change helps you discover news content more quickly than before.
    • Better application of inorganic backlinks signals. [launch codename “improv-fix”, project codename “Page Quality”] We have algorithms in place designed to detect a variety of link schemes, a common spam technique. This change ensures we’re using those signals appropriately in the rest of our ranking.
    • Improvements to Penguin. [launch codename “twref2”, project codename “Page Quality”] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm.
    • Trigger alt title when HTML title is truncated. [launch codename “tomwaits”, project codename “Snippets”] We have algorithms designed to present the best possible result titles. This change will show a more succinct title for results where the current title is so long that it gets truncated. We’ll only do this when the new, shorter title is just as accurate as the old one.
    • Efficiency improvements in alternative title generation. [launch codename “TopOfTheRock”, project codename “Snippets”] With this change we’ve improved the efficiency of title generation systems, leading to significant savings in cpu usage and a more focused set of titles actually shown in search results.
    • Better demotion of boilerplate anchors in alternate title generation. [launch codename “otisredding”, project codename “Snippets”] When presenting titles in search results, we want to avoid boilerplate copy that doesn’t describe the page accurately, such as “Go Back.” This change helps improve titles by avoiding these less useful bits of text.
    • Internationalizing music rich snippets. [launch codename “the kids are disco dancing”, project codename “Snippets”] Music rich snippets enable webmasters to mark up their pages so users can more easily discover pages in the search results where you can listen to or preview songs. The feature launched originally on google.com, but this month we enabled music rich snippets for the rest of the world.
    • Music rich snippets on mobile. [project codename “Snippets”] With this change we’ve turned on music rich snippets for mobile devices, making it easier for users to find songs and albums when they’re on the go.
    • Improvement to SafeSearch goes international. [launch codename “GentleWorld”, project codename “SafeSearch”] This change internationalizes an algorithm designed to handle results on the borderline between adult and general content.
    • Simplification of term-scoring algorithms. [launch codename “ROLL”, project codename “Query Understanding”] This change simplifies some of our code at a minimal cost in quality. This is part of a larger effort to improve code readability.
    • Fading results to white for Google Instant. [project codename “Google Instant”] We made a minor user experience improvement to Google Instant. With this change, we introduced a subtle fade animation when going from a page with results to a page without.
    • Better detection of major new events. [project codename “Freshness”] This change helps ensure that Google can return fresh web results in realtime seconds after a major event occurs.
    • Smoother ranking functions for freshness. [launch codename “flsp”, project codename “Freshness”] This change replaces a number of thresholds used for identifying fresh documents with more continuous functions.
    • Better detection of searches looking for fresh content. [launch codename “Pineapples”, project codename “Freshness”] This change introduces a brand new classifier to help detect searches that are likely looking for fresh content.
    • Freshness algorithm simplifications. [launch codename “febofu”, project codename “Freshness”] This month we rolled out a simplification to our freshness algorithms, which will make it easier to understand bugs and tune signals.
    • Updates to +Pages in right-hand panel. [project codename “Social Search”] We improved our signals for identifying relevant +Pages to show in the right-hand panel.
    • Performance optimizations in our ranking algorithm. [launch codename “DropSmallCFeature”] This launch significantly improves the efficiency of our scoring infrastructure with minimal impact on the quality of our results.
    • Simpler logic for serving results from diverse domains. [launch codename “hc1”, project codename “Other Ranking Components”] We have algorithms to help return a diverse set of domains when relevant to the user query. This change simplifies the logic behind those algorithms.
    • Precise location option on tablet. [project codename “Mobile”] For a while you’ve had the option to choose to get personalized search results relevant to your more precise location on mobile. This month we expanded that choice to tablet. You’ll see the link at the bottom of the homepage and a button above local search results.
    • Improvements to local search on tablet. [project codename “Mobile”] Similar to thechanges we released on mobile this month, we also improved local search on tablet as well. Now you can more easily expand a local result to see more details about the place. After tapping the reviews link in local results, you’ll find details such as a map, reviews, menu links, reservation links, open hours and more.
    • Internationalization of “recent” search feature on mobile. [project codename “Mobile”] This month we expanded the “recent” search feature on mobile to new languages and regions.
  • Google Calls Upon Tom Waits And Otis Redding To Help With Your Site’s Titles

    I’ve been digging through Google’s new list of algorithm changes it made during the month of May, and I couldn’t help but notice that Google launched a change with the codename “tomwaits”. It’s always interesting to see how Google names its updates. Some of the names go on to become legends (Panda and Penguin, for example). Others you just never hear about it.

    It just seems worth pointing out that someone at Google cares enough about musician Tom Waits to name an algorithm change after it (at least internally). That’s assuming it isn’t named after some Google engineer who also happens to be named Tom Waits.

    So what is the “tomwaits” update? Here’s the listing:

    Trigger alt title when HTML title is truncated. [launch codename “tomwaits”, project codename “Snippets”] We have algorithms designed to present the best possible result titles. This change will show a more succinct title for results where the current title is so long that it gets truncated. We’ll only do this when the new, shorter title is just as accurate as the old one.

    Have you seen this update in action with your own site? How good is Google at determining the accuracy?

    Tom Waits isn’t the only musician to inspire such codenames. There’s another one on the list under the codename: otisredding. This one, interestingly enough, also has to do with alt titles:

    Better demotion of boilerplate anchors in alternate title generation. [launch codename “otisredding”, project codename “Snippets”] When presenting titles in search results, we want to avoid boilerplate copy that doesn’t describe the page accurately, such as “Go Back.” This change helps improve titles by avoiding these less useful bits of text.

    Some other interesting codenames Google has for its changes in May:

    • Lookahead
    • BeautifulMind
    • TopOfTheRock
    • the kids are disco dancing
    • GentleWorld

    They almost sound like race horses, don’t they?

    Image: TomWaits.com

  • Google Algorithm Gets More Freshness Tweaks

    Google finally released its big list of algorithm changes for the month of May. As it has in other recent months, Google made more adjustments to freshness signals.

    Here are the relevant changes from the list:

    • Better detection of major new events. [project codename “Freshness”] This change helps ensure that Google can return fresh web results in realtime seconds after a major event occurs.
    • Smoother ranking functions for freshness. [launch codename “flsp”, project codename “Freshness”] This change replaces a number of thresholds used for identifying fresh documents with more continuous functions.
    • Better detection of searches looking for fresh content. [launch codename “Pineapples”, project codename “Freshness”] This change introduces a brand new classifier to help detect searches that are likely looking for fresh content.
    • Freshness algorithm simplifications. [launch codename “febofu”, project codename “Freshness”] This month we rolled out a simplification to our freshness algorithms, which will make it easier to understand bugs and tune signals.

    Google also announced a change for May listed as “improvements to ranking for news results”. That’s somewhat related to freshness. “This change improves signals we use to rank news content in our main search results,” says Google. “In particular, this change helps you discover news content more quickly than before.”

    For comparison, here are the freshness tweaks they announced a month ago.

    Do you think Google’s results are getting better in the freshness department. In the past, we’ve talked about how freshness has sometimes hurt relevancy.

  • Is It Google’s Fault You Got Hit By Penguin?

    Since Google unleashed the original Penguin update, there has been a lot of finger pointing at the search giant. That continues to this day, and will likely continue for the foreseeable future, not unlike we’ve seen with the Panda update. There have been plenty of stories about both algorithm updates leading to job cuts.

    Google has said on more than one occasion that it considers the Penguin update a success. While the company has also acknowledged that no algorithm is perfect, they seem pretty satisfied with the results. Of course, they’ll continue to push data refreshes, but it seems that Penguin is doing what Google wants it to do.

    Many webmasters are now scrambling to recover, and we’ve seen proof that it is possible, but still the finger pointing continues. If your site was hit by the Penguin update, was it your own fault or was it Google’s?

    Here’s a conversation Google’s Matt Cutts had on Twitter yesterday:

    Amazing. I’m currently interviewing Filipino workers. So many are now unemployed thanks to Penguin. “Do No Evil?” @mattcutts #seo
    1 day ago via Timely by Demandforce · powered by @socialditto
     Reply  · Retweet  · Favorite

    @marshmallocreme we have 2 make changes that we think will improve our search results. Users that are confronted with spam will leave Google
    1 day ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    @mattcutts still… it’s tremendously sad. i wonder how many families lost their primary source of income overnight.
    1 day ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    @marshmallocreme Penguin was just the implementation of things we’ve said clearly many times. If people were ignoring that clear guidance, ?
    1 day ago via web · powered by @socialditto
     Reply  ·  Retweet  ·  Favorite

    Danny Sullivan, who spoke with Cutts in a keynote discussion at SMX Advanced, writes in a blog post, “If you were hit by Penguin, don’t want to be hit by it in the future or are serious about winning with Google in the long-term, it’s crucial to understand that easy links will always be vulnerable. It doesn’t matter if easy links worked in the past. It doesn’t matter if easy links still seem to be working now. It doesn’t matter if you think easy links are now some type of potential negative SEO issue that Google isn’t policing well. None of that, valid or not, is going to help you with the winning game of earning the hard links, the links that will matter.”

    “I can’t stress this enough,” he adds. “I’ve read too many comments where people want to blame Google for the fact that the easy links they got before no longer work as well.”

    Penguin was designed to enforce Google’s quality guidelines algorithmically, and Google believes it has done its job. The guidelines have been around for much longer than the update, and Google has always said to follow them. They’ve penalized sites (manually) for the same things for much longer than Penguin has been around. Penguin just makes Google better at doing what it always tried to do.

    Todd Bailey at Search Engine Journal writes, I’ve talked with many web-masters who have been affected and seen some difficult situations. All of which fall within the communities assessment of Penguin. The reduction or outright disappearance of spam in search results may be advantageous to practitioners of white hat SEO, but the perilous Penguin has even struck unsuspecting webmasters. In fact, many of the 700,000+ recipients of GWT messages from Google were not previously aware of the spam content and low-quality links associated with their brand online. Since the update first swept the SERPs, some cases have shown recovery at refresh 1.1. However, many businesses and Internet marketing firms are left wondering what must be done to rebuild their rankings.”

    I’d suggest starting with trying to abide by Google’s quality guidelines. You may soon have a tool that helps you tell Google what links you want it to ignore as well. That will make a lot of webmasters happy.

    A new report also suggests that Facebook shares are even more significant than links. You know how to get both? Create good content that people want to share. Then, keep doing that.

  • Bing Unleashes a Ton of New Webmaster Tools for SEO, Links Stats

    As Vice President of Bing Program Management Derrick Connell mentioned at the Search Engine Land roundtable talk earlier today, Bing has launched a bevy of new webmaster tools to better inform the understanding of your site’s data and statistics. The addition of several tools and features, which the Bing Team is calling their Phoenix update, offers up everything from SEO analysis tools to a tools for link analysis.

    The new arsenal of analytical tools comes on the heels of Friday’s announcement that the new Bing design has become the standard Bing for users in the United States. Some of these tools are brand new while others are merely getting an update or moving out of beta.

    The Bing Team says that you should consider this announcement of the tools as an webmaster aperitif because they’ll be providing a more detailed explanation of each tool in the upcoming few weeks. Since Bing has promised to elaborate on each of these tools in the near future, I’m not going to try to out-Bing them so I’ll just include a brief description of each of the updates below. We’ll bring you further information about each new feature or tool as Bing makes the information available.

    Probably the most immediate change you’ll see is that Bing has redesigned the Webmaster Tools dashboard. As seen below in the example taken from the Bing blog post, the new look complements the cleaner look to Bing’s search results page.

    Bing Webmaster Tools Phoenix Update

    And now, on to the catwalk.

    New Tools:

  • Link Explorer (beta) – Go spelunking through the internet to discover links associated with any domain.
  • SEO Reports (beta) – Generate SEO analysis reports directly from Bing. The report uses roughly 15 SEO best practices to generate the analysis and runs once every two weeks for all of the domains you have verified with Webmaster Tools account.
  • SEO Analyzer (beta) – Similar to the SEO Reports tool, Analyzer will use the same best practices criteria to scan an URL in order to tell you whether or not you’re in compliance with each best practice.
  • Fetch as Bingbot (beta) – Every curious how the Bing’s web-crawler, Bingbot, sees your site? Now you can find out with this new tool, which will allow a webmaster to send Bingbot crawling across a specific page and display it as the bot sees it.
  • Canonical Alerts – A new tool to help keep webmasters from erroneously using the rel=canonical tags so your website doesn’t get mistaken as one single page.
  • The following tools have existed for a bit but received updates with Phoenix.

  • URL Removal Tool – Simple enough: a tool to allow webmasters to easily block a page from appearing in Bing’s search results.
  • Keyword Research Tool (beta) – Previously, users were only able to enter one single keyword or phrase per keyword request, but now now webmasters will be able to add multiple entries within the same request.
  • URL Normalization – Updated the interface to clarify how it works.
  • Whew. That should be enough to keep you Bing Webmaster Tools users busy for a while, at least until Bing begins to share more information about the hows and whys of each of these tools. To start playing with the tools, users will need a Bing Webmaster Tools account, so happy webmastering and enjoy.

  • Google: It’s Only A Penalty If It’s Manual Action

    Historically, the word “penalty” has been thrown around pretty loosely for sites who suffer in Google rankings. However, just because you got hit by an algorithm update, it doesn’t mean you’ve been penalized, as far as Google is concerned.

    Google defines penalty as “A punishment imposed for breaking a law, rule or contract.”

    Google Penalty

    So, technically speaking, if you got hit by Penguin (legitimately), you’re being penalized for breaking the rules (Google’s quality guidelines). That’s what Penguin is designed to do. However, Google views penalties as manual action, as opposed to algorithmic action.

    Google’s Matt Cutts spoke with Danny Sullivan in a keynote discussion at SMX last night. Danny asked if Penguin is a penalty, to which Cutts responded (according to Search Engine Land’s liveblog), “We look at it [as] something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that. It’s an algorithmic change, but when we use a word like ‘penalty,’ we’re talking about a manual action taken by the web spam team — it wasn’t that. We don’t think of it as a penalty. We think of it as, ‘We have over 200 signals, and this is one of the signals.’”

    According to the live blog, Danny asked, “So from now, does ‘penalty’ mean it’s a human thing?”

    To which Cutts responded, “That’s pretty much how we look at it. In fact, we don’t use the word ‘penalty’ much, we refer to things as a ‘manual action.’ Part of the reason why we do that breakdown is, how transparent can we be? We do monthly updates where we talk about changes, and in the past year, we’ve been more transparent about times when we take manual action. We send out alerts via Google Webmaster Tools.”

    Search Engine Land editor Barry Schwartz, who also liveblogged the discussion, writes, “Google’s Matt Cutts made it crystal clear last night. If you get those Google Webmaster Tools notifications or messages, that means you have been hit by a manual penalty done by a person at Google after reviewing your site by hand. Got that?”

    On a side note, Cutts indicated that only about 1-2% of 700,000 Webmaster Tools warnings were about links, and the rest were clear violations.

  • Google Penguin Update Now Better At Splog Detection?

    Some people have been talking about seeing major rankings changes, wondering if Google has released another Penguin update. According to Matt Cutts ,who spoke at SMX Advanced last night, Googe has not released a new one (or a new Panda update).

    Google did release a Penguin refresh in late May, and that did lead to some recoveries. WPMU has been the one in the spotlight, showing that it is possible to recover. While being in the spotlight couldn’t have hurt. Cutts says it’s possible to recover without talking to Google:

     

    @mike20 I’m not saying it’s trivial or easy. But it’s definitely possible to recover without even talking to anyone at Google.
    57 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Apparently, based on the following Twitter exchange, the latest iteration of Penguin helped Google deal with splogs (spam blogs) better:

     

    @csmartphonedeal we’re always working to improve..
    26 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    I wonder how long until “spinfographic” detection gets ramped up.

  • Google: We’re Starting To Enforce Paid Links More

    Google’s Matt Cutts has been making light of paid links all week, but in reality, Google isn’t joking when it comes to enforcing this part of its quality guidelines. According to Google, it’s cracking down on this more than ever, and we have seen in recent weeks, that Google is indeed cracking down.

     

     

     

    @kerrydean At the Q&A I should be like “Hey everyone, Kerry Dean is buying links in this session, so please get in touch if you’re selling.”
    1 day ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

     

     

     

    @SEOAware “Matt Cutts, Linkbuyer Psychologist.”
    7 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Now, the serious stuff.

    Cutts participated in a keynote discussion at SMX Advanced, and while much of the talk was about Penguin, the subject of paid links also came up. SMX sister site Search Engine Land has a liveblogged account of the discussion. Here’s what Cutts said about paid links, according to that:

    We’re always working on improving our tools. Some of the tools that we built, for example, to spot blog networks, can also be used to spot link buying. People sometimes think they can buy links without a footprint, but you don’t know about the person on the other side. People need to realize that, as we build up new tools, paid links becomes a higher risk endeavor. We’ve said it for years, but we’re starting to enforce it more.

    It makes you wonder how safe those directories that charge for “reviews” to potentially get links are.

    The liveblog continues:

    I believe, if you ask any SEO, is SEO harder now than 5-6 years ago, I think they’d say it’s a little more challenging. You can expect that to increase. Google is getting more serious about buying and selling links. Penguin showed that some stuff that may work short term won’t work in the long term.

    On a semi-related note, Cutts also talked about paid inclusion, given that this has been in the news, as it relates to Google’s new sponsored results and Google Shopping.

    “You call it paid inclusion, but it’s a separately labeled box and it’s not in web ranking,” Cutts told Danny Sullivan, according to the liveblog, which continues: “Google’s take on paid inclusion is when you take money and don’t disclose it. Google’s web rankings remain just as pure as they were 10 years ago. We have more stuff around the edges, that’s true, but that stuff is helpful. Matt mentions using Google Flight Search to book his trip here to Seattle. ‘You can’t buy higher rankings. That hasn’t changed. I don’t expect it to change.’”

     

    @aschottmuller another way to say it would be: payment should always be clearly disclosed + payment doesn’t affect web search rankings.
    8 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    At the conference, Cutts also revealed that Google is considering launching a tool that would allow webmasters to tell Google to ignore certain links. The idea is already attracting a lot of praise from webmasters, many of which though Google should have had something like this long ago. Cutts indicated that such a tool would be several months away.

  • Google: +1 Is Not The Best Quality Signal

    Google launched the +1 button, even before people knew about Google+, the social network. It was a way to tell Google that you felt a particular piece of content or search result was one of quality, and deserved to rank well in Google’s search engine. Now, Google’s Matt Cutts has admitted that it’s not really a great quality signal.

    Cutts participated in a keynote discussion at SMX Advanced in Seattle, where he was asked about Google+ and SEO. According to a liveblogged account of the conversation from SMX sister site Search Engine Land, Cutts said, “When we look at +1, we’ve found it’s not necessarily the best quality signal right now.”

    It’s unclear whether he meant this is the case versus other social signals (like Facebook likes) or whether social signals in general aren’t the best indicators of quality. It could be both in reality. I made the case here, why social is not always a great indicator of relevancy, but there’s also the matter of how frequently people are actually clicking on +1’s, compared to Facebook likes, tweets, and other social buttons.

    During the keynote, SMX’s Danny Sullivan asked Cutts if you have to be on Google+ to rank well in Google. According to the liveblog, his response was, “No!!!! It’s still early days on how valuable the Google+ data will be.”

    For Google’s sake, I hope Google+ and +1s do evolve into better quality signals, because Google sure seems to be placing a pretty good amount of eggs in the Google+ basket, tying it into its various products, and making it Google’s “social spine”.

    Clearly Google and Bing both consider social to be a pretty important factor in search, as evidenced by this year’s launches of Search Plus Your World, and Bing’s new social redesign.

    In a Google help center article, it says, “+1’s can help improve Google Search too, since you can see which pages your social connections have +1’d right beneath search results and ads.”

    Google says:

    +1 gets conversations going. Click the +1 button to give something your public stamp of approval. Then, if you want to share right away, add a comment and send it to the right circles on Google+.

    The next time your friends and contacts search on Google, they could see your +1. You’ll help them find the best stuff on the web – and you might just start up another conversation!

    I guess +1s are not the new PageRank just yet. Clearly, your Google+ presence (namely, your Google Profile) still has some important effects on search visibility though.

    Word is (at SMX) that Pinterest is a pretty strong social signal to search engines these days. Bing’s Duane Forrester appears to concur.