WebProNews

Tag: Webmasters

  • If You Experience A Manual Action From Google, You Should Hear About It

    Google’s Matt Cutts, as you may know, spoke at PubCon this week. It’s where he revealed Google’s new Link Disavow tool. That seems to have overshadowed just about everything else from the conference (even the news that PubCon founder Brett Tabke has sold WebmasterWorld), including other things Cutts talked about.

    It’s understandable, as webmasters have been waiting months for the tool to be released, but Danny Sullivan points out another piece of significance from Cutts’ speech. Google now claims to be sending out messages to webmasters for pretty much every manual action it takes on a site. Sullivan reports:

    “We’ve actually started to send messages for pretty much every manual action that we do that will directly impact the ranking of your site,” said Matt Cutts, the head of Google’s web spam team, when speaking at the Pubcon conference this week.

    “If there’s some manual action taken by the manual web spam team that means your web site is going to rank directly lower in the search results, we’re telling webmasters about pretty much about all of those situations,” he continued.

    Cutts said there might be a rare “corner case” that might not make it but that reporting is “practically 100%” and “the intent is to get to 100%, and as far as I know, we’re actually there.”

    It’s been quite obvious that Google has been sending out many more messages this year than they have historically, but this is good information for webmasters to know, especially since certain activities that are in violation of Google’s quality guidelines could really either be hit by a manual action or an algorithmic action, particularly since Penguin launched.

    I suppose this is all part of Google’s effort to be more transparent, which has also included semi-monthly lists of algorithm changes and more tweeting about major updates in recent weeks.

  • Jim Boykin (Internet Marketing Ninjas) Buys WebmasterWorld Forum

    WebmasterWorld and Pubcon founder Brett Tabke has sold WebmsaterWorld to Jim Boykin, who runs Internet Marketing Ninjas. The announcement was made today at PubCon, which Tabke will continue to run.

    “Internet Marketing Ninjas, led by company founder Jim Boykin, is the ideal match for WebmasterWorld,” said Tabke. “I couldn’t have asked for a better situation than a long-time member acquiring WebmasterWorld.”

    WebmasterWorld is just the latest in a series of forums Boykin has acquired. Other recent pick-ups include Cre8asiteforums, and the Developer Shed Network.

    “2012 has been a major year for IMN and I feel privileged to be a part of WebmasterWorld,” says Boykin. “I’ve been a member of WMW for over 10 years, and I’ve learned so much from so many there. I have so much respect for the community members and it gives me the uttermost pride to be able to work with the community to move WMW into the future. I am very humbled and ready to listen to, and work with, this amazing community.”

    WebmasterWorld Director of Operations, Neil Marshall, offered the following statement: “I’m delighted to be involved with Jim and his team at Internet Marketing Ninjas. This is an exciting new era for WebmasterWorld and I’m really looking forward to continuing to develop the site for the benefit of its members, and retaining the site’s world-renowned, quality discussion forum for webmasters and their businesses. Thank you to Brett Tabke, for giving webmasters an independent vehicle to meet and to discuss current hot topics, turning the site into a major brand for webmasters. I’m sure the whole team is ready to keep that going forward and to develop new ideas for today’s Internet marketing businesses.”

    WebmasterWorld is a place where there is a great deal of discussion from webmasters dealing with Google various algorithm updates. Such discussions often give clues to new major algorithm changes that Google makes before they are officially announced (although there is often a lot of talk that leads to false alarms).

    It will be interesting to see how the forum evolves under new ownership. From the sound of it, the community itself won’t be changing much.

    Tabke says he intends to invest his future efforts into the PubCon conferences.

  • Google Link Disavow Tool Does Not Guarantee Google Will Ignore A Link

    Google finally announced the launch of a Link Disavow tool for webmasters today, after months of anticipation. This is a tool that you can use to tell Google to ignore links if you feel they are hurting your search engine rankings.

    You can see plenty more details about it here (along with a video from Matt Cutts).

    One important thing to note about the tool, however, is that just because you do tell Google to ignore certain links, it is not a 100% guarantee that they will do so. It’s more of a suggestion, and Google will still decide whether or not it wants to follow your instructions.

    In a Q&A section in the official blog post announcing the tool, Google says:

    This tool allows you to indicate to Google which links you would like to disavow, and Google will typically ignore those links. Much like with rel=”canonical”, this is a strong suggestion rather than a directive—Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.

    Emphasis added.

    It probably won’t be an issue if you’re using the tool the way it was intended to be used, but it’s something to be aware of.

  • Google Tweaks Rich Snippets Testing Tool, Calls It Structured Data Testing Tool

    Google has launched a new version of its rich snippet testing tool, which it now calls the Structured Data Testing Tool. The company says it has improved how it displays rich snippets in the tool to better mach how they appear in search results.

    Google also says the new design makes it clearer what structured data Google can extract from the page (and how it may be shown in search results). It’s also now available in languages other than English.

    The tool works with all supported rich snippets, as well as authorship markup, at least in theory.

    Former Googler Vanessa Fox, who built Webmaster Central, writes, “I’m having a bit of trouble with the tool. For instance, looking at the page they show in the blog post: http://allrecipes.com/recipe/banana-banana-bread/, the rich snippet appears in the tool correctly…However, the image is missing for this page in the actual search results. Why isn’t the image showing up?”

    You can access the tool here.

    Last month, Google introduced the Structured Data Dashboard in Google Webmaster Tools. More on that here.

  • Google May Soon Update Its Webmaster Guidelines

    It looks like Google may soon be changing its Webmaster Guidelines. Patrick Sexton from FeedTheBot claims to have spotted an updated version of the guidelines, and posted about them. However, he says, two Google employees told him that they were put up by mistake and “were not meant to go public yet.”

    Sexton removed his post, but in the comments of the following Google+ post about the post (via Search Engine Roundtable), someone posted a link to a screen cap of her account of the guidelines.

    Eren Mckay

    Official Google Webmaster Guidelines gets updated
    thanks to +David Harry  for sharing:
    http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 here's a post about it too:
    http://www.feedthebot.com/blog/official-google-guidelines-get-updated/


    Official Google Guidelines Updated – Webmaster Advice
    update: I have confirmed with two Google employees that these new guidelines were put up by mistake and were not meant to go public yet. I just happened to notice them and I naturally wrote about them…

    (Click the timestamp to go to the comment thread.)

    Google must have had some reason to pull the guidelines, so it’s hard to say how much of what Sexton spotted is what Google will end up going with. The webmaster guidelines are obviously important, however, in ensuring that a site stays in Google’s good graces and doesn’t face getting penalized or hit by algorithm changes designed to enforce the guidelines. The Penguin update was geared towards enforcing the quality guidelines specifically (part of the Webmaster Guidelines).

    According to the screen cap of Sexton’s post, there is some new stuff about rich snippets, which are not currently mentioned on the Webmaster Guidelines page at all.

    There are things like, “Review our recommended best practices for images, video and rich snippets,” and “Avoid abusing rich snippets markup.”

    Things to avoid include: automatically generated content, participating in link schemes, cloaking, sneaky redirects, hidden text/links, doorway pages, scraped content, participating in affiliate programs without adding sufficient value, loading pages with irrelevant keywords, creating pages with malicious behavior (such as phishing or installing viruses, torjans or other badware,” abusing rich snippets markup and sending automated queries to Google.

    It says to “engage in good practices” like: monitoring your site for hacking and removing hacked content as soon as it appears, preventing and removing user-generated spam on your site.

    Some other quotes from the post:

    “Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.”

    “Don’t deceive your users.”

    “Avoid scraped content.”

    “Avoid automatically generated content.”

    “Monitor your site for hacking and remove hacked content as soon as it appears.”

    “Prevent and remove user-generated spam on your site.”

    This is all pretty basic and common sense stuff, but that’s essentially what the guidelines are are about, for the most part, anyway.

    I guess we’ll see if the changes are implemented soon, and whether Google has even more to add.

  • Now Is Your Chance To Ask Matt Cutts A Question

    If you’re reading this, you’ve probably seen some of the Webmaster Help videos Google’s Matt Cutts has done. He regularly takes questions from webmasters about various Google behaviors, and uploads responses to YouTube. Sometimes, they’re longer in more in depth, and sometimes they’re quick and to the point. We usually share them here as they become available, as they’re often filled with useful information, even if it’s not always completely new info.

    Cutts took to his blog this afternoon to announce that he’s currently taking questions for the next series. He says he plans to record some videos next week, and asks that people submit their questions via this Google Moderator page.

    Ask Matt Cutts a question

    So, now is the time to submit a question if you want a shot at getting an answer from Google’s Distinguished Engineer and head of web spam. The man can hardly walk through the hallways at an industry conference without being bombarded by people with questions. His answers are clearly in demand. Here’s your chance to get an answer without having to wait in line.

    Or, of course, you can just check out the Moderator page, and just upvote the questions you think are best.

  • Google Gives You More Site Error Details In Webmaster Tools

    Google is now sharing more detailed Site Error info in Webmaster Tools. Sire Errors will now display stats for each site-wide crawl error from the past ninety days. It will also show failure rates for category-specific errors.

    “This information is useful when looking for the source of your Site Errors,” Google says in a blog post. “For example, if your site suffers from server connectivity problems, your server may simply be misconfigured; then again, it could also be completely unavailable! Since each Site Error (DNS, server connectivity, and robots.txt fetch) is comprised of several unique issues, we’ve broken down each category into more specific errors to provide you with a better analysis of your site’s health.”

    Webmaster Tools site errors

    Users can hover over any of the entries in the legend to get an explanation of the errors. When you do so, there will be a “more info” link.

    Google recently added alerts for Search Queries data in Webmaster Tools, as well.

  • Links Are The Web’s Building Blocks, And Fear Of Google Has Them Crumbling

    This year, as you may know, Google has been sending out a whole lot of messages to webmasters about problematic links. People are in a frenzy trying to get rid of links that may or may not be hurting their search engine rankings, and this is a frenzy created by Google. It may not be exactly what Google intended, but it’s happening.

    Does Google have you in a frenzy? Let us know in the comments.

    Sure, there are plenty of cases where webmasters have engaged in some suspect linking practices, but there are other cases where links appearing around the web are out of webmasters’ control.

    The fact is that the web is about links. Links are what make it a web. It was that way before Google existed, and it still is that way. However, Google has become such a dominant force on the Internet, that webmasters who rely on Google traffic must bend over backwards to appease the search giant, or risk losing visibility in the search results.

    Competition is just a click away, as Google likes to say, and that’s very true. It is easy for users to simply go to Bing.com or Yahoo.com or any other search engine. But for the most part, people aren’t clicking away. They’re still going to Google. Clearly, Google is doing something right, but it also means webmasters must abide by Google’s rules if they want any significant amount of search traffic.

    Google, of course, launched its Penguin update earlier this year, an update that will continue to be refreshed over time. It targets sites that are violating Google’s quality guidelines. But beyond the update, Google is taking the time to send out thousands of emails warning webmasters about links, and in the process is spreading a great deal of confusion.

    Google recently began sending out a new batch of the link warnings with a somewhat different twist than the ones people were getting pre-Penguin. Whereas the company’s advice in the past was to pay attention to these warnings, Google was (at first) saying that with these, they were not necessarily something webmasters need to worry about it. But of course webmasters would worry about them.

    Google’s Matt Cutts aimed to clear up some of the confusion in a blog post over the weekend.

    “When we see unnatural links pointing to a site, there are different ways we can respond,” Cutts said, explaining the original messages. “In many severe cases, we reduce our trust in the entire site. For example, that can happen when we believe a site has been engaging in a pretty widespread pattern of link spam over a long period of time. If your site is notified for these unnatural links, we recommend removing as many of the spammy or low-quality links as you possibly can and then submitting a reconsideration request for your site.”

    “In a few situations, we have heard about directories or blog networks that won’t take links down,” he added. “ If a website tries to charge you to put links up and to take links down, feel free to let us know about that, either in your reconsideration request or by mentioning it on our webmaster forum or in a separate spam report. We have taken action on several such sites, because they often turn out to be doing link spamming themselves.”

    Regarding the newer messages, Cutts said, “In less severe cases, we sometimes target specific spammy or artificial links created as part of a link scheme and distrust only those links, rather than taking action on a site’s overall ranking. The new messages make it clear that we are taking ‘targeted action on the unnatural links instead of your site as a whole.’ The new messages also lack the yellow exclamation mark that other messages have, which tries to convey that we’re addressing a situation that is not as severe as the previous “we are losing trust in your entire site” messages.”

    “These new messages are worth your attention,” he said. “Fundamentally, it means we’re distrusting some links to your site. We often take this action when we see a site that is mostly good but might have some spammy or artificial links pointing to it (widgetbait, paid links, blog spam, guestbook spam, excessive article directory submissions, excessive link exchanges, other types of linkspam, etc.). So while the site’s overall rankings might not drop directly, likewise the site might not be able to rank for some phrases. I wouldn’t classify these messages as purely advisory or something to be ignored, or only for innocent sites.”

    “On the other hand, I don’t want site owners to panic,” he added. “We do use this message some of the time for innocent sites where people are pointing hacked anchor text to their site to try to make them rank for queries like [buy viagra].”

    But site owners are panicking. As usual.

    OK, we get that Google has its rules, but there is something about the whole thing that doesn’t feel quite right. It’s not necessarily Google’s stance on any particular kind of linking, but that Google, for all intents and purposes, even gets to tell people how they can and can’t link. How they can and can’t build the web.

    Sure, sites are free to disregard any of Google’s rules. You’re not going to go to prison for engaging in practices that Google doesn’t like, but if you’re running a business, being ignored by Google can have a tremendous impact on your well-being. For that reason, many businesses feel that that Google has a boot on their neck.

    This isn’t a call for government regulation of Google, though many would like to see it (in Europe, Google is already facing it). As I said, I do agree that competition is a click away. Nobody’s forcing people to use Google. They’re just using it because they want to.

    But Google could save the web a lot of trouble by handling things differently, or perhaps finding a better way to rank search results, without punishing sites for its own reliance on links.

    People are scrambling to have links removed that may or may not even affect their sites in Google. Some of these links are links that people would be happy to have pointing to their sites, but fear of Google’s wrath has them in a frenzy, and they don’t want anything tarnishing their search rankings.

    I want to include a few samples of what people are saying in link removal requests. WebProNews parent company iEntry owns a number of directories, none of which have ever accepted payment for listings, and many of which are nofollowed, yet are receiving requests like countless other sites for link removals because of the fear Google has instilled in webmasters. Nevermind that directories have existed since long before Google existed, and that Google seems to be OK with some directories. For that matter, some directories that are getting link removal requests, Google even links to itself from its own search results.

    Now, let’s look at some samples.

    “We are glad that our website ****.com is Live in your directory. Unfortunately we received a 2 notification letter from Google telling that our website is having unnatural links. Our firm decided to contact all our live links in all web directories and will request to delete it. Please kindly delete this website in your directory. I hope you do understand our concerns.”

    This person was glad to be listed, but feels they have to pull out because of Google.
    —-

    “Thank you so much for your effort to include ******* in your directory. However, due to recent changes in the company’s online marketing strategy, I am humbly requesting for the links to be deleted from your database…Really sorry for any inconvenience that this request will cause/may have caused you. Hoping for your consideration and understanding.”

    That’s another thing. Google is greatly inconveniencing not only those with links posted, but those who have posted the links. Wouldn’t be easier for Google to just take the actions it feels it needs to, without causing such a stir? This is no doubt costing business a great deal of time and money.

    —-

    “Unfortunately we’re facing an important situation right now and we could really use your help. Our website is currently under a Google penalty – basically that means that Google thinks some of our links are unnatural, and they have pushed our site to the back of their search engine results. We are working with consultants to ensure our site meets Google’s Quality Guidelines, and they have advised us to remove any links that might even appear as if they were paid for. Often, these links were naturally placed and are on great sites, but in an effort to be overly cautious, we need to have them removed anyway.

    “Our main goals is to get back to business and ensure we’re creating the best site and resources for our visitors, but until we get this issue taken care of, we’re at a bit of a standstill….”

    Fear of Google is causing people to seek link removal even for naturally placed links on great sites. Naturally placed links on great sites.

    “Because some of our sister stores received a Google penalty, we’ve been working to clean up our backlink profile and want to remove any links that Google may even begin to consider as unnatural or paid. This is absolutely no reflection on the value of your site, and we apologize that it is necessary. However, in an effort to be certain we are complying with changes in Google’s Quality Guidelines, we would be grateful if you could remove the links from your site.”

    So this person is basically saying that even though we may think your site has value, we need to have our link removed because of Google.

    “May I ask that you remove the link to ********** from your website? We do appreciate that the link on your site may not be causing us any problems however we wish to cover all bases as if we get this reconsideration wrong it will have huge implications on the future success of our SEO efforts.”

    So this person appreciates the link that may not even be causing any problems, but just in case, they want the link removed, because of Google.

    “We have received a notice from Google regarding presence of links of our website ******** on your website and they have asked us to get them removed, failing which yours & our sites will be penalized in google search, resulting in loss of business for both of us.

    “Therefore, you are requested to remove all the links as soon as possible, preferably within 72 hours, and confirm to us so that we can inform Google. It is not a reflection of the quality of your / our website, but only an approach to maintain our respective search engine rankings. Waiting for confirmation of removal from your end.”

    Speaking of inconvenience, this person even included a deadline, and still noted that it’s not a reflection of the quality of the site.

    “The following site ********* has links on their website without authorisation from anyone in our company linking back to our website. The website owner needs to remove these ASAP. As the registrar you are also seen responsible to ensure the website owner/ domain host they get all links removed, this is infringement of intellectual property.

    Then there’s this kind of request. People actually suggesting that linking is somehow an infringement. Linking. You know, that thing that the world wide web is based upon? SEM firms are even advising clients to take such action. Some are advising that clients send cease and desist letters. For linking. Because of Google.

    —-

    Now, this all may not be exactly what Google had in mind. A lot of people are overreacting, to say the least. But that’s what happens when one company has so much power on the Internet. Not that long ago, you might have thought that the more links out there pointing to your site the better. That’s more paths to your site, and more chances for people to find it, but with so much reliance on Google, people are getting rid of many of those paths for the all important one. Many of the things Google does with regards to how it treats certain kinds of links make a lot of sense, but this kind of madness that has people frantically seeking link removals (and even sites charging for link removals) doesn’t seem great for the web.

    It’s understandable that people want to be very careful about not having a negative impact on their search rankings, but this goes to show how much power Google really has over the web, just in its own efforts to try and make its own product better based on its flawed algorithm.

    I say flawed algorithm, because it’s not perfect. That’s not to say it isn’t as good as or better than competitors’ algorithms. There’s no perfect way to rank web content. If there is, nobody to my knowledge, has implemented it yet.

    When Google started, PageRank and links were a revolutionary way to rank search results, and there’s no question that they have an important place today. However, it seems like Google is indirectly reconstructing the web by sending out all of these messages to webmasters, who will essentially act as pawns in the process of making Google’s own search results better (which may or may not even actually happen). It does suggest that Google is relying on webmasters just as much as webmasters are relying on Google. Perhaps even more so. What would happen to the quality of search results if no webmasters abided by Google’s rules? It’s an interesting scenario to consider, no matter how unlikely. People fear Google too much not to obey the rules. Those who don’t obey are punished one way or another.

    Unfortunately, it’s entirely possible, at this point, that obeying the rules is out of webmasters’ control, as long as negative SEO is able to exist, which Google seems to have recently acknowledged that it is.

    Google did recently indicate that it is working on a way for users to tell Google which links they want it to ignore, and webmasters/SEOs will certainly be happy when it gets here, but why doesn’t Google simply ignore the links it decides are problematic, without making webmasters jump through hoops? To some extent, Google seems to be taking the action it deems appropriate on certain links (as in the subject of this most recent round of messages), but people are still getting messages, and Google is still taking it upon itself to dictate which links on the World’s web are valuable, and which are not.

    Google clearly still sees links as an incredibly important signal in ranking content, hence the company’s emphasis on penalizing any manipulation of them.

    “I don’t doubt that in ten years, things will be more social, and those will be more powerful signals, but I wouldn’t write the epitaph for links quite yet,” Matt Cutts recently said at SMX Advanced.

    Smart site owners find ways to diversify their traffic, so they don’t have to rely so much on Google for traffic. Social media has been a godsend for a lot of business, and the landscape continues to change rapidly. Even Google itself is doing some interesting things to change how we find and consume information, which may actually make search less crucial. We are living in interesting times, indeed. In the meantime, however, it appears that a great deal of the web will bend over backwards to appease Google, as to not be punished for what Google doesn’t like.

    Are you sore from all of that bending yet? Let us know in the comments.

  • Google Gives Webmasters Just What They Need: More Confusion

    Last week, Google began sending out messages to webmasters, warning them of bad links, much like the ones that many webmasters got prior to the infamous Penguin update. Google said, however, that these messages were different. Whereas the company’s advice in the past was to pay attention to these warnings, Google’s said this time, that they’re not necessarily something you need to worry about it.

    Google’s head of webspam, Matt Cutts, wrote on Google+,”If you received a message yesterday about unnatural links to your site, don’t panic. In the past, these messages were sent when we took action on a site as a whole. Yesterday, we took another step towards more transparency and began sending messages when we distrust some individual links to a site. While it’s possible for this to indicate potential spammy activity by the site, it can also have innocent reasons. For example, we may take this kind of targeted action to distrust hacked links pointing to an innocent site. The innocent site will get the message as we move towards more transparency, but it’s not necessarily something that you automatically need to worry about.”

    “If we’ve taken more severe action on your site, you’ll likely notice a drop in search traffic, which you can see in the ‘Search queries’ feature Webmaster Tools for example,” Cutts added. “As always, if you believe you have been affected by a manual spam action and your site no longer violates the Webmaster Guidelines, go ahead and file a reconsideration request. It’ll take some time for us to process the request, but you will receive a followup message confirming when we’ve processed it.”

    Obviously, this all caused a great deal of confusion, and panic among webmasters and the SEO community. Barry Schwartz, who spends a lot of time monitoring forum discussions, wrote, “It caused a major scare amongst SEOs, webmasters and those who owned web sites, never bought a link in their life, didn’t even know what link buying was and got this severe notification that read, ‘our opinion of your entire site is affected.’

    Even SEOmoz was getting these warnings. The company’s lead SEO, Ruth Burr, wrote,”We’ve got the best kind of links: the kind that build themselves. Imagine the sinking feeling I got in the pit of my stomach, then, when a Google Webmaster Tools check on Thursday revealed that we’d incurred an unnatural link warning.”

    Cutts eventually updated his post to indicate that Google has changed the wording of the messages it is sending, in direct response to webmaster feedback.

     
     

    Google has also removed the yellow caution sign that accompany the messages from the webmaster console. According to Cutts, this illustrates that action by the site owner isn’t necessarily required.

  • These Are The 10 Videos Webmasters Need To Watch, According To Google

    Google has updated its Webmaster Academy site to feature more videos. Some of the videos on the site are old, but some are brand new. None fo them are incredibly long, so if you have a few minutes to spare, I recommend watching all of them.

    Some webmasters are pretty used to videos from Google’s Matt Cutts, and he does appear in some of these, but there are also some other faces in the mix.

    Of course the site itself has complementary information to go along with the videos, but watching the videos themselves is a good start. Here they are:

    Google’s Matt Cutts explains how search works:

    Jen Lee of Google’s search quality team explains how to find your site on Google:

    Cutts talks about snippets:

    Alexi Douvas from Google’s search quality team talks about creating content that performs well in Google search results. It’s worth noting that this one was uploaded just today (post Panda and Penguin):

    Michael Wyszomierski from Google’s search quality team talks about webspam content violations:

    Betty Huang from Google’s search quality team talks about how malicious parties can spam your site:

    A hairless Cutts (for more on that story, see here) discusses how a site that focuses on video or images can improve its rankings:

    Lee talks about using sitemaps to help Google find content hosted on your site:

    An introduction to Google+:

    Cutts and colleague Othar Hansson discuss authorship markup:

  • Google Has New Advice For Mobile SEO

    Google has been pushing its “GoMo” campaign for a while, trying to get sites set up for mobile success, but today, Google posted specific recommendations for smartphone-optimized sites on its Webmaster Central blog.

    Google says it supports the following configurations for sites targeting smartphones:

    1. Sites that use responsive web design, i.e. sites that serve all devices on the same set of URLs, with each URL serving the same HTML to all devices and using just CSS to change how the page is rendered on the device. This is Google’s recommended configuration.

    2. Sites that dynamically serve all devices on the same set of URLs, but each URL serves different HTML (and CSS) depending on whether the user agent is a desktop or a mobile device.

    3. Sites that have a separate mobile and desktop sites.

    Google Webmaster Trends analyst Pierre Far also lists two advantages of utilizing responsive web design:

    1. It keeps your desktop and mobile content on a single URL, which is easier for your users to interact with, share, and link to and for Google’s algorithms to assign the indexing properties to your content.

    2. Google can discover your content more efficiently as we wouldn’t need to crawl a page with the different Googlebot user agents to retrieve and index all the content.

    Google “strongly” recommends using the Vary HTTP header to let its algorithms know that the content might change for different user agents. Google says it uses this as a crawling signal for Googlebot-Mobile.

    The company also notes in a help center article, “Don’t block Googlebot from crawling any page assets (CSS, javascript, and images) using robots.txt or otherwise. Being able to access these external files fully will help our algorithms detect your site’s responsive web design configuration and treat it appropriately.”

    Google has specific annotations for desktop and mobile URLs that it says will help its algorithms understand your site. There is a whole section about this in Google’s Building Smartphone-Optimized Sites recommendation page.

    A recent study from Adobe found that website visits from tablets grew about 10 times faster than the rate of smartphones within two years of market introduction, and by over 300% in the last year. Part of the reason for this, according to the company, is that the majority of sites are not optimized for mobile, and this is reflected when users view them on smartphones. Tablets tend to handle the sites better, to where the optimization isn’t as much of a factor.

    “Tablets are better for surfing than smartphones,” Adobe Digital Index Director, Austin Bankhead, told WebProNews at the time.

    Perhaps if enough sites take Google’s advice, smartphone web surfing in general will be better for everyone.

  • Google Website Translator Gives Webmasters More Control

    Back in 2009, Google released Website Translator, a plugin powered by Google Translate, which enabled webmasters to make their site’s content available in 51 languages (now it’s over 60). Google says over a million sites have utilized it to date.

    Google has now launched a new feature that lets users customize the translations and make adjustments.

    “Once you add the customization meta tag to a webpage, visitors will see your customized translations whenever they translate the page, even when they use the translation feature in Chrome and Google Toolbar,” explains Google Translate product manager Jeff Chin, in a blog post. “They’ll also now be able to ‘suggest a better translation’ when they notice a translation that’s not quite right, and later you can accept and use that suggestion on your site.”

    To use the new features, webmasters can simply add the plugin and customization meta tag to their site, translate a page into one language, hover over a translated sentence to display the original text, click on “contribute a better translation,” and click a phrase to choose an automatic, alternative translation. You can also double-click to edit the translation directly.

    Google webiste translator

    “If you’re signed in, the corrections made on your site will go live right away — the next time a visitor translates a page on your website, they’ll see your correction,” says Chin. “If one of your visitors contributes a better translation, the suggestion will wait until you approve it. You can also invite other editors to make corrections and add translation glossary entries.”

    The feature is currently in beta, and Google still considers it experimental, but if you can actually edit the translations yourself, you should have more control over how your text is displayed, which would be an automatic improvement to the plugin.

    Here, you can find tools and resources to add translation to your site.

  • Google Explains Its Responsive Webpage Design

    Though Google realizes that multiple versions of a website can help tailor that site for display on a specific device, Google uses a different approach for displaying websites on a wide variety of devices. Instead of having separate websites for PC’s, iPhones, Androids, feature phones, etc., Google uses dynamic page shaping they call responsive design to make sure their web pages display properly, or at least legibly, on any type of device. Over at the Google Webmaster Central Blog, the Google Webmaster Team has Google+Webmaster+Central+Blog%29″>outlined the process for webmasters who are tired of updating multiple sites for the same content.

    Google developed this process while following three rules: pages should render legibly at any resolution, only one set of content should be marked up and be viewable on any device, and a horizontal scrollbar should never be shown, no matter the window size. Using these guidelines, the team set up a liquid layout that is able to reformat the page dynamically based on the pixel-width of the browser it is displaying in. Instead of giving container elements a fixed width, they specified a max-width. Likewise, they used min-height instead of a set height. From there, media queries are used to rearrange content as the width of the browser changes.

    The examples used are the Google about page, seen above, and Google’s Cultural Institute page. The about page begins to shift non-essential content down to the bottom of the page when the browser width narrows, while the Cultural Institute page allows larger photos to be cropped until a certain pixel width is reached and it then transforms into a list-type view. Unfortunately, the blog points out that a quarter of website visits are still made with older browsers that don’t support some of these features, such as media queries:

    It’s worth bearing in mind that there’s no simple solution to making sites accessible on mobile devices and narrow viewports. Liquid layouts are a great starting point, but some design compromises may need to be made. Media queries are a useful way of adding polish for many devices, but remember that 25% of visits are made from those desktop browsers that do not currently support the technique and there are some performance implications. And if you have a fancy widget on your site, it might work beautifully with a mouse, but not so great on a touch device where fine control is more difficult.

    You can find out more, including the CSS code which enables Google’s liquid layout at Google’s Webmaster Central Google+Webmaster+Central+Blog%29″>Blog. Make sure, though, to leave a comment here and let us know your opinion on whether this process would be right for your website.

  • Google Rich Snippet Updates Announced, Author Stats Go Missing

    Update: A Google spokesperson tells WebProNews, “We’ve currently disabled the experimental ‘Author stats’ feature in Webmaster Tools Labs as we work to fix a bug in the way stats are attributed.”

    Google announced that Product Rich Snippets are now supported on a global scale, so businesses around the world can take advantage of them, and stand our more in search results for the products they’re selling – those, which searchers are looking for. Product Rich Snippets had only been available in certain locations until now.

    “Users viewing your site’s results in Google search can now preview information about products available on your website, regardless of where they’re searching from,” said product manager Anthony Chavez on the Google Webmaster Central blog.

    Chavez also announced that Google’s Rich Snippets Testing Tool has also been updated to support HTML input. “We heard from many users that they wanted to be able to test their HTML source without having to publish it to a web page,” he says.

    Rich Snippet Testing Tool

    There’s some interesting discussion in the comments section of Google’s blog post announcing these changes. Some are clearly happy to see the HTML suppor for the tool.

    Coincidentally, this is the second time I’ve written about the Rich Snippets Tool in the last 24 hours. I wrote a big piece on Google’s Authorship Markup and what it means for both authors and Google. In that, I referenced a recent interview Google’s Sagar Kamdar did with Eric Enge at Stone Temple Consulting, as Kamdar had suggested using the Rich Snippets Testing Tool to make sure you have authorship set up correctly.

    As mentioned in that other piece, Google has been providing author clicks and impressions data in Webmaster Tools. Now some are finding that author stats have gone missing. “Thanks for the upgrade 😉 But now the author stats are disappeared,” one user commented on Google’s blog post.

    Some are complaining about it in the WebmasterWorld forums. Sally Sitts, who started a thread, writes:

    I went to check my “Author Stats”, under the “Labs” tab in Google Webmaster Tools. GONE!

    Anyone else?

    In the past, they only gave me credit for about 50% of the pages that I have “fixed up with Google-required special authorship tags”, according to their specifications.

    At the bottom of the “Labs” page, their disclaimer prevails –

    “Webmaster Tools Labs is a testing ground for experimental features that aren’t quite ready for primetime (sic). They may change, break or DISAPPEAR AT ANY TIME.”

    Nothing about the probably of return, however. (sic)

    This was followed by a couple of interesting replies. Lucy24 writes:

    They never got as far as crediting me with anything, although the Rich Snippets Testing Tool (under More Resources) still comes through with “verified author markup”.

    :: mopping brow ::

    The author function definitely still exists. Saw it within the last 24 hours while doing a search. (Not, alas, a search for my own pages.)

    Sgt_Kickaxe writes:

    Lots of changes going on still.

    Did you know that after you verify your authorship with G+ you can UNverify it by removing the markup (you can even close your Google+ profile!) but Google will still give you special search results (including picture)? They forget nothing. That’s something to think about if you’re running a plugin of any sort to handle markup, save your resources and shut it down 🙂

    With regards to the Product Rich Snippets, one reader commented, “WHY are adult-related products not supported for rich snippets ? What is the problem, since there is no picture displayed?Are loveshops, selling perfectly legal items, not worthy of having nice SERPs displayed too ? I find that really unjust.”

    We’ve reached out to Google for comment regarding the missing author stats. We’ll update when we know more.

  • Pagination: Google Goes More In Depth On SEO And The Markup

    Back in September, Google introduced new markup for paginated content in an effort to return single-page versions of content in search results, when the content is broken up among multiple page. This would include things like multiple-page articles and slideshows.

    The markup is rel=”next” and rel=”prev”.

    Google says users usually prefer the single page format, and as a consumer of content, you probably agree in most cases. But tt’s worth noting that the markup isn’t an absolute must. Even Google acknowledges that there are times when paginated content makes sense. Sometimes single-page versions can load slowly, for example.

    Google Developer Programs Tech Lead Maile Ohye says, “Remember that if you have paginated content, it’s fine to leave it as-is and not add rel=”next” and rel=”prev” markup at all.”

    She created a video going more in depth on using the markup.

    She has actually put together a 37-page slideshow on the subject as well:

    Keep in mind, if you’re in e-commerce, this all applies to you too. It’s not just about writing articles and creating slidehsows. It could very well include product categories that span multiple pages.

    Google does say that using the markup provides “a strong hint” that pages should be treated as a “logical sequence”.

    By the way, notice that in the video, we have another Googler using a Mac.

  • Google Webmaster Tools Gets New Admin Feature

    Google announced the launch of a new Webmaster Tools feature, which lets verified site owners grant limited access to their site’s data and settings to other people.

    You can do this from the home page, by clicking “Manage Site” and going to the “Add or remove users” option, which has replaced the “Add or remove owners” option. This will take you to a new User admin page. From here, you can add or delete up to 100 users. Users can be identified as “full” or “restricted” depending on the rights you want to assign them.

    Full means they can view all data and take most actions. Restricted means they only have access to view most data, but can only take some actions, such as using Fetch as Googlebot and configuring message forwarding.

    Here’s who can do what:

    Full vs. Restricted on Webmaster Tools

    “You’ve had the ability to grant full verified access to others for a couple of years,” says Google Webmaster Trends analyst Jonathan Simon on the Webmaster Central blog. “Since then we’ve heard lots of requests from site owners for the ability to grant limited permission for others to view a site’s data in Webmaster Tools without being able to modify all the settings. Now you can do exactly that with our new User administration feature.”

    “Users added via the User administration page are tied to a specific site,” he explains. “If you become unverified for that site any users that you’ve added will lose their access to that site in Webmaster Tools. Adding or removing verified site owners is still done on the owner verification page which is linked from the User administration page.”

    Hopefully the new feature will make site management easier for webmasters with a lot of employees and colleagues, and save a lot of hassle when changes are needed, or need to be retracted.

  • Video Markup Hits Schema.org (Google, Bing, Yahoo)

    Last year, Google, Bing and Yahoo teamed up to announced schema.org, an initiative to supporta a common set of schemas for structured data markup on web pages.

    Schema.org got some rich snippet markup for music a couple months later, which servies like MySpace, Rhapsody and ReverbNation immediately started implementing.

    Google announced today that the trio of companies have no launched a new video markup. Google product manager Henry Zhang writes on the Webmaster Central Blog, “Adding schema.org video markup is just like adding any other schema.org data. Simply define an itemscope, an itemtype=”http://schema.org/VideoObject”, and make sure to set the name, description, and thumbnailURL properties. You’ll also need either the embedURL — the location of the video player — or the contentURL — the location of the video file.”

    In the post, he shares an example of what a typical video player with markup might look like.

    “Using schema.org markup will not affect any Video Sitemaps or mRSS feeds you’re already using,” says Zhang. “In fact, we still recommend that you also use a Video Sitemap because it alerts us of any new or updated videos faster and provides advanced functionality such as country and platform restrictions.”

    “Since this means that there are now a number of ways to tell Google about your videos, choosing the right format can seem difficult,” he adds. “In order to make the video indexing process as easy as possible, we’ve put together a series of videos and articles about video indexing in our new Webmasters EDU microsite.”

    The relevant section on the Schema.org site is here.

  • Google Webmaster Tools Sitemaps Feature Gets Some Updates

    Google announced that it is including some new information in the Webmaster Tools sitemaps feature.

    This includes details based on content-type, like stats from Web, Videos, Images and News featured more prominently.

    “This lets you see how many items of each type were submitted (if any), and for some content types, we also show how many items have been indexed,” explains Webmaster Tools engineer Kamila Primke. “With these enhancements, the new Sitemaps page replaces the Video Sitemaps Labs feature, which will be retired.”

    There is also now the ability to test a sitemap. “Unlike an actual submission, testing does not submit your Sitemap to Google as it only checks it for errors,” says Primke. “Testing requires a live fetch by Googlebot and usually takes a few seconds to complete. Note that the initial testing is not exhaustive and may not detect all issues; for example, errors that can only be identified once the URLs are downloaded are not be caught by the test.”

    Google also has a new way of displaying errors, which the company says better exposes what types of issues a sitemap contains. Rather than repeating the same kind of error numerous times for one sitemap, Google will group errors and warnings, giving a few examples.

    For sitemap index files, Google aggregates errors and warnings from the child sitemaps that the sitemap index encloses, so users won’t have to click through each child one at a time.

    The functionality of the delete button has changed as well. It will now remove the sitemap from Webmaster Tools from both your account and the accounts of the other owners of a site.

  • Google Webmaster Office Hours Hangout on Thursday at 10:30 am EST

    Short notice, everyone, but Google Webmaster Trends Analyst Pierre Far has announced his first 2012 webmaster office hours hangout, and it’s happening at 10:30 am Thursday morning.

    From his Google+ post:

    When: Thursday 26 January 2012 at 3:30pm UK time. Find out the time where you are at: http://goo.gl/Xwh7J

    Where: A hangout here on Google+. It works best with a webcam + headset. You can find out more about Hangouts and how to participate at http://goo.gl/ZH9xZ

    Topic? Anything webmaster related like crawling, mobile, indexing, duplicate content, Sitemaps, Webmaster Tools, pagination, duplicate content, multi-lingual/multi-regional sites, etc.

    Please join us! Hope to see many of you soon! Don’t forget to bring your questions or post them here ahead of time if you can’t make it 🙂

    Google hosts these office hours hangouts around 1-3 times a week. Yesterday, Google’s John Mueller hosted one on similar topics.

  • Google Office Hours Webmaster Hangout On Wednesday

    Google’s John Mueller (of Google Switzerland) announced in a Google+ post that Google will be hosting an “Office Hours’ hangout on Google+ for webmasters on Wednesday.

    They do this 1-3 times a week, but there’s a lot of learning opportunity. Mueller says:

    When: Wednesday, 25 January 2012 at 10am CET/9am GMT / http://goo.gl/I0E25http://goo.gl/u5Dr9

    Where: A hangout here on Google+. It works best with a webcam + headset. You can find out more about Hangouts and how to participate at http://goo.gl/ZH9xZ

    Topic? Anything webmaster related like crawling, indexing, duplicate content, Sitemaps, Webmaster Tools, pagination, duplicate content, multi-lingual/multi-regional sites, etc.

    Google asks that participants bring questions or post them here on this Google+ update in the comments.

    I’m guessing the page layout algorithm change will be a a topic that comes up.

  • How Can Google Help Your Website in 2012?

    How Can Google Help Your Website in 2012?

    Google uses Google+ probably more than anyone else out there (except for maybe Robert Scoble), and regardless of whether or not you have added it to your daily social networking routine, it continues to provide a great channel for getting to know Google better.

    That doesn’t just go for helping your search rankings (which it can), but it provides a direct line of communication with many, many Googlers. It’s a great place to get advice from Google, and to share feedback. And it’s not just empty feedback. They’re actively participating in meaningful conversations with users, and have shown that they are taking ideas into consideration (for example, see recent Gmail integration).

    This week, Google Webmaster Trends analyst John Mueller posted the following in a Google+ update:

    “Google has tried a lot of new things this year when it comes to webmaster support — such as the hangouts in a variety of languages. Which parts do you all think we should work on next year? How can we make it easier for you all to make awesome websites, which are easily findable in web-search?”

    “More hangouts? videos? more documentation? more detailed examples?”

    This seems like a good opportunity to not only to raise this question ourselves with our readers, but to spread the question further, because you can actually participate in this conversation and possibly have an impact on future Google offerings, which can in turn benefit your site in the long run, and after a crazy year of algorithm changes, I’m sure many of you are looking for any leg up possible.

    Google has already been hosting a slew of webmaster hangouts on Google+, and if you haven’t been taking advantage of this, why not? You are getting free access to some advice right from the horse’s mouth.

    If you read WebProNews regularly, you should also know that Google puts out a lot of webmaster videos, generally starring Matt Cutts. We cover them fairly frequently, because they’re generally full of helpful knowledge for webmasters. Even when they contain things you already knew, sometimes it helps to be reminded of certain things, or Cutts might present the topic in a slightly different light than you looked at it before. It’s a good idea to watch these videos.

    Here are some of the responses Mueller has received to his question so far:

    Thomas Morffew: More people like you John, that are real faces, and available to help.

    Sandip Dedhia: I agree with +Thomas Morffew, more Googlers who are open to speak about issues which webmasters are facing. In post panda era most of the replies on webmaster forum are so generic that it is hard to make out what is the exact cause of penalty or search traffic drop.

    I would suggest some case studies around those websites who managed to recover from different penalties, like the reasons of penalty and steps they took to recover from that penalty.

    Ramon Somoza: Certainly some assistance for multilingual sites would a great help.

    Lincoln Jaeger: There could be more direct interaction going on through the webmasters console, with regards to flagging up issues, for example.

    Bret Sutherland: When will Google shopping/product search get staff who are open and responsive?

    Do you agree with any of these commenters? Have other ideas?