WebProNews

Tag: webmaster tools

  • Did Google Give Webmasters What They Need This Time?

    Webmasters, many of which have businesses that rely on search rankings, have been wanting Google to do more to communicate with more specificity what is hurting their sites in Google search rankings. The search engine can’t seem to do enough to please everybody, but it does continue to launch tools and resources.

    Is Google doing enough to communicate issues it has with sites, or does it still need to do more? What exactly should Google be doing? Let us know what you think.

    Google has added a new feature to Webmaster Tools called the Manual Action Viewer. This is designed to show webmasters information about when Google’s manual webspam team has taken manual action that directly affects their site’s ranking in the search engine.

    To access the feature, simply click on “Manual Actions” under “Search Traffic” in Webmaster Tools. If Google hasn’t taken any action against your site, you should see a message that says “No Manual webspam actions found.” Obviously, this is what you want to see.

    Google notes that only less than 2% of the domains it sees are actually manually removed for webspam, so the likelihood that you see anything other than the message above seems pretty minimal (that is, of course, if you’re not spamming Google).

    The company will still notify you when you get a manual spam action, but the feature is just giving you another way to check. Here’s what you might see if you did have a manual action taken against you:

    Manual Action Viewer

    “In this hypothetical example, there isn’t a site-wide match, but there is a ‘partial match,’” Google’s Matt Cutts explains in a post on the Webmaster Central blog. “A partial match means the action applies only to a specific section of a site. In this case, the webmaster has a problem with other people leaving spam on mattcutts.com/forum/. By fixing this common issue, the webmaster can not only help restore his forum’s rankings on Google, but also improve the experience for his users. Clicking the “Learn more” link will offer new resources for troubleshooting.”

    “Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration,” he adds. “With this new feature, you’ll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking ‘Request a review.’”

    As Cutts notes, this new feature is something that Webmasters have been requesting for some time. While he emphasizes that a very small percentage of Webmasters will actually see any actions in the viewer, it is at least a new way to know for sure if Google has indeed taken a manual action.

    Reactions in the comments of Google’s announcement are a little mixed. Most of the visible comments are praising the tool. One person says they’re already putting the feature to good use. Another says, “Finally!”

    I say visible comments because many of them say, “Comment deleted. This comment has been removed by the author.”

    One user says, “If we have followed Matt’s advice and Google’s guidelines, why would we need this tool? Please give us a tool that can really help us , not distract us.”

    In addition to the new WMT feature, Google has put out a series of seven new videos to go with its documentation about webspam, explaining what each type really means. Cutts, with the assistance of a few other Googlers, covers unnatural links, think content, hidden text, keyword stuffing, user-generated spam, and pure spam. You can find all of them here.

    This is Google’s latest attempt to make its documentation more helpful. A couple weeks ago, Google updated its Link Schemes page to discuss article marketing and guest posting, advertorials and press release links.

    Of course this is all only applicable to those who have been hit with manual penalties, and is of little comfort to those hit by algorithm changes. If that’s your problem, you may want to look into the whole authorship thing, which just might be influencing ranking significantly.

    Are Google’s most recent “webmaster help” efforts truly helpful to webmasters? Let us know in the comments.

  • Cutts: We Will Give More Info In Link Messages Over Time

    Google’s Matt Cutts says webmasters can expect Google to expand the amount of info Google provides in Webmaster Tools messages related to manual web spam actions. Cutts put out a new Webmaster Help video today discussing this topic, when asked:

    Will Webmaster Tools ever tell us what links caused a penalty?

    “First off, remember, algorithmic things are just ranking, so they don’t generate messages in Webmaster Console,” Cutts responds. “However, if you log in to the Webmaster Tools Console, and you see that there’s a message, that means that there has been some direct manual action by the web spam team that is somehow directly affecting the ranking of your website. So in those cases, right now, some of those messages have example links or example URLs that are causing issues for us.”

    He continues, “We wouldn’t necessarily say that those are the only things because if you have a million URLs that are offending things, we couldn’t send all million URLs in an email or even a message, because that’s just gonna take too much storage, but we are going to, over time, give more and more information in those messages, and so I wouldn’t be surprised if you see, you know, 1, 2, 3 – some number of example URLs or links that give you an idea of where to look in order to find the sorts of things that are causing that particular action. So, I think that is really useful. We’re going to keep looking at how we can expand the number of example URLs that we include in messages, and I think that will be a great thing for webmasters because then you’ll have a really good idea about where to go and look in order to help diagnose what the issue is.”

    This is actually the second time Cutts has discussed this topic in a Webmaster Help video this month. Back on the 15th, Google released a video in which he also said they’d try to get more examples of bad links in messages to webmasters.

    You can check that out here, if you want to see exactly what he said then.

  • Google Makes Navigation Changes To Webmaster Tools

    Google has launched a new navigation design for Webmaster Tools in an effort to make frequently used features easier to access.

    The design organizes features into groups that match “the stages of search,” as Google puts it. These are: Crawl, Google Index, Search Traffic and Search Appearance.

    Crawl shows you info about how Google discovers and crawls your content, including crawl stats, crawl errors, URLs that are blocked from crawling, sitemaps, URL parameters and the Fetch as Google feature.

    Google Index shows how many pages you have in Google’s index, and lets you monitor the overall indexed counts for your site and see what keywords Google has found on your pages. From here, you can also request to remove URLs from search results.

    Search Traffic lets you check how your pages are doing in search results, how people find your site, and links to your site. Here, you can also see a sample of pages from your site that have incoming links from other internal pages.

    Finally, Search Appearance includes the Structured Data dashboard, Data highlighter, Sitelinks and HTML improvements.

    Admin tasks (at the account level) are found under the gear icon in the corner. This includes things like Webmaster Tools Preferences, Site Settings, Change of Address, Google Analytics Property, Users & Site Owners, Verification Details and Associates.

    “This is the list of items as visible to site owners, ‘full’ or restricted’ users will see a subset of these options,” says Google Webmaster Trends Analyst Mariya Moeva. “For example, if you’re a “restricted” user for a site, the “Users & Site Owners” menu item will not appear.”

    There’s also a new Search Appearance pop-up, which shows how your site may appear in search, and gives more info about the content or structure changes that could influence each element. This pop-up is accessible via the question mark icon.

  • Google Webmaster Tools API To Soon Let You Retrieve ‘Search Queries’ And ‘Backlinks’ Data

    Today’s Webmaster Help video from Google includes something of an announcement. Matt Cutts reveals that the company is working on some upcoming changes to the Webmaster Tools API.

    The video is a response to the user-submitted question:

    Can you tell us if Webmaster Tools will ever have an update to its API allowing us to retrieve the “Search Queries” and “Backlinks” data?

    “The answer is yes,” says Cutts. “That’s the short answer. The longer answer is, we’re working on it, and we hope to release it relatively soon. In the meantime, right now, there are PHP – there’s Python libraries that you can use to download the data, and so in the description (the meta information for this video) will include a few links where you can go and download toolkits that will allow you to get access to the data right now.”

    Here’s the link the description provides for downloading Search Queries data using Python. Here’s the one for PHP (not an official Google project).

    “And we’re gonna keep looking at how we can improve our API to get you more and more information over time,” Cutts says in the video’s conclusion.

    No time frame is given for when we might see an update to the API, but it sounds like it’s not too far off.

  • Google Kills ‘Links’ In Its Ranking Message To Webmasters

    Google has a help center article in Webmaster Tools specifically about “ranking”. It’s not incredibly informative, and certainly doesn’t walk you through Google’s over 200 signals. It’s just a few sentences of advice, including links to Google’s Webmaster Academy and the “How Google Search Works” page.

    Internet marketer Erik Baemlisberger spotted in a change (via Search Engine Land) in what little wording there is, however, and it’s actually somewhat noteworthy.

    As you can see, the wording use to be: “In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages.”

    Now, it says, “In general, webmasters can improve the rank of their sites by creating high-quality sites that users will want to use and share.”

    Google has removed he word “link,” presumably to play down the importance of links in its algorithm. This doesn’t mean that links are less important. High quality links are likely still a major signal, but by de-emphasizing the word link (or removing it altogether), Google probably hopes to cut down on people engaging in link schemes and paid links – things Google has been cracking down on more than ever over the past year or so.

  • Google Tests Structured Data Error Reporting In Webmaster Tools

    Google’s Matt Cutts revealed in a Q&A at SMX Advanced on Tuesday night that Google is rolling out a test of a “structured data dashboard,” according to Search Engine Land. Barry Schwartz writes that he announced “a new beta application is testing within Google Webmaster Tools named the Structured Data Dashboard”.

    Google actually announced the launch of the Structured Data Dashboard in Webmaster Tools last July. Our coverage of that is here.

    Search Engine Land provides a link the an application for those who want to test the new tool. It appears that the test is just for “structured data error reporting”.

    Error Reporting

    Presumably, this is part of the dashboard announced last year.

    A couple weeks ago, Google launched some new tools for webmasters to provide it with structured data from their sites. They added support for new types of data with the Data Highlighter and launched the Structured Data Markup Helper.

  • Google Announces Opt-Out Tool To Keep Content Out Of Its Specialized Search Engines

    Google has launched a new way for sites to opt out of having their content show up in Google Shopping, Advisor, Flights, Hotels, and Google+ Local search.

    Matt Cutts announced the feature in a very brief post on the Google Webmaster Central blog, saying, “Webmasters can now choose this option through our Webmaster Tools, and crawled content currently being displayed on Shopping, Advisor, Flights, Hotels, or Google+ Local search pages will be removed within 30 days.”

    This is obviously not a feature that Google would want a ton of people to use, because the less content that appears in these services, the less useful they are. Perhaps that’s why Cutts hasn’t tweeted about the tool (maybe not, but perhaps). At least with the short announcement, they have something they can point to.

    The feature is a direct response to an investigation by the Federal Trade Commission. When Google settled with the FTC, one of the voluntary concessions Google made was a feature that would let sites opt out of Google’s specialized search engines.

    As Danny Sullivan notes, the feature doesn’t let you choose which search engines you wish to opt out of. If you use the feature, you’re opting out of all of those mentioned.

    On a help page, Google says, “This opt-out option currently applies only to services hosted on google.com and won’t apply to other Google domains.”

  • Google Talks WMT Search Queries [Video]

    Google Talks WMT Search Queries [Video]

    Google has released a new video featuring Maile Ohye from the webmaster support team, who talks about the “Search Queries” feature in Webmater Tools, and how you can use that to “improve your site”.

    The video discusses the vocabulary of the feature – things like impressions, average position (only the top-ranking URL for the user’s query is factored in), click, and CTR. It also talks about steps for a way to investigate top queries and top pages.

    Last month, Ohye spoke about site verification in another video. Watch that here.

  • Googler Talks About Not Reindexing Pages

    Ever had a problem with Google indexing your pages? If there’s no real content on them, then that’s probably why.

    Barry Schwartz at Search Engine Roundtable points to a Google Webmaster Central help thread, where one person says they had over 2,000 pages indexed on Webmaster Tools, but that the number went down to 60, and eventually back up to 116.

    “When I google: ‘site:www.gamez4you.com’ I see all the indexed pages correctly,” the webmaster says. “I suspect that is the reson my Adsense CPC went down by 60-70%.”

    Google’s Gary Illyes responded, indicating that it was a lack of content on the pages that was the problem. Here’s his full response:

    As we improve our algorithms, they may decide to not reindex pages that are likely to be not useful for the users. I took a look on the pages that were once indexed but currently aren’t and it appears there are quite a few that have no real content; for example http://www.gamez4you.com/car-games/play-crazy-mustang seems to be a soft error page, which means that even though it comes back with 200 HTTP status code, it is in fact an error page that shouldn’t be indexed. Another example would be http://www.gamez4you.com/all-games/page-71 which seems to be an empty page.

    Another reason you may see the number of pages dropping in the Webmaster Tools’ Sitemaps module is that your Sitemap is referencing URLs that are not canonical. For example in your Sitemap you reference
    however the currently canonical URL of that particular URL is
    To fix the indexed urls count, I would recommend fixing canonicalization of the URLs one your site and to set your server to return proper status codes (e.g. 404) for inexistent URLs. You can read more about canonicalization at http://support.google.com/webmasters/bin/answer.py?answer=139394 and about soft error pages at http://support.google.com/webmasters/bin/answer.py?answer=181708
    Hope this helps

    Another webmaster in the thread indicates that he had a similar problem to the Sitemap referencing URLs that aren’t canonical issue.

  • Google Gives Webmasters Alternative To Markup With Data Highlighter

    Google announced the launch of Data Highlighter in Webmaster Tools, a tool for event data (and soon other types of data) that gives them an alternative to having to markup their sites. It lets webmasters show Google pieces of data on a typical event page on their site for use in Google’s structured data offerings in search results (like rich snippets and event calendars), by simply pointing and clicking.

    “If your page lists multiple events in a consistent format, Data Highlighter will ‘learn’ that format as you apply tags, and help speed your work by automatically suggesting additional tags,” explains product manager Justin Boyan. “Likewise, if you have many pages of events in a consistent format, Data Highlighter will walk you through a process of tagging a few example pages so it can learn about their format variations. Usually, 5 or 10 manually tagged pages are enough for our sophisticated machine-learning algorithms to understand the other, similar pages on your site.”

    Data Highlighter

    “When you’re done, you can review a sample of all the event data that Data Highlighter now understands,” he adds. “If it’s correct, click ‘Publish.’”

    After you do all of this, Google will recognize your latest event listings and make them eligible for enhanced search results anytime it crawls your site.

    The tool can be found under “Optimization” in Webmaster Tools. For now, it’s only available in English, but more languages, as well as data types, will be added soon.

  • Google Releases HTML Meta Tag Verification Video

    The latest Webmaster Help video from Google does not come from Matt Cutts, but Maile Ohye. It’s about HTML meta tag verification in Webmaster Tools.

    “Verifying ownership of your site in Webmaster Tools provides you and Google a secure channel for giving and receiving information,” she says. “Including a specific HTML tag is only one of several ways you can verify ownership of your site. It’s a great option if you’re familiar with HTML code like head, title and body tags, and you know where and how to edit this content or if you’re using a content management system or CMS like Google Sites, be sure to follow any specific instructions from your provider.”

    “If HTML tags look and sound like gibberish, then perhaps give another one of the verification methods a try,” she suggests.

    As much as we love Cutts’ videos, it’s nice to see some of the rest of the team take on subjects from time to time.

  • Cutts Does Customer Support In Latest Webmaster Video

    Google has put out a new webmaster help video with Matt Cutts. The basis of the user-submitted question isn’t even accurate, and Cutts still took the time to make the video and answer it. This goes to show that there is a solid chance that Google will answer your questions when you send them.

    The question was:

    The Webmaster Tools “Fetch as Googlebot” feature does not allow one to fetch an https page, making it not very useful for secure sites – any plans to change that?

    “So, we just tried it here, and it works for us,” said Cutts. “You have to register, and prove that you own the https site, just like you do with an http site. Once you’ve proven that you control or verify that you are able to control that https page, you absolutely can fetch. You need to include the protocol when you’re doing the fetch, but it should work just fine. If it doesn’t work, maybe show up in the webmaster forum, and give us some feedback, but we just tried it on our side, and it looks like it’s working for us.”

    How’s that for customer support?

    They must be getting close to the end of this batch of videos.

  • Hall Of Famer Matt Cutts On Why Google Doesn’t Provide An SEO Quality Calculator

    Google’s Matt Cutts, who on Friday, was inducted into the University of Kentucky’s Hall of Fame here in Lexington, has posted a new Webmaster Help video answering a question about why Google doesn’t provide some kind of “SEO quality calculator”. The exact question was:

    Why doesn’t Google release an SEO quality check up calcualtor? This will help people to optimize their websites in the right direction. Is Google willing to do this or it just wants people to keep guessing what’s wrong with their websites?

    He talks about how much Google does to provide Webmasters with help via Webmaster Tools and the notifications it sends out.

    Then, he says, “If we were to just give an exact score…so back in the early days, InfoSeek would actually like let you submit a page, see immediately where it was ranking, and let you submit another page, and there are stories that have lasted since then about how people would just spend their entire day spamming InfoSeek, tweaking every single thing until they got exactly the right template that would work to rank number one. So we don’t want to encourage that kind of experimentation, and we know that if we give exact numbers and say, ‘Okay, this is how you’re ranking on this particular sort of algorithm or how you rank along this axis,’ people will try to spam that.”

    “But what we do want to provide is a lot of really useful tools for people that are doing it themselves – mom and pops – people who are really interested and just want to dig into it – agencies who want to have more information so that they can do productive optimization – all that sort of stuff,” he continues.

    “We don’t want to make it easy for the spammers, but we do want to make it as easy as possible for everybody else,” he adds. “There’s inherently a tension there, and we’re always trying to find the features that will help regular people while not just making it easy to spam Google.”

    Of course, it’s getting harder to get on the front page of results on Google anyway, because of all of the other elements they’re adding to the SERPs and the reduced number of organic results appearing on an increasing number of them.

  • Will Google’s Link Disavow Tool Come Back To Haunt Webmasters?

    Back in June, during the height of the Penguin update freakout, Google’s Matt Cutts hinted that Google would launch a “link disavow” tool, so that webmasters can tell Google the backlinks they want Google to ignore. This means links from around the web that are potentially hurting a site’s rankings in Google could be ignored, and no longer count against the site in question. This is something that many webmasters and SEOs have wanted for a long time, and especially since the Penguin update launched earlier this year. On Tuesday, Google made these dreams come true by finally launching the tool, after months of anticipation.

    Is it what you hoped it would be? Do you intend to use it? Let us know in the comments.

    How It Works

    The tool tells users, “If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”

    It is worth noting, however, that just because you use the tool, and tell Google to ignore certain links, it is not a guarantee that Google will listen. It’s more of a helpful suggestion. Google made this clear in the Q&A section of the blog post announcing the tool.

    “This tool allows you to indicate to Google which links you would like to disavow, and Google will typically ignore those links,” Google Webmaster Trends Analyst Jonathan Simon says. “Much like with rel=’canonical’, this is a strong suggestion rather than a directive—Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.” He adds:

    If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.

    If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page.

    With the tool, you simply upload a .txt file containing the links you want Google to disavow. You add one URL per line. You can block specific URLs or whole domains. To block a domain, use this format: domain:example.com. You can add comments by including a # before them. Google ignores the comments. The file size limit is 2MB.

    If you haven’t watched it yet, watch Matt Cutts’ video explaining the tool. If it’s something you’re considering using, it’s definitely worth the ten minutes of your time:

    Cutts warns repeatedly that most people will not want to use this tool, and you should really only use it if you’ve already tried hard to get the questionable links removed, but haven’t been able to get it done. For more details and minutia about how this tool works, there is a whole help center article dedicated to it.

    Negative SEO

    Negative SEO, a practice in which competitors attack a site with spammy links and whatnot, has been debated for a long time, and many will see this tool as a way to eliminate the effects fo this. Google has specifically responded to this.

    “The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building,” says Simon. “If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool.”

    “In general, Google works hard to prevent other webmasters from being able to harm your ranking,” he adds. “However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.”

    Cutts also talked about the subject at PubCon, where the tool was announced. Search Engine Roundtable has a liveblogged account of what he said, which reads:

    All the negative SEO complaints he sees, or most of it, is really not negative SEO hurting you. It is a much better use of your time to make your site better vs hurting someone else. At the same time, we’ve seen cases of this as an issue. I.e. buying a new domain and needing to clean up that site. There are people who want to go through this process. Plus SEOs that take on new clients that went through bad SEOs.

    Warnings And Overreaction

    Again, you don’t want to use the tool in most cases. It’s pretty much a last resort tactic for links you’re positive are hurting you, and can’t get removed otherwise. Google has warned repeatedly about this, as over-use of the tool can lead to webmatsers shooting themselves in the foot. If you use it willy nilly, you may be hurting your site by getting rid of links that were actually helping you in the first place.

    It seems like common sense, but ever since the Penguin update, we’ve seen plenty of examples of webmasters frantically trying to get links removed that even they admit they would like to keep, if not for fear that Google might frown upon them (when in reality, it’s likely that they did not).

    Aaron Wall from SEOBook makes some other interesting points on the warnings front. He writes:

    The disavow tool is a loaded gun.

    If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.

    Could the use of the tool be seen as an admission of guilt? Matt gives examples of “bad” webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.

    Google Wants To Depend More On Social And Authorship

    If overreaction is an issue, and it seems fairly likely that it will be, despite Google’s warnings, this tool could really mess with how Google treats links, which have historically been the backbone of its algorithm.

    “Links are one of the most well-known signals we use to order search results,” says Simon. “By looking at the links between pages, we can get a sense of which pages are reputable and important, and thus more likely to be relevant to our users. This is the basis of PageRank, which is one of more than 200 signals we rely on to determine rankings. Since PageRank is so well-known, it’s also a target for spammers, and we fight linkspam constantly with algorithms and by taking manual action.”

    It will be interesting to see how Google treats the links webmasters tell it to ignore, which are not actually hurting them in the first place. I would not be surprised to see some in the industry test Google on this.

    Google does not like it when people manipulate the way it counts links, yet they’ve just given webmasters a tool to do so, even if it’s kind of the opposite of the black hat techniques Google has always tried to eliminate (link schemes, paid links, etc.). Now (and we’ve seen this even before the tool existed), you potentially have webmasters trying to get rid of links that actually do have value, even in Google’s eyes. I mean, seriously, what are the odds that this tool will be used 100% how Google intends it to be used, which is apparently in rare circumstances?

    Google seems to be grooming other signals to play a greater role in the algorithm. While they’re not there yet, based on various comments the company has made, social signals will almost certainly play an increasingly weighty role. CEO Larry Page was asked about this at a conference this week.

    He responded, “I think it’s really important to know, again, who you’re with, what the community is – it’s really important to share things. It’s really important to know the identity of people so you can share things and comment on things and improve the search ecosystem, you know, as you – as a real person…I think all those things are absolutely crucial.”

    “That’s why we’ve worked so hard on Google+, on making [it] an important part of search,” he continued. “Again, like Maps, we don’t see that as like something that’s like a separate dimension that’s never going to play into search. When you search for things, you want to know the kinds of things your friends have looked at, or recommended, or wrote about, or shared. I think that’s just kind of an obvious thing.”

    “So I think in general, if the Internet’s working well, the information that’s available is shared with lots of different people and different companies and turned into experiences that work well for everyone,” he said. “You know, Google’s gotten where it is by searching all the world’s information, not just a little bit of it, right? And in general, I think people have been motivated to get that information searchable, because then we deliver users to those people with information.”

    “So in general, I think that’s the right way to run the Internet as a healthy ecosystem,” Page concluded. “I think social data is obviously important and useful for that. We’d love to make use of that every way we can.”

    As Google says, links are a direct target for manipulation, and social could be harder to fake (though there are certainly attempts, and there will be plenty more).

    Another difficult signal to fake is authorship, which is why Google is really pushing for that now. In a recent Google+ Hangout, Matt Cutts said of authorship, “Sometimes you’ll have higher click through, and people will say, ‘Oh, that looks like a trusted resource.’ So there are ways that you can participate and sort of get ready for the longer term trend of getting to know not just that something was said, but who said it and how reputable they were.”

    “I think if you look further out in the future and look at something that we call social signals or authorship or whatever you want to call it, in ten years, I think knowing that a really reputable guy – if Dan has written an article, whether it’s a comment on a forum or on a blog – I would still want to see that. So that’s the long-term trend,” he said.

    “The idea is you want to have something that everybody can participate in and just make these sort of links, and then over time, as we start to learn more about who the high quality authors are, you could imagine that starting to affect rankings,” he pointed out.

    So here you have Google (Matt Cutts specifically) telling you that authorship is going to become more important, and that you probably shouldn’t even use the new link-related tool that the company just launched.

    Danny Sullivan asked Cutts, at PubCon, why Google doesn’t simply discount bad links to begin with, rather than “considering some of them as potentially negative votes.”

    “After all, while it’s nice to have this new tool, it would be even better not to need it at all,” he writes. Cutts did not really answer that question.

    Why do you think Google does not do as Danny suggests, and simply ignore the bad links to begin with? Do you think social and authorship signals will become more important than links? Share your thoughts about Google’s ranking strategy and the new tool in the comments.

    Lead Image: The Shining (Warner Bros.)

  • Google’s Link Disavow: Google Answers Domain Related Questions

    Google launched its Link Disavow tool today. If you haven’t read about it yet, you can do so here.

    There are a few things Google mentions about it at the end of a blog post, I think are worth highlighting, with regards to international domains, subdomains and www vs. non-www.

    Google ends its announcement with a Q&A section, and the last few are about these items. Here is what Google says:

    Q: Do I need to disavow links from example.com and example.co.uk if they’re the same company?
    A: Yes. If you want to disavow links from multiple domains, you’ll need to add an entry for each domain.

    Q: What about www.example.com vs. example.com (without the “www”)?
    A: Technically these are different URLs. The disavow links feature tries to be granular. If content that you want to disavow occurs on multiple URLs on a site, you should disavow each URL that has the link that you want to disavow. You can always disavow an entire domain, of course.

    Q: Can I disavow something.example.com to ignore only links from that subdomain?
    A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing “domain:something.example.com” will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.

    To disavow an entire domain, you’ll want to use a format like: domain:www.example.com.

    Here’s what Google says about the Link Disavow tool and negative SEO.

  • Google’s Link Disavow Tool And Negative SEO

    In case you haven’t heard yet, Google finally released its long-awaited Link Disavow tool. You can get more details about it here.

    In a blog post about the tool, Google includes a Q&A section. One of the questions in that is: Can this tool be used if I’m worried about “negative SEO”? Here is Google’s official response to that:

    The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building. If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool.

    In general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.

    Negative SEO also came up during the PubCon session in which Google’s Matt Cutts revealed the tool. Barry Schwartz at Search Engine Roundtable has a liveblog from that. Here is his account of what Cutts had to say about the subject:

    All the negative SEO complaints he sees, or most of it, is really not negative SEO hurting you. It is a much better use of your time to make your site better vs hurting someone else. At the same time, we’ve seen cases of this as an issue. I.e. buying a new domain and needing to clean up that site. There are people who want to go through this process. Plus SEOs that take on new clients that went through bad SEOs.

    Remember, Google updated the language of a help center article addressing negative SEO, seemingly indicating that it is possible.

  • Google Webmaster Tools Will Now Email You About Critical Site Issues

    Google announced today that Webmaster Tools will start letting you know when it discovers critical issues with your site, by sending you an email with more info. The company says it will only notify you about issues that it thinks have a significant impact on your site’s health or search performance and which have clear actions you can take to address the issue.

    “For example, we’ll email you if we detect malware on your site or see a significant increase in errors while crawling your site,” says Google Webmaster Trends analyst John Mueller.

    “For most sites these kinds of issues will occur rarely,” he adds. “If your site does happen to have an issue, we cap the number of emails we send over a certain period of time to avoid flooding your inbox. If you don’t want to receive any email from Webmaster Tools you can change your email delivery preferences.”

    This is just the latest in a series of new alerts from Webmaster Tools. Last month, Google launched alerts for Search Queries data to complement the Crawl Errors alerts it began sending out before that.

    Webmaster Tools also recently started sharing more detailed Site Error info, such as stats for each site-wide crawl error from the past ninety days. It also shows failure rates for category-specific error. More on that here.

  • Google Gives You More Site Error Details In Webmaster Tools

    Google is now sharing more detailed Site Error info in Webmaster Tools. Sire Errors will now display stats for each site-wide crawl error from the past ninety days. It will also show failure rates for category-specific errors.

    “This information is useful when looking for the source of your Site Errors,” Google says in a blog post. “For example, if your site suffers from server connectivity problems, your server may simply be misconfigured; then again, it could also be completely unavailable! Since each Site Error (DNS, server connectivity, and robots.txt fetch) is comprised of several unique issues, we’ve broken down each category into more specific errors to provide you with a better analysis of your site’s health.”

    Webmaster Tools site errors

    Users can hover over any of the entries in the legend to get an explanation of the errors. When you do so, there will be a “more info” link.

    Google recently added alerts for Search Queries data in Webmaster Tools, as well.

  • Bing Webmaster Tools Changes Site Activity Calculations

    Bing announced a new change to its Webmaster Tools, related to how it calculates Change in the Site Activity widget.

    Vincent Wehren explains on Bing’s Webmaster Center blog, “For the metrics Clicks from search, Appeared in search, Pages crawled, and Crawl errors we will now show you the percent change for the selected date range (as selected in the date selector near the top of the dashboard) compared to the prior date range of the same length. So when you log into your Bing Webmaster Tools account and are looking at changes for say a 30-day period in your Dashboard (which is the default setting), the percentage in the Site Activity widget will reflect your relative increase or decrease compared to the prior 30-day period.”

    Bing Webmaster Tools will also show you the absolute numbers for the aforementioned metrics for both the current and prior period:

    Bing webmaster tools changes

    “It’s worth noting that since we store up to 6 months’ worth of data in Bing Webmaster Tools, we are — as a consequence — able to show % change information for periods up to three months counting back from the last available date,” says Wehren. “For shorter date ranges we can show change information as long as they and their prior period fall within those 6 months of data we store. In other words, you are not limited to just seeing change information for say, just the last 30 days.”

    Bing Webmaster Tools will only provide data for your site from the day your site was registered, of course.

    BIng notes that the changes to the Site Activity widget also apply to change info shown on the My Sites page.

  • Google Webmaster Tools Gets Alerts For Search Queries Data

    Google announced today that it is adding alerts for Search Queries data to Webmaster Tools to complement its recently rolled out Crawl Errors alerts.

    You can get the alerts forwarded to your inbox if you sign up for email forwarding in Webmaster Tools.

    Search Queries Alerts

    “We know many of you check Webmaster Tools daily (thank you!), but not everybody has the time to monitor the health of their site 24/7,” says Webmaster Tools tech lead Javier Tordable in a blog post. “It can be time consuming to analyze all the data and identify the most important issues. To make it a little bit easier we’ve been incorporating alerts into Webmaster Tools. We process the data for your site and try to detect the events that could be most interesting for you.”

    “The Search Queries feature in Webmaster Tools shows, among other things, impressions and clicks for your top pages over time,” says Tordable. “For most sites, these numbers follow regular patterns, so when sudden spikes or drops occur, it can make sense to look into what caused them. Some changes are due to differing demand for your content, other times they may be due to technical issues that need to be resolved, such as broken redirects. For example, a steady stream of clicks which suddenly drops to zero is probably worth investigating.”

    He also notes that Google is still working on the sensitivity threshold for the messages.

  • Watch This New Google Video About URL Parameters

    Google has released a new video about configuring URL parameters in Webmaster Tools. It’s about 15-minutes long, and features Google Developer Programs Tech Lead Maile Ohye discussing the how to help Google crawl your site more efficiently, and manage a site with URL parameters.

    “URL Parameter settings are powerful,” says Ohye. “By telling us how your parameters behave and the recommended action for Googlebot, you can improve your site’s crawl efficiency. On the other hand, if configured incorrectly, you may accidentally recommend that Google ignore important pages, resulting in those pages no longer being available in search results.”