WebProNews

Tag: John Mueller

  • Google Webmaster Tools ‘Search Queries’ Feature Gets Some New Tweaks

    Google has announced a couple of changes to the Search Queries feature in Webmaster Tools, improving stats for mobile sites and getting rid of rounding.

    For webmasters who manage mobile sites on separate URLs from the desktop versions (like m.example.com), Google will now show queries where the m. pages appeared in results for mobile browsers and queries where Google applied Skip Redirect.

    Skip Redirect

    “This means that, while search results displayed the desktop URL, the user was automatically directed to the corresponding m. version of the URL (thus saving the user from latency of a server-side redirect),” explains developer programs tech lead Maile Ohye. “Prior to this Search Queries improvement, Webmaster Tools reported Skip Redirect impressions with the desktop URL. Now we’ve consolidated information when Skip Redirect is triggered, so that impressions, clicks, and CTR are calculated solely with the verified m. site, making your mobile statistics more understandable.”

    The change enabling users to see search queries data without being rounded will become visible in Webmaster Tools over the next few days.

    “We hope this makes it easier for you to see the finer details of how users are finding your website, and when they’re clicking through,” says Google webmaster trends analyst John Mueller.

    We wonder if these tweaks are related to Google’s recent call for ideas from users for Webmaster Tools improvements.

    Image: Google

  • Google Admits Link Mistake, Probably Won’t Help Webmaster Link Hysteria

    Google is apparently getting links wrong from time to time. By wrong, we mean giving webmasters example links (in unnatural link warning messaging) that are actually legitimate, natural links.

    It’s possible that the instances discussed here are extremely rare cases, but how do we know? It’s concerning that we’re seeing these stories appear so close together. Do you think this is an issue that is happening a lot? Let us know in the comments.

    A couple weeks ago, a forum thread received some attention when a webmaster claimed that this happened to him. Eventually Google responded, not quite admitting a mistake, but not denying it either. A Googler told him:

    Thanks for your feedback on the example links sent to you in your reconsideration request. We’ll use your comments to improve the messaging and example links that we send.

    If you believe that your site no longer violates Google Webmaster Guidelines, you can file a new reconsideration request, and we’ll re-evaluate your site for reconsideration.

    Like I said, not exactly an admission of guilt, but it pretty much sounds like they’re acknowledging the merit of the guy’s claims, and keeping these findings in mind to avoid making similar mistakes in the future. That’s just one interpretation, so do with that what you will.

    Now, however, we see a Googler clearly admitting a mistake when it provided a webmaster with one of those example URLs for a DMOZ link. Barry Schwartz at Search Engine Roundtable, who pointed out the other thread initially, managed to find this Google+ discussion from even earlier.

    Dave Cain shared the message he got from Google, which included the DMOZ link, and tagged Google’s Matt Cutts and John Mueller in the post. Mueller responded, saying, “That particular DMOZ/ODP link-example sounds like a mistake on our side.”

    “Keep in mind that these are just examples — fixing (or knowing that you can ignore) one of them, doesn’t mean that there’s nothing else to fix,” he added. “With that in mind, I’d still double-check to see if there are other issues before submitting a reconsideration request, so that you’re a bit more certain that things are really resolved (otherwise it’s just a bit of time wasted with back & forth).”

    Cain asked, ” Because of the types of links that were flagged in the RR response (which appear to be false negatives . i.e DMOZ/ODP), would it be safe to assume that the disavow file wasn’t processed with the RR?”

    Mueller said that “usually” submitting both at the same time is no problem, adding, “So I imagine it’s more a matter of the webspam team expecting more.”

    It’s a good thing Mueller did suggest that Google made a mistake, given the link in question was from DMOZ. There are a lot of links in DMOZ, and that could have created another wave in the ocean of link hysteria. Directories in general have already seen a great deal of requests for link removals.

    Here’s a video from a couple summers ago with Cutts giving an update on how Google thinks about DMOZ.

    Cutts, of the webpspam team, did not weigh in on Cain’s conversation with Mueller (which took place on August 20th).

    Mistakes happen, and Google is not above that. However, seeing one case where Google is openly admitting a mistake so close to another case where it looks like they probably also made a mistake is somewhat troubling, considering all the hysteria we’ve seen over linking over the past year and a half.

    It does make you wonder how often it’s happening.

    Update: Just got a tweet from Cutts on the matter:

    Do you think these are most likely rarities, or do you believe Google is getting things wrong often? Share your thoughts.

    Image: Google

  • Do You Follow Google’s Rules On Guest Posts?

    Google’s view of guest blog posts has come up in industry conversation several times this week. Webmasters and marketers have long engaged in the practice in writing articles for third-party sites as a content marketing strategy. Some have taken it to higher extremes of “SEO,” but regardless of how hard your pushing for a boost in PR from these articles, you might want to consider what Google has been saying about the matter.

    Do you write guest posts for other sites? Include guest posts on your site? Are you hoping to just provide good content or are you looking for linkjuice to help your Google rankings? Let us know in the comments.

    As far as I can tell this week’s conversation started with an article at HisWebMarketing.com by Marie Haynes, and now Google’s Matt Cutts has been talking about it in a new interview with Eric Enge.

    Haynes’ post, titled, “Yes, high quality guest posts CAN get you penalized!” shares several videos of Googlers talking about the subject. The first is on old Matt Cutts Webmaster Help video that we’ve shared in the past.

    In that, Cutts basically said that it can be good to have a reputable, high quality writer do guest posts on your site, and that it can be a good way for some lesser-known writers to generate exposure, but…

    “Sometimes it get taken to extremes. You’ll see people writing…offering the same blog post multiple times or spinning the blog posts, offering them to multiple outlets. It almost becomes like low-quality article banks.”

    “When you’re just doing it as a way to sort of turn the crank and get a massive number of links, that’s something where we’re less likely to want to count those links,” he said.

    The next video Haynes points to is a Webmaster Central Hangout from February:

    When someone in the video says they submit articles to the Huffington Post, and asks if they should nofollow the links to their site, Google’s John Mueller says, “Generally speaking, if you’re submitting articles for your website, or your clients’ websites and you’re including links to those websites there, then that’s probably something I’d nofollow because those aren’t essentially natural links from that website.”

    Finally, Haynes points to another February Webmaster Central hangout:

    In that one, when a webmaster asks if it’s okay to get links to his site through guest postings, Mueller says, “Think about whether or not this is a link that would be on that site if it weren’t for your actions there. Especially when it comes to guest blogging, that’s something where you are essentially placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a linkbuilding point of view. I think sometimes it can make sense to guest blog on other peoples’ sites and drive some traffic to your site because people really liked what you are writing and they are interested in the topic and they click through that link to come to your website but those are probably the cases where you’d want to use something like a rel=nofollow on those links.”

    Barry Schwartz at Search Engine Land wrote about Haynes’ post, and now Enge has an interview out with Cutts who elaborates more on Google’s philosophy when it comes to guest posts (among other things).

    Enge suggests that when doing guest posts, you create high-quality articles and get them published on “truly authoritative” sites that have a lot of editorial judgment, and Cutts agrees.

    He says, “The problem is that if we look at the overall volume of guest posting we see a large number of people who are offering guest blogs or guest blog articles where they are writing the same article and producing multiple copies of it and emailing out of the blue and they will create the same low quality types of articles that people used to put on article directory or article bank sites.”

    “If people just move away from doing article banks or article directories or article marketing to guest blogging and they don’t raise their quality thresholds for the content, then that can cause problems,” he adds. “On one hand, it’s an opportunity. On the other hand, we don’t want people to think guest blogging is the panacea that will solve all their problems.”

    Enge makes an interesting point about accepting guest posts too, suggesting that if you have to ask the author to share with their own social accounts, you shouldn’t accept the article. Again, Cutts agrees, saying, “That’s a good way to look at it. There might be other criteria too, but certainly if someone is proud to share it, that’s a big difference than if you’re pushing them to share it.”

    Both agree that interviews are good ways to build links and authority.

    In a separate post on his Search Engine Roundtable blog, Schwartz adds:

    You can argue otherwise but if Google sees a guest blog post with a dofollow link and that person at Google feels the guest blog post is only done with the intent of a link, then they may serve your site a penalty. Or they may not – it depends on who is reviewing it.

    That being said, Google is not to blame. While guest blogging and writing is and can be a great way to get exposure for your name and your company name, it has gotten to the point of being heavily abused.

    He points to one SEO’s story in a Cre8asite forum thread about a site wanting to charge him nearly five grand for one post.

    Obviously this is the kind of thing Google would frown upon when it comes to link building and links that flow PageRank. Essentially, these are just paid links, and even if more subtle than the average advertorial (which Google has been cracking down on in recent months), in the end it’s still link buying.

    But there is plenty of guest blogging going on out there in which no money changes hands. Regardless of your intensions, it’s probably a good idea to just stick the nofollows on if you want to avoid getting penalized by Google. If it’s still something you want to do without the SEO value as a consideration, there’s a fair chance it’s the kind of content Google would want anyway.

    Are you worried that Google could penalize you for writing high quality blog posts for third-party sites? Let us know in the comments.

  • Google Penalizes Mozilla For Web Spam [Updated]

    Update: It turns out that Google only penalized a single page from Mozilla. Matt Cutts weighed in on the “penalty” in that same forum thread (hat tip: Search Engine Land).

    Google has penalized Mozilla.org, the nonprofit site of the organization that provides the Firefox browser. This doesn’t appear to be an accident like what recently happened with Digg. This was a real manual web spam penalty.

    Mozilla Web Production Manager Christopher More posted about it in Google’s Webmaster Help forum (hat tip to Barry Schwartz), where he shared the message he got from Google:

    Google has detected user-generated spam on your site. Typically, this kind of spam is found on forum pages, guestbook pages, or in user profiles. As a result, Google has applied a manual spam action to your site.

    “I am unable to find any spam on http://www.mozilla.org,” said More. “I have tried a site:www.mozilla.org [spam terms] and nothing is showing up on the domain. I did find a spammy page on a old version of the website, but that is 301 redirected to an archive website.”

    Google Webmaster Trends analyst John Mueller responded:

    To some extent, we will manually remove any particularly egregious spam from our search results that we find, so some of those pages may not be directly visible in Google’s web-search anymore. Looking at the whole domain, I see some pages similar to those that Pelagic (thanks!) mentioned: https://www.google.com/search?q=site:mozilla.org+cheap+payday+seo (you’ll usually also find them with pharmaceutical brand-names among other terms).

    In addition to the add-ons, there are a few blogs hosted on mozilla.org that appear to have little or no moderation on the comments, for example http://blog.mozilla.org/respindola/about/ looks particularly bad. For these kinds of sites, it may make sense to allow the community to help with comment moderation (eg. allow them to flag or vote-down spam), and to use the rel=nofollow link microformat to let search engines know that you don’t endorse the links in those unmoderated comments.

    For more tips on handling UGC (and I realize you all probably have a lot of experience in this already) are at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=81749

    Also keep in mind that we work to be as granular as possible with our manual actions. Personally, I think it’s good to react to a message like that by looking into ways of catching and resolving the cases that get through your existing UGC infrastructure, but in this particular case, this message does not mean that your site on a whole is critically negatively affected in our search results.

    Let this be a lesson to all webmasters and bloggers. Keep your comments cleaned up.

    Mozilla still appears to be showing up in key search results like for “mozilla” and for “web browser”. It’s not as bad as when Google had to penalize its own Chrome browser for paid links.

  • Do Your Blog Comments Have Search Ranking Value?

    When Google unleashed the Panda update, it waged war on “thin” content in its search results. Google wants to provide pages that offer information valuable to searchers, as opposed to content that was hastily thrown together.

    It’s easy to hear “thin” content, and associate that with content in which there is not a lot of actual content. In other words, you might take this to mean that Google does not like short articles, and would favor a longer article in a case where these two pieces of content are competing for rankings.

    Have you seen search ranking success with short content? Let us know in the comments.

    The fact is, Google may very well favor the longer, more in depth piece, but that does not mean Google will not value a short article.

    In a Google forum thread, a webmaster asked the question: Is short content = thin content?” As Barry Schwartz at Search Engine Roundtable points out, Google Webmaster Trends Analyst John Mueller, weighed in on the discussion. Here’s what he said:

    “Rest assured, Googlebot doesn’t just count words on a page or in an article, even short articles can be very useful & compelling to users. For example, we also crawl and index tweets, which are at most 140 characters long. That said, if you have users who love your site and engage with it regularly, allowing them to share comments on your articles is also a great way to bring additional information onto the page. Sometimes a short article can trigger a longer discussion — and sometimes users are looking for discussions like that in search. That said, one recommendation that I’d like to add is to make sure that your content is really unique (not just rewritten, autogenerated, etc) and of high-quality.” Emphasis added.

    Last year, Google shared a set of questions that one could ask himself when assessing the quality of a page or an article. One of these was: “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?”

    Shallow does not mean short. The beginning part of that, which talks about experts and enthusiasts, is likely to have a stronger bearing on how Google views the content. Who you are matters to Google. That’s why they’re looking to push authorship as a stronger signal in the future. Length of a specific piece of content is not necessarily as much of a factor.

    Still, that doesn’t mean it’s not a factor. If one piece of content is simply more informative, which it may very well be if it is longer, it might still be the better result, regardless of who you are. There’s still something to be said for a well researched, insightful article. Google is not looking to ignore this kind of content, by any means.

    Another of Google’s questions is: “Does the article describe both sides of a story?” Sometimes, it may take more text to answer that with a yes.

    One thing about Mueller’s comments that stikes me as interesting is the part about comments. In an article a while back, we looked at the SEO value of comments. Blogger Michael Gray, who turned off his comments several years ago, told us, “It was one of the best decisions I made, and regret not doing it sooner.”

    “Does Google take a look at factors like time on site and bounce rate?” he said at the time. “IMHO yes, but you should be looking to increase those with good information, and solid actionable content, not comments. The biggest effect comments have is giving Google a date to show in the SERP’s. This is a huge factor who’s importance can’t be unstated. If I’m looking for how to fix the mouse on my computer, or what dress Angelina Jolie wore to an awards show, having the date show up in the SERP has a lot of value for the user. If I’m looking to learn how to structure a website, the date plays almost no role. The author’s expertise and understanding of information architecture trumps the date.”

    It should be noted that Google’s Matt Cutts has reportedly said since then that Google doesn’t use bounce rate.

    Interestingly, according to Shoemoney blogger Jeremy Schoemaker, who we also spoke with for that particular article, a Google engineer said at the time that, if anything, comments were diluting the quality score of a page, by possibly diluting overall keyword density. There is also the possibility that the few comments that go through that are clearly spam, could send poor quality signals to Google.

    “So he said he did not see a positive to leaving indexable comments on my site,” Schoemaker told us at the time.

    But now, here we have Mueller talking up the value of comments.

    Of course, it’s not as if this is the first time that Google has sent mixed signals to webmasters and content creators. But on the other hand, you can’t really hold every person at Google, speaking candidly, accountable for knowledge about every aspect of how Google works, especially when it comes to the search algorithm – Google’s secret recipe.

    It stands to reason that Google would look at comments in similar fashion to how it views the rest of the content on the page. Some comments are obviously of higher quality than others, even if the spammy ones have been cut out. But if quality is there, Google may just see how such comments could be valuable to users.

    Perhaps webmasters should be more stingy with the comments they allow, but then you’re talking about censorship, which is not necessarily a path you want to travel.

    Do you think comments on your blog have helped or hurt you in search? Do you believe they’ve had any effect at all? Should Google take them into consideration? Tell us what you think.

  • Watch This Google Webmaster Hangout From Monday

    Ever wish you had a chance to sit in on one of Google’s Webmaster Central Office Hours hangouts, but just can’t find the time, or they just to don’t correspond well with your schedule? Luckily, Google sometimes makes them available for later viewing, and you can skip around and find the parts most relevant to you.

    Here’s a hangout Google’s John Mueller hosted on Monday. It’s over an hour long, but there’s also a transcript available on the actual YouTube video page, if you’d rather simply peruse that.

  • Google Doesn’t Care How Many Nofollow Links You Have

    Adding the nofollow attribute to links prevents PageRank from being passed. This is something that Google wants webmasters to do for any links that have been purchased. To do otherwise is strictly against Google’s quality guidelines. Violating these guidelines can either get you hit with a manual penalty in Google, or get you snagged by Google’s Penguin updated, which will continue to see regular data refreshes.

    Some webmasters have wondered if having a large amount of links with the nofollow attribute pointing to a page could hurt that page in search. Barry Schwartz at Search Engine Roundtable points to an interesting Google webmaster forum discussion, in which Google Webmaster Trends analyst John Mueller sets the record straight.

    In that thread, user rickweiss writes, “Bloggers have apparently taken the issue of never having a dofollow on any link that is tied to something you are compensated for so seriously that they are putting nofollow on all links in their posts. In other words, the legitimate link to the products page is getting a nofollow.”

    Later, a user going by the name Bens Mom, asks, “I am incorrect in the belief that having too many rel=nofollow links can actually hurt a site? Because that is the impression I’m under.”

    Mueller responded:

    I’d like to back up what others said — having links (even a large number of them) with rel=nofollow pointing to your site does not negatively affect your site. We take these links out of our PageRank calculations, and out of our algorithms when they use links.

    If you’ve been doing this for a longer time, then it might even make sense to work to clean up those older links, and to have them either removed or the rel=nofollow attached (given that those kinds of paid posts would be against our Webmaster Guidelines).

    This isn’t much of a surprise, considering that nofollow is designed to do what its name would imply: keep the search engine algorithms from following these links. That would indicate that these links carry absolutely no weight one way or another.

    Google’s constantly changing algorithm has a lot of people paranoid about their linking strategies, and it seems that some are so worried about Google’s actions that they’re taking unneeded actions of their own, and ironically, possibly hurting SEO in the process.

    Image: John Mueller, from Google+

  • You Better Have More Than A Great Site If You Want To Rank In Google

    In a thread in the Google Webmaster Central forum (hat tip: Barry Schwartz), a user claimed to have lost all of their traffic over the weekend, and to have found thousands of “fake backlinks”.

    The user asked what they can do to make Google know the links have nothing to do with them.

    Well, Google’s Matt Cutts recently indicated that Google may soon launch a tool that will let you tell Google to ignore certain links, but also interesting is what Google Webmaster Trends Analyst John Mueller (pictured) said in response to this user’s post.

    Mueller said:

    From what I can tell, your site is still fairly new – with most of the content just a few months old, is that correct? In cases like that, it can take a bit of time for search engines to catch up with your content, and to learn to treat it appropriately. It’s one thing to have a fantastic website, but search engines generally need a bit more to be able to confirm that, and to rank your site – your content – appropriately.

    That said, if you’re engaging in techniques like comment spam, forum profile link-dropping, dropping links in unrelated articles, or just placing it on random websites, then those would be things I’d strongly recommend stopping and cleaning up if you can.

    Emphasis added.

    Google, especially in the last year or two, has talked up the importance of quality content probably above all else, so it is interesting to see Google so openly talking about how that’s not necessarily enough.

    Consider that when Google launched the Penguin update, Google’s Matt Cutts said in the announcement, “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites.”

    Of course, this is still important, but if that’s all you got, it sounds like you better have some patience too, even if Google is all about freshness too.

    Watch this video from Google’s Maile Ohye for some good SEO ideas as far as Google is concerned:

    Image: Mueller’s Google+ Profile