WebProNews

Tag: SEO

  • Would You Use Google+ More If You Knew It Could Help Your Search Rankings?

    Update: Matt Cutts has responded to Moz’s report. See end for update.

    Sure, some of you are already using Google+ a lot, and I’m not one to call it a ghost town, but I don’t think many would argue that it doesn’t get the level of use as Facebook. But if you knew for a fact that Google+ could directly help you rank better in Google’s search results, wouldn’t you dedicate more time to it?

    Would you use Google+ more if you saw direct ranking benefits? Let us know in the comments.

    It seems like only yesterday that Google was telling us that the +1 button had no direct effect on rankings. Actually, it was in October.

    “In the short term, we’re still going to have to study and see how good the signal is, so right now, there’s not really a direct effect where if you have a lot of +1s, you’ll rank higher,” Matt Cutts said in a Google Hangout back then.

    Google has not come out and officially said that +1’s will help you rank higher now, but Moz (formerly SEOmoz) has put out an interesting report suggesting that they will. Each year, they run a scientific correlation study looking at factors that have a strong association with higher Google rankings. This time, it found a very interesting trend.

    “After Page Authority, a URL’s number of Google +1s is more highly correlated with search rankings than any other factor,” writes Cyrus Shepard. “In fact, the correlation of Google +1s beat out other well known metrics including linking root domains, Facebook shares, and even keyword usage.”

    Plus Ones

    Searchmetics also recently found significant correlation of +1s and rankings:

    Plus ones

    Shepard notes that Moz found similar correlation with Facebook activity in rankings in a past study, but this was generally dismissed as not being a direct relationship, as that content likely had a lot of overlapping factors (like links and high quality content). He says it’s different this time with Google+, because it’s “built for SEO” in that posts are crawled and indexed “almost immediately,” Google+ posts pass link equity, and Google+ is “optimized for semantic relevance.”

    Basically, Google+ posts are very similar to blog posts.

    It doesn’t hurt that Google recently made +1s a lot more visible within Google+ itself, showing +1’d content more (which could lead to even more +1s).

    Plus Ones

    But +1s aren’t the only Google+ property that could be helping sites’ search rankings. Search Mojo CEO Janet Driscoll Miller recently made a pretty compelling case that Google authorship is substantially impacting rankings.

    None of this is officially acknowledge by Google, though in that Hangout above, Cutts did talk up the possibilities of authorship.

    Google is clearly doing just about all it can to keep from having to point users to third-party properties. This is most evident with its continuous expansion of the Knowledge Graph, but it also means pointing people to Google+ among other properties.

    As I mentioned in a previous article, Google is already sometimes ranking Google+ URLs for content shared on the social network better than the actual URL of that content.

    Whatever the case may be, it doesn’t seem to be a bad idea to be utilizing Google+ as much as possible if you want to improve your rankings. It certainly can’t hurt. That is unless you find a way to abuse it, then Google will surely find a way to make it hurt.

    Have you seen any direct relationship between rankings and Google+ activity? Is this a good direction for Google to go in? Let us know in the comments.

    Update: Matt Cutts responded to the new Moz report on HackerNews (h/t: Search Engine Land). This is what he said:

    Just trying to decide the politest way to debunk the idea that more Google +1s lead to higher Google web rankings. Let’s start with correlation != causation: http://xkcd.com/552/

    But it would probably be better to point to this 2011 post (also from SEOMoz/Moz) from two years ago in which a similar claim was made about Facebook shares: http://moz.com/blog/does-google-use-facebook-shares-to-influ… . From that blog post from two years ago: “One of the most interesting findings from our 2011 Ranking Factors analysis was the high correlation between Facebook shares and Google US search position.”

    This all came to a head at the SMX Advanced search conference in 2011 where Rand Fishkin presented his claims. I did a polite debunk of the idea that Google used Facebook shares in our web ranking at the conference, leading to this section in the 2011 blog post: “Rand pointed out that Google does have some access to Facebook data overall and set up a small-scale test to determine if Google would index content that was solely shared on Facebook. To date, that page has not been indexed, despite having quite a few shares (64 according to the OpenGraph).”

    If you make compelling content, people will link to it, like it, share it on Facebook, +1 it, etc. But that doesn’t mean that Google is using those signals in our ranking.

    Rather than chasing +1s of content, your time is much better spent making great content.

    Shepard, the author of the report responded back:

    Thanks Matt, I think we both agree that Google doesn’t use +1’s directly in your algorithm. But are you implying there are no SEO benefits to posting popular content on Google+? Google does use PageRank and anchor text, 2 things present in Google+ posts that aren’t passed as easily in Facebook and Twitter. It seems to me that a popular post on Google+, shared and linked to by well known authorities, is just like earning a high authority editorial link – and this is a bit different than most other social media platforms.

    Now, if you tell me you treat Google+ differently in a way that blocks link juice, blocks anchor text and doesn’t pass link equity, then I think I would have to rethink my thesis. Regardless, I think we’re both on the same page here. The goal is not to accumulate a massive amounts of +1’s (and I’ll amend my post to make that clear) but to share high quality content on Google+ and build your influence through this channel, and this can lead to real world success.

    My argument is that Google+ as a platform passes actual SEO value, and I don’t think this is a bad thing or something that needs to be debunked. Feel free to disagree if I’m way off base here.

  • Google Has Made People Afraid To Link

    Google has made it so people are scared to link to content. That’s what it has come to.

    I don’t think it’s ever been Google’s intention to scare people away from linking when it’s natural and deserving, but its never-ending advice, warnings, rules and policy re-wordings have simply led to mass confusion, and people being afraid to link to legitimate content in a legitimate way for fear that Google will penalize their site in search rankings.

    Are webmasters being overly paranoid about their linking practices or are they legitimately afraid of what Google might do to their sites? Share your thoughts in the comments.

    We’ve written several articles in the past about how fear of Google has led to people frantically rushing to have external links to their sites removed, in some cases even when these links are totally legitimate (meaning playing by Google’s rules) or creating non-Google-related value. Sometimes, they’ve even considered making natural links unnatural.

    Sure, some of it has been overreaction, but a Google penalty or loss of rankings can be a huge deal for a business. Companies have laid off staff because of it.

    While most of the time, we’re talking about people being afraid of Google not liking the links that are pointing to their own site, people are now also worried about linking to other sites.

    Barry Schwartz at Search Engine Roundtable writes, “I see questions popping up left and right. Can I link to this site? If so, should I nofollow it anyway? Should I make sure to not use keyword rich anchor text when linking?”

    “It is making natural linking unnatural because of the fear of linking is now killing natural links,” he adds. “Publishers and webmasters are less likely to link out because of that fear.”

    He points to a WebmasterWorld thread where people are voicing their concerns.

    Simply put, if websites stop linking to each other, the fabric of the web crumbles. Links are what make it a web. Other wise it’s just a bunch of silos.

    Again, I don’t think Google wants people to stop linking to each other, but people are clearly concerned about what might happen if they do link, and especially without a nofollow. It doesn’t help that Google recently advised that Infographic links be nofollowed. Here you have, at least in some cases, legitimate content that people editorially link to because they like that content and want to share it with their readers. Why should these links not count? Why is it so different? People that include others’ infographics on their sites make an editorial decision to do so. I know because I have made that decision editorially on occasion. And I’m happy to give some link love to the creator for taking the time to put together that content that I found valuable enough to share with my readers.

    If I created an infographic, and an authoritative site like CNN or The New York Times wanted to use it, and would certainly expect a link and its corresponding PageRank juice.

    But there are bigger problems still with people not linking. For one, credit is often not going to be given when due. Traffic to an original source is not going to happen. Readers are going to be deprived of additional, helpful and contextual information.

    From Google’s perspective, it doesn’t make a lot of sense for sites not to link to one another appropriately, because as far as we know, PageRank still carries weight in Google’s organic rankings. That said, Google does appear to be doing everything it possibly can to not have to point users to other websites.

    There have been numerous reports of Google increasingly showing more of its own stuff and less organic results on more and more SERPs. Hell, I even see Google displaying a Google+ link for an article I’ve written rather than the article page itself on SERPs. You know, I wrote an article, then shared it on Google+, and Google decides to show the Google+ link rather than the real link. This happens fairly often, actually.

    So really, it’s going to be interesting to see how long organic rankings really even matter. But they do still matter for now, and some are probably going to suffer from not getting the links they deserve.

    What do you think of all of this linking fear? Reasonable or not? Let us know in the comments.

    Image: Matt Cutts.com

  • Is Google Going Too Far With The Nofollows?

    Is Google Going Too Far With The Nofollows?

    Note: This article has been updated from its original form.

    Google, this week, has given webmasters more advice on when they should be using the nofollow attribute. In other words, Google is telling you more types of links you shouldn’t be expecting to get PageRank from, in some cases, even when that means nofollowing links to valuable content.

    If you’re putting out widgets or infographics, you might want to be including nofollows in the embed code. That is according to Google’s Matt Cutts, who addresses the subject in a new Webmaster Help video.

    Do you think Google is going too far with its nofollow “suggestions” or should widgets and/or infographic links be able to count? Let us know what you think in the comments.

    Cutts takes on the following submitted question:

    What should we do with embeddable codes in things like widgets and infographics? Should we include the rel=”nofollow” by default? Advert the user that the code includes a link and give him the option of not including it?

    “My answer to this is colored by the fact that we have seen a ton of people trying to abuse widgets and abuse infographics. We’ve seen people who get a web counter, and they don’t realize that there’s mesothelioma links in there,” Cutts says.

    He notes that he did a previous video about the criteria for widgets. Here’s that:

    “Does it point back to you or a third party?” he continues. “Is the keyword text sort of keyword rich and something where the anchor text is really rich or is it just the name of your site? That sort of stuff.”

    “I would not rely on widgets and infographics as your primary way to gather links, and I would recommend putting a nofollow, especially on widgets, because most people when they just copy and paste a segment of code, they don’t realize what all is going with that, and it’s usually not as much of an editorial choice because they might not see the links that are embedded in that widget,” Cutts says.

    “Depending on the scale of the stuff that you’re doing with infographics, you might consider putting a rel nofollow on infographic links as well,” he adds. “The value of those things might be branding. They might be to drive traffic. They might be to sort of let people know that your site or your service exists, but I wouldn’t expect a link from a widget to necessarily carry the same weight as an editorial link freely given where someone is recommending something and talking about it in a blog post. That sort of thing.”

    Nick Stamoulis from Brick Marketing makes a great point in the comments: “It seems like pretty much any content you create, in whatever form that may be, the new ‘best practice’ is to add a nofollow tag to it.”

    “I know plenty of sites have tried to use infographics as link bait, but I do feel like there are a few inforgraphics out there that deserve follow links simply because they are so useful and full of great information,” he adds. “Unfortunately spammers abused the potential of infographics and the rest of us have to play by the new rules.”

    As I responded, good content is good content, and quite frankly, there is a lot of good content out there that doesn’t get the credit or links it deserves. Whatever the case by be in any given instance, people often don’t think they should link to sources, don’t think about it all, or just don’t care. Including a link in the embed code of an infographic gets the link on the sites that use it, and if the sites are using it, isn’t that generally “an editorial decision” as Google likes to say? Why should these links not pass PageRank?

    Isn’t it on Google to figure out which ones are abusing the practice and determine what the spam really is? Why deprive legitimate content providers (and some infographics can be quite time-consuming to create) of the link juice that others are willing to give them?

    If a high-authority site makes the decision to use an infographic that you created, why shouldn’t that count?

    Do you think Google should count links from widgets or infographics? Tell us what you think.

  • Can Google Really Keep Competitors From Harming Your Business?

    Some webmasters aren’t convinced by Google’s “solution” to negative SEO.

    Wasn’t Google’s Disavow Links tool supposed to be a major help in preventing negative SEO – competitors (or other enemies) associating your otherwise legitimate site with “bad neighborhoods,” by way of links?

    Do you think Google’s tool does its job the way it should? Is it the answer to this problem? What more should Google be doing to help webmasters? Let us know what you think in the comments.

    Perhaps Disavow Links has helped combat negative SEO for some, but it hasn’t stopped the issue from coming up repeatedly since the tool was launched. Google has a new Webmaster Help video out about the topic. Matt Cutts responds to the user-submitted question:

    Recently I found two porn websites linking to my site. I disavow[ed] those links and wrote to admins asking them to remove those links but… what can I do if someone, (my competition), is trying to harm me with bad backlinks?

    Notice that Google rephrased the question for the video title: Should I be worried if a couple of sites that I don’t want to be associated with are linking to me?

    Cutts says, “So, you’ve done exactly the right thing. You got in touch with the site owners, and you said, ‘Look, please don’t link to me. I don’t want to have anything to do with your site, and then if those folks aren’t receptive, just go ahead and disavow those links. As long as you’ve taken those steps, you should be in good shape. But if there’s any site that you don’t want to be associated with that’s linking to you, and you want to say, ‘Hey, I got nothing to do with this site,’ you can just do a disavow, and you can even do it at a domain level.”

    “At that point, you should be in good shape, and I wouldn’t worry about it after that,” Cutts concludes.

    So, this has basically been Google’s advice since the Disavow tool launched, but is it really the answer? Based on the submitted question, it makes it seem like the webmaster did what he was supposed to do (as Cutts acknowledges). So why submit the question if the issue was resolved? Is it just a matter of time? Is the webmaster overlooking other variables? Is the solution Cutts prescribes really not the solution? Is there even a truly effective solution?

    Some webmasters in the comments on YouTube aren’t convinced by Cutts’ response.

    “What a crock Matt,” writes user jeffostroff. “What about the scammers who have 5000 links pointing to our site from sites in China or Russia, where no one responds, not even the web hosts. Disavow has not worked. When are you going to offer ability to disavow whole countries. I’m sure many Americans don’t want any links coming from other countries if their site is targeted only to Americans.”

    That comment has the most YouTube likes of the bunch so far (17) .

    “I don’t think simply disavowing links is necessarily the solution Matt,” HighPosition’s Chris Ainsworth comments. “Agreed it will help to disassociate a website from any rogue/malicious links but it doesn’t solve the on-going issue of competitor link spam tactics. In many cases, especially with larger brands, managing link activity can be a time intensive process. Should it be the responsibility of the business to manage their link profile or should Google have the ability to better identify malicious activity?”

    That one got 15 likes.

    Google has been talking about the effects of the Disavow tool on negative SEO from the beginning. In the initial blog post announcing the tool, Google included an FAQ section, and one of the questions was: Can this tool be used if I’m worried about negative SEO?

    The official response from Google was:

    The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building. If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool.

    In general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.

    So really, it does sound like Google does aim to shoulder the responsibility for negative SEO, rather than webmasters having to rely on their tool to battle it. Google wants to do that battling algorithmically, but is it doing a good enough job?

    Comments like the ones above and countless others in various threads around the SEO industry would suggest that it is not. Google is probably right in that “the vast majority of webmasters don’t need to worry about negative SEO,” but what about the minority? How big is the minority? That, we don’t know, but as often as the issue comes up in discussion, it seems big enough.

    Even if Google isn’t doing a good enough job combatting the issue, that doesn’t mean it’s not trying. Google makes algorithm changes on a daily basis, and many of them are certainly aimed at spam-related issues. Perhaps it will get better. Perhaps it has already gotten better to some extent. The concerns are still out there, however. Real people appear to still be dealing with negative SEO. Either that, or they’re just diagnosing their problems wrong.

    What do you think? How common is negative SEO really? What would you like to see Google do to address the issue? Share your thoughts.

  • Did Google Give Webmasters What They Need This Time?

    Webmasters, many of which have businesses that rely on search rankings, have been wanting Google to do more to communicate with more specificity what is hurting their sites in Google search rankings. The search engine can’t seem to do enough to please everybody, but it does continue to launch tools and resources.

    Is Google doing enough to communicate issues it has with sites, or does it still need to do more? What exactly should Google be doing? Let us know what you think.

    Google has added a new feature to Webmaster Tools called the Manual Action Viewer. This is designed to show webmasters information about when Google’s manual webspam team has taken manual action that directly affects their site’s ranking in the search engine.

    To access the feature, simply click on “Manual Actions” under “Search Traffic” in Webmaster Tools. If Google hasn’t taken any action against your site, you should see a message that says “No Manual webspam actions found.” Obviously, this is what you want to see.

    Google notes that only less than 2% of the domains it sees are actually manually removed for webspam, so the likelihood that you see anything other than the message above seems pretty minimal (that is, of course, if you’re not spamming Google).

    The company will still notify you when you get a manual spam action, but the feature is just giving you another way to check. Here’s what you might see if you did have a manual action taken against you:

    Manual Action Viewer

    “In this hypothetical example, there isn’t a site-wide match, but there is a ‘partial match,’” Google’s Matt Cutts explains in a post on the Webmaster Central blog. “A partial match means the action applies only to a specific section of a site. In this case, the webmaster has a problem with other people leaving spam on mattcutts.com/forum/. By fixing this common issue, the webmaster can not only help restore his forum’s rankings on Google, but also improve the experience for his users. Clicking the “Learn more” link will offer new resources for troubleshooting.”

    “Once you’ve corrected any violations of Google’s quality guidelines, the next step is to request reconsideration,” he adds. “With this new feature, you’ll find a simpler and more streamlined reconsideration request process. Now, when you visit the reconsideration request page, you’ll be able to check your site for manual actions, and then request reconsideration only if there’s a manual action applied to your site. If you do have a webspam issue to address, you can do so directly from the Manual Actions page by clicking ‘Request a review.’”

    As Cutts notes, this new feature is something that Webmasters have been requesting for some time. While he emphasizes that a very small percentage of Webmasters will actually see any actions in the viewer, it is at least a new way to know for sure if Google has indeed taken a manual action.

    Reactions in the comments of Google’s announcement are a little mixed. Most of the visible comments are praising the tool. One person says they’re already putting the feature to good use. Another says, “Finally!”

    I say visible comments because many of them say, “Comment deleted. This comment has been removed by the author.”

    One user says, “If we have followed Matt’s advice and Google’s guidelines, why would we need this tool? Please give us a tool that can really help us , not distract us.”

    In addition to the new WMT feature, Google has put out a series of seven new videos to go with its documentation about webspam, explaining what each type really means. Cutts, with the assistance of a few other Googlers, covers unnatural links, think content, hidden text, keyword stuffing, user-generated spam, and pure spam. You can find all of them here.

    This is Google’s latest attempt to make its documentation more helpful. A couple weeks ago, Google updated its Link Schemes page to discuss article marketing and guest posting, advertorials and press release links.

    Of course this is all only applicable to those who have been hit with manual penalties, and is of little comfort to those hit by algorithm changes. If that’s your problem, you may want to look into the whole authorship thing, which just might be influencing ranking significantly.

    Are Google’s most recent “webmaster help” efforts truly helpful to webmasters? Let us know in the comments.

  • Here’s The Matt Cutts Video We’ve All Been Waiting For

    Okay, maybe you weren’t waiting for it, but if you’ve watched a lot of Matt Cutts videos (which I assume you have if your’e reading this), it’s kind of funny.

    The video comes from OnlineMarketing.de, and it’s called, “It’s a party in here!” I’m not sure if it tops the classic Cutts video parody in which he discusses “How to rank #1 in Google,” but still.

    For more Matt Cutts-related humor, check out his halloween costumes and his extended dinosaur video.

    [via Search Engine Roundtable]

  • Google Adds Functionality To ‘Skip Redirect’ For Smartphone SERPs

    Google has made a change to how Skip Redirect on smartphone SERPs works, so it now also uses the rel-alternate-media annotations recommended in Google’s guidelines on separate mobile URLs.

    Skip Redirect is when Google changes the link target it shows in smartphone results if it knows the URL it’s showing redirects users to a different smartphone-optimized page.

    “In this case, even if the desktop page doesn’t automatically redirect to the smartphone page, if we discover valid annotations linking the desktop and smartphone pages, our algorithms may still change the link target shown in the search results to point directly to the smartphone page,” Google’s Pierre Far says in a Google+ post. “I know many webmasters asked me about this exact thing, and now you can be happy.”

    “Skip redirect changed only the URL in the HTML, i.e., the click URL, and didn’t change the URL that was displayed,” he notes. “The rel-alternate-media annotations change both, i.e. we display and link directly to the smartphone URL.”

    The Skip Redirect feature was introduced in late 2011. More on this and smartphone Googlebot-mobile in a blog post from the Webmaster Central blog.

    [via Search Engine Roundtable]

  • Here’s A New Google Video About Hidden Text And Keyword Stuffing

    Here’s A New Google Video About Hidden Text And Keyword Stuffing

    Okay, one more. Google cranked out seven new Webmaster Help videos feature Matt Cutts (and in some cases, other Googlers) talking about various types of webspam.

    So far, we’ve looked at three videos about unnatural links, one about thin content, one about user-generated spam and one about pure spam. You can find them all here.

    Finally, on to hidden text and/or keyword stuffing. This, like much of the content found in the other videos is pretty basic stuff and pretty common SEO knowledge, but that doesn’t mean it’s not valuable information to some.

  • If Google Has Accused Your Site Of ‘User-Generated Spam,’ You’ll Want To Watch This Video

    Google pumped out a batch of new videos about webspam via its Webmaster Help YouTube channel. You can find others from the series here.

    We just looked at one about the “pure spam” manual action label. This one is about “user-generated spam”.

    User-generated spam could include forum spam, spammy user profiles, spammy blog comments, spammy guestbook comments, etc.

    “The good thing is that normally when you see this kind of message, it normally means that the manual action we’ve taken is pretty precisely scoped,” Cutts says. “If possible, we try to avoid taking action on the whole domain. We might say something like, ‘Okay, don’t trust this forum. Don’t trust this part of the site, and that’s kind of nice because it doesn’t affect the primary part of your site, as long as your site is high quality. It might just affect your forum. So that’s how we try to do it unless we see so many different parts of the site that have been defaced or have been overrun that we end up taking action on the entire site.”

    The advice if you get this message is basically to clean it up. He suggests looking at new users that have been created, finding the spammy ones and kicking them out of your system. Also, deleting threads that are spammy would be a good idea.

    You also want to do preventive maintenance like CAPTCHAs and comment moderation.

    Google is clearly doing more to educate people about its manual actions. The company also just put out a new Webmaster Tools feature that lets users see when they have a manual action against them.

  • Google’s Cutts Explains The ‘Pure Spam’ Manual Action Label

    Google has put out a series of videos discussing various forms of webspam. You can see others from the series here.

    In this one, Google’s Matt Cutts explains the “Pure Spam” manual action label.

    This basically includes scraping, cloaking and automated black hat drivel. This kind of spam is the vast majority of the sites Google takes action on, Cutts says.

    He does talk about the scenario of buying a domain that had earned this label and getting Google to trust it under your ownership, which some people may find helpful.

    Google, in case you haven’t heard yet, has just added a new feature to Webmaster Tools called Manual Action Viewer, which will let webmasters see if Google has taken a manual action against their site. According to Cutts, this only happens for less than 2% of domains.

  • Google Tells You All About What It Means By ‘Thin Content’ In This New Video

    Google has put out a bunch of new videos (seven, to be exact) about various types of web spam. So far, we’ve looked at a trio of them dealing with unnatural links. You can find them here.

    This one is about thin content, or content with little or now added value. This is essentially the kind of thing Google’s Panda update was designed to attack, so if Panda has been a problem for you, you may want to pay attention.

    Doorways, thin affiliates (Cutts says this is the most common by volume), thin syndication (which he says is also very common), and scrapers are examples he discusses.

    While talking about affiliates, he notes that the things that Google considers to be adding value include adding original insight, research, analysis, reviews, or videos.

    On what you should do if you get a message about thin content, he says you can remove such content or actually add value to it. Anything original is bound to help.

  • …And Here’s One More Video Of Matt Cutts Talking About Unnatural Links

    Google has put out a new group of videos about various webspam topics. Three of these are specifically about unnatural links. Here’s one on unnatural links from your site, and here’s one on unnatural links to your site.

    While both of these videos featured Matt Cutts with other Googlers, this one is just Cutts himself talking about unnatural links and their impact.

    “Over time, we’ve gotten more granular, and our approaches have become more sophisticated, and so as a result, if you think perhaps that your site overall is good, but there might be some bad links (it’s not all bad links, but a portion bad links), then we might be taking targeted action on just those links. And that can be bad links or links that you might not think of as typically bad.”

    He goes on to talk about various examples. If you’ve got about ten minutes to spare, you’ll probably want to give it a watch.

  • This Google Video Talks About Unnatural Links TO Your Site

    As previously reported, Google has put out seven new videos about various kinds of webspam. Last time, we looked at one about unnatural links from your site. This one is about unnatural links to your site.

    Once again, it’s pretty basic stuff, as Google is including these videos in its documentation about webspam.

    “If you’ve gotten this message, it basically means that we have seen enough low-quality or spammy links to your site that it’s affected our opinion of your entire site” Google’s Matt Cutts says. “We don’t like to take action on sites. We prefer not to, but we have to protect users.”

    Google essentially wants you to contact the sites that are linking to you, and have them take care of them (whether that means removing them, nofollowin them, redirecting them or whatever), and submit reconsideration requests.

    If that doesn’t work, use the Disavow LInks tool. As we recently discussed, not everyone thinks this is working well enough.

  • Google Discusses Unnatural Links (On Your Site) In New Video

    Google has put out seven new Webmaster Help videos about various types of webspam. I’m not sure how long ago they were recorded, but they all just hit the GoogleWebmasterHelp YouTube channel, and feature Matt Cutts, and in some cases, Cutts and other Googlers.

    Here’s one on Unnatural LInks.

    It’s mostly pretty basic stuff about the difference between natural and unnatural links, but Google is using these videos in its documentation for its quality guidelines, so that makes a great deal of sense.

    “The good news is that this is something that is fixable,” says Cutts. “It’s fixable by you relatively easily, if you decide to commit to it.”

    Sandy, the other Googler in the video, notes that removing all the unnatural links is not always the best option, as they only ask that they don’t pass PageRank. This, of course, means nofollowing or redirecting through a URL that’s blocked by robots.txt.

    Cutts notes that he’s a big fan of the removing all the links approach.

  • Authorship May Already Substantially Impact Google Rankings

    It’s been pretty clear for quite a while that Google really likes its authorship signal, and aims to improve it, and make it matter more in search. True enough, the feature does have its benefits when it comes to associating content with certain people, and establishing trust while also improving visibility on crowded search results pages.

    However, Google hasn’t exactly been rushing to tell people it’s going to help their ranking on the search page. It has not been officially established as a direct ranking signal. A new report from Search Mojo CEO Janet Driscoll Miller makes a pretty compelling argument that authorship has is already being used as a direct ranking signal, though she admits it’s only a theory. But really, it is a very convincing theory.

    I would urge you to read her detailed account of the events that led her to this theory, but to make a long story short, a client (some health-related association) had been apparently hit by the Panda update. The client who had plenty of authoritative links lost its rankings while another site that had plagiarized its content (one of a handful) managed to rank. That offending site was using authorship, despite not being the true author of the content. When they got this site to remove the content, the rankings for the client were improved.

    “Some other things to note about this problem include that the offending website is a locally-based business in Texas,” Miller writes. “As a searcher based in Virginia, you wouldn’t normally expect to see this local business high in SERPs based on geographic settings.”

    “However, this site ranked very highly for very popular keyword terms, ranking alongside highly authoritative sites on the given keywords and subjects,” she adds. “The site had few, if any, inbound links. After doing some research using the Wayback Machine, it was also clear that these pages were likely added in the May 2013 timeframe, so they were relatively new pages.”

    Again, you should really read her report for the full story, which makes the argument all the more convincing. She also makes a great point about the potential for abuse if Google is really giving this kind of weight to authorship. Anyone can use authorship and steal content. If that means they’re going to rank over the true authors, that’s obviously a major issue that Google needs to (and surely will) deal with.

    Either way, this pretty much indicates that using authorship is a must. There’s no real reason that I’m aware of not to use it, but after this, I’m wondering if there are harmful consequences of not using it.

    Note that this report doesn’t come from some random conspiracy theorist webmaster, but from a long-time respected voice in the search industry.

    The possible Panda connection to authorship is quite interesting, considering that Google (which had previously indicated that it would no longer confirm Panda updates) recently confirmed a new Panda update, which it said included new, unspecified signals. Authority and trust have always been major indicators of quality to Google and are specifically discussed in Google’s post Panda content advice.

    In June, Matt Cutts was talking about Google finding ways to improve authorship and looking for other ways to use it.

    “I’m pretty excited about the ideas behind rel=’author’,” he said. “Basically, if you can move from an anonymous web to a web where you have some notion of identity and maybe even reputation of individual authors, then webspam, you kind of get a lot of benefits for free. It’s harder for the spammers to hide over here in some anonymous corner.”

    I’m not so sure about that statement in light of Miller’s report.

    Cutts continued, “Now, I continue to support anonymous speech and anonymity, but at the same time, if Danny Sullivan writes something on a forum or something like that I’d like to know about that, even if the forum itself doesn’t have that much PageRank or something along those lines,. It’s definitely the case that it was a lot of fun to see the initial launch of rel=’author’. I think we probably will take another look at what else do we need to do to turn the crank and iterate and improve how we handle rel=’author’. Are there other ways that we can use that signal?”

    He concluded the video by saying, “I do expect us to continue exploring that because if we can move to a richer, more annotated web, where we rally know…the philosophy of Google has been moving away from keywords, ‘from strings towards things,’ so we’ve had this Knowledge Graph where we start to learn about the real world entities and the real world relationships between those entities. In the same way, if you know who the real world people are who are actually writing content, that could be really useful as well, and might be able to help you improve search quality. So it’s definitely something that I’m personally interested in, and I think several people in the Search Quality group continue to work on, and I think we’ll continue to look at it, as far as seeing how to use rel=’author’ in ways that can improve the search experience.”

    Clearly this is going to be something for webmasters and SEOs to keep an eye on, and in light of Miler’s report, I would imagine that authorship is going to be more scrutinized than ever.

  • Cutts Talks Web Spam Fighting In International Markets

    In today’s Webmaster Help video from Google, Matt Cutts discusses the search giant’s efforts in web spam fighting around the world. Many of us are very used to hearing about the efforts surrounding web spam in the United States, but efforts in other countries aren’t discussed quite so frequently.

    Cutts responds to a question from an anonymous user, who asks:

    Is the Webspam team taking the same measures to counter spam in international markets like India like they do in the US market? It just seems like there are a lot of junk sites that come up in the first page of results when searching on google.co.in.

    “Remember, the web spam team has both the engineers who work on the algorithmic spam,” says Cutts .”We also have the manual web spam team, and both of those work on spam around the world. So, Google.co.in, you know, India…we want the algorithms, whether they be link spam or keyword stuffing or whatever to work in every language as much as we can. And so we do try to make sure that to the degree it’s possible for us to do it, we internationalize those algorithms.”

    “At the same time, we also have people, including people in like Hyderabad, who are fighting spam not only in English, and on the .com domains, but also in India, you know, .IN as well,” he continues. So we have people who are able to fight spam in forty different languages based around the world. At the same time, I would agree that probably English spam in the United States on a .com definitely gets a lot of attention because not every single engineer can speak French or German or a particular language, but it is the case that we put a lot of work into trying to make sure that we do internationalize those.”

    He adds, “Definitely if you see any results that are sub-optimal or that are generally bad, either do a spam report or show up in the webmaster forum, and drop a notice there. Feel free to send a tweet. That’s the sort of thing that we’re interested in, and we’d like to make sure that we do better on.”

    Cutts notes that they use the feedback they get to try to improve future iterations of their web ranking algorithms.

  • Google Tests Design Change To Sitelinks In Search Results

    Google appears to be testing a design change to its sitelinks feature on search results pages, which makes the experience a bit more interactive for users, and lets sites offer deeper pages within the feature.

    Moz and Hyperlinkx Medai CEO Jon Cooper both shared different instances of the feature on Google+.

    We haven’t been able to replicate the experience here, leading us to believe that Google is just testing this updated experience, but it looks like an improvement to me, so hopefully we’ll be seeing it roll out soon.

    Google does run about 20,000 of search experiments a year, so you never know.

    Moz

    #MozCast  Feature Alert – Spotted yet another call-out box for the 1st organic listing. This one has boxes around all of the expanded site-links and adds an expansion arrow that shows more links (screenshot is the expanded version).

    Jon Cooper

    Haven't seen anyone talk about this yet, so I'll just come out and say it – Google has just introduced a new brand SERP pack with drop downs for each sitelink/category.

    However, it's not just for big brands, it's being implemented across the board.

    EDIT: There's a lot of testing going on right now and not everyone is seeing this like I am.

    [via Search Engine Roundtable]

  • Cutts: We Will Give More Info In Link Messages Over Time

    Google’s Matt Cutts says webmasters can expect Google to expand the amount of info Google provides in Webmaster Tools messages related to manual web spam actions. Cutts put out a new Webmaster Help video today discussing this topic, when asked:

    Will Webmaster Tools ever tell us what links caused a penalty?

    “First off, remember, algorithmic things are just ranking, so they don’t generate messages in Webmaster Console,” Cutts responds. “However, if you log in to the Webmaster Tools Console, and you see that there’s a message, that means that there has been some direct manual action by the web spam team that is somehow directly affecting the ranking of your website. So in those cases, right now, some of those messages have example links or example URLs that are causing issues for us.”

    He continues, “We wouldn’t necessarily say that those are the only things because if you have a million URLs that are offending things, we couldn’t send all million URLs in an email or even a message, because that’s just gonna take too much storage, but we are going to, over time, give more and more information in those messages, and so I wouldn’t be surprised if you see, you know, 1, 2, 3 – some number of example URLs or links that give you an idea of where to look in order to find the sorts of things that are causing that particular action. So, I think that is really useful. We’re going to keep looking at how we can expand the number of example URLs that we include in messages, and I think that will be a great thing for webmasters because then you’ll have a really good idea about where to go and look in order to help diagnose what the issue is.”

    This is actually the second time Cutts has discussed this topic in a Webmaster Help video this month. Back on the 15th, Google released a video in which he also said they’d try to get more examples of bad links in messages to webmasters.

    You can check that out here, if you want to see exactly what he said then.

  • Google Discusses Its New Official Link Rules

    Google Discusses Its New Official Link Rules

    Google has some new rules for the kinds of links it allows (or doesn’t allow, rather). The concepts are actually not exactly new, but Google has updated its official documentation to reflect its views of certain kinds of links.

    Are you concerned with following Google’s rules for links on the web? Does Google have too much power over how people treat their content? Let us know what you think in the comments.

    As you may know, one of the things Google says in its Quality Guidelines to avoid is participation in link schemes. Google has updated the link schemes page, as Search Engine Land (tipped by Menaseh) recently reported.

    Now included as things that qualify as link schemes are:

    • Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links
    • Advertorials or native advertising where payment is received for articles that include links that pass PageRank
    • Links with optimized anchor text in articles or press releases distributed on other sites.

    Guest posts have been discussed numerous times recently. A recent article at HisWebMarketing.com suggested that “high quality guest posts can get you penalized.

    Google talked about the topic in several videos (which you can watch here if you want to spend the time doing so).

    In one video, Matt Cutts said that it can be good to have a reputable, high quality writer do guest posts on your site.

    He also said, “Sometimes it get taken to extremes. You’ll see people writing…offering the same blog post multiple times or spinning the blog posts, offering them to multiple outlets. It almost becomes like low-quality article banks.”

    “When you’re just doing it as a way to sort of turn the crank and get a massive number of links, that’s something where we’re less likely to want to count those links,” he said.

    “Generally speaking, if you’re submitting articles for your website, or your clients’ websites and you’re including links to those websites there, then that’s probably something I’d nofollow because those aren’t essentially natural links from that website,” Google’s John Mueller said in another video.

    In another video, Mueller said, “Think about whether or not this is a link that would be on that site if it weren’t for your actions there. Especially when it comes to guest blogging, that’s something where you are essentially placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a linkbuilding point of view. I think sometimes it can make sense to guest blog on other peoples’ sites and drive some traffic to your site because people really liked what you are writing and they are interested in the topic and they click through that link to come to your website but those are probably the cases where you’d want to use something like a rel=nofollow on those links.”

    Cutts said in a recent interview with Eric Enge, “The problem is that if we look at the overall volume of guest posting we see a large number of people who are offering guest blogs or guest blog articles where they are writing the same article and producing multiple copies of it and emailing out of the blue and they will create the same low quality types of articles that people used to put on article directory or article bank sites.”

    “If people just move away from doing article banks or article directories or article marketing to guest blogging and they don’t raise their quality thresholds for the content, then that can cause problems,” he said. “On one hand, it’s an opportunity. On the other hand, we don’t want people to think guest blogging is the panacea that will solve all their problems.”

    Advertorials are another thing Google has been cracking down on recently. Cutts put out a video specifically addressing this topic a few months ago.

    “Well, it’s advertising, but it’s often the sort of advertising that looks a little closer to editorial, but it basically means that someone gave you some money, rather than you writing about this naturally because you thought it was interesting or because you wanted to,” he said. “So why do I care about this? Why are we making a video about this at all? Well, the reason is, certainly within the webspam team, we’ve seen a little bit of problems where there’s been advertorial or native advertising content or paid content, that hasn’t really been disclosed adequately, so that people realize that what they’re looking at was paid. So that’s a problem. We’ve had longstanding guidance since at least 2005 I think that says, ‘Look, if you pay for links, those links should not pass PageRank,’ and the reason is that Google, for a very long time, in fact, everywhere on the web, people have mostly treated links as editorial votes.”

    More on all of that here.

    Finally, with regard to the optimized anchor text in articles or press releases thing, Google gives the following example of what not to do:

    There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

    I wonder if that’s a real sample.

    Barry Scwhartz from Search Engine Roundtable jumped in a hangout to ask Mueller some questions about press releases:

    He recaps:

    John Mueller from Google makes it clear that Google wants all links in these press releases to be nofollowed. He did say having a URL at the end should be okay but when he was grilled about it again, he said it is best to nofollow the links. John even said press releases should be treated as advertisements and thus links in those releases should be nofollowed.

    I asked John why all of a sudden the change in policy for press releases and John said that it is because SEOs were using these more and more in a way to promote their site [artificially in the Google search results] and Google needed to clarify their stance on them.

    Google did remove a few to make room for the new ones. Now gone are “linking to web spammers or unrelated sites with the intent to manipulate PageRank” and “links that are inserted into articles with little coherence”.

    I guess it’s game on with those. Just kidding.

    What do you think of Google’s updated language for links schemes? Do any of the changes concern you? Let us know in the comments.

  • Google Talks Geotargeting And Generic ccTLDs

    Google’s latest Webmaster Help video deals with ccTLDs and geotargeting – specifically Google’s view of a developer grabbing a ccTLD that is generally associated with a country they’re not actually in. Here’s the exact question:

    As memorable .COM domains become more expensive, more developers are choosing alternate new domains like .IO and .IM – which Google geotargets to small areas. Do you discourage this activity?

    “I want you to go in with your eyes open,” Google’s Matt Cutts responds. “Because you can pick any domain you want, but if you pick a domain like .ES or .IT because you think you can make a novelty domain like GOOGLE.IT (‘Google It’), you know, or something like that, be aware that most domains at a country level do pertain to that specific country, and so we think that that content is going to be intended mainly for that country.”

    He does note that there are some ccTLDs that are more generic like .IO, which stands for Indian Ocean, but there are “very few” domains that are actually relevant to that. A lot of startups were using it, and it was something that was more applicable to the entire world, he says. For reasons like this, Google periodically reviews the list of ccTLDs, looking for things that are in wider use around the world. This way, it can view sites with these domains as more generic.

    Here’s a list of the domains Google considers generic.

    Cutts talked about this topic in another video earlier this year, specifically responding to the question:

    We have a vanity domain (http://ran.ge) that unfortunately isn’t one of the generic TLDs, which means we can’t set our geographic target in Webmaster Tools. Is there any way to still target our proper location?

    You can see his response to that one here.

    On a semi-related note, last week, WordPress.com started letting users register .CO domains.

  • Penguin 2.0 Update & Its Effects On Small & Medium Sized Business

    On May 22, Google released penguin 2.0. The main purpose of this update was to provide a more in depth analysis on websites that benefits from link spam. This update helped keep websites with unnatural link building out of the eyes of Google users when viewing the search engine result pages.

    Older techniques like Google bombing, anchor text spam and link spam have been heavily scrutinized by this update. These types of updates have a larger impact on SMB’s than larger corporations due to limited resources and typically a less technical group of people.

    However, SMB’s need to be aware of some positive things they can do in order to make the most out of these aggressive updates.

    Local Citations

    Local Business directories are high quality sites that are meant to list your business details. This is very similar to how the “yellow pages” works but these listings are found online. The more higher quality and trusted sites that contain your Business Name, Address and Phone number. These types of links will help you rank locally and provide you with a stronger organic presence.

    A page for each location

    The recent penguin update calls for more specific pages. The best was an SMB owner can interpret this is by creating a specific page for each location with unique text.  By creating pages that have unique location information, you have a better chance to rank that page locally in your DMA.

    Added Tip# If you include micro data to these pages you will get an extra boost!

    Claim your authorship

    Recently Matt Cutts made a video talking about authorship tags and their importance for local SEO. Adding this tag helps Google recognize you from potential spammers or unnatural SEO methods and helps protect you from entering any negative filters associated with this update.

    Power of Local Pages

    To rank locally, having powerful local pages on Google, Bing and Yahoo can help get you more organic coverage as well as a better ranking in the search results pages.  Fill out as much information as you can for these local pages like hours of operation, types of products and services etc. to fully optimize these pages. Push people to provide consumer reviews when possible, having an active and positive customer review section can help you rank better as well as make more sales!

    Social Media

    Get involved with your Facebook, twitter and other social platforms. Penguin takes into count your social influence and that plays a part in your overall local rankings. Come up with a post schedule and keep to it. Use more imagery in your posts because studies show there is more engagement when you do. Produce interesting content on your site and share it on your social profiles. This will drive traffic as well as boost your organic rankings.

    Clean up your bad links

    For the first time ever, we have to worry about the links pointing to our sites from years ago. If you have toxic links pointing to your site, they need to be pruned and removed. Your current links could be holding you back. Link Research Tools offers a Link Detox report you can run. Once you run this report, you can log into your Google Webmaster Tools and disavow those links or even contact the website owners directly to remove the toxic links.

    If you apply these 7 simple steps you will see your site start to grow rankings in the search engines quickly and you can rest assured the tactics are implanting are safe and effective.