WebProNews

Tag: SEO

  • Google Launches New Knowledge Graph Tools, ‘Hummingbird’ Algorithm & More

    Google announced some new features today in celebration of its fifteenth birthday. First off, Knowledge Graph has a new comparison feature and new filters.

    “We keep expanding features of the Knowledge Graph so it can answer more questions—even those that don’t have a simple answer,” says Google’s Amit Singhal. “Let’s say you want to get your daughter excited about a visit to the Met. You can pull up your phone and say to Google: “Tell me about Impressionist artists.” You’ll see who the artists are, and you can dive in to learn more about each of them and explore their most famous works. If you want to switch to Abstract artists, you can do that really easily with our new filter tool:”

    He adds, “Or let’s say you want to compare two things: How much saturated fat is in butter versus olive oil? Now you can simply tell Google: ‘Compare butter with olive oil.’ Our new comparison tool gives you new insights by letting you compose your own answer.”

    knowledge graph comparison

    Knowledge graph

    He says Google will keep adding more data to make this feature available for more types of comparisons.

    Google will soon launch an update to the Google Search app for iPad and iPhone that will provide Google Now-based reminder notifications across devices. This follows a similar update last for Android last month.

    Google is also launching a new design for mobile search results and ads:

    New mobile search

    Google spoke about its new features at the famous Google garage today where they reportedly also discussed a new algorithm called Hummingbird and a song exploration feature.

    We don’t know much about these yet, but Danny Sullivan reports that Hummingbird affects 90% of searches worldwide, and “helps with complex queries”.

    I’m sure we’ll be finding out more about this soon.

    Image: Google

  • Keyword Planner Alternatives For Keyword Research: Who Will Rise To The Challenge?

    With the rise of ‘Not Provided’ and the recent death of the beloved Google Keyword Tool, it’s clear that the biggest player in search is trying to reduce SEOs’ obsession over keywords. These changes also reflect Google’s increase in monetization of their keyword data. Why else would they allow you to compare organic versus paid traffic on bid keywords in Adwords while leaving Analytics users in the dark?

    If you aren’t as cynical as I am, it’s easy to see that the new tool was actually designed for the PPC specialists. My biggest pet peeve is the removal of the “closely related filter.” This is clearly the biggest blow for SEOs. The tool is definitely still usable for keyword research, especially on the ultra local level, but it clearly isn’t really intended for that task.

    Keyword Planner

    This doesn’t mean that keyword research is dead; it just forces SEOs to begin using more alternatives to Google’s tools.

    Free Keyword Tool Alternatives

    To this date, there are only two completely free keyword planner alternatives. While, they are not optimal for SEOs, these free tools can help you decide on new niches to target or plan your next link building campaign effectively.

    Bing Keyword Tool

    Keyword Research

    The biggest disadvantage with using the Bing Keyword Tool is that the data is not from Google. The volume is usually hovering between 10% and 25% of Google’s planner for most keywords. However, unlike the Keyword Planner, you have a “strict” filter that acts like the old “closely related” filter which is great for increasing your keyword target spread.

    The tool is still in Beta and I look forward to seeing how Bing capitalizes on SEOs’ complaints about the Planner. You’ll also need to claim a website in Bing Webmaster Tools in order to use the tool.

    Google Trends & Webmaster Tools

    The only advantage of both these tools is that the data is straight from Google itself. The numbers are definitely questionable and only a very limited amount of data is available.

    Webmaster Tools limits itself to keywords on which your sites are already getting impressions. The data helps you strategize but won’t give you any data on words you might be interested in ranking on.

    Google Trends

    On the other hand, Google Trends can also be useful for planning content for referral traffic, but the data does not give you any idea of the actual traffic your article might get if it ranked first on the keyword.

    Freemium Keyword Tool Alternatives

    The problem with a lot of keyword research tools is that they are advertised as free tools, but there are usually some limitations to free access. You most likely will still need to sign up with an account, and even spend some money to get the best keyword data from them. The free versions aren’t always useful as stand-alone software. This is especially true when trying to find new keywords. This can obviously be solved by using a suggestion tool like Ubersuggest.

    Wordtracker

    Don’t be fooled by the Wordtracker home page. Their free keyword tool can be found on this subdomain. The tool will give you access to US, UK and Global search volume for the keywords you put in, but not much else. This is where Ubersuggest will come into play. To actually get suggestions and get country level keyword search data, you will actually have to sign up and pay a small fee.

    Wordtracker

    SEMRush

    SEMRush is one of my favorite tools because it’s extremely versatile. The reason why I like the tool so much for Keyword Research is that it also gives a lot of PPC info, which is great for figuring out the most profitable keywords. However, it suffers from most of the problems the other freemium tools in that its broad range suggestions of new keywords is not very robust.

    Paid Keyword Tool Alternatives

    If you’re looking to invest in a paid tool, be careful! There are a lot of tools that will get you to pay and not really be much help at all because they either require proxies, multiple Adwords accounts, or merely just change the user interface and give you the same data you could get for cheaper elsewhere.

    Advanced Web Ranking

    If you’re already using Advanced Web Ranking and just hate the UX of the new Keyword Planner, their tool does great and will let you compare Google results to SEMRush, Wordtracker and more. In addition, you’ll also get to see where your site currently ranks during the keyword research which is a great!

    All things considered, there are a lot of options out there, but if you want Google’s data, it might be best to just stick to the planner and use a paid tool like Advanced Web Ranking to provide a better user interface. Looking towards the future, I can see a lot more and better free tools being developed by big SEO software companies and I can’t wait to see what they come up with.

  • Don’t Worry About Google Penalties From Invalid HTML (At Least for Right Now)

    Ever wonder how the quality of your HTML is affecting your rankings in Google? Well, at least for the time being, it’s not having any effect at all, regardless of how clean it is. Google’s Matt Cutts said as much in a new Webmaster Help video.

    Cutts was answering the following submitted question:

    Does the crawler really care about valid HTML? Validating google.com gives me 23 errors and 4 warnings.

    “There are plenty of reasons to write valid HTML, and to pay attention to your HTML, and to make sure that it’s really clean and that it validates,” says Cutts. “It makes it more maintainable. It makes it easier whenever you want to upgrade. It makes it much better if you want to hand that code off to somebody else. There’s just a lot of good reasons to do it. At the same time, Google has to work with the web we have, not the web that we want to have. And the web that we have has a lot of syntax errors – a lot of invalid HTML, and so we have to build the crawler to compensate for that and to deal with all the errors and weird syntax that people sometimes mistakenly write in a broken way onto the web.”

    “So Google does not penalize you if you have invalid HTML because there would be a huge number of webpages like that,” he says. “And some people know the rules and then decide to make things a little bit faster or to tweak things here or there, and so their pages don’t validate, and there are enough pages that don’t validate that we said, ‘Okay, this would actually hurt search quality,’ if we said, ‘Only the pages that validate are allowed to rank or rank those a little bit higher’. First and foremost, we have to look at the quality of the information, and whether users are getting the most relevant information they need rather than someone has done a very good job of making the cleanest website they can.”

    “Now, I wouldn’t be surprised if they correlate relatively well,” he adds. “You know, maybe it’s a signal we’ll consider in the future, but at least for right now, do it because it’s good for maintenance. It’s easier for you if you want to change the site in the future. Don’t just do it because you think it will give you higher search rankings.”

    Or maybe you should do it also because Google might decide to use it in the future, and then you’ll have your bases covered.

    Image: Google

  • Google Reportedly Kills Link Network Ghost Rank 2.0

    Google has reportedly penalized link network Ghost Rank 2.0, and sites with links from it. If you were using it, you probably should have expected this to happen sooner or later.

    Cutts appeared to have hinted at this last month, when he tweeted:

    Now, forum watcher Barry Schwartz writes, “I am pretty confident, 99% confident, based on the data I see in the forums and some sources I have that want to remain anonymous, that Ghost Rank 2.0 was hit hard by Google. It seems that at least one of the underground and under the radar networks was severely hurt by Google and many of the sites using them to rank well in Google are now penalized.”

    Google shutting down link networks with the sole purpose of gaming Google rankings is nothing new. Google’s web spam team is constantly working to combat this type of thing. It should come as no surprise when they take such action. This is not a sustainable way to build links for SEO value.

    Here’s a closer look at Ghost Rank 2.0:

    The Ghost Rank site does its best to convince you that it is above Google penalties:

    Ghost Rank 2.0

    Ghost Rank 2.0’s site claims to sell high PR links. Here’s the pricing scale:

    Ghost Rank

    “We have signed up to 35 different Russian exchange networks,” the site explains. “Put all these domains available into one pool and ran them through our custom made algo and filters to find the strongest, most beneficial links. We don’t just look at PR. It’s a lot more complex than that.”

    On how safe you would be using this system, it says, “Well, let’s just put it this way. It’s about as diversified as you are going to get. We aren’t relying on one network for links. One gets hit and you still have 30+ other networks to keep you going strong.”

    You’re taking your site’s destiny into your own hands when you use this approach. Who do you believe more: a link network that promises to get you higher rankings or Google, who vows to penalize link networks that promise to get you higher rankings?

    Image: GhostRank.net

  • Google Cranks Up ‘Not Provided’ Keywords, Says Ads Aren’t The Reason

    It looks like the percentage of keywords that are listed as “not provided” in your Google Analytics account is going to keep going up, as Google is reportedly moving to switch all users to secure search regardless of whether or not they’re signed in.

    Have you noticed an increase in the amount of keywords that are labeled as not provided? Let us know in the comments.

    As I’m sure you’ll recall, Google launched SSL Search on Google.com as the default for signed in users about two years ago, claiming it was a move to protect user privacy. This had an unfortunate side effect for webmasters, as it means that those searching with this experience do not have any keyword data to contribute. Google masks the search terms these people use under the “Not Provided” label, and for a lot of sites, this tends to account for the majority of their search traffic.

    Google still provides this kind of data in AdWords, however, and is often criticized for doing so. Some don’t believe Google’s more honorable sounding privacy reasoning, but rather that Google is simply doing this to increase its own revenue.

    In the early days of the feature, the percentage of queries labeled not provided was supposed to be somewhere around 1%. Reports shortly thereafter had it closer to 8%, with more recent accounts having the number ranging from 40% to 80%. Everyone pretty much seems to agree that the number has been increasing, and it looks like it may increase even more.

    Danny Sullivan at Search Engine Land reports that Google is making secure search the default for all Google users, sharing this statement from the company:

    We added SSL encryption for our signed-in search users in 2011, as well as searches from the Chrome omnibox earlier this year. We’re now working to bring this extra protection to more users who are not signed in.

    We want to provide SSL protection to as many users as we can, in as many regions as we can — we added non-signed-in Chrome omnibox searches earlier this year, and more recently other users who aren’t signed in. We’re going to continue expanding our use of SSL in our services because we believe it’s a good thing for users….The motivation here is not to drive the ads side — it’s for our search users.

    ClickConsult has a site called NotProvidedCount.com, which tracks the rise of “not provided” queries for sixty sites, and graphs the average (via Sullivan). There’s also a live counter, which as of the time of this writing is floating around 74%.

    Not Provided Count

    “Grouping a large number of keywords under the banner of (not provided) denies site owners fundamental information about how their site is performing in organic search,” the site says. “The percentage of (not provided) traffic Google is sending your site is steadily rising, and will one day hit 100%.”

    It certainly looks that way based on Google’s statement and the obvious trending increase.

    Google does still provide search terms in Webmaster Central, but as Sullivan noted in a recent article, it’s not great for historical data, though Google is increasing the timeframe. Historical data is not an issue in AdWords.

    Regardless of Google’s motive for moving to a full-on encrypted search experience for all users, it’s going to mean that keyword data in Google Analytics is going to become obsolete at worst, and much less helpful at best.

    This also comes after Google killed its popular Keyword Tool to get people to use its newer Keyword Planner product. A lot of webmasters/SEOs have been pretty perturbed by that too.

    A recent report from MarketLive found that merchants saw “significant changes” in the mix of paid/organic traffic. Paid search visits made up about a third of total search engine visits (up from 26% the previous year). Search visit growth slowed in the first six months of the year, but paid was up 30% while organic was down 3%.

    Do you think the “not provided” percentages will hit 100%? Share your thoughts in the comments.

    Image: NotProvidedCount.com

  • Matt Cutts Talks Duplicate Content Once Again

    Google’s Matt Cutts has a new video out about duplicate content, a subject he has discussed many times in the past. If you have a site that you use to sell a product that other sites also sell, and are concerned that pages listing the “ingredients” of said product will be seen as duplicate content, this one’s for you.

    Cutts takes on the following submitted question:

    What can e-commerce sites do that sell products which have an ingredients list exactly like other e-commerce sites selling the same product to avoid Google as seeing it as duplicate content?

    Cutts begins, “Let’s consider an ingredients list, which is like food, and you’re listing the ingredients in that food and ingredients like, okay, it’s a product that a lot of affiliates have an affiliate feed for, and you’re just going to display that. If you’re listing something that’s vital, so you’ve got ingredients in food or something like that – specifications that are 18 pages long, but are short specifications, that probably wouldn’t get you into too much of an issue. However, if you just have an affiliate feed, and you have the exact same paragraph or two or three of text that everybody else on the web has, that probably would be more problematic.”

    He continues, “So what’s the difference between them? Well, hopefully an ingredients list, as you’re describing it as far as the number of components or something probably relatively small – hopefully you’ve got a different page from all the other affiliates in the world, and hopefully you have some original content – something that distinguishes you from the fly-by-night sites that just say, ‘Okay, here’s a product. I got the feed and I’m gonna put these two paragraphs of text that everybody else has.’ If that’s the only value add you have then you should ask yourself, ‘Why should my site rank higher than all these hundreds of other sites when they have the exact same content as well?’”

    “So if some small sub-component of your pages have some essential information that then appears in multiple places, that’s not nearly so bad,” Cutts adds. “If the vast majority or all of your content is the same content that appears everywhere else, and there’s nothing else to really distinguish it or to add value, that’s something I would try to avoid if you can.”

    So, pretty much the same thing you’ve heard before. Got it yet?

    Find other things Cutts has said about duplicate content in the past here.

  • Google: No Duplicate Content Issues With IPv4, IPv6

    Google released a new Webmaster Help video today discussing IPv4 and IPv6 with regards to possible duplicate content issues. To make a long story short, there are none.

    Google’s Matt Cutts responded to the following user-submitted question:

    As we are now closer than ever to switching to IPv6, could you please share some info on how Google will evaluate websites. One website being in IPv4, exactly the same one in IPv6 – isn’t it considered duplicate content?

    “No, it won’t be considered duplicate content, so IPv4 is an IP address that’s specified with four octets,” says Cutts. “IPv6 is specified with six identifiers like that, and you’re basically just serving up the same content on IPv4 and IPv6. Don’t worry about being tagged with duplicate content.”

    “It’s the similar sort of question to having content something something dot PL or something something dot com,” he continues. “You know, spammers are very rarely the sorts of people who actually buy multiple domains on different country level domains, and try to have that sort of experience. Normally when you have a site on multiple country domains, we don’t consider that duplicate content. That’s never an issue – very rarely an issue for our rankings, and having the same thing on IPv4 and IPv6 should be totally fine as well.”

    More on IPv6 here.

    Image: Google

  • Google Webmaster Tools Alters How It Selects Sample Links

    Google announced on Thursday that it has made a change to how it decides what links to show webmasters when they push the “Download more sample links” button. The feature typically shows about 100,000 backlinks.

    “Until now, we’ve selected those links primarily by lexicographical order,” explains Yinnon Haviv, a software engineer with Google’s Webmaster Tools team. “That meant that for some sites, you didn’t get as complete of a picture of the site’s backlinks because the link data skewed toward the beginning of the alphabet.”

    “Based on feedback from the webmaster community, we’re improving how we select these backlinks to give sites a fuller picture of their backlink profile,” Haviv adds. “The most significant improvement you’ll see is that most of the links are now sampled uniformly from the full spectrum of backlinks rather than alphabetically. You’re also more likely to get example links from different top-level domains (TLDs) as well as from different domain names. The new links you see will still be sorted alphabetically.”

    Soon, Google says, when webmasters download their data, they’ll see a more diverse cross-section of links. The goal is for webmasters to more easily be able to separate the bad links from the good.

    More link profile clarity has to be a good thing, because people are freaking out about links these days, and Google itself is even mistakenly telling webmasters that legitimate links are bad in some cases. But like Google’s Matt Cutts said in a tweet, “I think that’s 1 of the benefits of more transparency is that it helps us improve on our side too.”

    Image: Google

  • Here’s A New Panda Video From Matt Cutts

    Here’s A New Panda Video From Matt Cutts

    Google has released a new Webmaster Help video about the Panda update. Matt Cutts responds to a user-submitted question asking how she will know whether her site is hit by Panda since Google has integrated it into its normal indexing process, and if her site was already hit, how will she know if she has recovered?

    Cutts begins, “I think it’s a fair question because if the site as already hit, how will she know if she has recovered from Panda? So, Panda is a change that we rolled out, at this point, a couple years ago targeted towards lower quality content. It used to be that roughly every month or so we would have a new update, where you’d say, okay there’s something new – there’s a launch. We’ve got new data. Let’s refresh the data. It had gotten to the point, where Panda – the changes were getting smaller, they were more incremental, we had pretty good signals, we had pretty much gotten the low-hanging winds, so there weren’t a lot of really big changes going on with the latest Panda changes. And we said lets go ahead and rather than have it be a discreet data push that is something that happens every month or so at its own time, and we refresh the data, let’s just go ahead and integrate it into indexing.”

    “So at this point, we think that Panda is affecting a small enough number of webmasters on the edge that we said, ‘Let’s go ahead and integrate it into our main process for indexing,’” he continues. “We did put out a blog post, which I would recommend, penned by Amit Singhal, that talks about the sorts of signals that we look at whenever we’re trying to assess quality within Panda, and I think we’ve done some videos about that in the past, so I won’t rehash it, but basically we’re looking for high-quality content. And so if you think you might be affected by Panda, the overriding kind of goal is to try to make sure that you have high-quality content – the sort of content that people really enjoy, that’s compelling – the sort of thing that they’ll love to read that you might see in a magazine or in a book, and that people would refer back to or send friends to – those sorts of things.”

    You can read more about that Singhal blog post here.

    “That would be the overriding goal, and since Panda is now integrated with indexing, that remains the goal of entire indexing system,” says Cutts. “So, if your’e not ranking as highly as you were in the past, overall, it’s always a good idea to think about, ‘Okay, can I look at the quality of the content on my site? Is there stuff that’s derivative or scraped or duplicate or just not as useful, or can I come up with something original that people will really enjoy, and those kinds of things tend to be a little more likely to rank higher in our rankings.”

    See all of our past Panda coverage here to learn more.

    Image: Google (YouTube)

  • Google Admits Link Mistake, Probably Won’t Help Webmaster Link Hysteria

    Google is apparently getting links wrong from time to time. By wrong, we mean giving webmasters example links (in unnatural link warning messaging) that are actually legitimate, natural links.

    It’s possible that the instances discussed here are extremely rare cases, but how do we know? It’s concerning that we’re seeing these stories appear so close together. Do you think this is an issue that is happening a lot? Let us know in the comments.

    A couple weeks ago, a forum thread received some attention when a webmaster claimed that this happened to him. Eventually Google responded, not quite admitting a mistake, but not denying it either. A Googler told him:

    Thanks for your feedback on the example links sent to you in your reconsideration request. We’ll use your comments to improve the messaging and example links that we send.

    If you believe that your site no longer violates Google Webmaster Guidelines, you can file a new reconsideration request, and we’ll re-evaluate your site for reconsideration.

    Like I said, not exactly an admission of guilt, but it pretty much sounds like they’re acknowledging the merit of the guy’s claims, and keeping these findings in mind to avoid making similar mistakes in the future. That’s just one interpretation, so do with that what you will.

    Now, however, we see a Googler clearly admitting a mistake when it provided a webmaster with one of those example URLs for a DMOZ link. Barry Schwartz at Search Engine Roundtable, who pointed out the other thread initially, managed to find this Google+ discussion from even earlier.

    Dave Cain shared the message he got from Google, which included the DMOZ link, and tagged Google’s Matt Cutts and John Mueller in the post. Mueller responded, saying, “That particular DMOZ/ODP link-example sounds like a mistake on our side.”

    “Keep in mind that these are just examples — fixing (or knowing that you can ignore) one of them, doesn’t mean that there’s nothing else to fix,” he added. “With that in mind, I’d still double-check to see if there are other issues before submitting a reconsideration request, so that you’re a bit more certain that things are really resolved (otherwise it’s just a bit of time wasted with back & forth).”

    Cain asked, ” Because of the types of links that were flagged in the RR response (which appear to be false negatives . i.e DMOZ/ODP), would it be safe to assume that the disavow file wasn’t processed with the RR?”

    Mueller said that “usually” submitting both at the same time is no problem, adding, “So I imagine it’s more a matter of the webspam team expecting more.”

    It’s a good thing Mueller did suggest that Google made a mistake, given the link in question was from DMOZ. There are a lot of links in DMOZ, and that could have created another wave in the ocean of link hysteria. Directories in general have already seen a great deal of requests for link removals.

    Here’s a video from a couple summers ago with Cutts giving an update on how Google thinks about DMOZ.

    Cutts, of the webpspam team, did not weigh in on Cain’s conversation with Mueller (which took place on August 20th).

    Mistakes happen, and Google is not above that. However, seeing one case where Google is openly admitting a mistake so close to another case where it looks like they probably also made a mistake is somewhat troubling, considering all the hysteria we’ve seen over linking over the past year and a half.

    It does make you wonder how often it’s happening.

    Update: Just got a tweet from Cutts on the matter:

    Do you think these are most likely rarities, or do you believe Google is getting things wrong often? Share your thoughts.

    Image: Google

  • Matt Cutts On When Nofollow Links Can Still Get You A Manual Penalty

    Today, we get an interesting Webmaster Help video from Google and Matt Cutts discussing nofollow links, and whether or not using them can impact your site’s rankings.

    The question Cutts responds to comes from somebody going by the name Tubby Timmy:

    I’m building links, not for SEO but to try and generate direct traffic, if these links are no-follow am I safe from getting any Google penalties? Asked another way, can no-follow links hurt my site?

    Cutts begins, “No, typically nofollow links cannot hurt your site, so upfront, very quick answer on that point. That said, let me just mention one weird corner case, which is if you are like leaving comment on every blog in the world, even if those links might be nofollow, if you are doing it so much that people notice you, and they’re really annoyed by you, and people spam report about you, we might take some manual spam action on you, for example.”

    “I remember for a long time on TechCrunch anytime that people showed up, there was this guy anon.tc would show up, and make some nonsensical comment, and it was clear that he was just trying to piggyback on the traffic from people reading the article to whatever he was promoting,” he continues. “So even if those links were nofollow, if we see enough mass-scale action that we consider deceptive or manipulative, we do reserve the right to take action, so you know, we carve out a little bit of an exception if we see truly huge scale abuse, but for the most part, nofollow links are dropped out of our link graph as we’re crawling the web, and so those links that are nofollowed should not affect you from an algorithmic point of view.”

    “I always give myself just the smallest out just in case we find somebody who’s doing a really creative attack or mass abuse or something like that, but in general, as long as you’re doing regular direct traffic building, and you’re not annoying the entire web or something like that, you should be in good shape,” he concludes.

    This is perhaps a more interesting discussion than it seems on the surface in light of other recent advice from Cutts, like that to nofollow links on infographics, which can arguably provide legitimate content and come naturally via editorial decision.

    It also comes at a time when there are a lot questions about the value of links and what links Google is going to be okay with, and which it is not. Things are complicated even further in instances when Google is making mistakes on apparently legitimate links, and telling webmasters that they’re bad.

    Image: Google

  • The Google ‘Not Provided’ Problem Isn’t Getting Any Better

    About two years ago, Google launched SSL Search on Google.com as the default for signed in users, as a measure to protect user privacy. This encrypted search meant not providing keyword search data through analytics to websites that these users visited. As a webmaster, you would see that you were getting this traffic from Google, but the keywords would be unknown, as Google would label this traffic “Not Provided”.

    Yes, the dreaded “not provided” continues to this day to be a hot button issue in the SEO and online marketing community. It’s complicated by the fact that you can still see such data in AdWords. People have been accusing Google of doing this to increase its own revenue since the move was made that October of 2011.

    Do you think Google is doing this to increase its own revenue or is it really about privacy? Share your thoughts.

    Search industry vet Danny Sullivan has brought the discussion back to the forefront with an article about what he believes Google’s intentions to be, but what it looks like to everyone else.

    It seems to pretty much an industry consensus that the “not provided” percentages are increasing. They had already increased significantly a month after Google made the changes. Initially, the percentage was supposedly around less than 1%, before jumping to something like 8% the following month. More recently, it’s looking like above 40% for some industries and over 50% for tech sites.

    As I write this, about 80% of our own real-time Google traffic is coming from keywords that are “not provided”.

    Sullivan reminds us that Google provides search terms to publishers through Webmaster Central, and of course to advertisers, and that Google recently announced the Paid & Organic report for AdWords.

    We talked about this here. This was aimed at helping businesses get more out of their paid and organic search campaigns by offering new comparison options.

    “Previously, most search reports showed paid and organic performance separately, without any insights on user behavior when they overlap,” says AdWords product manager Dan Friedman. “The new paid & organic report is the first to let you see and compare your performance for a query when you have either an ad, an organic listing, or both appearing on the search results page.”

    Google suggests using the report to discover potential keywords to add to your AdWords accounts.

    “You’ll see your top terms, sortable by clicks, queries and other ways,” Sullivan writes. “The good news is that you don’t have to be a paying AdWords customer to do this. You just need an AdWords account. The bad news is that feels wrong that Google is forcing publishers into its ad interface to get information about their ‘non-paid’ listings. It also suggests an attempt to upsell people on the idea of buying AdWords, if they aren’t already.”

    “I don’t believe things were orchestrated this way, with terms being withheld to push AdWords. I really don’t,” he adds. “I think the search team that wanted to encrypt and strip referrer information had the best intentions, that it really did believe sensitive information could be in referrer data (and it can) and sought to protect this. I think AdWords continued to transmit it because ultimately, the search team couldn’t veto that department’s decision. But regardless of the intentions, the end result looks bad.”

    It does look bad, and a lot of webmasters are not buying it. If they weren’t buying it in the first place, they’re certainly not buying it at this point as the “not provided” percentages have increased, and Google has made it harder and harder for webmasters to use keyword data to their advantage. They recently killed the Keyword Tool, which was also a disappointment to many.

    If this has all been about increasing Google’s revenue, it might be working. We recently looked at a MarketLive report finding that its merchants saw “significant changes” in the mix of paid/organic traffic. Paid search visits made up about a third of total search engine visits (up from 26% the previous year). Search visit growth slowed in the first six months of the year, but paid was up 30% while organic was down 3%.

    We know that Google is clearly trying to move further away from keywords in terms of how it delivers its results, and more and more of what Google is showing users is coming from its own results (Knowledge Graph, Maps, etc.).

    Matt Cutts recently had some interesting things to say about Google trying to extract the “gist” of queries. He was specifically responding to a question about voice search, but Google clearly wants to get to the root of what people are searching for regardless of what input method they’re using, and that means exact keywords will continue to decrease in significance, at least for certain types of queries.

    As Sullivan notes, some fear we’re headed for a “100% not provided” future, but as Google itself moves away from keyword dependence, how much will it matter in the long run?

    Have you noticed the “not provided” percentage increase for your own site? Has it affected your organic/paid search mix? Let us know in the comments.

    Image: Google Analytics

  • Demand Media Launches eHow Crafts, Google’s Algorithmic Effects Remain To Be Seen

    Demand Media has launched a new channel for its popular eHow site – eHow Crafts.

    “We’ve seen explosive consumer demand for craft-related content on eHow, which is why we’re bringing together and adding a wide variety of helpful content in a single destination on eHow,” a spokesperson for the company says. “It will include everything, from tips on making felt flowers to learning how to make a yarn painting. The new channel will offer step-by-step text articles, video demonstrations, original photography, and even templates for printable coloring books.”

    eHow Crafts

    “With the launch of new channel and the acquisition of Creativebug earlier this year, we’re brought together one of the largest arts and crafts audiences on the Web,” the spokesperson adds. “We believe there will be synergies between both sites, as users go back and forth and enjoy the task-oriented, short-form content on the Crafts Channel to the project-oriented, long-form content on Creativebug.”

    More on the CreativeBug acquisition here.

    Crafts seems like an obvious vertical for how-to content. One might wonder why eHow is just now launching a channel.

    “We’ve been focused on growing our dominant position in other categories, and it’s paid off,” Paul Lively, SVP and GM of eHow tells WebProNews. “We hold a top 10 position in the top categories, including home (#1), personal finance (#2), health (#3 with LIVESTRONG) and pets (# 5). We’re now turning our attention to the crafts category and we plan to build up this channel (as we believe that crafting is an essential category for eHow, which is a resource that people turn to everyday to learn how to do things). We invested in the multi-billion dollar, arts and crafts market when we acquired Creativebug earlier this year. We believe in this opportunity and we plan to dominate the the crafts category with a dedicated channel that can serve as an online hub for this passionate community.”

    “Our entire business is built on listening to our consumers and giving them what they want,” he says. “This includes listening to what people want and giving them that content, which is what we’ve done by giving people more crafts content on a dedicated channel on eHow.”

    Last month, Demand Media launched eHow Now, a paid platform where users can chat directly with so-called experts, and get advice and guidance. Crafts seems like a natural vertical for such an offering. It’s not available for Crafts yet, but it’s in the cards.

    “We do have plans to make eHow Now available in the Crafts Channel for users who have questions they want answered by experts,” says Lively. “This will complement the 6 key categories currently available on eHow Now: auto, tech, health, legal, personal finance and pets.”

    No word on when that availability will happen.

    As those who follow the search industry may know, Demand Media and eHow in particular have been key properties of interest in relation to Google’s famous (or infamous if you prefer) Panda update. It was believed by many that “content farm” sites like eHow (at least the eHow of old) were largely responsible for the update to begin with, and the update made a significant impact on the company’s earnings, though it has managed to weather the storm. Last year, it returned to record profitability.

    More recently, however, Google’s algorithms have been affecting Demand Media’s properties again. On the company’s Q2 earnings call, CEO Richard Rosenblatt said that they were impacted by 30 algorithm changes since March, but noted that some were negative and others were positive.

    When asked about the new channel’s vulnerability to Google’s algorithms, Lively tells us, “We don’t break out traffic for individual categories on our sites. We do see fluctuations in traffic across all our sites over time, both up and down. Our primary focus is to provide the best consumer experience.”

    When asked if Demand Media has been affected by any Google algorithm changes since the last earnings call, he said, “We can’t specifically comment on Google’s practices. Our goal is to provide the best consumer experience, and that’s where we’ll continue to focus.”

    Image: eHow

  • Google’s Matt Cutts Talks Auto-Generated Pages

    Google takes action on pages that are auto-generated, and add no value. You probably know that. Google talks about this kind of content in its Quality Guidelines. But that hasn’t stopped Matt Cutts from discussing it further in a new Webmaster Help Video in response to the user-submitted question:

    What does Google do against sites that have a script that automatically picks up search query and makes a page about it? Ex: you Google [risks of drinking caffeine], end up at a page: “we have no articles for DRINKING CAFFEINE” with lots of ads.

    “Okay, so it’s a bad user experience,” says Cutts. “And the way that you have asked the question, we are absolutely willing to take action against those sites.”

    He notes that in the past, he has put out specific calls for sites where you search for a product, and you think you’re going to get a review, and you find a page with “zero reviews found” for that product.

    If you have a site that would have something like this, Cutts says to just block the pages that have zero results found.

    Image: Google (YouTube)

  • Is Google Acknowledging Getting A Link Wrong?

    Last week, we looked at a webmaster’s claim that Google was calling one of his natural, legitimate links unnatural, and now Google may have realized it that it made a mistake.

    The webmaster had said he received a warning earlier in the year, which he deemed “understandable,” as he had worked with SEO agencies in the past that did advertorials, and was spammed with “really bad links” by other sites. He said he spent months contacting webmasters, getting links removed and nofollowed, and had about 500 links disavowed. Essentially, according to him, the site in question should have been in good shape, but when Google responded to his reconsideration request, it gave an example link that appeared to not be violating any guidelines.

    Barry Schwartz at Search Engine Roundtable, who originally pointed out the forum post in which the webmaster shared his story, now points to a comment made by a Googler in that same thread. Google’s Eric Kuan had this to say about the situation:

    Thanks for your feedback on the example links sent to you in your reconsideration request. We’ll use your comments to improve the messaging and example links that we send.

    If you believe that your site no longer violates Google Webmaster Guidelines, you can file a new reconsideration request, and we’ll re-evaluate your site for reconsideration.

    It’s not exactly admitting the mistake, but as Schwartz notes, it’s interesting that they would even respond in this scenario otherwise. They’re using the comments to improve messaging. Does that mean they realize there is merit to what this guy is saying, and will use that to keep from making similar mistakes in the future? That’s what it sounds like.

    Google seems to be all about some feedback. The company even surprised webmasters last week with a form asking about “small sites” that should rank better in Google’s results.

  • Google Wants You To Tell It Specific Sites To Rank Better. Will It Listen?

    Assuming that Matt Cutts’ Twitter account wasn’t hijacked, Google wants you to tell it if you know of any small sites that should be doing better in Google rankings.

    Do you think Google will really listen to this feedback? Do you think many will suggest sites they’re not affiliated with? Let us know what you think in the comments.

    If you’ve ever thought Google is giving too much weight to big brands, I guess this is your chance to weigh in on the better alternatives, wait, and see if your suggestion did any good.

    Cutts points to a form, which says, ‘Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we’re just collecting feedback at this point; for example, don’t expect this survey to affect any site’s ranking.” Emphasis added.

    You simply enter the site, and then in a box, explain to Google what makes it better. Here’s what it looks like:

    Small site survey

    And….go!

    Reactions, unsurprisingly, are a bit skeptical:

    My guess is that a lot of people will be giving votes for their own sites, and few will be submitting others’. Maybe I’m wrong.

    Do you expect Google to obtain valuable information from this effort? Tell us what you think.

    Note: this post has been updated from its original form.

  • Why Google Can’t Answer All Of Your Questions About Your Site

    In a new Google Webmaster Help video, Matt Cutts talks about why Google can’t answer all of your questions. He responds to the following question:

    When will there be official Google support for webmaster questions? I only ever receive automated responses after submitting reconsideration requests despite going to length to write in detail with regards to my issues and what I have done.

    “The problem is fundamentally a scale issue,” Cutts explains. “There’s 250 million domain names. I think the most recent data that we’ve provided says that we took action on 400,000 sites to the degree that we sent them a manual message in January of 2013. And we get about 5,000 reconsideration reports each week, so about 20,000 a month. And the problem is, our primary goal has to be returning the highest quality set of search results, so that’s what we really need to work on. And then our secondary goal is to talk to webmasters about actions that we’ve taken on sites. So the problem primarily is that there’s so many webmasters on the web, and our index is really big, and we get over two billion queries a day, so we don’t really have a great way to talk one on one with individual webmasters.”

    “So we try to come up with scalable ways like webmaster videos like these that can get several thousand views, but it is really tricky to have a conversation – especially a prolonged, detailed conversation – about a particular site,” he continues. “We’ll keep looking for new ways to do better. We’ll keep looking for new ways to communicate scalably, but that’s the fundamental dilemma. That’s the issue that we face. And so the reconsideration request process, for example, you’ll typically get back, ‘Yes, you’re doing okay,’ or ‘No, you still have work to do,’ or in some cases, we process your request, which might mean, ‘Hey, you had multiple issues, and maybe one is now resolved, but there’s still more issues that need to be resolved.”

    This seems like an example of where a one-on-one conversation would be of tremendous help to the webmaster. As we talked about earlier, one webmaster was complaining in the forum that Google warned him about a natural link in a reconsideration request, leaving him wondering what he’s supposed to do with that.

    But it’s hard to argue with Matt’s point about scalability. Google is huge, but do you really think the search team could thoroughly go through every site’s issues with the webmaster individually?

  • Is Google Ever Wrong About Links?

    In case it wasn’t bad enough that fear of Google has kept people from linking to other sites, and got them requesting legitimate links be pulled down, Google is reportedly sending unnatural link warnings to sites based on links that are actually natural.

    Is Google ever wrong about links? Does Google ever really look at legitimate links as bad? Let us know what you think in the comments.

    It’s hard to say if this is happening often, or if with 100% certainty that it is happening, but Barry Schwartz at Search Engine Roundtable appears to have found at least one example in a Google help forum thread.

    The webmaster says he received a warning in February, noting that this was “understandable” because he’s worked with SEO agencies in the past that did advertorials, and was spammed with “really bad links” by unknown individuals.

    “So we spent the last months, contacting webmasters, getting links removed and nofollowed and we disavowed around 500 Links,” the webmaster writes. “Next to that we stopped the redirection from our old domain to which there are quite some spammy links pointing.”

    “I think we have done everything within our ability, at considerable time and cost to our company, to comply with Googles guidelines,” he adds. “We have completely stopped working with agencies and we pursue a quality approach.”

    He says after his last reconsideration request was declined, Google gave the following URL as an example of one of the bad links:

    http://sustainablog.org/2013/07/furniture-recycling-endangered-animals/

    “This is a completely legitimate post and it was not influenced by us in any way,” he says. “They are writing about a campaign we are running. I have the feeling this sometimes is completely random. I am even unsure if it makes sense to take the time to actually file another reconsideration request under these circumstances.”

    He later notes that there is no relationship between his company and the blog with the “bad link”.

    Another discussion participant suggests that the “money” keyword link “Guide To Recycling” in the article, which points to the webmaster’s page, could be the problem.

    “Well the so called ‘money keyword link’ was chosen by sustainablog itself, probably because they thought it would best describe what we do,” the webmaster responded. “We have no influence on this, and we certainly have no interest in ranking for ‘Guide To Recycling’”.

    So yes, this sounds like a natural link, at least from this side of the story.

    Interestingly, the person who suggested the “money keyword” issue said the same thing happened to one of their clients – also in the furniture space.

    Schwartz suggests the webmaster is “better off disavowing the link, and also finding links like it,” and doing the same for them. This might be good SEO advice, but it also highlights a possible issue in webmasters being forced to have Google ignore legitimate links.

    If this is really what’s going on, it’s pretty sad.

    It does, however, come at a time when independent reports are finding strong correlation between Google+ and authorship and search rankings. You have to wonder if links are simply starting to play less of a role in Google’s algorithm than in the past. Even if they are still playing a role, it’s possible that they’re not being given as much weight. Following a recent Moz (formerly SEOmoz) report about +1s and rankings, Matt Cutts set out to “debunk the idea that more Google +1s lead to higher Google web rankings.” But if you think about +1s like links, it’s not necessarily link quantity that really counts either.

    There’s also question about whether Google is going to continue to update Toolbar PageRank. It’s not the same as pure PageRank, but it’s still a de-emphasis, if they’ are in fact killing it.

    Either way, Google has been changing its wording related to link guidelines, putting out multiple new videos about “unnatural links” and suggesting webmasters use nofollow on more types of content.

    Do you think Google is capable of making mistakes like this? If so, do you think it happens often? Share your thoughts.

    Note: This article has been updated from its original form.

    Image: ThinkStock

  • Are You Getting More Out Of Paid Search Than From SEO?

    As you’ve probably found out, getting your content seen in Google’s organic listings is not as easy as it used to be. It’s no wonder that businesses are getting more out of paid listings than they are organic search traffic.

    is this the case for your business or do you get more out of organic SEO? Let us know in the comments.

    Google has launched a new Paid & Organic report in AdWords aimed at helping businesses get more out of their paid and organic search campaigns by offering new comparison options.

    “Previously, most search reports showed paid and organic performance separately, without any insights on user behavior when they overlap,” says AdWords product manager Dan Friedman. “The new paid & organic report is the first to let you see and compare your performance for a query when you have either an ad, an organic listing, or both appearing on the search results page.”

    Google suggests using the report to discover potential keywords to add to your AdWords accounts by looking for queries where you only appear in organic search with no associated ads, as well as for optimizing your presence on high value queries and measuring changes to bids, budgets, or keywords and their impact across paid, organic and combined traffic.

    Paid & Organic Report

    Image: Google

    Digital marketing firm IMPAQT was part of the beta testing, and says, “The paid & organic report has been incredibly useful in understanding the interaction between paid and organic search, and the overall synergy when they are working together. For one of our client’s key branded queries, we saw an 18% increase in CTR when paid and organic work together, as opposed to only having the organic listing.”

    It’s worth noting that Google itself shared this quote.

    To take advantage of the Paid & Organic report, you have to link your AdWords account to Webmaster Tools, and you have to be a verified owner or be granted access by one.

    MarketLive has put out a report finding that its merchants saw “significant changes” in the mix of paid/organic traffic. Paid search visits made up about a third of total search engine visits (up from 26% the previous year), while revenue from paid search grew to 44% of total search engine visit revenue (up from 40% in 2012). Interestingly, search visit growth altogether slowed in the first six months of the year, but paid was up 30% while organic was down 3%.

    Paid/Organic Search Traffic

    Image: Marketlive

    Here’s a side-by-side comparison of conversions, order size, new visits, bounce rate and pages per visit. As you can see, paid performs better across the board, except for new visits, which makes sense if you consider brand familiarity.

    Marketlive: Paid vs. Organic

    Image: Marketlive

    The report delves into performance across verticals, device comparisons and more, if you want to check it out (registration required).

    This is only one study, of course, but the signs are pointing to businesses getting more out of paid search than out of organic search. While Google’s new report feature could help both, it certainly seems geared toward using what you learn from your organic performance to put toward your paid campaigns. And again, Google certainly isn’t making things any easier for those trying to be found in organic results.

    For one thing, Google results simply have a lot more types of results than they used to, and on many pages, that means less traditional organic results. For another thing, people are afraid to link out, and to have links pointing toward them, which surely can’t be a great thing for traditional SEO, considering that Google’s algorithm (while including over 200 signals) has historically placed a great deal of its confidence in legitimate linking.

    Between webmaster paranoia, Google’s somewhat mixed messaging and ongoing “advice,” and its ever-changing algorithms, many businesses are finding out the hard way that relying too heavily on organic search is just detrimental. Paid search is less risky. It’s also how Google makes the bulk of its money.

    The AdWords department lost some trust points this week, however, when an account manager’s accidental voice mail recording gained some attention. Basically, he expressed his distaste that the client had upgraded to Google’s Enhanced Campaigns without consulting him, that he would now have to pitch call extensions and site links. He also noted that he didn’t care about bridge pages or parked domains.

    As Ginny Marvin at Search Engine Land writes, the implications of that are that AdWords account reps are paid to upsell new products/services that may or may not be in clients’ best interests, an account rep was willing to ignore a breach of Google’s own policies, and that AdWords account managers are “sales people first and foremost.”

    Google indicated that this person was not an actual Google employee, but a contractor, and that they had already removed them from the AdWords team, but as Marvin points out, it’s unclear whether this is potentially a bigger issue or if this one person’s attitude is just a rare case. Either way, it hasn’t been great for advertiser perception.

    But what are you gonna do?

    Obviously, when it comes to paid and organic search, the idea is to get them to work together. It’s not necessarily an “either or” situation, but there is always a question of how to balance your resources.

    Do you get better performance from paid search or organic SEO? Let us know in the comments.

  • Today On The Matt Cutts Show: Page Speed As A Ranking Factor

    In Google’s latest Webmaster Help video, Matt Cutts discusses page speed and whether or not it’s a more important factor for mobile than for desktop.

    The video is a response to the submitted question:

    Is load speed a more important factor for mobile? Is it really something that can change your rankings, all the things being equal?

    “Let’s start with the second part of that question: all things being equal,” Cutts begins. “If your site is really, really slow, we’ve said that we do use page speed in our rankings. And so all of the things being equal, yes, a site can rank lower. Now, we tend not to talk about things in terms of like an absolute number of seconds because websites do work differently in different parts of the world, and there’s different bandwidth and speeds in different parts of the world. However, it’s a good way to think about it to say, ‘Okay, look at your neighborhood of websites. Look at the sites that are returned along with you, and then if you’re the outlier. If you’re at the very bottom end because your site is really, really slow, then yes, it might be the case that your site will rank lower because of its page speed.”

    “Now, what’s interesting is that that factor applies across the board,” he continues. “It’s not specific to mobile…it’s not that in mobile we apply that any more or less than we do for desktop search, but if you’re using your mobile phone, you do care a lot about whether it will load in a reasonable period of time. So we’ll continue to look at ways to improve the ways that we find out how fast a site is, the page speed for a particular page, and then try to figure out whether it makes sense…okay, if we want users to be less frustrated, then maybe it does make sense to incorporate that more into our rankings or more for mobile Something along those lines.”

    About two years ago, Cutts said page speed affects rankings in about one out of a hundred searches, and that you shouldn’t overly stress about it. It’s unclear how much it matters now compared to then.

  • Google Answers Questions About Authorship

    Google posted to its Webmaster Central blog today to address seven questions the company is commonly hearing about authorship, or rel=”author”.

    The post discusses what kinds of pages can be used with authorship, use of company mascots as authors, language issues, multiple authors for a single article, preventing Google from showing authorship, the difference between rel=author and rel=publisher, and use of authorship on property listings and product pages.

    Google says it only uses authorship when a page contains a single article (or subsequent versions of the article) or piece of content by an author – not a list of articles or an updating feed – or the page consists primarily of content written by the same author. It needs to have a clear byline on the page with the same name as the one used on the author’s Google profile.

    “Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic,” writes Maile Ohye, developer programs tech lead for Google. “Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship.”

    Google only supports one author per article currently, but says it is experimenting with finding “the optimal outcome” when there are multiple authors. Google wants humans for authorship, so don’t use it for your mascot.

    On rel=author vs. rel=publisher, Ohye says, “rel=publisher helps a business create a shared identity by linking the business’ website (often from the homepage) to the business’ Google+ Page. rel=author helps individuals (authors!) associate their individual articles from a URL or website to their Google+ profile. While rel=author and rel=publisher are both link relationships, they’re actually completely independent of one another.”

    If you’re wondering if you should have URLs for content in different languages pointing to two separate Google profiles in different languages, the answer is no. Use one Google+ profile in your language of preference.

    If you don’t want authorship to be displayed in Google results, simply prevent your Google profile from being discoverable in search results. If you don’t want to do that, you can just remove any profile or contributor links to the site or remove the markup so it’s not connected with your profile.