WebProNews

Tag: SEO

  • Google’s OK With This Kind Of Hidden Text

    Today’s Webmaster Help video from Google is interesting. It tackles hidden text, but not the kind that Google has always spoken out agains (and talks about in its quality guidelines), but a more legitimate kind.

    In the video, Matt Cutts answers the following submitted question:

    How does Google treat hidden content which becomes visible when clicking a button? Does it look spammy if most of the text is in such a section? (e.g. simple page to buy something and “show details” button which reveals a lot of information about it).

    “I wouldn’t be overly concerned about this, but let’s talk through the different consequences,” begins Cutts. “It’s pretty common on the web for people to want to be able to say, ‘Click here,’ and then ‘show manufacturer details,’ ‘show specifications,’ ‘show reviews,’ and that’s a pretty normal idiom at this point. It’s not deceptive. Nobody’s trying to be manipulative. It’s easy to see that this is text that’s intended for users, and so as long as you’re doing that, I really wouldn’t be too stressed out.”

    He continues, “Now certainly if you were using a tiny little button that users can’t see, and there’s like six pages of text buried in there, and it’s not intended for users, and it’s keyword-stuffing, then that is something that we could possibly consider hidden text or probably would consider hidden text, but in general, if you just have something where you have a nice AJAXy sort of site, and things get revealed, and you’re trying to keep things clean, that’s not the sort of thing that’s going to be on the top of our list to worry about because there’s a lot of different sites that really do that.”

    “It’s pretty common on the web, and a lot of people expect that on the web,” he says. “Take, for example, Wikipedia on your mobile phone – they’ll have different sections, and then if you click, they expand those sections, and there’s good usability reasons for doing that, so as long as you’re not trying to stuff something in in a hidden way that’s deceptive or trying to distort the rankings – as long as you’re just doing that for users, I think you’ll be in good shape.”

    As a user notes in the comments of the video, you have to click a button to reveal the video description on YouTube.

  • Matt Cutts Talks About Duplicate Content With Regards To Disclaimers, Terms/Conditions

    Google’s Matt Cutts has put out a new Webmaster Help video once again discussing duplicate content. This time it’s about duplicate content with regards to how it relates to legally required content, such as disclaimers and terms and conditions. The exact question Cutts responds to is:

    How does duplicate copy that’s legally required (ie Terms & Conditions across multiple offers) affect performance in search?

    Cutts notes that there was a follow-up comment to the question, saying that some in the financial services industry are interested in the answer.

    “The answer is, I wouldn’t stress about this unless the content that you have is duplicated as spammy or keyword stuffing or something like that, you know, then we might be – an algorithm or a person might take action on – but if it’s legal boiler plate that’s sort of required to be there, we might, at most, might not want to count that, but it’s probably not going to cause you a big issue,” says Cutts.

    “We do understand that lots of different places across the web do need to have various disclaimers, legal information, terms and conditions, that sort of stuff, and so it’s the sort of thing where if we were to not rank that stuff well, then that would probably hurt our overall search quality, so I wouldn’t stress about it,” he says.

    So, long story short: don’t make your disclaimers and terms spammy, just like with any other content. As usual, if you play by the rules (Google’s quality guidelines), you should be fine.

  • Google Panda Update Rolls Out With New Signals

    It’s a day, so Google is updating its algorithm. Apparently there’s a Panda update currently under way.

    Barry Schwartz over at Search Engine Roundtable noticed people at WebmasterWorld talking about “another shuffle taking place in Google” as pretty much occurs all the time. Panda was suspected, and according to Scwhartz, Google has confirmed that Panda is indeed the culprit. He shares this statement at Search Engine Land:

    In the last few days we’ve been pushing out a new Panda update that incorporates new signals so it can be more finely targeted.

    Google has indicated in the past that it would no longer be confirming Panda updates, but apparently they changed their minds. Schwartz quotes the company as saying this one is “more finely targeted”.

    Some in the forums are suggesting Panda recoveries.

    Back in May, Google’s Matt Cutts released a video discussing coming changes SEOs and webmasters could expect from Google. In that, he indicated that Panda would ease up a little bit.

    He said, “We’ve also been looking at Panda, and seeing if we can find some additional signals (and we think we’ve got some) to help refine things for the sites that are kind of in the border zone – in the gray area a little bit. And so if we can soften the effect a little bit for those sites that we believe have some additional signals of quality, then that will help sites that have previously been affected (to some degree) by Panda.”

    Either way, Panda has managed to get Hitler riled up again.

    Image: fortherock (Flickr)

  • Google: You Probably Shouldn’t Link Your 20 Domains Together

    You know how Google has everybody afraid of links? People are also afraid to link to their own stuff in certain ways, and Google’s Webmaster Help video today pretty much indicates that this is with good reason.

    If you have a bunch of domains, you need to be careful about linking them to each other, because Google’s not a fan. Matt Cutts took on the following question in the video:

    Should a customer with 20 domain names link it all together or not, and if he links it should he add nofollow to the links not to pass PageRank?

    “Well first off, why do you have 20 domain names?” Cutts begins. “You know, if it’s all ‘CheapOnlineCasinos’ or “MedicalMalpracticeInOhio,’ you know, that sort of stuff, having 20 domain names there can look pretty spammy, and I probably would not link them all together. On the other hand, if you have 20 domain names and they’re all versions of your domain in different countries, right – Google.co.za, Google.fr, Google.de – that sort of thing, then it can make a lot of sense to have some way to get to one version of the domain to a different version.”

    “But even then,” he adds. “I probably wouldn’t link all the domains even in the footer, all by themselves, because that’s a little bit strange. I’d probably have one link to a country locator page, which might even be on domain.com, and you might have flags or something like that, so there are ways to get to those other domains. And as long as there’s a good way for users to get there, then search engines will be able to follow those links as well. Just make sure that they’re normal static HTML links, and we’ll be able to follow, and the PageRank will flow, and all of that sort of thing. So if there’s a really good reason for users to do it – maybe you could have a dropdown where you could pick your country or something like that – then it might make sense.”

    “But having the country top-level domains is one of the only areas where I can think of where you’d really need to have twenty different domains,” says Cutts. “In theory, you might have a blog network, but even then, you know, I’ve seen very large blog networks, and you’ve got that footer at the bottom that has a lot of unrelated domains, and at some point it gets pretty big. Even then, you’d probably only have like ten domains, and maybe a few posts on each domain that are linking to each other, so at the point where you have 20, unless there’s a really good reason, I would be a little bit leery of just doing some massive cross-linking scheme between all of them.”

    So it appears that even if you’re not trying to “scheme” per se, Google might view it as a scheme, if you link your own web properties together in a way that it doesn’t like. Of course, there’s always nofollow.

  • No, Google Still Doesn’t Think Link Building Is Bad

    For more than a year, webmasters have been receiving a great deal of messages from Google about unnatural links pointing to their sites. Sometimes it’s obvious which links Google does not like, but often times, it’s not so clear. As we’ve discussed repeatedly in the past, people became afraid of links to the point where they’d go around and try to have legitimate links removed from legitimate sites in an effort to reverse old link building efforts for fear that Google would not approve and send a penalty their way.

    Has Google made you afraid to build links or to leave existing links on the web? Let us know in the comments.

    Since this phenomenon really started to run rampant, Google has given webmasters the Disavow Links tool, which lets webmasters tell Google specific links to ignore, but Google’s message with that has basically been that most people shouldn’t use it, and before using it, do everything you can to clean up the bad links you have out there. So, while it is perhaps a helpful tool, it hasn’t necessarily put all of the fear of link building to bed.

    But Google wants you to know that it doesn’t consider link building “illegal”. Google’s Matt Cutts did an interview with Stone Temple’s Eric Enge last week, which Cutts tweeted out to his followers as a reading recommendation.

    We discussed some of the things Cutts said, mainly surrounding guest posts, in another article, but link building was another big area of discussion. Enge, introducing the piece, notes that there are people who think link building is illegal.

    “No, link building is not illegal,” says Cutts. “It’s funny because there are some types of link building that are illegal, but it’s very clear-cut: hacking blogs, that sort of thing is illegal.”

    But even beyond actual law, Cutts confirms that “not all link building is bad.”

    “The philosophy that we’ve always had is if you make something that’s compelling then it would be much easier to get people to write about it and to link to it,” Cutts tells Enge. “And so a lot of people approach it from a direction that’s backwards. They try to get the links first and then they want to be grandfathered in or think they will be a successful website as a result.”

    He notes that a link from a press release would “probably not count,” but if the press release convinces an editor or reporter to write a story about it, then the editorial decision counts for something.

    Cutts thinks a great way to build links is to build strong Twitter, Facebook and Google+ presences, and strong, engaged followings, then create great content that you push out to the audience, who will likely share it, and start doing other things that cause visibility and help it rank (these are actually Enge’s words, but Cutts “completely” agrees).

    In essence, you shouldn’t rely completely on Google, and should diversify your way of getting to your audience. If the Panda update taught the web one lesson, that was it. Ask Demand Media.

    When asked about authority as a ranking factor, Cutts tells Enge, “I would concentrate on the stuff that people write, the utility that people find in it, and the amount of times that people link to it. All of those are ways that implicitly measure how relevant or important somebody is to someone else. Links are still the best way that we’ve found to discover that, and maybe over time social or authorship or other types of markup will give us a lot more information about that.”

    On the subject of those link messages Google sends Webmasters, people often say they want Google to give them more specific examples of bad links. Google says it will try to give more in the future. This was the subject of a new Webmaster Help video Cutts put out this week.

    “We’re working on becoming more transparent, and giving more examples with messages as we can,” said Cutts. “I wouldn’t try to say, ‘Hey, give me examples in a reconsideration request,’ because a reconsideration request – we’ll read what you say, but we can really only give a small number of replies – basically ‘Yes, the reconsideration request has been granted,’ or ‘No, you still have work to do.’ There’s a very thin middle ground, which is, ‘Your request has been processed.’ That usually only applies if you have multiple webspam actions, and maybe one has been cleared, but you might have other ones left. But typically you’ll get a yes or no back.”

    He continued, “But there’s no field in that request to say – a live amount of text – to just say, ‘Okay, here’s some more examples. But we will work on trying to get more examples in the messages as they go out or some way where you…for example, it would be great if you could just log into Webmaster Tools and see some examples there.”

    “What I would say is that if you have gotten that message, feel free to stop by the Webmaster Forum, and see if you can ask for any examples, and if there’s any Googlers hanging out on the forum, maybe we can check the specific spam incident, and see whether we might be able to post or provide an example of links within that thread,” Cutts concludes. “But we’ll keep working on trying to improve things and making them more transparent.”

    I don’t think the audience was completely satisfied with Cutts’ video. The top YouTube comments as of the time of this writing are:

    “Great question. Very unsatisfying answer.”

    and

    “Matty, great non-answer. You should run for office!”

    Those were the two with the most upvotes.

    Have you been affected by Google’s link warnings? Do you think Google provides a sufficient amount of examples of what it considers to be bad links? Have you altered your link building strategy over the past year? Let us know in the comments.

  • Google Will Try To Get More Examples Of ‘Bad Links’ In Messages To Webmasters

    Google says it will try to get more examples of so-called “bad links” in its messages to webmasters who have submitted reconsideration requests after being hit with webspam penalties.

    In a Webmaster Help video today, Google’s Matt Cutts responded to the submitted question:

    Client got unnatural links warning in Sept’ 12 without any example links, 90% links removed, asked for examples in every RR but no reply, shouldnt it be better to have live/cached “list” of bad links or penalties in GWT? Think about genuine businesses.

    “That’s fair feedback. We appreciate that,” says Cutts. “We’re working on becoming more transparent, and giving more examples with messages as we can. I wouldn’t try to say, ‘Hey, give me examples in a reconsideration request,’ because a reconsideration request – we’ll read what you say, but we can really only give a small number of replies – basically ‘Yes, the reconsideration request has been granted,’ or ‘No, you still have work to do.’ There’s a very thin middle ground, which is, ‘Your request has been processed.’ That usually only applies if you have multiple webspam actions, and maybe one has been cleared, but you might have other ones left. But typically you’ll get a yes or no back.”

    He continues, “But there’s no field in that request to say – a live amount of text – to just say, ‘Okay, here’s some more examples. But we will work on trying to get more examples in the messages as they go out or some way where you…for example, it would be great if you could just log into Webmaster Tools and see some examples there.”

    “What I would say is that if you have gotten that message, feel free to stop by the Webmaster Forum, and see if you can ask for any examples, and if there’s any Googlers hanging out on the forum, maybe we can check the specific spam incident, and see whether we might be able to post or provide an example of links within that thread,” Cutts concludes. “But we’ll keep working on trying to improve things and making them more transparent.”

    How would you like to see Google approach this issue?

  • Google Warns Sites About Browser History Spam

    Google took to its Webmaster Central blog on Friday to warn webmasters that if they engage in a particular type of deceptive behavior, it “may” take action on guilty sites.

    First off, I’m pretty sure that Google absolutely will take action on a site that it catches engaging in pretty much any kind of deceptive behavior that violates its quality guidelines. That’s pretty standard.

    Apparently, however, Google has been seeing more sites that are tricking users’ browsers into going to new pages full of ads when they push the back button. This is the subject of Google’s warning.

    “Recently, we’ve seen some user complaints about a deceptive technique which inserts new pages into users’ browsing histories,” explains Michael Wyszomierski from Google’s Search Quality Team. “When users click the ‘back’ button on their browser, they land on a new page that they’ve never visited before. Users coming from a search results page may think that they’re going back to their search results. Instead, they’re taken to a page that looks similar, but is actually entirely advertisements.

    Wyszomierski shows this as an example:

    Back button spam

    “To protect our users, we may take action on, including removal of, sites which violate our quality guidelines, including for inserting deceptive or manipulative pages into a user’s browser history,” says Wyszomierski. “As always, if you believe your site has been impacted by a manual spam action and is no longer violating our guidelines, you can let us know by requesting reconsideration.”

    This is one of those things that is so obviously deceptive, it’s a little surprising that Google even feels the need to warn those engaging in this practice. By doing this, you’re essentially preventing people from getting back to their Google results when they’re unsatisfied with the result they clicked on. Of course Google isn’t going to stand for that.

    You were lucky enough to get the user to click on your result. Wouldn’t your time be better spent giving them what they actually want rather than doing shady stuff like this? Come on.

  • As Google Moves Away From Keywords, Can You Optimize For Gist?

    Today, keywords still play a significant role in search habits, and in how Google and other search engines deliver search results. The trend, however, is moving further and further away from this, especially on Google’s side. Google wants to become less dependent on keywords, and gradually doing so.

    Do you see this trend as a problem or a potential problem to your online marketing efforts? Tell us what you think.

    When Google launched the Knowledge Graph, it was clear how proud the company’s engineers and executives are of what they have put together.

    Google’s Matt Cutts proclaimed, “It’s another step away from raw keywords (without knowing what those words really mean) toward understanding things in the real-world and how they relate to each other. The knowledge graph improves our ability to understand the intent of a query so we can give better answers and search results.”

    SInce then, Google has made numerous enhancements to the Knowledge Graph, and has tweaked its algorithm in other ways that would seem to indicate a decreased dependence on keywords. In fact, there have probably been a number of changes related to this that we don’t even know about because Google stopped publishing their monthly lists of algorithm updates for some reason.

    Then there’s search-by-voice and conversational search.

    Google put out a pretty interesting Webmaster Help video this week in which Cutts discusses voice search’s impact on searcher behavior. In response to the question, “How has query syntax changed since voice search has become more popular?” Cutts talks about the trends that Google is seeing.

    “It’s definitely the case that if you have something coming in via voice, people are more likely to use natural language,” says Cutts. “They’re less likely to use like search operators and keywords and that sort of thing. And that’s a general trend that we see. Google wants to do better at conversational search, and just giving your answers directly if you’re asking in some sort of a conversational mode.”

    While search-by-voice is certainly a growing trend on mobile, Google, as you may know, recently launched its conversational search feature for the desktop, and improvements to that shouldn’t be far off.

    Cutts continues, “At some point, we probably have to change our mental viewpoint a little bit, because normally if you add words onto your query, you’re doing an ‘and’ between each of those words, and so as you do more and more words, you get fewer and fewer results, because fewer and fewer documents match those words. What you would probably want if you have spoken word queries is the more that you talk, the more results you get because we know more about it, and so you definitely have to change your viewpoint from ‘it’s an and of every single word’ to trying to extract the gist – you know, just summarize what they’re looking for, and that matching that overall idea.”

    Good luck trying to optimize for gist.

    “If you take it to a limit, you can imagine trying to do a query to Google using an entire document or you know, a thousand words or something like that,” Cutts adds. “And rather than match only the documents that had all thousand of those words, ideally, you’d say, ‘Okay, what is the person looking for? Maybe they’re telling you an awful lot about this topic, but try to distill down what the important parts are, and search for that.’ And so it’s definitely the case that query syntax has changed. I think it will continue to change. You know, we allow people to query by images. You can search for related images by dragging and dropping a picture on Google Image Search. So people want to be able to search in all kinds of ways. They don’t want to think about keywords if they can avoid it, and I think over time, we’ll get better and better at understanding that user’s intent whenever we’re trying to match that up and find the best set of information or answers or documents – whatever it is the user’s looking for.”

    These days, Google is pretty hit and miss on the relevancy front when it comes to voice search, but I have no doubt that it will continue to improve rapidly. It’s already gotten significantly better than it was in earlier days.

    Can you optimize for gist? How will you adjust your SEO strategy as Google moves further and further away from keywords? Let us know in the comments.

  • Do You Follow Google’s Rules On Guest Posts?

    Google’s view of guest blog posts has come up in industry conversation several times this week. Webmasters and marketers have long engaged in the practice in writing articles for third-party sites as a content marketing strategy. Some have taken it to higher extremes of “SEO,” but regardless of how hard your pushing for a boost in PR from these articles, you might want to consider what Google has been saying about the matter.

    Do you write guest posts for other sites? Include guest posts on your site? Are you hoping to just provide good content or are you looking for linkjuice to help your Google rankings? Let us know in the comments.

    As far as I can tell this week’s conversation started with an article at HisWebMarketing.com by Marie Haynes, and now Google’s Matt Cutts has been talking about it in a new interview with Eric Enge.

    Haynes’ post, titled, “Yes, high quality guest posts CAN get you penalized!” shares several videos of Googlers talking about the subject. The first is on old Matt Cutts Webmaster Help video that we’ve shared in the past.

    In that, Cutts basically said that it can be good to have a reputable, high quality writer do guest posts on your site, and that it can be a good way for some lesser-known writers to generate exposure, but…

    “Sometimes it get taken to extremes. You’ll see people writing…offering the same blog post multiple times or spinning the blog posts, offering them to multiple outlets. It almost becomes like low-quality article banks.”

    “When you’re just doing it as a way to sort of turn the crank and get a massive number of links, that’s something where we’re less likely to want to count those links,” he said.

    The next video Haynes points to is a Webmaster Central Hangout from February:

    When someone in the video says they submit articles to the Huffington Post, and asks if they should nofollow the links to their site, Google’s John Mueller says, “Generally speaking, if you’re submitting articles for your website, or your clients’ websites and you’re including links to those websites there, then that’s probably something I’d nofollow because those aren’t essentially natural links from that website.”

    Finally, Haynes points to another February Webmaster Central hangout:

    In that one, when a webmaster asks if it’s okay to get links to his site through guest postings, Mueller says, “Think about whether or not this is a link that would be on that site if it weren’t for your actions there. Especially when it comes to guest blogging, that’s something where you are essentially placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a linkbuilding point of view. I think sometimes it can make sense to guest blog on other peoples’ sites and drive some traffic to your site because people really liked what you are writing and they are interested in the topic and they click through that link to come to your website but those are probably the cases where you’d want to use something like a rel=nofollow on those links.”

    Barry Schwartz at Search Engine Land wrote about Haynes’ post, and now Enge has an interview out with Cutts who elaborates more on Google’s philosophy when it comes to guest posts (among other things).

    Enge suggests that when doing guest posts, you create high-quality articles and get them published on “truly authoritative” sites that have a lot of editorial judgment, and Cutts agrees.

    He says, “The problem is that if we look at the overall volume of guest posting we see a large number of people who are offering guest blogs or guest blog articles where they are writing the same article and producing multiple copies of it and emailing out of the blue and they will create the same low quality types of articles that people used to put on article directory or article bank sites.”

    “If people just move away from doing article banks or article directories or article marketing to guest blogging and they don’t raise their quality thresholds for the content, then that can cause problems,” he adds. “On one hand, it’s an opportunity. On the other hand, we don’t want people to think guest blogging is the panacea that will solve all their problems.”

    Enge makes an interesting point about accepting guest posts too, suggesting that if you have to ask the author to share with their own social accounts, you shouldn’t accept the article. Again, Cutts agrees, saying, “That’s a good way to look at it. There might be other criteria too, but certainly if someone is proud to share it, that’s a big difference than if you’re pushing them to share it.”

    Both agree that interviews are good ways to build links and authority.

    In a separate post on his Search Engine Roundtable blog, Schwartz adds:

    You can argue otherwise but if Google sees a guest blog post with a dofollow link and that person at Google feels the guest blog post is only done with the intent of a link, then they may serve your site a penalty. Or they may not – it depends on who is reviewing it.

    That being said, Google is not to blame. While guest blogging and writing is and can be a great way to get exposure for your name and your company name, it has gotten to the point of being heavily abused.

    He points to one SEO’s story in a Cre8asite forum thread about a site wanting to charge him nearly five grand for one post.

    Obviously this is the kind of thing Google would frown upon when it comes to link building and links that flow PageRank. Essentially, these are just paid links, and even if more subtle than the average advertorial (which Google has been cracking down on in recent months), in the end it’s still link buying.

    But there is plenty of guest blogging going on out there in which no money changes hands. Regardless of your intensions, it’s probably a good idea to just stick the nofollows on if you want to avoid getting penalized by Google. If it’s still something you want to do without the SEO value as a consideration, there’s a fair chance it’s the kind of content Google would want anyway.

    Are you worried that Google could penalize you for writing high quality blog posts for third-party sites? Let us know in the comments.

  • Matt Cutts Talks About Site Downtime’s Impact On Rankings

    Google has released a new Webmaster Help video with Matt Cutts addressing the question: If my site goes down for a day, does that affect my rankings?

    Sound familiar? I thought so too. Earlier this year, Cutts did a similar video addressing the question: How do I get my search rankings back after my site has been down?

    Here’s the new one:

    “Well, if it was just for a day, you should be in pretty good shape,” says Cutts. “You know, if your host is down for two weeks, then there’s a better indicator that the website is actually down, and we don’t want to send users to a website that’s actually down, but we do try to compensate for websites that are transiently or sporadically down, and you know, make a few allowances. We try to come back 24 hours later or something like that, so if it was only just a short period of down time, I wouldn’t really worry about that.”

    He adds that you might want to drop into the Google Webmaster forum and look around a little. He notes that there was recently a day where Googlebot itself was having trouble fetching pages. It usually has “pretty good reliability,” though, he says.

  • Linking Practices That Annoy Matt Cutts, But Don’t Make A Difference To Google

    Google posted an interesting Webmaster Help video today about linking. It’s basically about whether it’s better to link to an original source somewhere at the top of a post, or at the bottom. The answer is essentially that it makes no difference, as far as Google’s algorithm is concerned. The link will flow PageRank either way, so as fas as SEO is concerned, it really doesn’t matter.

    After Cutts answers the question directly, he gets into his personal opinions and discusses what he finds annoying about linking practices.

    “I’ll just say, for my personal preference, I really appreciate when there’s a link somewhere relatively close to the top of the article because I really kind of want to know when someone’s talking about it, you know, hey, go ahead and show me where I can read the original source or let me look up more information,” says Cutts. “There are a lot of blogs that will give one tiny little link all the way at the bottom of a big long story, and by that time, it just doesn’t seem like it’s quite as useful, but that’s just a personal preference. That’s not ranking advice as far as it goes.”

    “The only other thing I hate – this is once again just personal – is whenever you’ve got a regular news report, whether it’s in a mainstream newspaper – New York Times, AP, whatever – and they say, ‘Blah Blah Blah said on a popular webmaster blog that blah blah blah,’ and they don’t link to the source,” he continues. “I mean, come on. Link to your sources, whether you’re a journalist, whether you’re a blogger, let people go and look at the original information themselves so that they can suss out what they think about whatever it is that you’re writing about. So if you just say, ‘Oh, it was discovered on a popular forum that blah blah blah,’ then we have to go look for it. That’s really annoying.”

    “Again, not ranking advice,” he reiterates. “Just asking everybody to be considerate on the web, and share credit, and attribute, so that people can, you know, do the research for themselves if they want to.”

    As if anybody on the web would ever be inconsiderate.

  • Hitler Is Talking About The Google Panda Update Again

    There’s a new Hitler Panda update video out. This isn’t the first time we’ve seen Hitler address the subject of Google’s controversial algorithm update, but it’s been a while. In fact, Hitler seems to have completely reversed his position from this one where he was clearly not a fan of the update.

    This time, Hitler is running Google, and is a huge proponent of the update. The video from FreewareGenius was uploaded on July 4th.

    Samer from FreewareGenius.com tells WebProNews, “Panda killed my site but at least I got to make a Hitler video about it entitled ‘Hitler as Google CEO’”.

    Yeah, I guess there’s always a silver lining.

  • Google Makes Navigation Changes To Webmaster Tools

    Google has launched a new navigation design for Webmaster Tools in an effort to make frequently used features easier to access.

    The design organizes features into groups that match “the stages of search,” as Google puts it. These are: Crawl, Google Index, Search Traffic and Search Appearance.

    Crawl shows you info about how Google discovers and crawls your content, including crawl stats, crawl errors, URLs that are blocked from crawling, sitemaps, URL parameters and the Fetch as Google feature.

    Google Index shows how many pages you have in Google’s index, and lets you monitor the overall indexed counts for your site and see what keywords Google has found on your pages. From here, you can also request to remove URLs from search results.

    Search Traffic lets you check how your pages are doing in search results, how people find your site, and links to your site. Here, you can also see a sample of pages from your site that have incoming links from other internal pages.

    Finally, Search Appearance includes the Structured Data dashboard, Data highlighter, Sitelinks and HTML improvements.

    Admin tasks (at the account level) are found under the gear icon in the corner. This includes things like Webmaster Tools Preferences, Site Settings, Change of Address, Google Analytics Property, Users & Site Owners, Verification Details and Associates.

    “This is the list of items as visible to site owners, ‘full’ or restricted’ users will see a subset of these options,” says Google Webmaster Trends Analyst Mariya Moeva. “For example, if you’re a “restricted” user for a site, the “Users & Site Owners” menu item will not appear.”

    There’s also a new Search Appearance pop-up, which shows how your site may appear in search, and gives more info about the content or structure changes that could influence each element. This pop-up is accessible via the question mark icon.

  • Longtime WebmasterWorld Admin ‘Tedster’ (Warren “Ted” Ulle) Passes Away

    Warren “Ted” Ulle, a longtime forum admin at WebmasterWorld has passed away, leaving the SEO community in mourning. WebmasterWorld founder Brett Tabke wrote in the forum last night:

    I am deeply saddened to tell you that long time administrator and member of WebmasterWorld, Tedster (Warren “Ted” Ulle) passed away Friday in his sleep with family near…

    Ted was a dear friend of this community and everyone involved. He always made time to talk to people regardless of their circumstances. He was one of the most friendly and approacable people in the entire industry. It was for that very reason that he was voted to be awarded the WebmasterWorld lifetime achievement award last year. While Ted had been sick for quite some time, he never showed it or bothered anyone with it. His strength while facing it, was inspirational and trademark Tedster.

    Tabke says the post will be updated with more info on memorial funds/charities in Ted’s name.

    While I didn’t know Ted personally, it’s been virtually impossible to cover search and SEO without seeing his name and posts frequently. I’ve certainly referenced his words numerous times, particularly throughout the Google Panda update era.

    It’s clear that Ted has been a very influential, well-liked and respected member of the SEO industry. All you have to do is peruse the WebmasterWorld thread and see what his peers have to say about him.

    image via WebmasterWorld

  • Google Tests Authorship-Like Results For Brands

    Google is reportedly testing authorship-like search results for brands, where it shows company logos and Google+ circle information for the results, similar to how it shows author photos and Google+ circle information with authorship.

    With Authorship, it’s based on authorship markup that Google encourages people to use, but with these brand results, it appears to be based on simply having a verified Google+ page. The image it grabs is actually whatever image the Google+ page has set as its profile image (so it doesn’t necessarily have to be a logo).

    Brand Results in Google

    Siege Media Founder Ross Hudgens spotted the results and blogged about them on Thursday. He writes:

    However, in both examples, Travelstart and Progressive, their Google Plus accounts are verified, which would lead some to believe that Google was circumventing that markup to reward pages they could identify as authoritative enough to give verification status.

    It’s worth noting that these results aren’t only showing up for branded keyword searches, but for generic ones (like “flight” and “car insurance”) as well. If this becomes more than a test, I’m sure Google will announce the feature properly, and encourage webmasters and businesses to get their Google+ pages verified.

    Identity and trust are more important than ever to Google these days, which is the whole reason authorship exists. Look for Google to be doing more with that in the future, as well.

    [via Search Engine Roundtable]

  • ‘Multi-Week’ Google Update Happening Now, Says Cutts

    Google is currently running a “multi-week” rollout of an update that will continue until the week after July 4th. Matt Cutts mentioned the update in response to some questions about “car insurance” spam on Twitter.

    The fact that this is in response to “car insurance” seems to indicate that this is part of the “payday loans” initiative Cutts talked about recently. We haven’t confirmed that, but the term seems to fit the bill. As Alex Graves at David Naylor’s Blog, who pointed out Cutts’ tweet earlier this morning, notes, “car insurance” is one of the big competitive niche areas online.

    Cutts said at SMX earlier this month that Google had started the update to help clean up spammy queries. He had warned about the update in a video in May, when he said:

    “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on Google.co.uk. So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.”

    So if Google had already started one update related to this back during SMX, and another one just started within recent days, that couldd be the “two different changes” Cutts mentioned.

  • Google Webmaster Tools API To Soon Let You Retrieve ‘Search Queries’ And ‘Backlinks’ Data

    Today’s Webmaster Help video from Google includes something of an announcement. Matt Cutts reveals that the company is working on some upcoming changes to the Webmaster Tools API.

    The video is a response to the user-submitted question:

    Can you tell us if Webmaster Tools will ever have an update to its API allowing us to retrieve the “Search Queries” and “Backlinks” data?

    “The answer is yes,” says Cutts. “That’s the short answer. The longer answer is, we’re working on it, and we hope to release it relatively soon. In the meantime, right now, there are PHP – there’s Python libraries that you can use to download the data, and so in the description (the meta information for this video) will include a few links where you can go and download toolkits that will allow you to get access to the data right now.”

    Here’s the link the description provides for downloading Search Queries data using Python. Here’s the one for PHP (not an official Google project).

    “And we’re gonna keep looking at how we can improve our API to get you more and more information over time,” Cutts says in the video’s conclusion.

    No time frame is given for when we might see an update to the API, but it sounds like it’s not too far off.

  • Should Google Penalize Content For Using Stock Images?

    Use stock images on your site? Soon, you may find that it is hurting your rankings in Google. Maybe.

    Do you think rankings should suffer when content utilizes stock images? Share your thoughts.

    Google’s Matt Cutts discussed stock photos as a ranking signal in a Webmaster Help video this week. Specifically, he responded to the following user-submitted question:

    Does using stock photos on your pages have a negative effect on rankings? Do original photos help you in this regard?

    “‘Does using stock photos on your pages have a negative effect on rankings?’ To the best of my knowledge, the answer is no,” says Cutts. “‘Do original photos help you?’ To the best of my knowledge, it doesn’t really make a difference whether it’s a stock photo versus an original photo.”

    But he doesn’t leave it at that.

    “But you know what?” he adds. “That’s a great suggestion for a future signal we could look at in terms of search quality. Who knows? Maybe original sites – original image sites might be higher quality, where sites that just repeat the same stock photos over and over again might not be nearly as high quality.”

    Interesting.

    “But to the best of my knowledge,” he reiterates, “we don’t use that directly in our algorithmic web ranking right now.”

    Well, even if Google is not using this as a signal currently, it’s hard to imagine why Cutts would make comments like these if he’s not serious about this actually being something Google could add in the future. They are, as you know, making changes to the algorithm every day. Here, he’s pretty much saying that original images are a signal of quality, so that’s worth paying attention to.

    Is this the case though? Should original images always be treated as a signal of quality? It raises some new questions that webmasters and SEOs haven’t necessarily needed to think about in the past.

    Will original trump actual photo quality? Will an amateur photo in an amateur blog post gain get a boost over a “professional” post with a re-used image from Getty? Will stock photo providers lose business because people are afraid to use the images in their content? Surely webmasters would never overreact to a ranking change Google makes, right?

    Do you think stock images hurt the quality of a piece of content? Should Google include this as a ranking signal? Let us know what you think in the comments.

  • Google Kills ‘Links’ In Its Ranking Message To Webmasters

    Google has a help center article in Webmaster Tools specifically about “ranking”. It’s not incredibly informative, and certainly doesn’t walk you through Google’s over 200 signals. It’s just a few sentences of advice, including links to Google’s Webmaster Academy and the “How Google Search Works” page.

    Internet marketer Erik Baemlisberger spotted in a change (via Search Engine Land) in what little wording there is, however, and it’s actually somewhat noteworthy.

    As you can see, the wording use to be: “In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages.”

    Now, it says, “In general, webmasters can improve the rank of their sites by creating high-quality sites that users will want to use and share.”

    Google has removed he word “link,” presumably to play down the importance of links in its algorithm. This doesn’t mean that links are less important. High quality links are likely still a major signal, but by de-emphasizing the word link (or removing it altogether), Google probably hopes to cut down on people engaging in link schemes and paid links – things Google has been cracking down on more than ever over the past year or so.

  • Google Adds Knowledge Graph-Like Carousel For Local Results

    Google announced today that it is adding a new carousel feature for local search results on the top of some search results pages. It looks essentially like the recently redesigned Knowledge Graph carousel, but includes local businesses and their Zagat scores. Google says in a Google+ post:

    Give it a go—type or say “mexican restaurants,” or try any similar search for restaurants, bars or hotels. Click on one of the places in the carousel to get more details on it, including its overall review-based score, address and photos. If you want to see more places, click the arrow at the right of the carousel. And you can zoom in on the map that appears below the carousel to restrict your search to only places in a specific area.

    While some iPad and Nexus tablet users have seen this new look since December, we’re excited to expand to desktop. The interactive “carousel” is rolling out in English in the U.S.—we’ll add more features and languages over time.

    The feature could be good for local businesses that don’t necessarily rank at the top of local results, as it features more on the page, and encourages users to scroll through them.

  • Wow, A Lot Of Stuff Just Happened In SEO

    It’s been a pretty big week for search and SEO news. There have been a lot of announcements, not only from Google, but from Google competitors. Let’s recap, and discuss in the comments.

    Which of the latest announcements do you believe will have the biggest impact on webmasters? On your SEO strategy? Let us know what you think.

    On Monday, Apple had its big Worldwide Developer’s Conference keynote, where it unveiled the latest versions of its Mac OS X and iOS operating systems. Within these unveilings were a few pieces of noteworthy search news. For one, its adding more search options to Safari, which is significant given that it has made moves in recent memory to distance itself further from Google. The big piece of news here, however, was the addition of Bing (Google’s biggest search competitor) to Siri as the web search provider. We discussed the implications of this in more depth here, but suffice it to say, this could lead to a lot more people accessing your content from Bing if you’re ranking there. In other words, you now have more of a reason to optimize for Bing.

    Also on Monday, Google released a video discussing mistakes webmasters are commonly making when using the Disavow Links tool.

    The most common mistake is that people are uploading the wrong kinds of files.

    Yelp, a frequent critic of Google’s (which generates its own share of criticism) is making moves to become a better local search tool. See its newly revamped “Nearby” mobile feature. Local businesses now have even more incentive to be found in Yelp. Speaking of Yelp, Greg Sterling at Screenwerk shares an anecdote in which a plumber claimed that 95% of his leads come from the service. This caught the attention of CEO Jeremy Stoppelman:

    Clearly some are finding Yelp well worth it, despite those decrying the service.

    Google made a major announcement in that it is readying ranking changes for mobile content. Basically, if you’re not providing smartphone users with the relevant content you’re providing them on the desktop, you’re going to be in trouble.

    “Some websites use separate URLs to serve desktop and smartphone users,” explain Google’s Yoshikiyo Kato and Pierre Far. “A faulty redirect is when a desktop page redirects smartphone users to an irrelevant page on the smartphone-optimized website. A typical example is when all pages on the desktop site redirect smartphone users to the homepage of the smartphone-optimized site.”

    “This kind of redirect disrupts a user’s workflow and may lead them to stop using the site and go elsewhere,” they add. “Even if the user doesn’t abandon the site, irrelevant redirects add more work for them to handle, which is particularly troublesome when they’re on slow mobile networks. These faulty redirects frustrate users whether they’re looking for a webpage, video, or something else, and our ranking changes will affect many types of searches.”

    More on all of this here.

    In addition to that, Google’s Matt Cutts hinted at SMX Advanced that mobile site speed could soon become a ranking factor. Google made site speed a signal several years ago, and it looks like they’ll be taking that a step further with mobile in mind.

    Cutts revealed quite a few things at SMX Advanced, actually. Here’s the whole discussion he had with interviewer Danny Sullivan:

    One thing he mentioned at the conference was that Google started rolling out a new ranking update to clean up more spammy queries. It’s been unofficially referred to as the “payday loans” update. Google had previously warned about forthcoming efforts in this area, and these efforts are now taking effect.

    In other algorithm update news, Cutts also indicated that Google hasn’t rolled out a Panda data refresh for a month and a half. Panda is apparently being run about once a month, and rolling out slowly over the course of roughly ten days.

    He mentioned a new structured data tool Google is beta testing, which allows webmasters to report structured data errors. Giving webmasters as much control over structured data is going to be increasingly important, as Google is turning to this kind of data more and more for its search results. Optimizing structured data could be considered a vital part of your SEO strategy these days, for better or worse. At least Google providing more and more tools in this area.

    Finally, Cutts announced that Google is now including example links in its messages to webmasters regarding manual penalties. Those who have to deal with these penalties find the addition very welcome. Cutts put out a video discussing this:

    Facebook, as I’m sure you’ve heard, has launched hashtags, which pretty much turn the giant social network into a real-time search engine, for all intents and purposes. That has some pretty big marketing implications. The hashtags, by the way, can be searched via Facebook’s Graph Search. On a separate note, Facebook is killing its sponsored search results.

    So those are some of the biggest stories in a very busy week for search. The mere fact that all of this stuff just happened over the past week really illustrates how rapidly the search game is evolving, and this doesn’t even take into account that Google makes changes to its algorithm every day.

    Out of all that was announced this week, which item are you most concerned about? Which are you most excited about? Let us know in the comments.