WebProNews

Tag: Search

  • Google Penguin Update: More On That Recovery Story From WPMU

    One site that got hit by Google’s Penguin update has managed to make a comeback, thanks to Google’s most recent data refresh of the algorithm update. WPMU’s story has been in the spotlight as an example of where a legitimate non-spammy site was hit by an algorithm update designed to attack webspam. We spoke with James Farmer who runs the site (which distributes WordPress themes) about the situation before the recovery, and now he’s shared some more thoughts about the whole thing with WebProNews.

    “I’m going to stick my neck out here and say I think we’re going to see a recovery *far better* than the traffic we experienced before,” Farmer tells us. “Which makes me wonder if we were being ‘kind’ [of] penalized before, but that the new penguin update has actually used something (like our social signals, for example) to really stamp that out.”

    “Needless to say I’m super happy,” he adds.

    Earlier this week, Ross Hudgens from Full Beaker, who provided some assistance and advice to Farmer, blogged about the recovery at SEOmoz. In that post, it mentions that Farmer did some other things to improve quality, beyond the Penguin-specific advice he was given.

    “Well, because we didn’t want to appear at all self-promotional we didn’t mention that one of the things we did was to actually eat our own dog food and implement the Infinite SEO plugin we make at DEV (instead of all the other bits and bobs we were using). This gave us a nice clean and unbroken sitemap.”

    “Plus, and this one may be significant, given that we’re a WP site, we were getting a bunch of warnings in Webmaster Tools (and via our SEOMoz report) about dodgy URL paramaters (? queries etc.) and so we implemented, finally, canonical URLs from all of the various comment permalinks etc. etc. to the actual posts themselves.”

    In the SEOmoz piece, Hudgens wrote:

    The most perilous piece of WPMU’s link profile came from one site – EDUblogs.org. EDU Blogs is a blogging service for people in the education space, allowing them to easily set up a subdomain blog on EDUblogs for their school-focused site – in a similar fashion to Blogspot, Typepad, or Tumblr, meaning that each subdomain is treated as a unique site in Google’s eyes. Coincidentally, this site is owned by WPMU and Farmer, and every blog on the service leverages WPMU theme packs. Each of these blogs had the “WordPress MU” anchor text in the footer, which meant a high volume of subdomains considered unique by Google all had sitewide “WordPress MU” anchor text. In what might have been a lucky moment for WPMU, this portion of their external link profile was still completely in their control because of WPMU ownership.

    In what I believe is the most critical reason why WPMU made a large recovery and also did it faster than almost everyone else, Farmer instantly shut off almost 15,000 ‘iffy’ sitewide, footer LRDs to their profile, dramatically improving their anchor text ratios, sitewide link volume, and more.

    We asked Farmer if the EDUblogs.org situation was the biggest factor in the recovery in his opinion, and whether he thought if that were the only thing that was addressed, the site would have still recovered.

    “A commentator on SEOMoz pointed out that WPMUD EV – our main business – is still linked to on Edublogs! And widely too!” he says. “However, as I responded to them, WPMU DEV has actually seen a significant growth (12%) in Google traffic in the last month, the second biggest monthly unseasonal bump there in the last couple of years.”

    “So I think that the removal of edublogs links may have been a factor, in fact I reckon it was a factor, but I think that Google identifying the strength of the site through our social presence and tweaking Penguin appropriately was the real deal,” he adds.

    “As we’ve been 100% above board with everything we’ve done and have to offer, I really hope we were a poster child for how Google could improve its results,” Farmer says.

    When WPMU first came into the spotlight, Google’ Matt Cutts had pointed out some specific problem link examples, which came in the footers of some WordPress themes that appeared on spam blogs. Part of Farmer’s post-Penguin strategy has been to request the removal of these links.

    “I don’t think anyone gets good referrals from footer links, apart from possibly great designers who insist on credits,” he says. “To me it was always just a ‘part of the same company’ thing. Iit just made sense, [but] clearly not!”

    “I think that if we do add them back it’ll be 100% branded and on relevant pages only (ie. the homepage, about page for the company). I think that given the relevance we’ll dofollow them,” he says.

    While he’s certainly a little biased in this department, Farmer says he thinks Google’s results in general have gotten better with the most recent Penguin refresh, saying “it’s like manna from heaven. Thank you for listening Google!”

    In a new post, Farmer says he’s getting record referrals since the latest Penguin update.

    Image: Batman Returns (Warner Bros.)

  • Google Shopping (“Paid Inclusion” Results) To Replace Product Search

    Google is changing Google Product Search to Google Shopping, and building it on product listing ads.

    “When searching for great local restaurants, people want places to eat right there on the results page, not another click or two away. It’s the same with hotels, flight options, directions and shopping,” says Sameer Samat, Vice President of Product Management, Google Shopping. “Organizing these types of data can be very different from indexing the Web, because the information is often not publicly available. It requires deep partnerships with different industries—from financial services and travel to merchants who sell physical goods.”

    This sounds very much like those paid inclusion results Danny Sullivan reported on this week, which we talked about here. Sullivan talked about Google “sponsored” results, which are being found in searches for hotels, flights and financial services – all of which are mentioned in Google’s announcement today. He quoted Google’s Amit Singhal as saying:

    “Fundamentally, time and time again, we started noticing that a class of queries could not be answered based upon just crawled data…We realized that we will have to either license data or go out and establish relationships with data providers…To be super safe, where we have a deal between Google and another party, we didn’t want to call those fully organic results, because they are based on a deal…After much debate, we said “OK, let’s be extra cautious. Let’s call it ‘sponsored’ so that we tell our users that there’s a special relationship that Google has established with someone.”

    In today’s announcement, Samat says, “We believe that having a commercial relationship with merchants will encourage them to keep their product information fresh and up to date. Higher quality data—whether it’s accurate prices, the latest offers or product availability—should mean better shopping results for users, which in turn should create higher quality traffic for merchants.”

    Google talks about how to create a new product listing ad here:

    The transition from Google Product Search to Google Shopping will be complete in the fall, Google says. Google is giving Merchants who create product listing ads by August 15, a 10% monthly credit of their total Product Listing Ad spend through the end of the year. Current Product Search merchants can get $100 AdWords credit toward the ads if they fill out a form before that date.

    Google says ranking in Google Shopping will be based on “a combination of relevance and bid price,” the same as Product Listing Ads today, and those who want to stand out can participate in Google’s Trusted Stores program. The program saw a limited launch last fall:

    Merchants will also be able to standout, using special offers, Google says.

    In Google.com results, the Shopping results will appear as “sponsored,” as discussed by Singhal. Google shows the following example for “telescopes”:

    Telescopes on Google

    It sure seems like there are a lot of ads “above the fold”. I thought Google didn’t care for that much.

    “These new formats are clearly labeled ‘sponsored,’ and take space currently occupied by AdWords,” says Samat.

    But it looks like there will be plenty fo AdWords ads on the page too.

    Google is, however, also putting the “sponsored” results in the area where other queries will return Knowledge Graph results:

    Telescope sponsored result

    These types of search results have been described by Google recently as a “third kind of thing” between organic results and ads. Sullivan has made a point to referr to them as “paid inclusion” results, and to point out that this kind of thing was considered “evil” by Google back in the IPO days.

    Things change (and Google doesn’t call it paid inclusion).

  • Google Alerts Searchers In China When Queries May Cause Problems

    Google announced today that it will now notify users in mainland China when they enter a keyword that may cause connection issues. Such issues are occurring, according to Google, when users search for a particular subset of queries.

    Google shared a video demonstrating what happens:

    Google says it has not found any problems with its own systems.

    “By prompting people to revise their queries, we hope to reduce these disruptions and improve our user experience from mainland China. Of course, if users want to press ahead with their original queries they can carry on,” explains Alan Eustace, Senior Vice President, Knowledge at Google. “In order to figure out which keywords are causing problems, a team of engineers in the U.S. reviewed the 350,000 most popular search queries in China. In their research, they looked at multiple signals to identify the disruptive queries, and from there they identified specific terms at the root of the issue.”

    “We’ve observed that many of the terms triggering error messages are simple everyday Chinese characters, which can have different meanings in different contexts,” Eustace continues. “For example a search for the single character [江] (Jiāng, a common surname that also means “river”) causes a problem on its own, but 江 is also part of other common searches like [丽江] (Lijiang, the name of a city in Yunnan Province), [锦江之星] (the Jinjiang Star hotel chain), and [江苏移动] (Jiangsu Mobile, a mobile phone service). Likewise, searching for [周] (Zhōu, another common surname that also means “week”) triggers an error message, so including this character in other searches—like [周杰伦] (Jay Chou, the Taiwanese pop star), [周星驰] (Stephen Chow, a popular comedian from Hong Kong), or any publication that includes the word “week”—would also be problematic.”

    Google will show a message to the user when they search for things it thinks might trigger the issues:

    Google Hong Kong messge

    Google will still let the searcher try to search, or edit their search terms and try a different approach to the query.

    Google.cn still redirects to Google Hong Kong, since Google pulled its search engine out of China a couple years ago.

  • Google Knowledge Graph Increasing Number Of Searches, Company Says

    Earlier this month, Google announced Knowledge Graph, which it considers its way of providing results about “things” rather than “strings” or keywords.

    Google has been speaking very highly of the feature, while also making sure people know it’s still a work in progress. Either way, it’s already increasing the number of searches people perform on Google, according to the company. The Wall Street Journal ran an article after interviewing Google’s Amit Singhal. Here’s a snippet:

    “Early indications are that people are interacting with it more, learning about more things…and doing more [search] queries,” said Amit Singhal, a top Google search executive, in an interview Friday. “It’s stoking people’s curiosity.”

    On Tuesday Google spokesman Jason Freidenfelds said the company’s internal data continues to show people are “doing more searches as a result” of the revamp, though he and Singhal declined to share specific figures.

    Google ran a doodle on its homepage this week celebrating the birthday of Fabergé Egg creator Peter Carl Fabergé. Despite an extensive and well-sourced Wikipedia entry on the man, Google did not display “knowledge panels” for him in its search results. This shows that Google has a lot of expansion to do on the Knowledge Graph, and it’s interesting that they would highlight someone that they don’t have knowledge panels for with a clickable doodle, which takes you straight to Google’s search results for him.

    If the Knowledge Graph continues to contribute to increased Google searches, that could do wonders for Google’s search market share, and ultimately mean a lot more AdWords impressions. It will be interesting to see how the market share trends as the Knowledge Graph expands, not only to cover more topics, but to include more data sources.

    More Knowledge Graph coverage here.

  • Google+ Local Reveals Google’s Plan For Zagat

    Google has announced that it’s rolling out Google+ Local, which it bills as “a simple way to discover and share local information”.

    Google is finally revealing just what it’s going to do with it’s Zagat acquisition, as Google+ Local features Zagat scores, as well as recommendations from Google+ connections. “Since Zagat joined the Google family last fall, our teams have been working together to improve the way you find great local information,” says Director of Product Management, Avni Shah. “Zagat has offered high-quality reviews, based on user-written submissions and surveys, of tens of thousands of places for more than three decades. All of Zagat’s accurate scores and summaries are now highlighted on Google+ local pages. ”

    Google+ Local is also integrated into search, Google Maps and mobile (Android…iOS is on the way). Inside Google+, it has its own tab. When you use that tab, you can search for specific places or browse non-specific ones. When you click on a place, you’ll go to a local Google+ page with photos, Zagat scores, summaries, reviews from friends and other info, like hours of operation, address, etc. Google will show you the same info if your’e using Search or Maps, it says.

    Here’s what it looks like on Google+:

    Google Plus Local

    Maps:

    Google Plus Local on Maps

    Google+ Local on mobile:

    Google Plus Local on Mobile

    “Each place you see in Google+ Local will now be scored using Zagat’s 30-point scale, which tells you all about the various aspects of a place so you can make the best decisions,” says Shah. “For example, a restaurant that has great food but not great decor might be 4 stars, but with Zagat you’d see a 26 in Food and an 8 in Decor, and know that it might not be the best place for date night.”

    Users can share their opinions and photos of places, of course, which will help feed the personalization of your friends’ results.

    Here are a few videos about Google+ Local:

    For Businesses

    Business owners will be able to continue managing their local listings the same way, using Google Places for Business. Businesses can still verify their basic listing data, make updates, and respond to reviews. If you use AdWords Express, Google says your ads will operate as normal, and will automatically redirect customers to the destination you selected, or your current listing.

    Businesses will have to get used to a new layout and design for their listings.

    Google Plus Local Businss Listing

    “All your basic business information is still available,” notes Jen Fitzpatrick, VP Engineering. “And by streamlining the layout and putting more focus on photos and reviews, we hope to help you highlight what makes your business truly unique.”

    “With these updates, we’re connecting the millions of people on Google+ to local businesses around the world,” adds Fitzpatrick. “With one listing, your business can now be found across Google search, maps, mobile and Google+, and your customers can easily recommend your business to their friends, or tell the world about it with a review.”

    Google posted the following to its Google+ Your Business Page:

    <a href=Google+ Your Business” width=”50″ />
    Google+ Your Business   21 minutes ago Posted by +Vanessa Schneider

    Manage a Google Places page? That listing just got a simpler, cleaner look with the introduction of Google+ Local, a new way to discover businesses across Google.

    With Google+ Local, customers can now easily recommend your business to their friends, or tell the world about it with a review. We’ve added +Zagat reviews and updated our scoring system to their 30-point scale, allowing customers to better share what makes your business unique. Google+ Local is integrated into Search, Maps, mobile and as a new tab in Google+ (just look for “Local” along the left to get started). We are rolling out today, so if you don’t see it now, you’ll see it soon.

    Read more about what this means for you as a business owner over on the Google and Your Business Blog (http://goo.gl/bVi9Q), or check out our quick guide to what’s new (http://goo.gl/p1dni).

    Have feedback to share? Use the “Send feedback” link under the Google+ gear icon

    Google says this is just a first step, that we’ll see more updates in the upcoming months, and that it will soon make it easier to manage listings on Google and take full advantage of the social features of Google+ pages, such as Hangouts, and sharing photos/videos/posts.

    Google has already given a handful of businesses access to such functionality. Check out The Meatball Shop for an example.

  • iAcquire Gets Rid Of Paid Link Offerings Following Google De-Indexing

    Blogger Josh Davis recently put out an investigative report exposing the marketing firm iAcquire for engaging in paid links for clients. Once Google caught wind of it, iAcquire was de-indexed from Google’s search results.

    Now, the company has openly admitted to “financial compensation,” though it says it has been transparent about this with its clients. On Tuesday, iAcquire put out a blog post talking about the ordeal. Here’s a snippet:

    There are many methods to develop link relationships. Based on the client strategy we deploy a variety of approaches to link development, and in some cases we’ve allowed financial compensation as a tool. Removing financial compensation from the link development toolset has been a long term goal for us. We are using these recent events to be a catalyst to expedite those plans effective immediately.

    We do not mislead customers nor operate in any manner contrary to their wishes or directives. Every strategy we develop is done in conjunction with knowledgeable online marketing specialists from iAcquire and our clients. Our process is transparent- every aspect of a campaign is available to our customers. In the past, we have responded to the frequent needs for urgency and speed from our clients. We are going to take this opportunity to discuss with our clients the best approaches to ensure a long term strategy and horizon for their program.

    The company has been engaging in a lot of related conversation on Twitter:

     

    @jonahstein We’re not that concerned with being deindexed–we weren’t driving much traffic through search anyway. But thanks for the…
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @righthatseo not exactly getting rocked, just nudging us in the right direction even quicker http://t.co/6RLQedRg
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @jonahstein but of course we are working to comply with google in order to return to the search results
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @craigaddyman Can’t speak for everyone else but we haven’t stopped working hard to be the best SEOs we can right now
    45 minutes ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

    We may soon see other companies being exposed in similar fashion, as Davis recently told WebProNews, “I have come across some other smaller companies which seem to be doing it (maybe one other large one, but I am still researching that).”

    Google penalties from paid links, as we’ve seen in the past, can have big effects on big companies. Overstock.com even blamed Google’s penalty for an “ugly year”.

    Update: Davis now tells us, “I am not currently researching other companies large or small that may be buying undisclosed links. While my research for the initial piece did unearth what appeared to be other paid links, that was just a byproduct of my initial work. I have not further pursued examining any more links. It took so much time to do the ‘Search Secrets’ piece in thorough manner, I don’t intend to duplicate that amount of work again.”

  • Google Penguin Update Refresh & Recovery Provide Hope For Webmasters

    As previously reported, Google announced its first Penguin update since the original over Memorial Day weekend. Google’s head of webspam, Matt Cutts, tweeted about it, saying, “Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches." Have you seen search referrals drop or rise since this update? Let us know in the comments.

    The good news, whether you were hit by Penguin the first time or this time, is that you can recover. We’ve now seen that this can happen, and since we know that Google will continue to push data refreshes for Penguin, there should be plenty of chances to do so. Just think about all the Panda refreshes we’ve seen since February 2011.

    We recently reported on WPMU, a seemingly quality site with plenty of fans on social media channels, which got hit by the first Penguin update. The site has now made a full recovery.

    Here’s what the analytics looked like after Penguin:

    WPMU analytics

    Here’s what the analytics look like now:

    WPMU Analytics

    It’s worth noting that Cutts was aware of this site, as James Farmer (the site’s owner) was able to get it brought to his attention, following the initial Penguin update, via an interview with the Sydney Morning Herald. Cutts had provided some examples of the kinds of links that were likely hurting it. This was all discussed in our previous article, but to summarize, WPMU distributes WordPress themes, and a lot of blogs, including spam blogs were using some of them, which included links back to WPMU in the footer.

    Ross Hudgens from Full Beaker provided some assistance and advice for Farmer, and blogged about the experience at SEOmoz. He notes that Farmer opted to ask blogs to remove the links, rather than applying nofollow to them, but it was actually an internal change that Farmer was able to make, which ultimately might have had the greatest impact on the recovery. Hudgens writes:

    The most perilous piece of WPMU’s link profile came from one site – EDUblogs.org. EDU Blogs is a blogging service for people in the education space, allowing them to easily set up a subdomain blog on EDUblogs for their school-focused site – in a similar fashion to Blogspot, Typepad, or Tumblr, meaning that each subdomain is treated as a unique site in Google’s eyes. Coincidentally, this site is owned by WPMU and Farmer, and every blog on the service leverages WPMU theme packs. Each of these blogs had the “WordPress MU” anchor text in the footer, which meant a high volume of subdomains considered unique by Google all had sitewide “WordPress MU” anchor text. In what might have been a lucky moment for WPMU, this portion of their external link profile was still completely in their control because of WPMU ownership.

    In what I believe is the most critical reason why WPMU made a large recovery and also did it faster than almost everyone else, Farmer instantly shut off almost 15,000 ‘iffy’ sitewide, footer LRDs to their profile, dramatically improving their anchor text ratios, sitewide link volume, and more. They were also able to do this early on in the month, quickly after the original update rolled out. A big difference between many people trying to “clean up their profile” and WPMU is time – getting everything down and adjusted properly meant that many people simply did not see recoveries at refresh 1.1 – but that doesn’t mean it won’t happen at all if the effort persists.

    Farmer was also able to get one of the blogs that Cutts had initially pointed out the links from, to remove the links. According to Hudgens, he also did some other things, which may have played a role in the recovery, such as: implementing canonical URLs to clean up crawl errors and eliminate unnecessary links, fixed some broken sitemaps and submitted them to Webmaster Tools, fixed some duplicate title tag issues (which Webmaster Tools reported). He also submitted the site to the form Google provides for those who think they’ve wrongfully been impacted by Penguin. Twice.

    It’s also possible that the exposure this site has received in the media, and in front of Matt Cutts could have helped. We’ve certainly seen penalties come from such exposure.

    Not everyone will be able to get such exposure to make their cases as strong to Google, but Google does look at the submissions to that form, so if you’ve determined that you’re in compliance with Google’s quality guidelines, and you still think you were actually hit by Penguin, that’s a good place to start your recovery efforts, but you’ll probably want to continue to dig as much as you can.

    Look at all of Google’s quality guidelines. Are there any areas where Google may think you’re in violation? Make the proper changes. Cutts recently pointed to the following videos as recovery advice:

    He also said the following tips from Marc Ensign “looked solid”:

    • Create a blog and consistently build up your site into a wealth of valuable content.
    • Work with a PR firm or read a book and start writing legitimate press releases on a regular basis and post them on your site.
    • Visit blogs within your industry and leave valuable feedback in their comments section.
    • Link out to other valuable resources within your industry that would benefit your visitors.
    • Share everything you are creating on 2 or 3 of your favorite social media sites of choice.
    • Position yourself as an expert.

    Virginia Nussey at Bruce Clay put together an interesting step-by-step guide to “link pruning” which might help you clean up your link profile, and ease your way to a recovery. She recommends setting up a spreadsheet with the following headers: Target URL, Source URL, Source Rank, Source Craw Date, Anchor Text, Image Link, ALT Text, Nofollow, Redirect and Frame. Then, she recommends adding the following to the spreadsheet, for webmaster contact info: Owner name, IP Address, Owner Address, Owner Email, Owner Phone Number, Registrar Name, Technical Content, Name Servers, Net Name, Created, Updated, Expires, Data Source (what site/registry was the resource for the contact gathered?).

    From there, it’s just about sending removal requests and seeing what happens. Hopefully lawsuits aren’t part of your strategy.

    We’ll have more discussion with Farmer to share soon, and perhaps he’ll be able to shed a bit more light on his own Penguin recovery. In the meantime, if you’re been hit, perhaps you can view his story as one of hope and inspiration, before you go starting over with a new site (which Google has also suggested as a possible option).

    Penguin will be back again. You can recover. Remember, there are always other non-Penguin signals that you can try to improve upon too. You certainly don’t want to forget about our old pal the Panda.

    Google called Penguin a success even before the latest refresh. What are your thoughts about it now that we’ve seen an update to the update? Let us know in the comments.

  • Peter Carl Fabergé Gets Google Doodle

    Fabergé egg creator Peter Carl Fabergé is being honored today with a Google doodle. The man was born on this day in 1846.

    He was a jeweler from Russia, whose company was commissioned by Tsar Alexander III, to create a jewel-encrusted Easter egg for the Empress Maria, his wife. From then on, he made more eggs over the years for the Tsar and the next Tsar, Nicholas II. That is, according to a well-cited Wikipedia entry.

    Sources for that Wikipedia entry include:

    • Twice Seven: The Autobiography of H C Bainbridge,
    • Fabergé: Goldsmith and Jeweller to the Imperial Court – His Life and Work
    • The History of the House of Fabergé according to the recollections of the senior master craftsman of the firm Franz P. Birbaum
    • The Fabergé Imperial Easter Eggs
    • Peter Carl Fabergé – Goldsmith and Jeweller to the Russian Imperial Court – His Life and Work
    • The Art of Carl Fabergé
    • Geza von Habsburg’s Fabergé
    • Masterpieces from the House of Fabergé
    • Faberge’s Eggs: The Extraordinary Story of the Masterpieces That Outlived an Empire
    • Faberge and the Russian Master Goldsmiths
    • Carl Fabergé: Goldsmith to the Imperial Court of Russia

    Interestingly enough, Google has no “knowledge panel” for Fabergé, despite this Wikipedia entry,

  • which has information aplenty. Google does still deem the Wikipedia entry the most relevant organic result for “Peter Carl Fabergé”.

    Here’s what Google deems to be the most relevant video for a YouTube search for “Fabergé egg”:

    In case the doodle has gotten you in the mood for some egg browsing, a Google image search for “Fabergé egg” makes for a very colorful experience:

    Peter Carl Faberge's eggs on display in Google Image Search

    It’s unlikely that today’s doodle will be as popular as some of Google’s other, more interactive doodles, such as last week’s Bob Moog doodle. Still, given Google’s penchant for easter eggs (as in fun, hidden items), and the nature of this particular doodle, I would not be surprised if some hidden gems were uncovered throughout the course of the day.

  • Google Panda Update Advice Appears In Webmaster Academy

    As you may recall, last year, Google put out a list of 23 questions that one should consider when assessing the quality of their content. This was largely considered to be the types of things Google is considering when it comes to the Panda update, which is supposed to be about surfacing quality content in search results.

    Last week, Google introduced Webmaster Academy, a new guide for helping you perform better in Google results. Earlier, we looked at Google’s advice on influencing your site’s listing in search.

    There’s another section specifically about content quality. The section is called “Create Great Content“.

    “One key element of creating a successful site is not to worry about Google’s ranking algorithms or signals, but to concentrate on delivering the best possible experience for your user by creating content that other sites will link to naturally—just because it’s great,” the guide says. It then provides a couple of lists. The first list is for what to think about when you’re writing a post or an article:

    • Would you trust the information in this article?
    • Is the article useful and informative, with content beyond the merely obvious? Does it provide original information, reporting, research, or analysis?
    • Does it provide more substantial value than other pages in search results?
    • Would you expect to see this in a printed magazine, encyclopedia or book?
    • Is your site a recognized authority on the subject?

    The second list is for problems to keep an eye out for:

    • Does this article have spelling, stylistic, or factual errors?
    • Does the site generate content by attempting to guess what might rank well in search engines?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites?
    • Does this article have an excessive number of ads that interfere with the main content?
    • Are the articles short or lacking in helpful specifics?

    This is all stuff from Google’s post-Panda list, but it’s not everything from that list. Some of the other entries in the initial list kind of go hand in hand with the stuff on these new lists, but some of the things not mentioned include:

  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • Does this article provide a complete or comprehensive description of the topic?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?
  • I do find it interesting that in the new lists, Google talks about the site being a recognized authority, but not so much from the author perspective. I have no doubt that Google considers this greatly, but it’s a little odd that it wasn’t included here. Goog, of course, has been pushing authorship in search results, even using it to promote Google+ profiles. It does provide the author with greater visibility in search results by default (with visual, clickable images).

    It’s also interesting that the “does this article describe both sides of a story” entry didn’t make an appearance. Perhaps Google’s increased personalization has made this less of a factor. If you follow a lot of conservative (or liberal) Google+ profiles, for example, it’s possible that you might see more content they’ve shared in your search results, which may or may not show both sides of a story.

  • Is Google Admitting That Negative SEO Is Possible?

    Google has a page in its Webmaster Tools help center addressing the question: Can competitors harm ranking? This has been a topic getting a lot of discussion since the Penguin update. It was getting a pretty good amount before then too, but it seems to have ramped up significantly in recent months.

    Some webmasters have noticed that Google has updated the wording it uses to address this question on the help center page. There are various forum discussions going on, as Barry Schwartz at Search Engine Roundtable has pointed out.

    Back in 2006, Schwartz posted what Google said on the page at the time, which was:

    There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.

    Our search results change regularly as we update our index. While we can’t guarantee that any page will consistently appear in our index or appear with a particular rank, we do offer guidelines for maintaining a “crawler-friendly” site. Following these recommendations may increase the likelihood that your site will show up consistently in the Google search results.

    These days, it just says:

    Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.

    Apparently the wording itself was changed in March, though the page says there was an update made on 05/22. Either way, Google recently changed it, and instead of saying “there’s almost nothing a competitor can do to harm your ranking…” Google now says, “Google works hard to prevent other webmasters from being able to harm your ranking…”

    That’s not incredibly reassuring.

    Google has also added the following video from Matt Cutts to the page:

    This video isn’t exactly an answer to the question though. It’s more about telling on competitors’ black hat tactics, rather than your competitors directly hurting your ranking. Essentially, it equates to: file a spam report.

    Rand Fishkin at SEOmoz has been testing the negative SEO waters, challenging others to hurt his sites’ rankings. He told us a couple weeks ago that despite thousands of questionable links, his sites were still ranking well.

  • Are Google’s Results Better Today Than They Were 5 Years Ago?

    According to Google CEO Larry Page, you’d be astounded by how bad Google search was 5 years ago.

    Do you think Google is significantly better than it was five years ago? How about two years ago? One year ago? Let us know what you think in the comments.

    Google has done a whole lot in the past five years. In the past year and a half or so, they launched two major algorithmic changes in Panda and Penguin, designed to surface higher quality content and reduce the clutter of webpsam. There have been a lot of complaints about both updates, but Google seems to think they have been successful.

    Page spoke at Zeitgeist 2012 this week, talking about a number of things, and wearing the famous Google glasses (or glass, if you prefer).

    “I think that’s a really big area of focus for us,” Page said, regarding search. That’s good to know. Google is still focused on search (in case you’ve been distracted by fancy future glasses, cars that drive themselves, and that sort of thing).

    Page spoke about the ways Google is getting better at search (though I’m not sure everyone completely agrees on that, based on many of the comments we see on a daily basis).

    “It’s an area where, you know, I think if you used Google from five years ago, you’d be astounded by how bad it is. Or how bad it was,” Page said. He then talked about things like Google’s Search Plus Your World personalized results and the recently launched knowledge graph.

    Search Plus Your World would be referring to Google’s big personalized search push, launched earlier this year. It draws heavily on the user’s Google+ connections, as well as various other social connections (though missing valuable personal data from networks like Facebook and Twitter).

    Knowledge Graph is what Google a launched last week, designed to help users find the things they’re actually looking for without having to click over to other sites (and to distinguish between queries with more than one meaning – such as Tesla the scientist vs. Tesla the car company vs. Tesla the rock band).

    While we’ve seen plenty of complaints about Search Plus Your World, I can’t honestly say I’ve seen many about Knowledge Graph.

    “Search has gotten a lot better,” said Page. “You don’t always see it, because we change it every day, and we try not to distract you too much with changes, but I think one of the things I’m most proud of that we did recently is that I have a friend at Google named Ben Smith, and that’s a very common name in the U.S. You know, Smith’s the most common last name. And it was very difficult to find him before. But now actually, with Google+ and with our understanding of all that, when I search for ‘Ben Smith,’ I actually get the Ben Smith that I know, and he actually appears in the search box. There’s a little picture of him, and if that’s not the Ben Smith I want, I can, you know, delete him, and put a different one in. But I’m actually searching for that person, rather than the sting – the combination of letters, and that’s a really big deal for Google.”

    He says they’re calling the Knowledge Graph boxes “knowledge panels.”

    “What we’re really trying to do is get to the point where we can represent knowledge, and we can do much more complicated types of queries,” said Page. “What are the 20 deepest lakes? What are the highest market cap companies? Whatever. Things like that. Things where we really understand what that query means, rather than just give you the exact text that matches best on some webpage somewhere, and so we’re really looking at synthesizing knowledge, and I’m incredibly excited about that.”

    Synthesizing. Perhaps the Moog doodle on Google’s homepage this week was more symbolic than anyone thought.

    Interestingly, since the Knowledge Graph was introduced, there seems to be less emphasis on Google+ content from Google’s SERPs in some cases. For example, before, with Search Plus Your World, a search for “music” might have brought up the Google+ profiles of random artists in a box on the side, but now, that query will bring up knowledge graph results for people. From there, you can click on the artist you want, where you’ll be directed to a different SERP specifically for that artist.

    When you are on the SERP for a particular person, however, you might see Google+ profiles. This is the case with Mark Zuckerberg, for example.

    Some users have complained since SPYW launched that there is too much Google+ in search results now, but Google also made an algorithmic change in March that may have toned that down a bit too.

    Google is tasked with quite the balancing act in trying to use its properties to grow Google+, while not sacrificing search relevancy in the process.

    Do you think Google’s results are the best they’ve ever been? Do you think they’ve improved in the past five years? Let us know what you think.

  • Matt Cutts: Here’s How To Expose Your Competitors’ Black Hat SEO Practices

    Google’s Matt Cutts put out a Webmaster Help video discussing how to alert Google when your competitors are engaging in webspam and black hat SEO techniques. The video was in response to the following user-submitted question:

    White hat search marketers read and follow Google Guidelines. What should they tell clients whose competitors use black hat techniques (such as using doorway pages) and whom continue to rank as a result of those techniques?

    Do you you think Google does a good job catching webspam? Let us know in the comments.

    “So first and foremost, I would say do a spam report, because if you’re violating Google’s guidelines in terms of cloaking or sneaky JavaScript redirects, buying links, doorway pages, keyword stuffing, all those kinds of things, we do want to know about it,” he says. “So you can do a spam report. That’s private. You can also stop by Google’s Webmaster forum, and that’s more public, but you can do a spam report there. You can sort of say, hey, I saw this content. It seems like it’s ranking higher than it should be ranking. Here’s a real business, and it’s being outranked by this spammer…those kinds of things.”

    He notes that are both Google employees and “super users” who keep an eye on the forum, and can alert Google about issues.

    “The other thing that I would say is if you look at the history of which businesses have done well over time, you’ll find the sorts of sites and the sorts of businesses that are built to stand the test of time,” says Cutts. “If someone is using a technique that is a gimmick or something that’s like the SEO fad of the day, that’s a little less likely to really work well a few years from now. So a lot of the times, you’ll see people just chasing after, ‘OK, I’m going to use guest books’, or iI’m going to use link wheels’ or whatever. And then they find, ‘Oh, that stopped working as well.’ And sometimes it’s because of broad algorithmic changes like Panda. Sometimes it’s because of specific web spam targeted algorithms.”

    I’m sure you’ve heard of Penguin.

    He references the JC Penney and Overstock.com incidents, in which Google took manual action. For some reason, he didn’t bring up the Google Chrome incident.

    This is actually a pretty timely video from Cutts, as another big paid linking controversy was uncovered by Josh Davis (which Cutts acknowledged on Twitter). Google ended up de-indexing the SEO firm involved in that.

    “So my short answer is go ahead and do a spam report,” Cutts continues. “You can also report it in the forums. But it’s definitely the case that if you’re taking those higher risks, that can come back and bite you. And that can have a material impact.”

    He’s not joking about that. Overstock blamed Google for “an ugly year” when its revenue plummeted. Even Google’s own Chrome penalty led to some questions about the browser’s market share.

    Cutts notes that Google is also happy to get feedback at conferences, on Twitter, online, blogs, forums, “if you’re seeing sites that are prospering and are using black hat techniques.”

    “Now, it’s possible that they have some low-quality links, and there are some links that people aren’t aware of that we see that are actually high quality,” Cutts notes. “But we’re happy to get spam reports. We’re happy to dig into them. And then we’ll try to find either new algorithms to try to rank the things more appropriately in the future. Or we’re certainly willing to take manual action on spam if it’s egregious or if it violates our guidelines. We have a manual web spam team that is willing to respond to those spam reports.”

    According to Cutts, you can even submit spam reports using Google Docs. Here’s a conversation he had on Twitter recently:

    @mattcutts Can we send a link to a Google Docs spreadsheet when reporting spam? #penguin 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    After Google launched the Penguin update, Cutts tweeted the following about post-Penguin spam reports:

    To report post-Penguin spam, fill out https://t.co/di4RpizN and add “penguin” in the details. We’re reading feedback. 5 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Shortly thereafter, he tweeted:

    @Penguin_Spam yup yup, we’ve read/processed almost all of them. A few recent ones left. 10 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    I’m sure plenty more reports have rolled into Google since then, but it does seem like they process them fairly quickly.

    Do you think Google has done a good job at cleaning up webspam? Share your thoughts.

  • Can Search Save Yahoo?

    Can Search Save Yahoo?

    To say that Yahoo has had its share of problems in the past few years is an understatement. The most recent news of the now former CEO Scott Thompson and his resume scandal has only added to the disorder surrounding Yahoo. The once highly regarded Internet giant has experienced all types of turmoil including numerous management changes, extensive layoffs, and the closing of multiple properties, all of which have raised a lot of questions about the company’s future.

    Do you think Yahoo can make a comeback in search? Let us know in the comments.

    The company’s “Search Alliance” with Microsoft leaves plenty of questions about the company’s future in search, as well, but there are rumors going around that the deal may not play out as planned. We spoke with Kevin Ryan, the CEO of Motivity Marketing, who says that based on rumors floating around in the industry, the search alliance between Yahoo and Microsoft might not make its projected 10-year tenure.

    “There’s a lot of rumors in the business that [it] isn’t going well, and that it’s not going to make the full-decade run,” he tells us. “So, if I’m Yahoo, I’m spending a little bit of money trying to figure out how we can get that search bucket going.”

    When the agreement was reached, Search Engine Land’s Danny Sullivan spoke with us about the impact of the deal on each company. “[A] big win for Microsoft, a lot of questions for Yahoo,” he summarized.

    “Yahoo effectively threw in the towel with search,” says Ryan of the alliance.

    Despite the many reports on the struggling search alliance, David Pann, the General Manager of Search Networks at Microsoft, spoke with us last year and pointed out that the companies had already experienced success, even in its early stages.

    “It’s easy to say, ‘Okay, well, you didn’t do this, you didn’t that – it’s a failure,’” Pann explained. “I don’t think of that. I think that, given where we are, and given the complexity of the relationship… we’re actually making very good progress.”

    However, if the rumors are true and Yahoo does pull out of its agreement with Microsoft, the company will have to begin thinking about search again. The only other option would be to sell its search business completely.

    Kevin Ryan, CEO of Motivity Marketing According to Ryan, there are several instances that could be credited as catalysts for Yahoo’s downward sprial, but he believes its Panama search ad platform played a large role in beginning the downfall. Panama was Yahoo’s attempt to monetize search, as Google did, but instead, Ryan says it was an “abject failure” for Yahoo.

    Ryan compares Yahoo’s current situation to what happened with Ask. The former search engine pulled out of the search industry in 2010 and has now transformed itself into a questions and answers service with an emphasis on mobile.

    There are some who would likely consider Yahoo to be a former search engine as well. Yahoo is currently in the process of reinventing itself as a media company, but the company’s short-term leaders have had difficulty in making this transition. Ryan, however, doesn’t believe this particular attempt is the best option for Yahoo.

    “Stop trying to reverse engineer HuffPo,” said Ryan, “and create something new.”

    But, Yahoo’s performance as a media company has been nothing to shake a stick at, and for many, Yahoo’s homepage works very well as a portal to the Web. Just take a look at Yahoo’s realtime homepage view counter at any given day. So far today (at 8:30AM Pacific), the page has already seen over 73 million views .

    Yahoo Homepage views

    Still, Ryan tells us that Yahoo should reinvest in its core business, in which search plays a very big role. With a renewed focus in this area, he believes Yahoo could better serve consumers and also advertisers, which could help it get back on the right track.

    Ryan told us there could be opportunities for Yahoo in terms of social development. Both Google and Bing have yet to completely succeed in social, which leaves an open door for Yahoo. He believes that a renewed focus on search combined with the opportunities in social could help to begin to turn Yahoo around. In addition, Ryan would like to see Yahoo make drastic changes internally in order to streamline its processes and improve its culture.

    “I hope that the change that Yahoo makes will be very internal,” he said. “I hope that the culture internally will become much more positive, but we’ll see.”

    At this point, Ross Levinsohn is Yahoo’s interim CEO. Although some people believe he will become the permanent CEO, Yahoo has not indicated any official word for who its next leader will be.

    It’s inevitable that changes will happen at Yahoo in the coming months, but what they will be and whether or not they will be effective in saving the company are both still in question. On the positive side, Yahoo was able to finally reach an agreement with Alibaba Group. The $7 billion deal will require Yahoo to sell back half its stake in the Chinese company, but it brings resolution to one of its many problems.

    Do you think search should play a significant role in Yahoo’s future as a company? Share your thoughts in the comments.

  • Google Penguin Update: Don’t Forget About Duplicate Content

    There has been a ton of speculation regarding Google’s Penguin update. Few know exactly what the update specifically does, and how it works with Google’s other signals exactly. Google always plays its hand close to its chest.

    “While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics,” Google’s Matt Cutts said in the announcement of the update.

    He also said, “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.”

    “We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings,” he said. To me, that indicates that this is about all webspam techniques – not just keyword stuffing and link schemes, but also everything in between.

    So it’s about quality guidelines. Cutts was pretty clear about that, and that’s why we’ve been discussing some of the various things Google mentions specifically in those guidelines. So far, we’ve talked about:

    Cloaking
    Links
    Hidden text and links
    Keyword stuffing

    Another thing on the quality guidelines list is: “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.”

    Of course, like the rest of the guidelines, this is nothing new, but in light of the Penguin update, it seems worth examining the guidelines again, if for no other reason than to provide reminders or educate those who are unfamiliar. Duplicate content seems like one of those that could get sites into trouble, even when they aren’t intentionally trying to spam Google. Even Google says in its help center article on the topic, “Mostly, this is not deceptive in origin.”

    “However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic,” Google says. “Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.”

    Google lists the following as steps you can take to address any duplicate content issues you may have:

    • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)
    • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.
    • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.
    • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
    • Use Webmaster Tools to tell us how you prefer your site to be indexed: You can tell Google your preferred domain (for example, http://www.example.com or http://example.com).
    • Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.
    • Avoid publishing stubs: Users don’t like seeing “empty” pages, so avoid placeholders where possible. For example, don’t publish pages for which you don’t yet have real content. If you do create placeholder pages, use the noindex meta tag to block these pages from being indexed.
    • Understand your content management system: Make sure you’re familiar with how content is displayed on your web site. Blogs, forums, and related systems often show the same content in multiple formats. For example, a blog entry may appear on the home page of a blog, in an archive page, and in a page of other entries with the same label.
    • Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.

    Don’t block Google from duplicate content. Google advises against this, because it won’t be able to detect when URLs point to the same content, and will have to treat them as separate pages. Use the canonical link element (rel=”canonical”).

    Note: there are reasons why Google might skip your Canonical link elements.

    It’s important to note that Google doesn’t consider duplicate content to be grounds for penalty, unless it appears that it was used in a deceptive way or to manipulate search results. However, that seems like one of those areas, where an algorithm might leave room for error.

    Here are some videos with Matt Cutts (including a couple of WebProNews interviews) talking about duplicate content. You should watch them, if you are concerned that this might be affecting you:

    This one comes from Google’s Greg Grothaus rather than Cutts. Also worth watching:

    If you think you’ve been wrongfully hit by the Penguin update, Google has a form you can fill out to let them know.

    More Penguin update coverage here.

    Tell us about duplicate content issues you’ve run into in the comments.

  • Google’s Amit Singhal: Penguin A Success

    Early this morning, Google Fellow Amit Singhal was interviewed by Danny Sullivan at Chris Sherman on stage at SMX London, the sister conference of Search Engine Land. Singhal discussed a variety of Google search-related topics.

    We were hoping to get a some in depth discussion about Google’s recent Penguin update, but apparently that wasn’t a major point of conversation. Daniel Waisberg liveblogged the discussion at Search Engine Land, and Penguin only came up briefly. Here’s the relevant snippet of the liveblog:

    Danny talks about Penguin and asks how it is going from Google standpoint, are search results better? Amit says that in the end of the day, users will stay with the search engine that provides the most relevant results. Google’s objective was to reward high quality sites and that was a success with Penguin. One of the beauties of running a search engine is that the search engines that can measure best what the users feel is the one that will succeed more.

    From Google’s perspective they use any signal that is available for them, more than 200 of them. They have to make sure they are accurate and good. They will use any signal, whether it is organic or not.

    “Google Penguin’s objective is to reward high quality sites and authors” Amit Singhal #smxlondon 4 hours ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Panda and penguin update has gone really well… Can someone show amit the results for Viagra #smx 4 hours ago via Twitter for iPad ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @dannysullivan please ask Amit if he has any Penguin recovery tips apart from removing links #smx 4 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Google’s Matt Cutts also recently said that Google has considered Penguin a success, though plenty out there disagree.

    If you want Google’s advice on Penguin recovery, check out these videos Matt Cutts says to watch, these tips he endorsed on Twitter, and of course Google’s quality guidelines.

  • Can You Build A Business Without Google Traffic?

    ClickZ ran one of those good wake-up call type articles about search marketing. It comes with a Penguin headline, but it’s really about much more than Penguin. It’s about how you shouldn’t run your business relying too heavily on how Google is ranking your content.

    How dependent on Google is your business? Let us know in the comments.

    The article was written by Sage Lewis, who made one statement in particular, which I think is worth reflecting on a bit:

    “It is very possible to build a business without Google traffic.”

    It may not seem like it sometimes, but I believe Sage is right. Do you? Can you run a business without Google traffic?

    It may mean doing some things differently than you’re currently doing them. In fact, if you were hit by the Penguin update (legitimately), you’re definitely going to want to rely on tactics that don’t involve gaming search results. It’s just not a sustainable business model. Even if you weren’t hit, and you’re managing to get away with something, it could be only a matter of time. If you think Google launched this Penguin update and that’s the end of it, you’re living in a dream world. If it’s anything like the Panda update, we’ll see numerous iterations of it. Google launched 2 Panda data refreshes in April alone – bookends for the Penguin update.

    It will be interesting to see how often we even hear about new Penguin updates. Since it’s designed to hit spammers, I don’t expect we’ll see the amount of complaints we’ve seen with Panda, which is more about content quality.

    Of course, even if you were not hit, and you aren’t spamming Google, you still shouldn’t be putting all of your eggs in one basket, because Google does make over 500 changes to its algorithm each year. There is always the possibility that Google will make a change that starts ranking other things above you.

    By the way, when we’re talking about Google traffic, we’re talking about organic search. You can always buy AdWords ads. Lewis mentions a handful of other online marketing strategies, like AdCenter, Facebook ads, LinkedIn ads, display, social media marketing, content marketing that brings visitors directly to your site and email marketing (which still has tremendous effects, by the way).

    The good news is that if you run your business, and your content as if you’re not worried about Google, you’ll probably find a lot more ways of driving quality traffic. Furthermore, if you are creating the kind of traffic that does well for other channels, it’s likely that Google will take this into consideration too, and you’ll have a variety of traffic sources, which end up including Google anyway.

    As you’ve read over and over again, it really is a matter of providing worthwhile content and products (or just content, if your content is your product). If it’s high quality, and has something to offer that people aren’t getting elsewhere, it is more likely to be shared across various social networks, talked about, and linked to. These things can drive traffic on their own, but it’s also the content that Google wants to rank well.

    Google’s advice is not to worry about specific algorithm changes so much, and focus on good content. Sure, it’s possible to play to certain signals Google uses, but that piece of advice really is more than just hot air from Google. There really is a great deal of merit to that mentality.

    Still, it nerver hurts to keep up with Google’s latest algorithm changes, and be aware of what’s going on.

    Do you think a business can survive without Google traffic? Let us know in the comments.

  • Should The Google Penguin Update Hit Sites Like WPMU.org?

    We recently told you about WPMU.org apparently getting hit by Google’s Penguin update. The site went from 8,580 visits (pretty standard for the site, having looked through the Analytics myself) to 1,527 a week later. It’s been hovering around similar numbers ever since, with a pretty clear dip right around Penguin time.

    Do you think this site deserved to get hit by Penguin? Let us know in the comments.

    Penguin drop

    We spoke with James Farmer, Founder and CEO of Incsub, which runs the site. Farmer maintains that WPMU.org engages in no keyword stuffing, link schemes, and has no quality issues. In fact, the site has actually done well throughout Google’s series of Panda updates.

    Farmer tells WebProNews, “We did great after Panda, it was like that update recognized we were decent folk… you can’t win them all huh?”

    “Apart from not being able to guess what Google was going to do in April, 3 years ago, we haven’t done anything wrong,” he says.

    Last week, Farmer received some second-hand info from Google’s Matt Cutts, who reportedly spoke with the Sydney Morning Herald about WPMU.org. According to Farmer, Cutts provided three problem links pointing to the site. These included a site pirating their software and two links from one spam blog using an old version of one of their WordPress themes with a link in the footer. Farmer reported that Cutts “said that we should consider the fact that we were possibly damaged by the removal of credit from links such as these.”

    It’s pretty interesting that if such links were the problem that it could have such a tremendous impact. It’s no wonder there have been so many discussions about negative SEO (competitors attacking each other with these kinds of tactics) since Penguin launched.

    The site has over 10,400+ Facebook likes, 15,600+ Twitter followers, 2,537 +1s and 4,276 FeedBurner subscribers, according to Farmer. Apparently not enough to outweigh some questionable links from third parties.

    “How could a bunch of incredibly low quality, spammy, rubbish (I mean a .info site… please!) footer links have made that much of a difference to a site of our size, content and reputation, unless Google has been absolutely, utterly inept for the last 4 years (and I doubt that that’s the case),” Farmer wrote in his article on the matter.

    When asked how many links he has out there just from footers for WordPress themes, he tells WebProNews, “Given that we stopped adding links years ago, actually not that many at all.”

    “However, the challenge is that given that we provided themes to a lot of multisite installs, which have since become overrun with splogs, there’s an enormous amount of links from not that many actual root domains,” he adds. “I’d guesstimate 1-2K, 99% of clearly low quality sites.”

    We asked if he’s heard from other WordPress theme creators, having similar issues.

    “Actually no, although that doesn’t surprise me that much,” he says. “Not many folk are as open as us, and in this field they probably have good reason to be. WordPress terms are very, very competitive so I wouldn’t be surprised if 9/10 competitors had something to hide!”

    Like many webmasters, Farmer just doesn’t know what to expect from Google, in terms of whether or not Google will consider the site to be one of the innocent casualties of Penguin.

    “I have no idea, I would love it if they did. I guess the thing I’m begging for is some sort of qualitative mechanism (NOT the manual webspam web, faster approach) that allows quality operators, like us, to survive and carry on providing Google users exactly the kind of helpful content they need!”

    Google does have a form users can submit to, if they think they’ve been wrongfully hit by the Penguin update.

    Google’s Matt Cutts recently told Danny Sullivan that Google considers the Penguin update a success, despite the large number of complaints from those commenting on blogs and in forums. Of course, the Penguin update, much like the Panda update, should be periodically coming back around, giving sites a chance to make fixes and recover. That also means however, sites will also have more chances to get hit.

    We asked Farmer if he thinks Penguin has helped or hurt search results in general, outside of his site’s issues.

    “Especially in the WP field they have gone wild,” he emphasizes. “For example our flagship site WPMU DEV – if you go to search for that now a competitor writing something ridiculous about us and copyright appears above our massively popular Facebook page. It even looks like our YouTube channel has been demoted. Crazy stuff.”

    We’ve certainly seen some other questionable search results following the update, and others have complained aplenty.

    Do you think the search results have improved since Penguin? Should WPMU have been hit by Penguin? Share your thoughts.

  • Google PageRank Applied To Cancer Outcome Prediction

    While PageRank may still be a huge part of Google’s search algorithm, some feel the model is outdated, and are looking for new approaches to web search. That’s not stopping scientists from finding interesting applications for PageRank, however.

    Earlier this year, we looked at a story about Washington State University chemistry professor Aurora Clark who claimed to have adapted Google’s PageRank algorithm for use in moleculaRnetworks, which is designed to enable scientists to determine molecular shapes and chemical reactions “without the expense, logistics and occasional danger of lab experiments.”

    In fact, we also interviewed her:

    More recently, a study, published in the Public Library of Science journal Computational Biology, looked at improving outcome prediction for cancer patients by network-based ranking of marker genes, using Google’s PageRank concept. The abstract for the study says:

    Predicting the clinical outcome of cancer patients based on the expression of marker genes in their tumors has received increasing interest in the past decade. Accurate predictors of outcome and response to therapy could be used to personalize and thereby improve therapy. However, state of the art methods used so far often found marker genes with limited prediction accuracy, limited reproducibility, and unclear biological relevance. To address this problem, we developed a novel computational approach to identify genes prognostic for outcome that couples gene expression measurements from primary tumor samples with a network of known relationships between the genes. Our approach ranks genes according to their prognostic relevance using both expression and network information in a manner similar to Google’s PageRank. We applied this method to gene expression profiles which we obtained from 30 patients with pancreatic cancer, and identified seven candidate marker genes prognostic for outcome. Compared to genes found with state of the art methods, such as Pearson correlation of gene expression with survival time, we improve the prediction accuracy by up to 7%. Accuracies were assessed using support vector machine classifiers and Monte Carlo cross-validation. We then validated the prognostic value of our seven candidate markers using immunohistochemistry on an independent set of 412 pancreatic cancer samples. Notably, signatures derived from our candidate markers were independently predictive of outcome and superior to established clinical prognostic factors such as grade, tumor size, and nodal status. As the amount of genomic data of individual tumors grows rapidly, our algorithm meets the need for powerful computational approaches that are key to exploit these data for personalized cancer therapies in clinical practice.

    The Author Summrary says:

    Why do some people with the same type of cancer die early and some live long? Apart from influences from the environment and personal lifestyle, we believe that differences in the individual tumor genome account for different survival times. Recently, powerful methods have become available to systematically read genomic information of patient samples. The major remaining challenge is how to spot, among the thousands of changes, those few that are relevant for tumor aggressiveness and thereby affecting patient survival. Here, we make use of the fact that genes and proteins in a cell never act alone, but form a network of interactions. Finding the relevant information in big networks of web documents and hyperlinks has been mastered by Google with their PageRank algorithm. Similar to PageRank, we have developed an algorithm that can identify genes that are better indicators for survival than genes found by traditional algorithms. Our method can aid the clinician in deciding if a patient should receive chemotherapy or not. Reliable prediction of survival and response to therapy based on molecular markers bears a great potential to improve and personalize patient therapies in the future.

    I’m not going to pretend like I understand the ins and outs of this complex study, and try to dissect it here, but if you want to dig through it, you can do so here.

    (via txchnologist)

  • Is DuckDuckGo Gaining Ground on Google?

    Is DuckDuckGo Gaining Ground on Google?

    In recent years, the search industry has not changed a lot in terms of large players. Google has maintained the leader position with its ownership of nearly 70 percent of search market share. The #2 and #3 spots have changed slightly after the Microsoft-Yahoo Search Alliance in 2009. According to the most recent Experian-Hitwise statistics, Bing-powered search has risen to 30 percent.

    Hitwise April Search Market Share Report

    In 2010, Ask exited the search business to focus its efforts on a Q/A service and mobile endeavors. Also in 2010, Blekko launched with the goal of becoming the “#3 search engine” by tackling the growing problem of spam with its slashtag technology approach.

    Other than these events, the search player side of the industry has been relatively quiet. There is, however, current talk of Yahoo re-entering the search market and pulling out of its 10-year search deal with Microsoft. This week, the company introduced Yahoo Axis, which could be its first move in this direction.

    Still, there is one more player that we have yet to mention – DuckDuckGo. This search engine launched quietly in 2008 and has stayed somewhat low on the radar until recently. It is getting a lot of attention now though for the bold position it is taking on major issues.

    Gabriel Weinberg, Founder of DuckDuckGo When we talked with the search engine’s founder Gabriel Weinberg last year, he told us that DuckDuckGo was focused on building a search alternative to Google. The search engine separates itself from other search engines through its Zero-Click Info feature that provides instant answers to search queries, its user experience that is free of both spam and clutter, and its privacy protections for users.

    Over the past year, DuckDuckGo has ramped up its efforts in each of these areas, and as a result, users are noticing. In a recent conversation with Weinberg, he tells us that DuckDuckGo receives just under 50 million search queries a month, which translates into about 1.5 million each day. In other words, the search engine has more than doubled its traffic.

    “We’ve grown gradually since the beginning, but we had a major uptick at the beginning of the year when we launched a visual redesign,” said Weinberg.

    He explained that DuckDuckGo made around 100 changes that it rolled out in that redesign. More recently, the search engine announced an effort that encourages developers to add instant answer plugins to DuckDuckGo. The initiative is called DuckDuckHack and is geared toward making the search experience faster and more relevant.

    DuckDuckHack Example

    Local, mobile, and social are also just as important to DuckDuckGo as they are to other search engines. Instead of viewing them as individual products, however, it uses an “umbrella approach” for them. In other words, all these areas are incorporated into finding the best search results.

    “Instead of tailoring results to you personally,” says Weinberg, “we, instead, return results that are generally known to be good.”

    Ultimately, DuckDuckGo is trying to improve in all the areas that Google seems to be lacking in. For instance, Google has had many struggles regarding content farms and spam in the past couple of years. Although it has attempted to address these issues with the ongoing Panda updates, some people, including Weinberg, believe the problems still exist.

    “A lot of it seems opaque to me,” said Weinberg. “I’m sure there’s a ton of changes, but I still see a lot of the same kind of, what I consider, content farms on Google.”

    The privacy issues against Google continue to build as well. The search giant has received scrutiny from both the U.S. and Europe, and after releasing its new privacy policy earlier this year, it has gotten even more criticism.

    For DuckDuckGo, this turn of events creates an opportunity. As users become more dissatisfied with Google, Weinberg is hoping that they’ll look to his search engine as an alternative.

    “We’re making the case that there are certainly some users who would prefer to be tracked a lot less,” he said. “I really think there is a percentage who prefer alternative experiences.”

    The irony in all this is that DuckDuckGo makes money the same way that Google does – through advertising based on search queries. But, “you don’t have to track users to do that,” Weinberg says.

    “The problem is that they want to serve better ads across their sites where you don’t have that search query to serve an ad against,” he further explained.

    While it is possible that DuckDuckGo could begin to pull away some of Google’s search market share, Weinberg tell us that DuckDuckGo has no desire to become a big corporation. Web search is the company’s #1 priority at this point, and he intends to keep it that way.

    “Our goal really is just to build a nice alternative search engine that… a decent percentage of people would prefer as their search engine of choice,” he said.

  • Google Penguin Update: There Hasn’t Been One Since The First One

    As previously reported, there has been some chatter in the forums speculating that Google may have launched another Penguin update. That’s not the first time this has happened since the original one, and it will surely not be the last, but rest assured, there has only been one Penguin update so far.

    A Google spokesperson tells WebProNews: “There hasn’t been an update since the first one.”

    It doesn’t get any clearer than that.

    Of course, one Googler recently said that Google didn’t even have anything called Penguin, so I guess you can never be 100% sure.

    That said, I’m pretty confident that this particular Googler is right. Even the speculation about the possible update has been mixed. Some are attributing traffic dips to the holiday weekend.

    There’s also the fact that Google makes changes every day. We should soon be seeing the big list for the month of May.

    In the meantime, you’d probably do well to focus on making your site and content as good as they can be, and keep it all within Google’s quality guidelines. Also, try to make sure if you hire an agency to do your SEO, that they’re not engaging in any paid linking on your behalf.

    You can still expect Penguin to be coming back around sooner or later.

  • Nielsen Reveals Top Online US Brands & Travel Sites

    Nielsen Reveals Top Online US Brands & Travel Sites

    Nielsen is always slaving away to find out what’s hot and what the latest trends tell us about consumer behavior. This time they were looking at who the top online brands were in the United States.

    They also examined who the top travel brands were from online search. As with many studies on popular culture, the results are not really a surprise.

    Who came out on top? Of course, Google did, and in more ways than one. Obviously they won for search, but they were also the most clicked on travel brand with their Google Maps site. As you can imagine, they were followed by MapQuest in the travel category, with Yahoo Local and TripAdvisor trailing very far behind.

    As for the top US search brands it plays out as follows; Google, Facebook, Yahoo, MSN/WindowsLive/Bing, YouTube, Microsoft, AOL Media, Amazon, Wikipedia, then Apple. Again, nothing really comes as a surprise from that list.

    The study was conducted in April of this year where the United States had 210 million active internet users. On average, people spent 29 hours online browsing during April. So that’s the latest word from Nielsen; we like Google, Yahoo, and Facebook.