WebProNews

Tag: SEO

  • Matt Cutts Gives An Update On Penguin

    Googe’s Matt Cutts participated in a keynote discussion at SMX Advanced, and as you might have guess, the topics of Google’s Penguin and Panda updates came up more than once.

    Matt McGee liveblogged the event for the SMX-affiliated Search Engine Land, and quoted Cutts throughout. Cutts also answered a bunch of questions on Twitter, so some of those tweets will be sprinkled throughout this article.

    The first question was about Penguin. According to this liveblogged account, Cutts said, “We look at it something designed to tackle low-quality content. It started out with Panda, and then we noticed that there was still a lot of spam and Penguin was designed to tackle that. It’s an algorithmic change, but when we use a word like ‘penalty,’ we’re talking about a manual action taken by the web spam team — it wasn’t that.”

     

    @dyksta algo changes can result in abrupt drops, b/c an algo can be launched quickly. To differentiate, check for manual action in console.
    6 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    When asked if Google just did another Penguin update, Cutts said, “No.”

     

    @damienpetitjean no Panda or Penguin updates going out recently, no.
    7 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Cutts does say both Penguin and Panda will happen on nearly a monthly basis:

     

    @GaryLHenderson and that’s a ballpark estimate, but it’s held pretty true for Panda data updates for example.
    6 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Cutts also confirmed that WPMU.org recovered because of the Penguin update, and the site’s cleaning up. He says there’s no white list with Penguin (or Panda).

     

    @RossHudgens and the more general statement is that there is no whitelist for Penguin, just as there is no whitelist for Panda.
    6 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    He also said, “WPMU had a pretty good number of links from lower quality sites.”

    Some comments Cutts made in the past about starting over with your site if you were hit by Penguin, scared some, but this was brought up again during the keynote. “Sometimes you should. It’s possible to recover, but if you’re a fly-by-night spammer, it might be better to start over,” he’s quoted as saying.

     

    @AndyBeal I think the site that set the record most recently had nine completely different things that we flagged on it. Sheesh. cc @jenstar
    7 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @thompsonpaul I take care of the penguin and he takes care of lots of search results 🙂
    7 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

  • SMX Advanced: The Periodic Table Of SEO: 2012 Edition [Live Twitter Reaction]

    SMX Advanced is officially underway in Seattle, and the first session was: The Periodic Table Of SEO: 2012 Edition. SMX’s official description for the session is as follows:

    We introduced the Periodic Table Of SEO last year at SMX Advanced. Since then, new elements have been discovered, such as a penalty for pages top-heavy with ads or boosts for being in Google+. Meanwhile, Google warns that an “over-optimization” penalty may be coming.

    Here’s the speaker list:

    Moderator: Danny Sullivan, Editor-in-Chief, Search Engine Land (@dannysullivan)
    Q&A Moderator: Jonathon Colman, Internet Marketing Manager, REI (@jcolman)
    Speakers:
    Jeff MacGurn, VP of SEO, Covario (@yerrbo)
    Mark Munroe, Senior Director, SEO, Reply (@markemunroe)
    Kristine Schachinger, Founder/Consultant, SitesWithoutWalls.com (@schachin)
    Chris Silver Smith, President, Argent Media (@si1very)

    Here’s what attendees are saying about the session on Twitter (we’ll update as the tweets pour in, so feel free to keep refreshing). It will be almost like being there:

    I have never seen Danny so dressed up! Doesn’t he know he’s in Seattle? #11A #smx
    3 minutes ago via Twitter for iPhone · powered by @socialditto
     Reply  · Retweet  · Favorite

    @mattcutts is being fed grapes while sacrificing paid links backstage. If @dannysullivan says it. It must be true #smx #11a
    6 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    So much Matt Cutts humor. He is keynoting at the event this evening, by the way.

    @si1very about Ranking Factors: Quality/Trust Scores can be assessed separately from prominence #smx #11a
    4 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Over 100+ factors can be assessed in quality scores (though not all will be triggered) @si1very #11A #SMX
    4 minutes ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Someone in the front row has not muted his or her very annoying computer or phone. Please mute! #smx #11a
    3 minutes ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Brands rule, always have and always will #smx #11A
    5 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Privacy Policy important for trust!! #smx #11a
    3 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    @si1very says: the high quality sites out there have good “About Us” pages. Low quality sites don’t #smx #11a
    3 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @si1very quality authorship indicators (bylines, bios, social links, etc) are also quantifiable #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    #smx #11a only about 8% of websites have a privacy policy
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @si1very speculates a low-quality writing penalty? Learn how to write well folks. And don’t buy $3 content! #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @si1very Content (and links) above the fold and page width given more importance. #smx #11a
    3 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Users spend 80% of their time looking at info above the page fold. #smx #11a
    3 minutes ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Good thing Google is adding its Browser Size tool to Google Analytics, where you can see what content is actually above the fold on your pages.

    SEO may be getting supplanted by User-Centered Design and Usability. We need to do what’s good for users not rankings #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Great #SEO open by @si1very, simple trust signals & the thought that user-centric design is a growing rank factor. #smx #11A
    2 minutes ago via Twitter for iPhone · powered by @socialditto
     Reply  · Retweet  · Favorite

    Microformats are a factor. Users click results with snippets more. And understand what they’re clicking more – @yerrbo #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @yerrbo Talking micro formatting and rich snippets boosting CTR & rankings at #smx #11a
    4 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    If you don’t use rich snippets. The online world will turn into a Ben Affleck movie with an Aerosmith ballad. @yerrbo #smx #11a
    3 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Filtering Google’s search results using left nav filters instantly excludes sites that don’t have semantic markup / rich snippets #smx #11a
    3 minutes ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Do-Follow, exact keyword anchor text links are seeing lowest correlation to strong rankings. High benefits to non-match #smx #11a
    3 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    If Google offers a search filter and your site does NOT use rich snippet/microformats you will not be in the result #11a #smx via @yerrbo
    3 minutes ago via TweetCaster for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    Even Danny’s taking the time to chime in on Twitter:

    Links are the four letter word of the SEO industry even though it has five letters @yerrbo #smx #11a
    2 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Social Signals Test: G+ most, Facebook like least, twitter over time, Pinterest strong overall. @yerrbo #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Using only social signals, a new page with no other links or onsite optimization grew to position 12 quickly. @yerrbo #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Pinterest strong overall. No wonder Bing’s Duane Forrester recommends it so highly as a Penguin recovery tip.

    Testing what social signals help with Google surprise, Google+ was huge, but also Pinterest seems to build good links @yerrbo #smx #11a
    2 minutes ago via Twitter for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    Great analysis of the social explosion of the anti-SOPA “Operation Blackout” by @yerrbo #smx #11a.
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Interesting – Over 10k peices of content collected from social platforms on the SOPA blackout. Less then 15% became popular. #smx #11a
    3 minutes ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    68% of content that goes viral on reddit is just an image hosted on imgur #smx #11a
    1 minute ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Almost 70% of what goes viral on reddit is an image. @yerrbo #smx #11a. I’d guess 60% of that are memes!
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    I’d add that the majority of those images are hosting on Imgur (which hit 2 billion page views per month recently). Read here about how Imgur can drive big traffic to product pages (and they don’t even have iPhone and Android apps yet. They’re coming this fall.)

    for sure @yerrbo is a #reddit guy. What time does the narwhal bacon? #smx #11a
    2 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Need to find this Canadian guy who is responsible for 2% of all viral content #11A #smx
    2 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Most content consumed at work hrs. Serious in am. Humorous in pm @yerrbo study of 10k pieces of #some & #SOPA virality #smx #11a
    2 minutes ago via TweetCaster for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    @markmunroe 2005 SEO was fat, dumb, and happy. Not anymore 🙂 #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    #SEO 2005 buy some links build some doorway pages … not anymore Google is about relevance & trust @markmunroe #smx #11a
    47 seconds ago via TweetCaster for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    Google giving answers right in the results. more knowledge graph can mean less clicks. Be fantastic to get clicked! #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    It just occurred to me to check the session’s hashtag on Google+. Not a lot happening there. Not a good sign for Google+ engagement, when the updates aren’t rolling in from an SEO conference.

    <a href=Google+ Engagement” />

    Google has thresholds for their “quality” metrics. You don’t know where you are, and you could plummet at any time. #smx #11a @markmunroe
    4 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Google Analytics likely not directly used as a quality indicator @markmunroe #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Bounce rates probably aren’t factored either. Doesn’t mean they aren’t important for UX @markmunroe #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    “If your website sucks, get SUX” (Search UX) “- @markemunroe #smx #11a
    54 seconds ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Overall bounce rate may not be important. But Google sure knows when users immediately go back to Google from your site #smx #11a
    3 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Apparently we are more advanced monkeys @jaredmore #smx #11A
    3 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    “Google has perfect knowledge of how your site performs in the SERPs. You have none” – @markmunroe #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    “Everything starts with a question on the SERPs” – @markmunroe #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Engagement tools help if they’re a response to the search query. Use relevant engagement tools that are specific to search queries #smx #11a
    1 minute ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    #Content can only be judged in response to a #search query. @markmunroe #smx #11a
    1 minute ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Optimize for search experience. Are users finding on your site what they are searching for in Google? Via @markmunroe #smx #11a
    2 minutes ago via TweetCaster for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    optimizing for UX not Company Goals? Problem is you get to be too info based, and not sales based. Balance #SMX #11A
    2 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    If you aren’t integrating user-testing and surveys into your strategy, you are missing out on major opportunities #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Site sucks in Google? Maybe SUX issue: bad search user experience, searchers don’t find answer bounce back to results @markmunroe #smx #11a
    1 minute ago via Twitter for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    To Google are users’ search needs being met? If yes..good If no..check your goals @markmunroe #smx #11a
    2 minutes ago via TweetCaster for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    Are we really talking correlation v causation? Must we have this at every SEO conference? #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    correlation is not causation, sounds like @randfish #SMX #11A
    1 minute ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    @schachin – “Correlation doesn’t equal Causation.” I feel like I’ve heard some great SEO say that before. @randfishkin #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    Anyone at #11a of #smx – it’s FREEZING in here! Can we turn up the heat!
    1 minute ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    @schachin – “Observer perception is often incorrect.” #smx #11a
    2 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @si1very @schachin I feel like I’m back in my philosophy logic class! #11a #smx
    2 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    SEO changes but what users want does not #SMX #11A
    2 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    When Penguin hit, a lot of people were looking for Penguin-problems that were actually caused by Panda related issues @schachin #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @dannysullivan FYI this room is freeeeeeeeeezing #smx #11a
    1 minute ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @StephStMartin We’re working on turning the cold air down in this session room 🙂 #smx #11a
    2 minutes ago via TweetDeck · powered by @socialditto
     Reply  · Retweet  · Favorite

    Trying to understand the Correlative, Causal, Causation argument in #smx #11a? Go here: http://t.co/8eAi0BAx
    4 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    “Put 10 SEOs in a room and they’ll all have a different opinion” @schachin #smx #11a
    3 minutes ago via Twitter for iPad · powered by @socialditto
     Reply  · Retweet  · Favorite

    @monicawright we’re getting the same message about Pinterest in #smx #11A
    3 minutes ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

    Fantastic panel on new ranking factors &indicates Google is, or likely is using. Lots of stuff to implement. #smx #11a http://t.co/Wgv9ZmSH
    1 minute ago via Photos on iOS · powered by @socialditto
     Reply  · Retweet  · Favorite

    People of #SMX — please help. Someone picked up my iPhone in the #11A session. I’m desperate. If you have it, pls turn it in to concierge?
    38 minutes ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    If you want more of a liveblogged account of the session, Barry Schwartz has one at Search Engine Roundtable.

  • Have You Seen This Matt Cutts Parody Twitter Account? [SEO Humor]

    Matt Cutts has quite a following for a search engine engineer, and with good reason. He’s in a pretty powerful position, leading the webspam team at Google, and being a “Distinguished Engineer” at the company and all. When webmasters want to know why the things they’re doing (or aren’t doing) are affecting their search rankings, Cutts (either directly or indirectly) is often the first person they turn to for advice.

    With this popularity, comes parody. Sure, we’ve seen the fake Matt Cutts commenting throughout the Blogosphere, pretending to be the real Matt Cutts, but other impostors are more transparent, and are clearly just having fun with no ill will intended.

    Take, for example, this Matt Cutts parody video:

    There’s a Matt Cutts parody account on Twitter: @rnattcutts. It’s been around for a while, and has amassed over 600 followers so far. It looks like the account started in mid April, close to Penguin time. Here are some of the tweets:

    Team (WEBSP)AM-erica – Fuck Yeah!!!
    47 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Now I’ve taken away all your Anchor Text you’ve no choice but to create Great Content! Muahahaha!!
    47 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Remember to add all those 5 star Rich Snippets to your home page 🙂
    47 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    SEO’s have you considered a career in PPC?
    47 days ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Popcorn’s on & time for me to settle down for my favorite soap opera’s – Hello! Black hat forums
    47 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Stupid @Twitter having Avatar issues – wouldn’t happen on Google plus!
    47 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Time for another day of wack-a-mole aka work!
    47 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Dear SEO’s stop moaning & use @blekko if you hate our product!
    46 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    .@skrenta got any jobs?
    46 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Am I made of cheese? I drank so much Sprite at Fat Steve’s party last night I think I’m hallucinating
    45 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Ok, SEO’s what’s the weirdest thing in your office? Winner gets an unnatural link warning sent to their competitor of choice.
    45 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Does anyone fancy meeting up later for a nice long conversation about great content?
    45 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Woot!! Finally won the Banana cup on Mario Kart Wii. What are your greatest lifetime achievements?
    44 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Going to feed the Panda again soon, boys & girls. Would appreciate you load some crap content to your site ASAP
    44 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    If you’ve been caught up in the webspam updates we want to help you http://t.co/sVT9LfdK
    40 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Sorry for the lack of tweets this weekend. I had food poisoning after visiting a restaurant with over 63,000 five star rich snippet reviews.
    35 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    Listening to some Tupac whilst burning down Link Farms…. Gangsta Matt is in the Hiz-zouse!!
    33 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    If I have to sit through another one of Sergey’s PowerPoint presentations I’m going all “Falling Down” in this goddamn place!!
    24 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    God these SEO’s are bunch of whiny pricks
    3 days ago via Echofon · powered by @socialditto
     Reply  · Retweet  · Favorite

    As I said, the account is not maliciously impersonating Cutts. The bio says: “I’m the head of the web sparn team at Google. I love fried chicken & hate anchor text (parody).”

    Cutts tends to be a good sport about it all. Someone tweeted him a Google algorithm parody article about the “Scissors Update,” and he acknowledged that he has a pretty thick skin:

     

     

     

     

    @cstechjoel if I had a thin skin I would have switched to a different job years ago. 🙂
    10 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

  • Browser Size Tool Comes To Google Analytics, Lets You Analyze “Above The Fold”

    Google announced the launch of the Browser Size analysis tool in Google Analytics, under the In-Page Analytics report. The tool shades the portions of a page that are “below the fold,” and shows you what percentage of users are seeing how much of the page.

    “What is actually ‘above the fold’ on a web page is a significant factor to conversion rates,” says Gaal Yahas from Google’s Analytics team. “If visitors have to scroll to see an ‘add to cart’ button, or some other critical element, they may never get around to it. Analyzing the percentage of visitors for whom page elements fall beneath the fold or off to one side is difficult, so we’ve created a visualization that lets you quickly determine which portions of your page are visible to which percentages of visitors.”

    Browser Size in Google Analytics

    This may prove to be a helpful SEO tool, as well, considering Google’s recent “Above the Fold” algrorithm update, which penalizes pages with too many ads above the fold. In January, Google’s Matt Cutts wrote, “We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.”

    In fact, in that post on Google’s Webmaster Central blog, Google suggested using the Browser Size tool, which at that point was just part of Google Labs. That version will be sunsetting in a month, and you’ll have to use Google Analytics.

    This seems to be becoming a trend with site optimization tools from Google. Google’s just building them into Google Analytics. Last week, Google announced the addition of Content Experiments in Google Analytics, which is taking the place of Google’s Website Optimizer:

    Both the Browser Size tool and the Content Experiments tool are rolling out to Google Analytics users over the next few weeks.

  • Matt Cutts To Keynote SMX Advanced, Leaves Panda, Florida Apparel At Home

    SMX Advanced gets started on Tuesday in Seattle. In what many, I’m sure, will consider the highlight, Google’s Matt Cutts will keynote.

    Here’s what the SMX site says about it:

    Google software engineer Matt Cutts will be returning to SMX Advanced in 2012. As the head of Google’s web spam team, Matt’s been dealing with webmaster issues for Google since 2000 and is well known to many advanced search marketers from his blog and public speaking.

    Matt will participate in an always popular and engaging “You&A” format keynote, in which he’ll address questions from the SMX Advanced audience. So bring those questions, and don’t miss out on this SMX Advanced tradition!

    Just don’t expect any Panda or Florida t-shirts. Cutts tweeted:

    Almost done packing for #SMX. Leaving T-shirts with pandas or Florida on them at home for this trip. http://t.co/ASJpwdYS
    22 hours ago via Twitter for Android · powered by @socialditto
     Reply  · Retweet  · Favorite

    He didn’t say anything about Penguin apparel. There are certainly plenty of fashionable options in the market:

    Google results - Penguin shirts

    Hopefully he’ll show up in the “Come at Me Bro” (Google’s first shopping result):

    Come at be bro Penguin shirt

    Either way, I’m sure the subject of Penguin will come up more that once at the event. Google’s Penguin update was designed to tackle webspam. That is Cutts’ department, after all.

  • Matt Cutts Addresses Duplicate Content Issue In New Video

    This week, Google posted a new Webmaster Help video featuring Matt Cutts talking about a potential duplicate content issue. This time, he even broke out the whiteboard to illustrate his points.

    Specifically, Cutts addressed the user-submitted question:

    Many sites have a press release section, or a news section that re-posts relevant articles. Since it’s all duplicate content, would they be better off removing these sections even with plenty of other unique content?

    “The answer is probably yes, but let me give you a little bit of color about the reasoning for that,” Cutts says in the video. “So a lot of the times at Google, we’re thinking about a continuum of content, and the quality of that content, and what defines the value add for a user. So let’s draw a little bit of an axis here and think a little bit about what’s the difference between high quality guys versus low quality guys? Take somebody like The New York Times. Right? They write their own original content. They think very hard about how to produce high quality stuff. They don’t just reprint press releases. You can’t just automatically get into The New York Times. It’s relatively hard. Right?”

    “At the other end of this spectrum is the sort of thing that you’re talking about, where you might have a regular site, but then one part of that site, one entire section of that site, is entirely defined by maybe just doing a news search, maybe just searching for keywords in press releases,” he continues. “Whatever it is, it sounds like it’s pretty auto-generated. Maybe it’s taking RSS feeds and just slapping that up on the site. So what’s the difference between these?”

    “Well, The New York Times is exercising discretion,” Cutts explains. “It’s at exercising curation in terms of what it selects even when it partners with other people, and whenever it puts other content up on its site. And most of its content tends to be original. Most the time it’s thinking about, OK, how do we have the high quality stuff, as opposed to this notion– even if you’ve got high quality stuff on the rest of your site, what is the value add of having automatically generated, say, RSS feeds or press releases, where all you do is you say, OK. I’m going to do a keyword search for Red Widgets and see everything that matches. And I’m just going to put that up on the page.”

    “So on one hand, you’ve got content that’s yours, original content–there’s a lot a curation. On the other hand, you’ve got something that’s automated, something that’s more towards the press release side of things, and it’s not even your content. So if that’s the case, if you’re just looking for content to be indexed, I wouldn’t go about doing it that way.”

    For many in the SEO realm, there aren’t any new revelations here, but duplicate content is an issue that continues to be a problem many worry about, even after so many years. It’s still part of Google’s quality guidelines, and as you probably know, the Penguin update is designed to algorithmically enforce those, so that on its own is a good reason to exercise caution in this area.

  • Spinfographics: When Will Google Crack Down On Infographic Spam?

    Infographic spam may soon take its rightful place in the grand lineage of splogs, duplicate-content articles and mass directory submissions.

    Spinfographics. Get ready, because they about to flood the Internet.

    You never heard of “spinfographics” before? It means, well, if you know what a splog is, you will probably understand exactly what a spinfographic is. If, not we had best go back to the beginning.

    In the beginning there was Google.

    Google created the Internet, and saw that it was good.

    Then Google created websites, and saw that it was good.

    Then Google ranked websites, and saw that it was good.

    And Google told webmasters to make their websites for users, not for higher rankings. But webmasters were tempted, and they took of the fruit of the Tree of Knowledge, that they might be like Google and know the ranking algorithm.

    And nothing was the same again. Every time webmasters took another bite of the fruit, webmasterkind would spoil it. The pattern was always the same…

    1. Knowledge: A few people discover that they can rank better by adding more terms in the
    keywords meta tag.

    2. Temptation: Everybody decides to stuff their keyword meta tag so they can rank for everything.

    3. A big mess!: Suddenly rankings are scalable, everybody can do it and replicate with ease. Too much quantity, too little quality.

    4. Banishment: Google removes keywords meta data from its algorithm.

    Sometime later…

    1. Knowledge: People learn that directory links can be useful for ranking well.

    2. Temptation: Some people realize that if they can create tons of directories, they can get lots of webmaster traffic.
    Other people discover that if they can auto-submit sites, they can make money for building tons of links.

    3. A big mess!: Suddenly link-building is scalable, everybody can do it and replicate with ease. Too much quantity, too little quality.

    4. Banishment: Google devalues directory links in its algorithm.

    Then…

    1. Knowledge: People learn that article directory links can be useful for ranking well.

    2. Temptation: Some people realize that if they can create tons of article directories, they can get lots of webmaster traffic. Other people discover that if they can auto-submit articles, they can make money for building tons of links. Quickly. Cheaply.

    3. A big mess!: Suddenly article submissions are scalable, everybody can do it and replicate with ease. Too much quantity, too little quality.

    4. Banishment: Google devalues links from duplicate content in its algorithm.

    Then, of course…

    1. Knowledge: People figure out that if they spin each article into various versions, they can use the same basic content without creating duplicate content.

    2. Temptation: Some people realize if they can automate the spinning process, they can create lots of articles easily from the same content. Quickly. Cheaply.

    3. A big mess!: Suddenly article spinning is scalable, everybody can do it and replicate with ease. Too much quantity, too little quality. In fact, so little quality that it starts turning the Internet into a waste bin.

    4. Banishment: Google devalues spun content in its algorithm and penalizes heavy users.

    We are getting closer. And then…

    1. Knowledge: People figure out that keyword rich links in blog content are the best links for ranking well.

    2. Temptation: Some people realize how much money they can make by offering tons of in-content blog links for very little work by creating blogs just to sell links.

    3. A big mess!: Suddenly in-content blog link-building is scalable, and splogs (spam blogs) are popping up like weeds. Too much quantity, too little quality. Yes, the Internet really is looking more and more like a waste bin.

    4. Banishment: Google de-indexes whole networks of splogs and penalizes heavy users. Can you say “Penguin”?

    And next…

    1. Knowledge: Some people figure out that they can get lots of good links by sharing Infographics.

    2. Temptation: Infographics galleries start popping up and some people realize there is a market to be made selling “cheap, easy, DIY Infographics”.

    3. A big mess!: Suddenly Infographics creation and distribution becomes ______________ . Too much quantity, too little quality. (Fill in the blank. Hint, it rhymes with “shwalable”). Yes, we transition from Infographics to spinfographics.

    4. Banishment: Google _________________________ (Fill in the blanks). What do you think Google will do to spinfographers – to webmasters who mass produce and mass distribute Infographics?

    Listen carefully, and you can already hear the moaning and groaning on future webmaster forums, as people complain with surprise that their sites have been penalized or lost rankings because they were mass distributing Infographics to artificially boost their rankings.

    “But Google says, ‘The best way to get other sites to create relevant links to yours is to create unique, relevant content that can quickly gain popularity in the Internet community.’”

    OK, sure. But the pattern is always the same. Huge swarms of webmasters looking for shortcuts, trying to mass produce quality, totally oblivious to the oxymoron of their business model. And they spoil it for the rest of us. Already people are advertising services to create “easy” infographics “in minutes” for a very “cheap” price.

    Does this mean the days of Infographics are numbered? I don’t think so. There always has been a place for graphical displays of data. Newspapers have been doing it for decades, and it will continue on the Internet.

    However, I am certain that any popular link-bait strategy using Infographics today will be outdated a year or two from now. Smart webmasters will go back to the table and reconsider how to use Infographics to boost their promotions.

    Done right, I am confident that these will always be useful for search engine rankings. Just as blog links.
    And content spinning. And article links. And directory links. And…well, maybe not meta tags.

    Just as in all these previous techniques, webmasters will have to make sure that it is perfectly clear to the search engines that they are not mass-producing, mass-linking or using a scalable or automated method to create or distribute content.

    And there is a single strategy that applies to all of these. Don’t do it for the search engines; do it for reaching out to new markets. Don’t ignore the search engines; keep one eye on them with everything you do. But if the main goal of any action is aimed at reaching new markets, you will end up creating and distributing the kind of content that Google wants you to. Or at least that Google is now saying that it wants you to – but that is another scary topic for another discussion.

    For now, the key thing is to avoid Spinfographics, because with the Penguin update, Google has shown that it is ready to do more than just devalue scalable links – they are willing to penalize sites involved.

  • Google: There Hasn’t Been Another Panda Update

    As reported, webmasters have been speculating that there may have been another Panda update based on some significant changes in referrals. We asked Google to confirm one way or the other. A Google spokesperson simply tells us, “There hasn’t been.”

    Still, Google also told us there had not been a new Penguin update, just before Matt Cutts announced that there had been one.

    In a WebmasterWorld thread Barry Schwartz pointed to, there was some suspicion that there had been a Panda refresh on May 30, but taking Google at its word, that was apparently not the case.

    Today is June 1 though, and that means that it’s about time for Google to release its big list of algorithm changes for the month of May. There’s no guarantee that this list will come out today, but I wouldn’t be surprised. Sometimes they wait until a few days into the following month to release the lists. Once, they released it before the month was even over. Either way, I’m sure there will be plenty to dig into.

    More Panda coverage here.

  • Google Panda Update: More Speculation From Webmasters About Another One

    As usual, webmasters are speculating that there may have been a Panda update refresh. Once again, Barry Schwartz at Search Engine Roundtable is pointing to a WebmasterWorld thread where such discussion is taking place.

    Various people are reporting that they’re seeing significant changes, speculating that there was an update on May 30. We’ve reached out to Google for confirmation one way or the other. Sometimes they’ll confirm, others they won’t. Sometimes they’ll tweet about the refreshes. Other times they won’t.

    Right after Google told us last week there hadn’t been a Penguin update since the first one, Matt Cutts tweeted this:

    Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches. Context: http://t.co/ztJiMGMi
    1 day ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    So I guess you never can know for sure.

    One thing that I’m pretty sure of, however, is that Google will be releasing its monthly list of algorithm changes in the near future. It’s the last day of May, which means it’s just about time to see what all Google has done over the past month. These are always full of interesting nuggets of information.

    In other Panda news, Google is including some of its guidance for webmasters regarding the quality of content in its recently launched Webmaster Academy.

    Apparently Google’s Amit Singhal has been eavesdropping on people talking about it in coffee shops as well:

    Overheard next table at Starbucks, “With Google Panda and Penguin our tricks don’t work.” Glad they didn’t recognize me.
    14 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Image: Dancing Panda (YouTube)

  • Google Penguin Update: More On That Recovery Story From WPMU

    One site that got hit by Google’s Penguin update has managed to make a comeback, thanks to Google’s most recent data refresh of the algorithm update. WPMU’s story has been in the spotlight as an example of where a legitimate non-spammy site was hit by an algorithm update designed to attack webspam. We spoke with James Farmer who runs the site (which distributes WordPress themes) about the situation before the recovery, and now he’s shared some more thoughts about the whole thing with WebProNews.

    “I’m going to stick my neck out here and say I think we’re going to see a recovery *far better* than the traffic we experienced before,” Farmer tells us. “Which makes me wonder if we were being ‘kind’ [of] penalized before, but that the new penguin update has actually used something (like our social signals, for example) to really stamp that out.”

    “Needless to say I’m super happy,” he adds.

    Earlier this week, Ross Hudgens from Full Beaker, who provided some assistance and advice to Farmer, blogged about the recovery at SEOmoz. In that post, it mentions that Farmer did some other things to improve quality, beyond the Penguin-specific advice he was given.

    “Well, because we didn’t want to appear at all self-promotional we didn’t mention that one of the things we did was to actually eat our own dog food and implement the Infinite SEO plugin we make at DEV (instead of all the other bits and bobs we were using). This gave us a nice clean and unbroken sitemap.”

    “Plus, and this one may be significant, given that we’re a WP site, we were getting a bunch of warnings in Webmaster Tools (and via our SEOMoz report) about dodgy URL paramaters (? queries etc.) and so we implemented, finally, canonical URLs from all of the various comment permalinks etc. etc. to the actual posts themselves.”

    In the SEOmoz piece, Hudgens wrote:

    The most perilous piece of WPMU’s link profile came from one site – EDUblogs.org. EDU Blogs is a blogging service for people in the education space, allowing them to easily set up a subdomain blog on EDUblogs for their school-focused site – in a similar fashion to Blogspot, Typepad, or Tumblr, meaning that each subdomain is treated as a unique site in Google’s eyes. Coincidentally, this site is owned by WPMU and Farmer, and every blog on the service leverages WPMU theme packs. Each of these blogs had the “WordPress MU” anchor text in the footer, which meant a high volume of subdomains considered unique by Google all had sitewide “WordPress MU” anchor text. In what might have been a lucky moment for WPMU, this portion of their external link profile was still completely in their control because of WPMU ownership.

    In what I believe is the most critical reason why WPMU made a large recovery and also did it faster than almost everyone else, Farmer instantly shut off almost 15,000 ‘iffy’ sitewide, footer LRDs to their profile, dramatically improving their anchor text ratios, sitewide link volume, and more.

    We asked Farmer if the EDUblogs.org situation was the biggest factor in the recovery in his opinion, and whether he thought if that were the only thing that was addressed, the site would have still recovered.

    “A commentator on SEOMoz pointed out that WPMUD EV – our main business – is still linked to on Edublogs! And widely too!” he says. “However, as I responded to them, WPMU DEV has actually seen a significant growth (12%) in Google traffic in the last month, the second biggest monthly unseasonal bump there in the last couple of years.”

    “So I think that the removal of edublogs links may have been a factor, in fact I reckon it was a factor, but I think that Google identifying the strength of the site through our social presence and tweaking Penguin appropriately was the real deal,” he adds.

    “As we’ve been 100% above board with everything we’ve done and have to offer, I really hope we were a poster child for how Google could improve its results,” Farmer says.

    When WPMU first came into the spotlight, Google’ Matt Cutts had pointed out some specific problem link examples, which came in the footers of some WordPress themes that appeared on spam blogs. Part of Farmer’s post-Penguin strategy has been to request the removal of these links.

    “I don’t think anyone gets good referrals from footer links, apart from possibly great designers who insist on credits,” he says. “To me it was always just a ‘part of the same company’ thing. Iit just made sense, [but] clearly not!”

    “I think that if we do add them back it’ll be 100% branded and on relevant pages only (ie. the homepage, about page for the company). I think that given the relevance we’ll dofollow them,” he says.

    While he’s certainly a little biased in this department, Farmer says he thinks Google’s results in general have gotten better with the most recent Penguin refresh, saying “it’s like manna from heaven. Thank you for listening Google!”

    In a new post, Farmer says he’s getting record referrals since the latest Penguin update.

    Image: Batman Returns (Warner Bros.)

  • SEOmoz Analyst: Google Will Be Cracking Down On Directories More

    Earlier this month, there was some discussion about Google having de-indexed free web directories. Most of the ones we looked at had not actually been de-indexed, but were not ranking well, but there were clearly some that had been de-indexed.

    Since then, SEOmoz has been doing somde extensive data gathering, investigating the situation further. Kurtis Bohrnstedt, the company’s “Captain of Special Projects” gathered a total of 2,678 directories, and only found 200 of them to be banned, but an additional 340 to be penalized (as in not de-indexed, but not ranking for obvious terms where they would be the only result that makes sense).

    Still, that’s only 540 directories out of 2,678. It would seem that there are a lot more directories in the clear, but Bohrnstedt thinks this is only Google sending a warning, and that there is likely more to come.

    “That is not to say the ones left unharmed are safe from a future algorithmic update,” he writes. “In fact, I suspect this update was intended to serve as a warning; Google will be cracking down on directories. Why? In my own humble opinion, most of the classic, ‘built-for-SEO-and-links’ directories do not provide any benefit to users, falling under the category of non-content spam.”

    I wonder if that includes directories that are apparently built for SEO and links and charge webmasters for the chance to get links, but offer some form of editorial oversight.

    Interestingly, when this topic was being discussed a couple weeks ago, one webmaster said he had a paid directory he hadn’t touched in years, which was unexpectedly seeing an increase in PageRank.

  • iAcquire Gets Rid Of Paid Link Offerings Following Google De-Indexing

    Blogger Josh Davis recently put out an investigative report exposing the marketing firm iAcquire for engaging in paid links for clients. Once Google caught wind of it, iAcquire was de-indexed from Google’s search results.

    Now, the company has openly admitted to “financial compensation,” though it says it has been transparent about this with its clients. On Tuesday, iAcquire put out a blog post talking about the ordeal. Here’s a snippet:

    There are many methods to develop link relationships. Based on the client strategy we deploy a variety of approaches to link development, and in some cases we’ve allowed financial compensation as a tool. Removing financial compensation from the link development toolset has been a long term goal for us. We are using these recent events to be a catalyst to expedite those plans effective immediately.

    We do not mislead customers nor operate in any manner contrary to their wishes or directives. Every strategy we develop is done in conjunction with knowledgeable online marketing specialists from iAcquire and our clients. Our process is transparent- every aspect of a campaign is available to our customers. In the past, we have responded to the frequent needs for urgency and speed from our clients. We are going to take this opportunity to discuss with our clients the best approaches to ensure a long term strategy and horizon for their program.

    The company has been engaging in a lot of related conversation on Twitter:

     

    @jonahstein We’re not that concerned with being deindexed–we weren’t driving much traffic through search anyway. But thanks for the…
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @righthatseo not exactly getting rocked, just nudging us in the right direction even quicker http://t.co/6RLQedRg
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @jonahstein but of course we are working to comply with google in order to return to the search results
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @craigaddyman Can’t speak for everyone else but we haven’t stopped working hard to be the best SEOs we can right now
    45 minutes ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

    We may soon see other companies being exposed in similar fashion, as Davis recently told WebProNews, “I have come across some other smaller companies which seem to be doing it (maybe one other large one, but I am still researching that).”

    Google penalties from paid links, as we’ve seen in the past, can have big effects on big companies. Overstock.com even blamed Google’s penalty for an “ugly year”.

    Update: Davis now tells us, “I am not currently researching other companies large or small that may be buying undisclosed links. While my research for the initial piece did unearth what appeared to be other paid links, that was just a byproduct of my initial work. I have not further pursued examining any more links. It took so much time to do the ‘Search Secrets’ piece in thorough manner, I don’t intend to duplicate that amount of work again.”

  • Google Penguin Update Refresh & Recovery Provide Hope For Webmasters

    As previously reported, Google announced its first Penguin update since the original over Memorial Day weekend. Google’s head of webspam, Matt Cutts, tweeted about it, saying, “Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches." Have you seen search referrals drop or rise since this update? Let us know in the comments.

    The good news, whether you were hit by Penguin the first time or this time, is that you can recover. We’ve now seen that this can happen, and since we know that Google will continue to push data refreshes for Penguin, there should be plenty of chances to do so. Just think about all the Panda refreshes we’ve seen since February 2011.

    We recently reported on WPMU, a seemingly quality site with plenty of fans on social media channels, which got hit by the first Penguin update. The site has now made a full recovery.

    Here’s what the analytics looked like after Penguin:

    WPMU analytics

    Here’s what the analytics look like now:

    WPMU Analytics

    It’s worth noting that Cutts was aware of this site, as James Farmer (the site’s owner) was able to get it brought to his attention, following the initial Penguin update, via an interview with the Sydney Morning Herald. Cutts had provided some examples of the kinds of links that were likely hurting it. This was all discussed in our previous article, but to summarize, WPMU distributes WordPress themes, and a lot of blogs, including spam blogs were using some of them, which included links back to WPMU in the footer.

    Ross Hudgens from Full Beaker provided some assistance and advice for Farmer, and blogged about the experience at SEOmoz. He notes that Farmer opted to ask blogs to remove the links, rather than applying nofollow to them, but it was actually an internal change that Farmer was able to make, which ultimately might have had the greatest impact on the recovery. Hudgens writes:

    The most perilous piece of WPMU’s link profile came from one site – EDUblogs.org. EDU Blogs is a blogging service for people in the education space, allowing them to easily set up a subdomain blog on EDUblogs for their school-focused site – in a similar fashion to Blogspot, Typepad, or Tumblr, meaning that each subdomain is treated as a unique site in Google’s eyes. Coincidentally, this site is owned by WPMU and Farmer, and every blog on the service leverages WPMU theme packs. Each of these blogs had the “WordPress MU” anchor text in the footer, which meant a high volume of subdomains considered unique by Google all had sitewide “WordPress MU” anchor text. In what might have been a lucky moment for WPMU, this portion of their external link profile was still completely in their control because of WPMU ownership.

    In what I believe is the most critical reason why WPMU made a large recovery and also did it faster than almost everyone else, Farmer instantly shut off almost 15,000 ‘iffy’ sitewide, footer LRDs to their profile, dramatically improving their anchor text ratios, sitewide link volume, and more. They were also able to do this early on in the month, quickly after the original update rolled out. A big difference between many people trying to “clean up their profile” and WPMU is time – getting everything down and adjusted properly meant that many people simply did not see recoveries at refresh 1.1 – but that doesn’t mean it won’t happen at all if the effort persists.

    Farmer was also able to get one of the blogs that Cutts had initially pointed out the links from, to remove the links. According to Hudgens, he also did some other things, which may have played a role in the recovery, such as: implementing canonical URLs to clean up crawl errors and eliminate unnecessary links, fixed some broken sitemaps and submitted them to Webmaster Tools, fixed some duplicate title tag issues (which Webmaster Tools reported). He also submitted the site to the form Google provides for those who think they’ve wrongfully been impacted by Penguin. Twice.

    It’s also possible that the exposure this site has received in the media, and in front of Matt Cutts could have helped. We’ve certainly seen penalties come from such exposure.

    Not everyone will be able to get such exposure to make their cases as strong to Google, but Google does look at the submissions to that form, so if you’ve determined that you’re in compliance with Google’s quality guidelines, and you still think you were actually hit by Penguin, that’s a good place to start your recovery efforts, but you’ll probably want to continue to dig as much as you can.

    Look at all of Google’s quality guidelines. Are there any areas where Google may think you’re in violation? Make the proper changes. Cutts recently pointed to the following videos as recovery advice:

    He also said the following tips from Marc Ensign “looked solid”:

    • Create a blog and consistently build up your site into a wealth of valuable content.
    • Work with a PR firm or read a book and start writing legitimate press releases on a regular basis and post them on your site.
    • Visit blogs within your industry and leave valuable feedback in their comments section.
    • Link out to other valuable resources within your industry that would benefit your visitors.
    • Share everything you are creating on 2 or 3 of your favorite social media sites of choice.
    • Position yourself as an expert.

    Virginia Nussey at Bruce Clay put together an interesting step-by-step guide to “link pruning” which might help you clean up your link profile, and ease your way to a recovery. She recommends setting up a spreadsheet with the following headers: Target URL, Source URL, Source Rank, Source Craw Date, Anchor Text, Image Link, ALT Text, Nofollow, Redirect and Frame. Then, she recommends adding the following to the spreadsheet, for webmaster contact info: Owner name, IP Address, Owner Address, Owner Email, Owner Phone Number, Registrar Name, Technical Content, Name Servers, Net Name, Created, Updated, Expires, Data Source (what site/registry was the resource for the contact gathered?).

    From there, it’s just about sending removal requests and seeing what happens. Hopefully lawsuits aren’t part of your strategy.

    We’ll have more discussion with Farmer to share soon, and perhaps he’ll be able to shed a bit more light on his own Penguin recovery. In the meantime, if you’re been hit, perhaps you can view his story as one of hope and inspiration, before you go starting over with a new site (which Google has also suggested as a possible option).

    Penguin will be back again. You can recover. Remember, there are always other non-Penguin signals that you can try to improve upon too. You certainly don’t want to forget about our old pal the Panda.

    Google called Penguin a success even before the latest refresh. What are your thoughts about it now that we’ve seen an update to the update? Let us know in the comments.

  • Google Panda Update Advice Appears In Webmaster Academy

    As you may recall, last year, Google put out a list of 23 questions that one should consider when assessing the quality of their content. This was largely considered to be the types of things Google is considering when it comes to the Panda update, which is supposed to be about surfacing quality content in search results.

    Last week, Google introduced Webmaster Academy, a new guide for helping you perform better in Google results. Earlier, we looked at Google’s advice on influencing your site’s listing in search.

    There’s another section specifically about content quality. The section is called “Create Great Content“.

    “One key element of creating a successful site is not to worry about Google’s ranking algorithms or signals, but to concentrate on delivering the best possible experience for your user by creating content that other sites will link to naturally—just because it’s great,” the guide says. It then provides a couple of lists. The first list is for what to think about when you’re writing a post or an article:

    • Would you trust the information in this article?
    • Is the article useful and informative, with content beyond the merely obvious? Does it provide original information, reporting, research, or analysis?
    • Does it provide more substantial value than other pages in search results?
    • Would you expect to see this in a printed magazine, encyclopedia or book?
    • Is your site a recognized authority on the subject?

    The second list is for problems to keep an eye out for:

    • Does this article have spelling, stylistic, or factual errors?
    • Does the site generate content by attempting to guess what might rank well in search engines?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites?
    • Does this article have an excessive number of ads that interfere with the main content?
    • Are the articles short or lacking in helpful specifics?

    This is all stuff from Google’s post-Panda list, but it’s not everything from that list. Some of the other entries in the initial list kind of go hand in hand with the stuff on these new lists, but some of the things not mentioned include:

  • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
  • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
  • Would you be comfortable giving your credit card information to this site?
  • How much quality control is done on content?
  • Does the article describe both sides of a story?
  • Was the article edited well, or does it appear sloppy or hastily produced?
  • Does this article provide a complete or comprehensive description of the topic?
  • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
  • Are the pages produced with great care and attention to detail vs. less attention to detail?
  • Would users complain when they see pages from this site?
  • I do find it interesting that in the new lists, Google talks about the site being a recognized authority, but not so much from the author perspective. I have no doubt that Google considers this greatly, but it’s a little odd that it wasn’t included here. Goog, of course, has been pushing authorship in search results, even using it to promote Google+ profiles. It does provide the author with greater visibility in search results by default (with visual, clickable images).

    It’s also interesting that the “does this article describe both sides of a story” entry didn’t make an appearance. Perhaps Google’s increased personalization has made this less of a factor. If you follow a lot of conservative (or liberal) Google+ profiles, for example, it’s possible that you might see more content they’ve shared in your search results, which may or may not show both sides of a story.

  • How To Influence Your Site’s Listing In Search, According To Google’s Webmaster Academy

    Last week, Google introduced Webmaster Academy, billed as a tool to “help you create great sites that perform well in Google search results.”

    Google’s Matt Cutts tweeted about it again today:

    Webmaster Academy is a free set of helpful entry-level tutorials for site owners: http://t.co/LNi3BVro Please RT!
    2 hours ago via Tweet Button · powered by @socialditto
     Reply  · Retweet  · Favorite

    There’s a section in the guide called: Influence Your Site’s Listing In Search. Note, this is more about how your listing apppears, rather than where it appears. It’s more about improving clickability than rankings, although this stuff shouldn’t hurt your rankings either.

    Granted, there’s not much in the way of new information, but it’s nice to see what exactly Google is highlighting. The first thing under this section is a video from Matt Cutts (from 2007) about snippets:

    It then lists the following as ways you can help create “compelling listings that users are more likely to click”:

    • Create useful page titles. Make sure that your title is useful, descriptive, and relevant to the actual page itself.
    • Use informative URLs. The URL (web address) of a page appears below the title, with words from the user’s query in bold. Your URLs should be simple and human readable. Which do you find more informative: http://example.com/products/shoes/high_heels/pumps.html or http://example.com/product_id=123458?
    • Provide relevant page descriptions. The descriptive text under the URL is usually taken from the description meta tag on the page. Descriptions should be different and unique to each area of your site.
    • Add your business to Google Places, to help Google display location information in results.
    • Manage your sitelinks. Sitelinks (sub-links to individual pages on your site) are meant to help users navigate your site. Sitelinks are automatically generated. This means that you can’t specify a sitelink, but you can use Webmaster Tools to ask Google to demote sitelinks you don’t like.

    With regards to sitelinks, Google actually made some adjustments to how they appear back in April. In Google’s most recently monthly list of algorithm changes, there were four changes that had to do specifically with sitelinks. These were:

    • “Sub-sitelinks” in expanded sitelinks. [launch codename “thanksgiving”] This improvement digs deeper into megasitelinks by showing sub-sitelinks instead of the normal snippet.
    • Better ranking of expanded sitelinks. [project codename “Megasitelinks”] This change improves the ranking of megasitelinks by providing a minimum score for the sitelink based on a score for the same URL used in general ranking.
    • Sitelinks data refresh. [launch codename “Saralee-76”] Sitelinks (the links that appear beneath some search results and link deeper into the site) are generated in part by an offline process that analyzes site structure and other data to determine the most relevant links to show users. We’ve recently updated the data through our offline process. These updates happen frequently (on the order of weeks).
    • Less snippet duplication in expanded sitelinks. [project codename “Megasitelinks”] We’ve adopted a new technique to reduce duplication in the snippets of expanded sitelinks.

    To demote a sitelink with Webmaster Tools, click “sitelinks” under “Site Configuration,” add the URL in the “For this search result” box, and complete the URL of the one you want to demote in the “Demote this sitelink URL” box.

  • Is Google Admitting That Negative SEO Is Possible?

    Google has a page in its Webmaster Tools help center addressing the question: Can competitors harm ranking? This has been a topic getting a lot of discussion since the Penguin update. It was getting a pretty good amount before then too, but it seems to have ramped up significantly in recent months.

    Some webmasters have noticed that Google has updated the wording it uses to address this question on the help center page. There are various forum discussions going on, as Barry Schwartz at Search Engine Roundtable has pointed out.

    Back in 2006, Schwartz posted what Google said on the page at the time, which was:

    There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.

    Our search results change regularly as we update our index. While we can’t guarantee that any page will consistently appear in our index or appear with a particular rank, we do offer guidelines for maintaining a “crawler-friendly” site. Following these recommendations may increase the likelihood that your site will show up consistently in the Google search results.

    These days, it just says:

    Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.

    Apparently the wording itself was changed in March, though the page says there was an update made on 05/22. Either way, Google recently changed it, and instead of saying “there’s almost nothing a competitor can do to harm your ranking…” Google now says, “Google works hard to prevent other webmasters from being able to harm your ranking…”

    That’s not incredibly reassuring.

    Google has also added the following video from Matt Cutts to the page:

    This video isn’t exactly an answer to the question though. It’s more about telling on competitors’ black hat tactics, rather than your competitors directly hurting your ranking. Essentially, it equates to: file a spam report.

    Rand Fishkin at SEOmoz has been testing the negative SEO waters, challenging others to hurt his sites’ rankings. He told us a couple weeks ago that despite thousands of questionable links, his sites were still ranking well.

  • Matt Cutts: Here’s How To Expose Your Competitors’ Black Hat SEO Practices

    Google’s Matt Cutts put out a Webmaster Help video discussing how to alert Google when your competitors are engaging in webspam and black hat SEO techniques. The video was in response to the following user-submitted question:

    White hat search marketers read and follow Google Guidelines. What should they tell clients whose competitors use black hat techniques (such as using doorway pages) and whom continue to rank as a result of those techniques?

    Do you you think Google does a good job catching webspam? Let us know in the comments.

    “So first and foremost, I would say do a spam report, because if you’re violating Google’s guidelines in terms of cloaking or sneaky JavaScript redirects, buying links, doorway pages, keyword stuffing, all those kinds of things, we do want to know about it,” he says. “So you can do a spam report. That’s private. You can also stop by Google’s Webmaster forum, and that’s more public, but you can do a spam report there. You can sort of say, hey, I saw this content. It seems like it’s ranking higher than it should be ranking. Here’s a real business, and it’s being outranked by this spammer…those kinds of things.”

    He notes that are both Google employees and “super users” who keep an eye on the forum, and can alert Google about issues.

    “The other thing that I would say is if you look at the history of which businesses have done well over time, you’ll find the sorts of sites and the sorts of businesses that are built to stand the test of time,” says Cutts. “If someone is using a technique that is a gimmick or something that’s like the SEO fad of the day, that’s a little less likely to really work well a few years from now. So a lot of the times, you’ll see people just chasing after, ‘OK, I’m going to use guest books’, or iI’m going to use link wheels’ or whatever. And then they find, ‘Oh, that stopped working as well.’ And sometimes it’s because of broad algorithmic changes like Panda. Sometimes it’s because of specific web spam targeted algorithms.”

    I’m sure you’ve heard of Penguin.

    He references the JC Penney and Overstock.com incidents, in which Google took manual action. For some reason, he didn’t bring up the Google Chrome incident.

    This is actually a pretty timely video from Cutts, as another big paid linking controversy was uncovered by Josh Davis (which Cutts acknowledged on Twitter). Google ended up de-indexing the SEO firm involved in that.

    “So my short answer is go ahead and do a spam report,” Cutts continues. “You can also report it in the forums. But it’s definitely the case that if you’re taking those higher risks, that can come back and bite you. And that can have a material impact.”

    He’s not joking about that. Overstock blamed Google for “an ugly year” when its revenue plummeted. Even Google’s own Chrome penalty led to some questions about the browser’s market share.

    Cutts notes that Google is also happy to get feedback at conferences, on Twitter, online, blogs, forums, “if you’re seeing sites that are prospering and are using black hat techniques.”

    “Now, it’s possible that they have some low-quality links, and there are some links that people aren’t aware of that we see that are actually high quality,” Cutts notes. “But we’re happy to get spam reports. We’re happy to dig into them. And then we’ll try to find either new algorithms to try to rank the things more appropriately in the future. Or we’re certainly willing to take manual action on spam if it’s egregious or if it violates our guidelines. We have a manual web spam team that is willing to respond to those spam reports.”

    According to Cutts, you can even submit spam reports using Google Docs. Here’s a conversation he had on Twitter recently:

    @mattcutts Can we send a link to a Google Docs spreadsheet when reporting spam? #penguin 1 day ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    After Google launched the Penguin update, Cutts tweeted the following about post-Penguin spam reports:

    To report post-Penguin spam, fill out https://t.co/di4RpizN and add “penguin” in the details. We’re reading feedback. 5 days ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Shortly thereafter, he tweeted:

    @Penguin_Spam yup yup, we’ve read/processed almost all of them. A few recent ones left. 10 minutes ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    I’m sure plenty more reports have rolled into Google since then, but it does seem like they process them fairly quickly.

    Do you think Google has done a good job at cleaning up webspam? Share your thoughts.

  • Google Penguin Updated To Version 1.1

    Google’s Penguin update has been somewhat divisive in the search industry. There have been Web sites that were negatively impacted by the update while Google says that Penguin was a success. One of the most persistent rumors, however, has been Google pushing out an update to Penguin. That never seemed to be the case.

    Saturday brought word from Matt Cutts that an update had been pushed through. Unfortunately, Cutts was light on details.

    Minor weather report: We pushed 1st Penguin algo data refresh an hour ago. Affects <0.1% of English searches. Context: http://t.co/ztJiMGMi
    1 day ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    It’s not immediately clear what the update covers or fixes from the original launch of Penguin back in April. A few legitimate sites claimed to have been hit by Penguin and lost traffic as a result. One theory is that the Penguin 1.1 is meant to address those legitimate sites that were targeted.

    If you recall, Penguin has been billed as Google’s method of cutting down on Web spam and other nefarious parts of the Web. One of the main concerns that Penguin addressed was the use of backlinks. It was found that sites being linked to by lesser quality sites were affected by Penguin as well. This led to Chris Crum reporting on a new method that had these sites threatening to sue other sites over their use of backlinks. Nobody wants to have poor quality sites linking to them in this post-Penguin world, but linking isn’t illegal. That leaves the threat of a lawsuit the only option for those sites that rely on search to get traffic.

    It remains to be seen if the newest Penguin update has had any kind of effect on search. It might have been a smart on Google’s part to release the update on Memorial Day weekend to avoid any major outcry. It leaves them a few more days to work out any minor kinks that may be in the algorithm.

    Still, Matt Cutts refers to Penguin 1.1 as a “Minor weather report.” It’s highly possible that the update doesn’t contain anything that is Web shattering. Google is remaining mum on the details for this update. We’ll probably have to wait until the next update on algorithm changes to find out.

    Until then, check out our comprehensive guide to appeasing Penguin. It contains all the tips you need to know to keep a quality Web site and retain or increase your traffic in the post-Penguin world.

    [h/t: SearchEngineLand]

  • Google Penguin Update: Don’t Forget About Duplicate Content

    There has been a ton of speculation regarding Google’s Penguin update. Few know exactly what the update specifically does, and how it works with Google’s other signals exactly. Google always plays its hand close to its chest.

    “While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics,” Google’s Matt Cutts said in the announcement of the update.

    He also said, “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.”

    “We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings,” he said. To me, that indicates that this is about all webspam techniques – not just keyword stuffing and link schemes, but also everything in between.

    So it’s about quality guidelines. Cutts was pretty clear about that, and that’s why we’ve been discussing some of the various things Google mentions specifically in those guidelines. So far, we’ve talked about:

    Cloaking
    Links
    Hidden text and links
    Keyword stuffing

    Another thing on the quality guidelines list is: “Don’t create multiple pages, subdomains, or domains with substantially duplicate content.”

    Of course, like the rest of the guidelines, this is nothing new, but in light of the Penguin update, it seems worth examining the guidelines again, if for no other reason than to provide reminders or educate those who are unfamiliar. Duplicate content seems like one of those that could get sites into trouble, even when they aren’t intentionally trying to spam Google. Even Google says in its help center article on the topic, “Mostly, this is not deceptive in origin.”

    “However, in some cases, content is deliberately duplicated across domains in an attempt to manipulate search engine rankings or win more traffic,” Google says. “Deceptive practices like this can result in a poor user experience, when a visitor sees substantially the same content repeated within a set of search results.”

    Google lists the following as steps you can take to address any duplicate content issues you may have:

    • Use 301s: If you’ve restructured your site, use 301 redirects (“RedirectPermanent”) in your .htaccess file to smartly redirect users, Googlebot, and other spiders. (In Apache, you can do this with an .htaccess file; in IIS, you can do this through the administrative console.)
    • Be consistent: Try to keep your internal linking consistent. For example, don’t link to http://www.example.com/page/ and http://www.example.com/page and http://www.example.com/page/index.htm.
    • Use top-level domains: To help us serve the most appropriate version of a document, use top-level domains whenever possible to handle country-specific content. We’re more likely to know that http://www.example.de contains Germany-focused content, for instance, than http://www.example.com/de or http://de.example.com.
    • Syndicate carefully: If you syndicate your content on other sites, Google will always show the version we think is most appropriate for users in each given search, which may or may not be the version you’d prefer. However, it is helpful to ensure that each site on which your content is syndicated includes a link back to your original article. You can also ask those who use your syndicated material to use the noindex meta tag to prevent search engines from indexing their version of the content.
    • Use Webmaster Tools to tell us how you prefer your site to be indexed: You can tell Google your preferred domain (for example, http://www.example.com or http://example.com).
    • Minimize boilerplate repetition: For instance, instead of including lengthy copyright text on the bottom of every page, include a very brief summary and then link to a page with more details. In addition, you can use the Parameter Handling tool to specify how you would like Google to treat URL parameters.
    • Avoid publishing stubs: Users don’t like seeing “empty” pages, so avoid placeholders where possible. For example, don’t publish pages for which you don’t yet have real content. If you do create placeholder pages, use the noindex meta tag to block these pages from being indexed.
    • Understand your content management system: Make sure you’re familiar with how content is displayed on your web site. Blogs, forums, and related systems often show the same content in multiple formats. For example, a blog entry may appear on the home page of a blog, in an archive page, and in a page of other entries with the same label.
    • Minimize similar content: If you have many pages that are similar, consider expanding each page or consolidating the pages into one. For instance, if you have a travel site with separate pages for two cities, but the same information on both pages, you could either merge the pages into one page about both cities or you could expand each page to contain unique content about each city.

    Don’t block Google from duplicate content. Google advises against this, because it won’t be able to detect when URLs point to the same content, and will have to treat them as separate pages. Use the canonical link element (rel=”canonical”).

    Note: there are reasons why Google might skip your Canonical link elements.

    It’s important to note that Google doesn’t consider duplicate content to be grounds for penalty, unless it appears that it was used in a deceptive way or to manipulate search results. However, that seems like one of those areas, where an algorithm might leave room for error.

    Here are some videos with Matt Cutts (including a couple of WebProNews interviews) talking about duplicate content. You should watch them, if you are concerned that this might be affecting you:

    This one comes from Google’s Greg Grothaus rather than Cutts. Also worth watching:

    If you think you’ve been wrongfully hit by the Penguin update, Google has a form you can fill out to let them know.

    More Penguin update coverage here.

    Tell us about duplicate content issues you’ve run into in the comments.

  • Google’s Amit Singhal: Penguin A Success

    Early this morning, Google Fellow Amit Singhal was interviewed by Danny Sullivan at Chris Sherman on stage at SMX London, the sister conference of Search Engine Land. Singhal discussed a variety of Google search-related topics.

    We were hoping to get a some in depth discussion about Google’s recent Penguin update, but apparently that wasn’t a major point of conversation. Daniel Waisberg liveblogged the discussion at Search Engine Land, and Penguin only came up briefly. Here’s the relevant snippet of the liveblog:

    Danny talks about Penguin and asks how it is going from Google standpoint, are search results better? Amit says that in the end of the day, users will stay with the search engine that provides the most relevant results. Google’s objective was to reward high quality sites and that was a success with Penguin. One of the beauties of running a search engine is that the search engines that can measure best what the users feel is the one that will succeed more.

    From Google’s perspective they use any signal that is available for them, more than 200 of them. They have to make sure they are accurate and good. They will use any signal, whether it is organic or not.

    “Google Penguin’s objective is to reward high quality sites and authors” Amit Singhal #smxlondon 4 hours ago via Twitter for iPhone ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Panda and penguin update has gone really well… Can someone show amit the results for Viagra #smx 4 hours ago via Twitter for iPad ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @dannysullivan please ask Amit if he has any Penguin recovery tips apart from removing links #smx 4 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Google’s Matt Cutts also recently said that Google has considered Penguin a success, though plenty out there disagree.

    If you want Google’s advice on Penguin recovery, check out these videos Matt Cutts says to watch, these tips he endorsed on Twitter, and of course Google’s quality guidelines.

  • Google Penguin Update: There Hasn’t Been One Since The First One

    As previously reported, there has been some chatter in the forums speculating that Google may have launched another Penguin update. That’s not the first time this has happened since the original one, and it will surely not be the last, but rest assured, there has only been one Penguin update so far.

    A Google spokesperson tells WebProNews: “There hasn’t been an update since the first one.”

    It doesn’t get any clearer than that.

    Of course, one Googler recently said that Google didn’t even have anything called Penguin, so I guess you can never be 100% sure.

    That said, I’m pretty confident that this particular Googler is right. Even the speculation about the possible update has been mixed. Some are attributing traffic dips to the holiday weekend.

    There’s also the fact that Google makes changes every day. We should soon be seeing the big list for the month of May.

    In the meantime, you’d probably do well to focus on making your site and content as good as they can be, and keep it all within Google’s quality guidelines. Also, try to make sure if you hire an agency to do your SEO, that they’re not engaging in any paid linking on your behalf.

    You can still expect Penguin to be coming back around sooner or later.