WebProNews

Tag: algorithm updates

  • Impacted By Google’s Mobile-Friendly Update?

    Impacted By Google’s Mobile-Friendly Update?

    On April 21, Google began rolling out its mobile-friendly update, which makes the mobile-friendliness of a site a ranking signal. It has largely become known as “Mobilegeddon” by publications with a flare for the dramatic, as this name was given to it before it even launched or its effects were even felt.

    Were you prepared for the update? Was it “Mobilegeddon” for your site? Have you noticed any change (positive or negative) so far? Let us know in the comments.

    As expected, Searchmetrics has released a list of winners and losers from the update. Given that the update likely hasn’t finished rolling out yet, even Searchmetrics itself cautions that these are only preliminary results.

    Following are the lists, which were first posted on Saturday.

    The preliminary losers:

    Domain Mobile SEO Visibility actual loss in percent Ratio Mobile vs Desktop
    reddit.com 874108 -27% -36%
    nbcsports.com 139213 -28% -40%
    songlyrics.com 111042 -26% -47%
    youngmoney.com 10602 -76% -77%
    fool.com 78599 -27% -49%
    isitdownrightnow.com 83067 -25% -49%
    tested.com 3243 -89% -18%
    sidereel.com 88851 -22% -44%
    census.gov 71234 -23% -53%
    onlinecreditcenter2.com 33026 -38% -39%
    odir.us 75586 -21% -15%
    boxofficemojo.com 39951 -33% -64%
    schoolloop.com 50046 -27% -50%
    interviewmagazine.com 42280 -31% -32%
    locatetv.com 65460 -21% -53%
    fnfismd.com 54730 -23% -30%
    etymonline.com 25169 -39% -77%
    reviewjournal.com 22769 -41% -40%
    thinkexist.com 15514 -49% -68%
    sciencedaily.com 45017 -23% -45%
    majorgeeks.com 40374 -24% -53%
    movie25.ag 16324 -44% -43%
    thefind.com 2448 -84% -49%
    megashare.sc 48082 -20% -64%
    walmartstores.com 27157 -31% -31%
    thefiscaltimes.com 4940 -71% -81%
    brassring.com 38315 -24% -46%
    google.es 5830 -67% -26%
    epguides.com 32037 -27% -53%
    krebsonsecurity.com 10451 -52% -42%
    sheppardsoftware.com 39140 -22% -55%
    upworthy.com 17146 -38% -26%
    jobs.net 34174 -23% -31%
    apples4theteacher.com 30268 -25% -34%
    mmo-champion.com 2948 -78% -77%
    webcrawler.com 36291 -21% -36%
    moreofit.com 35610 -21% -46%
    hid.im 14776 -40% -43%
    webs.com 9237 -51% -54%
    ft.com 27984 -25% -42%
    paroles-musique.com 15220 -37% -62%
    jcpportraits.com 14936 -37% -32%
    lottostrategies.co 2717 -76% -63%
    searchbug.com 8885 -49% -17%
    usps.gov 33186 -20% -25%
    ondvdreleases.com 4058 -68% -17%
    barchart.com 10435 -44% -55%
    genealogybank.com 26451 -24% -53%
    sketchup.com 18633 -30% -19%
    zeropaid.com 2550 -76% -74%
    edx.org 13439 -37% -30%

    The Preliminary winners:

    Domain Mobile SEO Visibility actual gain in percent Ratio Mobile vs Desktop
    tvtropes.org 290528 420% 23%
    foreignaffairs.com 153528 771% 37%
    gq.com 178364 67% 19%
    w3snoop.com 104573 91% 108%
    knowyourmeme.com 153154 32% 13%
    bandcamp.com 272302 13% 12%
    fbschedules.com 133754 31% 11%
    washingtontimes.com 173354 21% 12%
    ipaddress.com 89830 51% 71%
    imgur.com 118307 32% 24%
    free-tv-video-online.info 71972 65% 38%
    quora.com 251746 13% 28%
    lyricsmania.com 229221 14% 12%
    foreignpolicy.com 56583 83% 18%
    wtvr.com 43562 124% 67%
    sports-reference.com 58155 65% 10%
    refinery29.com 100977 29% 19%
    macmillandictionary.com 150033 17% 12%
    hitfix.com 75004 42% 28%
    zacks.com 87375 33% 14%
    motherjones.com 200106 12% 14%
    dslreports.com 82131 32% 28%
    allposters.com 67786 39% 11%
    rt.com 89352 26% 10%
    easycounter.com 38134 87% 15%
    change.org 89739 23% 11%
    newrepublic.com 129003 15% 15%
    boostmobile.com 54172 40% 11%
    stream-tv1.net 18126 548% 151%
    newsweek.com 110915 16% 17%
    iconosquare.com 93694 19% 31%
    watch-series-tv.to 65776 28% 13%
    websta.me 112232 14% 23%
    800-numbers.net 76397 22% 23%
    hypestat.com 30643 81% 172%
    pcgamer.com 77815 21% 24%
    nybooks.com 95426 16% 20%
    advanceautoparts.com 90342 17% 25%
    radio.com 53584 32% 11%
    newmexicocriminallaw.com 13363 4012% 30%
    mp3skull.to 35886 56% 143%
    religionfacts.com 24257 107% 24%
    thinkprogress.org 64598 22% 20%
    wikimedia.org 92805 15% 28%
    microcenter.com 41385 39% 13%
    kochdavis.com 14444 408% 181%
    mixcloud.com 32295 56% 89%
    topix.com 114888 11% 12%
    fox2now.com 35422 42% 23%
    kcci.com 24271 75% 64%
    grist.org 20662 100% 30%
    stemfireandems.com 10752 2072% 49%
    shazam.com 30850 45% 103%
    eurogamer.net 45327 26% 49%

    You can see SearchMetrics’ findings in a PDF here. They’re also updating data as the roll-out continues, so you can see the latest here.

    The effects of “Mobilegeddon” do have the potential to be felt by many sites who haven’t paid attention to Google’s warning. The update was officially announced in February, and was hinted at for months before that. Google has given webmasters time to prepare.

    Small businesses who don’t have the time or resources to dedicate to making their sites mobile-friendly, or even pay attention to the latest happenings in SEO, are likely to be hurt by the update the most. All of that said, this is still just one of over 200 signals Google is taking into account when ranking search results on mobile devices. It’s not everything. It’s also on a URL-by-URL basis, and is supposed to update in near real time, so webmasters can fix pages over time, and potentially increase their rankings without waiting months for Google to recognize these fixes, as they have with some other updates.

    Last week, Lawyer.com, a site which helps people find law firms, announced that it had analyzed the law sites in its database, and found that many of them will likely be affected negatively by Google’s update. 46% of solo firms failed Google’s requirements, it said, while larger firms did a little better with a 33% failure rate.

    The reality is that the update shouldn’t affect any particular vertical more than the next. It’s not like Panda where it is specifically looking at the type of content. It’s strictly looking at technical elements that enable the content to be consumed on a mobile device with ease. Even if your content completely sucks, it can pass Google’s mobile-friendly test. It may not help you with other signals, but that’s a different story.

    Last week, Google put its latest round of guidance related to the update. This included an FAQ. The important takeaways from that included:

    – The update does not affect searches on tablets or desktops, and it’s a page-level change. Only mobile-friendly pages will be able to get a boost as a direct result of the change.

    – Google determines whether or not a page is mobile-friendly every time it’s crawled, so webmasters won’t have to wait for another update after they fix a page for it to get the advantage of the signal. This also means that if you weren’t quite ready for the update today, it shouldn’t be that big a deal as long as you can still fix what need’s fixing.

    – Google is saying now that the roll-out should take “a week or so”. You can’t determine whether or not you’ve been impacted on April 22.

    – If your pages are designed to work well on mobile devices, but aren’t passing Google’s mobile-friendly test, it’s probably because you’re blocking Googlebot for smartphones from crawling resources like JavaScript and CSS. This is the most common reason that happens.

    – You can still link to sites that Google doesn’t consider mobile-friendly without fear of repercussions.

    “It’s not the best experience for mobile visitors to go from a mobile-friendly page to a desktop-only page, but hopefully as more sites become mobile-friendly, this will become less of a problem,” says Google’s Maile Ohye.

    – Mobile-friendliness is assessed the same regardless of whether a site is using responsive design, separate mobile URLs, or dynamic serving.

    – It’s naive to think you don’t need to worry about the signal because you think your audience is desktop-only. More and more people are using mobile devices more as time goes on. Even if they’re mostly desktop-only now, that doesn’t mean it will stay that way.

    – Pages with the old style of object YouTube embeds may register as not mobile-friendly. Make sure pages are using the newer iframe embeds.

    – For tap target size, Google suggests a minimum of 7mm width/height for primary targets and a minimum margin of 5mm between secondary ones.

    You an also read this for additional guidance on how to improve your site’s mobile-friendliness.

    What do you think of Google’s update? Has it made search results better? Let us know in the comments.

    Image via Google

  • Google’s Mobile-Friendly Reportedly Hurting Law Sites

    Google’s Mobile-Friendly Reportedly Hurting Law Sites

    This week, Google began rolling out its dreaded mobile-friendly update. It’s unclear exactly how long it will take for the roll-out to complete, but it sounds like it will be at least a full week, based on what Google has said.

    The company did indicate that it has rolled out completely in at least some of its data centers.

    “So that is something where I think you will probably see that change over the course of a week, maybe a week and a half – something like that,” said Google’s John Mueller in a webmaster hangout. “From the first day to the next day, I don’t think you’ll see a big change. But if you compare last week to next week, then you should see a big change.”

    While it’s still early to know what kinds of businesses are going to be impacted most, it does appear that law sites might get hit pretty hard.

    Of course it’s not like the update is targeting any particular vertical, but law firm search engine Lawyer.com says it has reviewed mobile readiness across its database of over 100,000 U.S. law firm sites using Google’s mobile-friendly test on April 21, and found that 46% of solo firms failed Google’s requirements. Larger firms did a bit better with a 33% failure rate. The site says:

    Websites for Texas-based law firms passed Google’s tests 68% of the time compared to only 65% for California-based law firms and 61% for both New York and Florida-based firms. Male owned solo firms passed slightly more often than female owned solo firms with rates of 60% and 58%, respectively. Websites of solo lawyers 50 years or older had a 54% pass rate while sites of younger solo lawyers reached 55%. Personal Injury law firms had the highest pass rate of all major practice areas (67%), while Real Estate firms had a low pass rate at only 57%. Patent law firms, which often have tech savvy partners, surprisingly had a low pass rate of just 44%.

    Home pages of three of the top-five grossing law firms in the U.S. failed Google’s mobile-friendly test; Latham & Watkins, Skadden Arps, and Clifford Chance all have websites that can expect organic search traffic declines until adding responsive design elements. Google has indicated that drops in traffic will not be reflected immediately and may take over a week for indexing to occur.

    “Making the necessary adjustments is extremely important for business websites in any industry,” said Lawyer.com CEO Gerald Gorman. “It is especially important in the legal services industry as most users now start their Lawyer search on Google using a mobile phone or tablet.”

    SEO Clarity is keeping score on what’s happening with the update here. On the third day of the roll-out, it reported that out of 50,000 keywords it analyzed and the associated 60,000 domains appearing in the Top 10 results in Google’s desktop and mobile SERPs, that day showed a 0.5% change in the opposite direction.

    Image via Google

  • Google Rolls Local Update Out To More Countries

    Back in July, Google launched an algorithm update that shook up local search results in the U.S. While Google never gave it an official name like Panda or Penguin that we’re aware of, Search Engine Land started calling it the “Pigeon” update, and that’s what people in the industry have, for the most part, adopted for it.

    That update has now reportedly started rolling out to other countries including the UK, Canada, and Australia.

    Back in July when the update launched, people noticed missing 7 packs in some types of local results, and Google confirmed the update, saying that it “ties deeper into their web search capabilities, including the hundreds of ranking signals they use in web search, along with search reatures such as Knowledge Graph, spelling correction, synonyms, and more.”

    The update was also said to improve distance and location parameters.

    Search Engine Land is now reporting that the update has significantly affected local businesses in the new regions, and points out that Google is, once again, making major algorithm changes around the holidays, which it had pretty much stopped doing until recently.

    As we’ve talked about in the past, Google updates around the holidays can deliver major blows to businesses at the most important sales time of the year. Now, not only are they rolling this update out, they’ve been slowly rolling out a new version of Penguin.

    Image via Google

  • MetaFilter Reportedly Recovering From Google Update

    A few months ago, MetaFilter founder Matt Haughney revealed in a blog post that a decline in Google traffic resulting from a 2012 algorithm update had led him to lay off some of the site’s staff. He wrote:

    Today I need to share some unfortunate news: because of serious financial downturn, MetaFilter will be losing three of its moderators to layoffs at the end of this month. What that means for the site and the site’s future are described below.

    While MetaFilter approaches 15 years of being alive and kicking, the overall website saw steady growth for the first 13 of those years. A year and a half ago, we woke up one day to see a 40% decrease in revenue and traffic to Ask MetaFilter, likely the result of ongoing Google index updates. We scoured the web and took advice of reducing ads in the hopes traffic would improve but it never really did, staying steady for several months and then periodically decreasing by smaller amounts over time.

    The long-story-short is that the site’s revenue peaked in 2012, back when we hired additional moderators and brought our total staff up to eight people. Revenue has dropped considerably over the past 18 months, down to levels we last saw in 2007, back when there were only three staffers.

    Google even confirmed that the site was hit by a previously undisclosed algorithm change, which it did not confirm existed back then. Google’s Matt Cutts indicated in May that a solution was on the way.

    While it may have been more like months than weeks, it appears that the solution may have finally come. The site’s traffic is now reportedly back on the rise. Barry Schwartz at Search Engine Roundtable points to data from Searchmetrics suggesting the site’s traffic has nearly recovered.

    So far, we haven’t seen any acknowledgement of this fro Haughey, but apparently some other sties hit at the same time MetaFilter was are recovering as well.

    Image via MetaFilter

  • Google Update Hits Local Search Results

    Google Update Hits Local Search Results

    Google has reportedly launched a significant algorithm update that’s shaken up local search results in the United States quite a bit.

    Following a tip from Brian May on Twitter, local search watcher Mike Blumenthal reported on the “turmoil” in local search results.

    They were noticing missing 7 packs in real estate results.

    Search Engine Land also reported that Google confirmed an update, saying that it “ties deper into their web search capabilities, including the hundreds of ranking signals they use in web search, along with search features such as Knowledge Graph, spelling correction, synonyms, and more”.

    According them, the company also said it improves distance and location ranking parameters. They’re calling it the “Piegon” update just to give it a name, as Google has apparently not given it one.

    While those who lose rankings as a result of this will no doubt be irate, this is probably a good thing for users, as the rankings of local search results have always been questionable. If they can tap into all these other signals, it stands to reason that relevance should improve at least to some extent.

    This update comes as Google sees increased competition in the local search space from companies like Foursquare and Yelp. Google also recently made an update to its Google Maps mobile apps to make it easier for users to find places of interest. It has also added new ads to local search results.

    Image via Google

  • Google Launches New Version Of Payday Loan Algorithm

    Last month, Google rolled out two major updates to its algorithm around the same time – new versions of the famous Panda update and the “Payday Loans” update, which is one of its ways of fighting spam.

    A newer version of the latter began rolling on Thursday afternoon.

    Google’s head of webspam Matt Cutts announced the update at the Search Marketing Expo in front of a packed house.

    “Matt Cutts explained that this goes after different signals,” recounts Barry Schwartz at SMX sister site Search Engine Land, who was in attendance. “The 2.0 version targeted spammy sites, whereas version 3.0 targets spammy queries.”

    It will target queries like “payday loans,” “casinos,” “viagra,” etc., he says.

    According to this recap of Cutts’ announcements (as tweeted by Cutts himself), he referred to the new update as Payday Loan 2.0 with last month’s being 2.0A if that helps you for any reason whatsoever.

    Also according to that recap, Google is working on improving reconsideration requests so web spam analysts can provide additional feedback. Also, Google is close to getting IE 8 referring data back. It will still show mostly as not provided, it says, but will correctly show the visitor as coming from Google search.

    Image via MYA (Twitter)

  • Facebook News Feed Update Earned This Site More Likes In 2 Months Than In Previous 5 Years

    BuzzFeed isn’t the only winner from Facebook’s recent “Panda” update. Mental Floss has made out incredibly well too, according to a Poynter report.

    Have Facebook’s recent changes to News Feed ranking affected your Facebook traffic? Let us know in the comments.

    In December, Facebook announced some changes to how it ranks content in the News Feed. The social network said it was placing an emphasis on “quality” content, aiming to show more of that and less memes and other things of little substance that have typically done very well.

    And just like that, all kinds of Pages starting getting a lot less News Feed visibility. The problem (one of them, at least) is that the algorithm appears to be playing favorites rather than truly distinguishing quality content versus low quality.

    Last week there was a lot of talk about how big-name viral content sites BuzzFeed and Upworthy have performed since the update. BuzzFeed is up, and Upworthy is down (though it’s still unclear if it was really Facebook hurting Upworthy, or if it’s just stabilizing after an unusually high traffic month). Either way, BuzzFeed is up, and is still raking in the Facebook engagement, and this is not just on the site’s “quality journalism” articles, but also on articles like, “The Definitive Ranking Of Poop,” which is up to nearly 10K likes and 3K Facebook shares after a week, compared to just 233 tweets.

    You can click the link, and judge for yourself just how high quality this piece of content is. To be clear, it’s not that I’m knocking BuzzFeed for producing this kind of content. This is the kind of thing that BuzzFeed is known for, and obviously some people do like it. But Facebook holding content like this up on a pedestal as a high mark of what counts for quality in the News Feed at the cost of visibility (and potentially business) for other content providers is a different story.

    In reality, it’s not that Facebook is intentionally trying to show people more poop list-like articles. It’s just that it considers BuzzFeed itself a trusted sources of high quality content, so as a result, it can put out whatever it wants without having to worry about the same kind of lost visibility as those negatively impacted by Facebook’s update. BuzzFeed is basically white listed.

    I know this because it’s the reality laid out by Facebook itself in an interview just after the update was announced. News Feed manager Lars Backstrom told Peter Kafka (then at All Things D), “Right now, it’s mostly oriented around the source. As we refine our approaches, we’ll start distinguishing more and more between different types of content. But, for right now, when we think about how we identify ‘high quality,’ it’s mostly at the source level.”

    So things might get better, but for right now, it doesn’t matter if you break the biggest news in the world if you’re not one of the sources Facebook has deemed “quality”. You won’t get poop list-like visibility. The time table for further News Feed algorithm tweaks to address this is anybody’s guess. In the meantime, if your visibility has dropped off despite having quality content, you’ll just have to deal with it, and hope Facebook really does figure things out.

    But back to Mental Floss, another apparent beneficiary of the apparent white list. The site does a lot of lists too, but they tend to be of significantly greater substance than the aforementioned poop list. Recent ones include: The Original Locations of 15 Famous Food Chains,” “The First Guests on 22 Late Night Talk Shows,” “25 Things You Might Not Know About Boston, and “9 of Thomas Jefferson’s Head-Turning Hobbies.” You know, lists that you can actually learn from.

    According to Poynter, the site’s monthly Facebook referrals have nearly doubled (from 1.9 million in November to 3.7 million in both December and January). It has gained more likes over the past two months than it got in its first five years on Facebook. Could Mental Floss’ history as a print publication be influencing Facebook’s treatment of the site? Interestingly, the report says Mental Floss was only officially verified by Facebook around the time it started getting all the new traffic.

    Another interesting nugget to come out of the report (in which author Sam Kirkland spoke with the site’s editor-in-chief) is that Mental Floss is not posting more frequently to Facebook, but “rather thinking hard about what he chooses to post.”

    This is particularly interesting because before Facebook announced the News Feed update, it was actually encouraging publishers to increase frequency of posts to earn more referrals. In fact, BuzzFeed was specifically named as a partner that participated in a study that helped the social network to reach this conclusion.

    “As we’ve worked with our partners and shared best practices, we’ve found that on average referral traffic from Facebook to media sites has increased by over 170% throughout the past year,” Facebook VP of Media Partnerships and Global Operations, Justin Osofsky, said in an October blog post. “In fact, from September 2012 to September 2013, TIME’s referral traffic has increased 208%. BuzzFeed is up 855%. And Bleacher Report has increased 1081%.

    “We worked with 29 media sites over a seven-day period to find out exactly how their referral traffic could be impacted if they increased the number of times they posted to their Facebook pages,” he said. “The net result: posting more frequently increases referral traffic by over 80%.”

    It’s unclear whether each of the 29 sites mentioned have benefited from the News Feed update.

    Update: After reaching out to Facebook for the whole list of sites, we’re told that they’re not disclosing it.

    How has your site been affected by Facebook’s changes? Let us know in the comments.

    Images via Facebook, BuzzFeed

  • Google Updated The Page Layout Algorithm Last Week

    Google’s Matt Cutts announced on Twitter that the search engine launched a data refresh for its “page layout” algorithm last week.

    If you’ll recall, this is the Google update that specifically looks at how much content a page has “above the fold”. The idea is that you don’t want your site’s content to be pushed down or dwarfed by ads and other non-content material.

    You want it to be simple for users to find your content without having to scroll.

    Cutts first announced the update in January, 2012. He said this at the time:

    Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.

    We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

    The initial update only affected less than 1% of searches globally, Google said. It’s unclear how far-reaching this data refresh is. Either way, if you’ve suddenly lost Google traffic, you may want to check out your site’s design.

    Unlike some of its other updates, this one shouldn’t be too hard to recover from if you were hit.

    You should check out Google’s browser size tool, which lets you get an idea of how much of your page different users are seeing.

    Image via Google

  • Facebook Gets Its Own Panda Update

    As reported earlier this week, Facebook has made some changes to how it ranks content in the news feed, putting a greater emphasis on content quality. The update may have a significant impact on how visible your site’s content will be on the social network. Comparisons to Google’s Panda update have emerged.

    Do you think Facebook’s News Feed update will have a positive or negative impact on your site? On the Facebook experience in general? Share your thoughts in the comments.

    As a WebProNews reader, you probably know how big Google’s Panda update was. If you don’t, you can learn all about it here. Long story short, Google launched a major algorithm update a couple years ago aimed at returning higher quality content in its search results. The move gave so-called “content farms” less incentive to flood the web with mediocre to poor content to serve ads on. It’s been a controversial move to say the least, and the update has been refreshed numerous times, and continues to plague some webmasters to this day.

    Google is obviously one of the primary ways Internet users find content. Another is Facebook. Like Google, it’s one of the main gateways to information on the web. If you produce web content, you want people on Facebook to be able to find it, just as you want Google searchers to find it. While perhaps not to the extent of a Google update, a Facebook News Feed update can have a major impact on a website’s ability to attract pageviews and customers. Facebook updates have already been detrimental to companies in the past.

    Facebook says the new changes may not be on the scale of the Google Panda update, but are “a step in that direction.”

    In its announcement, the company said it is paying closer attention to what makes for high quality content, and how often articles are clicked on from the News Feed on mobile. There’s good news for publishers in that they’re going to start showing more links to articles, especially on mobile, where nearly half of Facebook users are accessing the social network exclusively.

    “Why are we doing this? Our surveys show that on average people prefer links to high quality articles about current events, their favorite sports team or shared interests, to the latest meme,” says Facebook software engineer Varun Kacholia in a blog post. “Starting soon, we’ll be doing a better job of distinguishing between a high quality article on a website versus a meme photo hosted somewhere other than Facebook when people click on those stories on mobile. This means that high quality articles you or others read may show up a bit more prominently in your News Feed, and meme photos may show up a bit less prominently.”

    “To complement people’s interest in articles, we recently began looking at ways to show people additional articles similar to ones they had just read,” Kacholia adds. “Soon, after you click on a link to an article, you may see up to three related articles directly below the News Feed post to help you discover more content you may find interesting.”

    Here’s what that looks like:

    Facebook articles

    Earlier this year, Facebook introduced the concept of “story bumping” to the News Feed algorithm. This is when Facebook “bumps” up a story in the News Feed because it’s getting a lot of likes and comments.

    Facebook is now updating bumping to highlight stories with new comments. So now, you’re more likely to revisit a story that you saw before if your friends have commented on it.

    “Our testing has shown that doing this in moderation for just a small number of stories can lead to more conversations between people and their friends on all types of content,” says Kacholia.

    So there’s more to what Facebook is doing than the Pandaesque update, but that’s a major part of things, and Facebook News Feed manager Lars Backstrom opened up a bit more about it in an interview with All Things D’s Peter Kafka.

    He says they’re not really looking to promote/demote types of content, but rather do a better job of “identifying value”.

    “In the past, there were a lot of things that all fell into one bucket, and we would treat them all the same, even though they clearly weren’t,” Backstrom told Kafka. “If you see a funny meme photo in your feed — sure, you get some value from that. But if you compare that to reading 1,000 words on AllThingsD, you would presumably get more value from that experience than the first one. And, in the past, we were treating them as the same.”

    Umm, with all due respect to All Things D, isn’t that a matter of preference – something illustrated by social interaction? Kafka basically suggested as much back to him. According to Backstrom, the surveys indicate people want the quality articles more than the cat photos. But in the end, doesn’t it really depend on the article and on the cat photo? And what happened to Facebook being about what people are sharing? People like to share cat photos. People like when other people share cat photos. If there’s one thing the Internet has proven it’s that. Also, I wonder how many of Facebook’s over a billion users were actually surveyed. I don’t remember being asked about this. Do you?

    I’m not saying I personally don’t prefer a good article to a cat photo, but that’s beside the point.

    Backstrom says Facebook is not trying to “impose its will” on people. He also admits that surveys “are not necessarily the truth,” but that treating “every single click as having the same value,” as in cat photo clicks vs. in-depth article clicks, would be “as naive”.

    So the new way of doing things is naive too?

    And here’s something that a lot of smaller sites aren’t going to like very much. Right now, the changes are “mostly oriented around the source,” according to Backstrom. So apparently brand is going to make a big difference right off the bat, regardless of how in-depth your content is.

    Talk about the “filter bubble“.

    People have been calling for an unfiltered Facebook news feed for years, and they kind of got one, when Facebook launched the Ticker. Earlier this year, Facebook launched the “new” News Feed. That was in March, and a lot of people still don’t have the design. Some variations of the design don’t include the ticker, and others have it down in the corner in a less noticeable part of the interface. The future of the feature is uncertain. A lot of content is going to only be visible via the Ticker, Graph Search or on actual Timelines. The News Feed is what everyone pays attention to.

    Backstrom does say that Facebook will start “distinguishing more and more” between different types of content as it refines its approaches, so it might not all be based upon source in the future, even if it starts off that way. But who knows how long that will take? When does Facebook ever roll out things quickly?
    Google did after the Panda update. That certainly didn’t appease everyone, but at least it was something. It’s a hell of a lot more to go on than what Facebook is giving people so far. Google’s list also included twenty-three bullet points. That’s a lot more consideration than just the source of the content.

    It’s going to be harder to build a brand if Facebook – the biggest social service in the world – won’t acknowledge it to begin with.

    While meme photos are mentioned specifically by Facebook as things that will be less visible, Backstrom told Kafka that this was just an example, and that it’s not targeting one category or another.

    Apparently the kinds of posts that have a call to action (Backstrom gave Kafka the example of “one like = one respect“) that are designed to simply get likes, will not be doing so well with the update.

    Asked if the update is targeting sites like Buzzfeed or Upworthy, he said that there are no specific targets, and that he doesn’t know how the changes will impact those sites. At the very least, it may affect those sites’ sharing tactics.

    According to Backstrom, the changes aren’t going to eliminate funny Imgur photos and the like from your News Feed entirely. You just may see less of that kind of thing. I know some of us are at least hoping for less Bitstrips.

    You have to wonder how all of this will affect Facebook’s teen problem non-problem.

    Oh, did we mention that Facebook is also spreading the message that marketers are going to have to pay them if they want more visibility? AdAge reported this week:

    If they haven’t already, many marketers will soon see the organic reach of their posts on the social network drop off, and this time Facebook is acknowledging it. In a sales deck obtained by Ad Age that was sent out to partners last month, the company states plainly: “We expect organic distribution of an individual page’s posts to gradually decline over time as we continually work to make sure people have a meaningful experience on the site.”

    Discuss.

    Image: Wikimedia Commons

  • Google Goes After ‘Payday Loans’ And Other Spam With New Algorithm Update

    Google’s Matt Cutts announced that Google has “started” a new ranking update to help clean up some spammy queries. It’s one of the changes that Cutts warned us about in that big video a while back.

    In the video, Cutts talked about working harder on types of queries that tend to draw a lot of spam.

    “We get a lot of great feedback from outside of Google, so, for example, there were some people complaining about searches like ‘payday loans’ on Google.co.uk,” he said. “So we have two different changes that try to tackle those kinds of queries in a couple different ways. We can’t get into too much detail about exactly how they work, but I’m kind of excited that we’re going from having just general queries be a little more clean to going to some of these areas that have traditionally been a little more spammy, including for example, some more pornographic queries, and some of these changes might have a little bit more of an impact on those kinds of areas that are a little more contested by various spammers and that sort of thing.”

    Cutts discussed the update a little at SMX Advanced. Barry Schwartz from the conference’s sister site, Search Engine Land, reports:

    Matt Cutts explained this goes after unique link schemes, many of which are illegal. He also added this is a world-wide update and is not just being rolled out in the U.S. but being rolled out globally.

    This update impacted roughly 0.3% of the U.S. queries but Matt said it went as high as 4% for Turkish queries were web spam is typically higher.

    They’re calling it the “payday loan algorithm,” by the way (not sure if that’s official).

    In another tweet, Cutts said that the update will be rolling out over the course of the next one to two months.

    Image: Hersheys.com

  • These Are The Top Losers From Google’s New Penguin Update, According To Searchmetrics

    As it typically does with many major Google algorithm updates, Searchmetrics has released of the top losers from the Google Penguin 2.0, which the search engine rolled out this week. Based on this list, some big brands like Dish and Salvation Army were hit, as were some porn sites, travel sites and game sites. Even the Educational Testing Service was hit.

    “My first analysis shows that many thin sites, sites with thin links and especially untrusted links face the problem,” says Searchmetrics founder and CTO Marcus Tober. “In addition, some small business sites were hit because they haven’t taken SEO serious enough.”

    Here’s the list:

    Searchmetrics on Top Penguin 2.0 Losers

    Back in April of 2012, after Penguin 1.0, Searchmetrics put out one of these lists. Google’s Matt Cutts spoke out about it, saying it was inaccurate, because there had also been a Panda update, and the list was likely more indicative of that. The fact is that Google puts out algorithm changes every day, and any of these can potentially play into analysis like this.

    In fact, Google recently transitioned Panda into a rolling update, meaning it is being pushed out regularly, rather than coming in big waves like it used to. We’re not trying to discredit Searchmetrics’ list here. It’s just always best to take these things with a grain of salt.

    It’s worth noting that Searchmetrics put out an updated list after Cutts’ comments.

  • Google Launches New Page Layout Update (Yes, ANOTHER Update)

    Google is on a roll with these updates. I think webmasters are starting to understand what Google’s Matt Cutts meant when he said a while back that updates would start getting “jarring and jolting”. It seems, that rather than one major update, we’re getting a bunch of updates in a short amount of time. This past Friday, Google launched its latest Penguin refresh. A week before that, it was the EMD update and a new Panda update.

    Tuesday, Cutts tweeted about a Page Layout update:

    The Page Layout update was first announced early this year, months before we ever saw the first Penguin update. It’s sometimes referred to as the “above the fold” update. It was designed to target pages that lack content above the fold. At the time, Cutts wrote in a blog post:

    As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.

    We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content. This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

    It looks like Christmas has come early for webmasters this year. Although, on that note, this could be a sign that Google is getting all of this stuff out of the way before the holiday season, so they don’t mess too much with your rankings during this crucial time of year for ecommerce. They’ve shown in the past that they’ve learned from the infamous Florida update.

  • Google Panda Update: New Data Refresh Rolls Out

    Google tweeted late last night that it has begun rolling out a new data refresh of the Panda update. The refresh affects about 1% of search results in a noticeable way, according to the company.

    The last Panda update data refresh we saw was in late June, and that was the second one during that month. It was a rarity for Google to roll out two refreshes to this update in a single month.

    Google did, however, announce last week, that it was launching Panda for Japanese and Korean, affecting about 5% of queries (with no other languages affected at all).

    Remember, the latest announcement was just a data refresh. For more on what that means (vs. an update), see what Google’s Matt Cutts had to say about it here.

    For more information about the Panda update, including Google’s advice, recovery stories and other commentary, please peruse our Panda coverage section.

  • Google Panda Update: Google Rolls Out Data Refresh

    Google just announced via Twitter that it started rolling out a Panda refresh on Friday. According to the company, less than 1% of queries are noticeably affected in the U.S. Worldwide, 1% are apparently affected.

    Google told us earlier this month that there had not been another Panda update, after some webmasters suspected one, but that has obviously changed now.

    If you’ve been hit by the Panda update, remember, you can recover. Last year, Google put out this list of 23 questions to ask yourself about the quality of your content:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Google didn’t say exactly that these are official guidelines, though many of them did reappear in Google’s recently launched Webmaster Academy.

    Were you affected by this update? Let us know in the comments.

    More Panda coverage here.

  • Google Algorithm Changes For May: Big List Released

    We’ve all been waiting for it, and now it’s here: Google’s monthly list of algorithm changes for May. This time, it’s 39 changes (less than last month).

    Of particular note, Google says it made a couple of adjustments to Penguin:

    Improvements to Penguin. [launch codename “twref2”, project codename “Page Quality”] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm.

    Also noteworthy:

    Better application of inorganic backlinks signals. [launch codename “improv-fix”, project codename “Page Quality”] We have algorithms in place designed to detect a variety of link schemes, a common spam technique. This change ensures we’re using those signals appropriately in the rest of our ranking. 

    Of course, Google also made more adjustments to freshness.

    We’ll be digging into these much more, but for now, here’s the list in its entirety:

    • Deeper detection of hacked pages. [launch codename “GPGB”, project codename “Page Quality”] For some time now Google has been detecting defaced content on hacked pages and presenting a notice on search results reading, “This site may be compromised.” In the past, this algorithm has focused exclusively on homepages, but now we’ve noticed hacking incidents are growing more common on deeper pages on particular sites, so we’re expanding to these deeper pages.
    • Autocomplete predictions used as refinements. [launch codename “Alaska”, project codename “Refinements”] When a user types a search she’ll see a number of predictions beneath the search box. After she hits “Enter”, the results page may also include related searches or “refinements”. With this change, we’re beginning to include some especially useful predictions as “Related searches” on the results page.
    • More predictions for Japanese users. [project codename “Autocomplete”] Our usability testing suggests that Japanese users prefer more autocomplete predictions than users in other locales. Because of this, we’ve expanded the number or predictions shown in Japan to as many as eight (when Instant is on).
    • Improvements to autocomplete on Mobile. [launch codename “Lookahead”, project codename “Mobile”] We made an improvement to make predictions work faster on mobile networks through more aggressive caching.
    • Fewer arbitrary predictions. [launch codename “Axis5”, project codename “Autocomplete”] This launch makes it less likely you’ll see low-quality predictions in autocomplete.
    • Improved IME in autocomplete. [launch codename “ime9”, project codename “Translation and Internationalization”] This change improves handling of input method editors (IMEs) in autocomplete, including support for caps lock and better handling of inputs based on user language.
    • New segmenters for Asian languages. [launch codename “BeautifulMind”] Speech segmentation is about finding the boundaries between words or parts of words. We updated the segmenters for three asian languages: Chinese, Japanese, and Korean, to better understand the meaning of text in these languages. We’ll continue to update and improve our algorithm for segmentation.
    • Scoring and infrastructure improvements for Google Books pages in Universal Search.[launch codename “Utgo”, project codename “Indexing”] This launch transitions the billions of pages of scanned books to a unified serving and scoring infrastructure with web search. This is an efficiency, comprehensiveness and quality change that provides significant savings in CPU usage while improving the quality of search results.
    • Unified Soccer feature. [project codename “Answers”] This change unifies the soccer search feature experience across leagues in Spain, England, Germany and Italy, providing scores and scheduling information right on the search result page.
    • Improvements to NBA search feature. [project codename “Answers”] This launch makes it so we’ll more often return relevant NBA scores and information right at the top of your search results. Try searching for [nba playoffs] or [heat games].
    • New Golf search feature. [project codename “Answers”] This change introduces a new search feature for the Professional Golf Association (PGA) and PGA Tour, including information about tour matches and golfers. Try searching for [tiger woods] or [2012 pga schedule].
    • Improvements to ranking for news results. [project codename “News”] This change improves signals we use to rank news content in our main search results. In particular, this change helps you discover news content more quickly than before.
    • Better application of inorganic backlinks signals. [launch codename “improv-fix”, project codename “Page Quality”] We have algorithms in place designed to detect a variety of link schemes, a common spam technique. This change ensures we’re using those signals appropriately in the rest of our ranking.
    • Improvements to Penguin. [launch codename “twref2”, project codename “Page Quality”] This month we rolled out a couple minor tweaks to improve signals and refresh the data used by the penguin algorithm.
    • Trigger alt title when HTML title is truncated. [launch codename “tomwaits”, project codename “Snippets”] We have algorithms designed to present the best possible result titles. This change will show a more succinct title for results where the current title is so long that it gets truncated. We’ll only do this when the new, shorter title is just as accurate as the old one.
    • Efficiency improvements in alternative title generation. [launch codename “TopOfTheRock”, project codename “Snippets”] With this change we’ve improved the efficiency of title generation systems, leading to significant savings in cpu usage and a more focused set of titles actually shown in search results.
    • Better demotion of boilerplate anchors in alternate title generation. [launch codename “otisredding”, project codename “Snippets”] When presenting titles in search results, we want to avoid boilerplate copy that doesn’t describe the page accurately, such as “Go Back.” This change helps improve titles by avoiding these less useful bits of text.
    • Internationalizing music rich snippets. [launch codename “the kids are disco dancing”, project codename “Snippets”] Music rich snippets enable webmasters to mark up their pages so users can more easily discover pages in the search results where you can listen to or preview songs. The feature launched originally on google.com, but this month we enabled music rich snippets for the rest of the world.
    • Music rich snippets on mobile. [project codename “Snippets”] With this change we’ve turned on music rich snippets for mobile devices, making it easier for users to find songs and albums when they’re on the go.
    • Improvement to SafeSearch goes international. [launch codename “GentleWorld”, project codename “SafeSearch”] This change internationalizes an algorithm designed to handle results on the borderline between adult and general content.
    • Simplification of term-scoring algorithms. [launch codename “ROLL”, project codename “Query Understanding”] This change simplifies some of our code at a minimal cost in quality. This is part of a larger effort to improve code readability.
    • Fading results to white for Google Instant. [project codename “Google Instant”] We made a minor user experience improvement to Google Instant. With this change, we introduced a subtle fade animation when going from a page with results to a page without.
    • Better detection of major new events. [project codename “Freshness”] This change helps ensure that Google can return fresh web results in realtime seconds after a major event occurs.
    • Smoother ranking functions for freshness. [launch codename “flsp”, project codename “Freshness”] This change replaces a number of thresholds used for identifying fresh documents with more continuous functions.
    • Better detection of searches looking for fresh content. [launch codename “Pineapples”, project codename “Freshness”] This change introduces a brand new classifier to help detect searches that are likely looking for fresh content.
    • Freshness algorithm simplifications. [launch codename “febofu”, project codename “Freshness”] This month we rolled out a simplification to our freshness algorithms, which will make it easier to understand bugs and tune signals.
    • Updates to +Pages in right-hand panel. [project codename “Social Search”] We improved our signals for identifying relevant +Pages to show in the right-hand panel.
    • Performance optimizations in our ranking algorithm. [launch codename “DropSmallCFeature”] This launch significantly improves the efficiency of our scoring infrastructure with minimal impact on the quality of our results.
    • Simpler logic for serving results from diverse domains. [launch codename “hc1”, project codename “Other Ranking Components”] We have algorithms to help return a diverse set of domains when relevant to the user query. This change simplifies the logic behind those algorithms.
    • Precise location option on tablet. [project codename “Mobile”] For a while you’ve had the option to choose to get personalized search results relevant to your more precise location on mobile. This month we expanded that choice to tablet. You’ll see the link at the bottom of the homepage and a button above local search results.
    • Improvements to local search on tablet. [project codename “Mobile”] Similar to thechanges we released on mobile this month, we also improved local search on tablet as well. Now you can more easily expand a local result to see more details about the place. After tapping the reviews link in local results, you’ll find details such as a map, reviews, menu links, reservation links, open hours and more.
    • Internationalization of “recent” search feature on mobile. [project codename “Mobile”] This month we expanded the “recent” search feature on mobile to new languages and regions.
  • Google Penguin Update: Webmasters Wondering If Another One Came Out

    Update: Google says there has not been another one.

    As usual, webmasters are speculating about the latest big name Google algorithm change as some have experienced sudden traffic issues. This has happened quite frequently since last year’s Panda update was launched. Sometimes it was Panda, and other times it wasn’t.

    Now, there is talk in the WebmasterWorld forum that there may have been another Penguin update. We’ve reached out to Google for confirmation one way or another, and will update accordingly.

    In one forum thread, member Tedster writes, “A number of members who were hit by Penguin are now reporting some movement in various threads. Can anyone else see evidence of a Penguin refresh?”

    Some are seeing lower traffic, but assuming it’s because of the coming holiday weekend. However, user Anteck writes, “Getting loads of zombie traffic, suddenly the majority of visitors arnt converting. Googles up to something. All Australian sites. No holidays here.”

    There is also some speculation that the Knowledge Graph came into play. Member Rasputin writes:

    I would say that our ‘penguin’ sites recovered a few % around the 19th May while non-penguin sites fell a similar amount at the same time (no changes made – they are mostly old, small sites), but the variation after a few days is small enough that it could possibly just be seasonal variation.

    It is also possible that the introduction of Knowledge Graph is distorting the figures for our (travel) sites that fell a small amount around the same time (or the sunny weather in the UK could have got people away from their computers…)

    Google, by the way, appears to consider Knowledge Graph one of its crowning achievements. CEO Larry Page was sure talking it up this past week.

    Either way, it’s put a little less emphasis on Google+ profiles for some search results, though it’s still thrusting them in the spotlight for others (like Mark Zuckerberg’s).

    It’s hard to believe, but here we are close to the end of May already. Before too long, we should be seeing a new giant list of Google algorithm updates for the past month.

    Image: The Batman Season 4 Episode 2 (Warner Bros.)

  • Google Tweaks Algorithm To Surface More Authoritative Results

    Who you are matters more in search than ever. This is reflected in search engines’ increased focus on social signals, and especially with authorship markup, which connects the content you produce with your Google profile, and ultimately your Google presence.

    Late on Friday, Google released its monthly list of search algorithm changes, and among them was:

    More authoritative results. We’ve tweaked a signal we use to surface more authoritative content.

    Google has tried to deliver the most authoritative content in search results for as long as I can remember, but clearly it’s been pretty hard to get right all the time. The Panda update, introduced in February 2011, was a huge step in the right direction – that is if you think Panda has done its job well. Perhaps to a lesser extent, the Penguin update is another step, as its aim is to eliminate the spam cluttering up the search results, taking away from the actual authority sites.

    About a year ago, Google released a list of questions that “one could use to assess the quality of a page or an article.” This was as close as we got to a guide on how to approach search in light of the Pand update. There were 23 questions in all. Some of them are directly related to authority.

    Would you trust the information presented in this article?

    Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?

    Does this article have spelling, stylistic, or factual errors?

    Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?

    Does the article provide original content or information, original reporting, original research, or original analysis?

    Does the page provide substantial value when compared to other pages in search results?

    How much quality control is done on content?

    Does the article describe both sides of a story?

    Is the site a recognized authority on its topic?

    For a health related query, would you trust information from this site?

    Would you recognize this site as an authoritative source when mentioned by name?

    Does this article provide a complete or comprehensive description of the topic?

    Does this article contain insightful analysis or interesting information that is beyond obvious?

    Would you expect to see this article in a printed magazine, encyclopedia or book?

    Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?

    Google’s Matt Cutts gave something of an endorsement to a list of tips to consider post-Penguin update, written by Marc Ensign. One of those was “Position yourself as an expert.”

    Of course, we don’t know what exactly Google did to the signal (one of many, I presume) it uses to surface more authoritative content. It’s worth noting that they made a change to it, however, and it will be interesting to see if there’s a noticeable impact in search results.

    It’s one thing for Google to preach about quality content, and saying that’s what it wants to deliver to users, but we continue to see Google cite specific actions it has taken to make good on that, even if we can’t know exactly what they are (Google is vague when it lists its changes). Panda and Penguin are obviously major steps, but Google seems to be doing a variety of other things that cater to that too.

    I mentioned authorship. That’s a big one, and one you should be taking advantage of if you want to be seen as an authority in Google’s eyes. It really means you should be engaging on Google+ too, because it’s tied directly to it. For some authors, Google will even show how many people have you in Circles in the search results. It’s hard to dispute you being an authority if you manage to rack up a substantial follower count.

  • Google Webspam Algorithm Update Draws Mixed Reviews From Users

    Google’s Matt Cutts has been talking about leveling the playing field for sites that don’t participate in “over-optimization”. Last month at SXSW, Cutts made something of a pre-announcement about such changes, and it looks like a major part of these efforts is now launching.

    According to Danny Sullivan, who spoke directly with Cutts, this is indeed the change Cutts was referring to at SXSW, but that Cutts admits “over-optimization” wasn’t he best way of putting it, because it’s really about webspam, and not white hat SEO techniques.

    Cutts himself announced a new algorithm change targeted at webpspam, which he describes as black hat techniques. “We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings,” he says.

    Link schemes are actually something webmasters have been getting messages from Google about already. The company recently de-indexed paid blog/link networks, and notified webmasters about such links.

    “The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines,” says Cutts. “We’ve always targeted webspam in our rankings, and this algorithm represents another improvement in our efforts to reduce webspam and promote high quality content. While we can’t divulge specific signals because we don’t want to give people a way to game our search results and worsen the experience for users, our advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods instead of engaging in aggressive webspam tactics.”

    Google has kind of sent webmasters mixed signals about search engine optimization. They recently shared some SEO DOs and DON’Ts, specifically talking about some white hat things webmasters can do to help Google rank their content better. And Cutts’ point about not divulging specific signals so people can’t game search results is one the company has stood by for ages. But at the same time, Google does divulge algorithm changes it makes via monthly lists, which seem to dare webmasters to play to certain signals. That’s not to say they’re encouraging the kind of black hat stuff Cutts is talking about here, but doesn’t it kind of say, “Hey, these are some things we’re focusing on; perhaps you should be thinking about these things with your SEO strategy?” Isn’t that encouraging “gaming” to some extent, rather than just telling webmasters not to worry about it?

    Of course Google always says not to focus on any one signal, and just focus on making good, quality content. In fact, this new change (as in line with Cutts’ comments at SXSW) indicates that sites shouldn’t have to worry about SEO at all.

    “We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites,” Cutts says. Emphasis added.

    As far as black hat SEO, it’s not as if this is some big change out of the blue. Algorithmically, it’s a change, but Google has always targeted this stuff. There’s a reason Cutts has been the head of webspam. Google has never been shy about penalizing sites violating its quality guidelines. Google even penalized its own Chrome site when some paid linking by the hands of a marketing agency was unearthed.

    If you’re engaging in SEO, and Google gets you on black hat tactics, you probably knew what you were doing. You probably knew it was in violation of Google’s guidelines. Of course, that’s assuming Google’s algorithm change does not make any errors. And what are the chances of that happening? Google will be the first to admit that “no algorithm is perfect.” As we saw with the Panda update, there were some sites hit hard, that possibly shouldn’t have been.

    So is that happening this time? It’s still early. As far as I can tell, the change hasn’t even finished rolling out. But there are plenty of people already commenting about it.

    Others are critical of Google’s search quality in general:

    From the comments on Cutts’ announcement:

    So far today’s search results are worse than they’ve been for the past month. On one search for a keyword phrase there’s a completely unrelated Wikipedia page, a random Twitter account for some company, and a page from an independent search engine from 1997 showing in the top 10 results. Yeah, that’s the kind of quality user experience we want to see. Way to knock it out of the park.

    well now more rubbish results appearing in search than before. more exact domain name match results and unrelated websites . Google failed once again.

    so many .info, .co unrelated domains ranked for respected queries. are you sure no mistake in this update?

    Surely, whatever these updates are doing, they are not right. Here’s just one example. A search for “ereader comparison chart” brings up “ereadercomparisonchart dot com” on 2nd page of results and it goes “Welcome! This domain was recently registered at namecheap.com. The domain owner may currently be creating a great site for..”
    While my site which provided true value to its readers is nowhere to be found.
    Please fix this.

    there is something wrong with this update . search “viagra” on Google.com 3 edu sites are showing in the first page . is it relevant? matt you failed .

    Search Google for a competitive term such as “new shoes” — look who’s #1: Interpretive Simulations – NewShoes – (Intro to Marketing, Marketing Principles). All competitive terms have some youtube videos on the top which aren’t of any good quality even. This is not what is expected of google. Please revert.

    These are results have to be a complete joke, so much unrelated content is now surfaced to the top it’s sickening.

    That’s just a sampling. There’s more in other forums, of course, such as WebmasterWorld. There is some more talk about exact match domains being hit. User Whitey says:

    News just in to me that a large network of destination related exact match domains [ probably 1000+], including many premium ones [ probably 50+], ultra optimized with unique content and only average quality backlinks with perhaps overkill on exact match anchor text, has been hit.

    A few of the premium one’s have escaped. Not sure if the deeper long tail network which were exact match have been effected, but they would have had little traffic.

    The sites were built for pure ranking purposes, and although largely white hat, didn’t do much beyond what other sites in the category do.

    User Haseebnajam says:

    Ranking Increase = squidoo, blogspot, forums, subdomains
    Ranking Decrease = exact match domains, sites with lots of backlink from spun content sources

    User driller41 says:

    I am seeing changes in the UK today, most of my affiliate sites are down which is annoying – all are exact match domains btw.

    Most of the backlinks are from web2.0 sites with spun content in the downed sites.

    One interesting point is that one of the sites which I had built most links to is unafected – the only differnce between this and my downed sites is that I never got around to adding the affiliate outlinks to this website – so google does not know that this site is an affiliate and thus no punishment has been dished out.

    We’ll keep digging for more on the Google’s webmspam update.

    Update: More on that viagra thing.

    The new algorithm change is launching over the next few days, Cutts says, and it will impact 3.1% of queries in English, “to a degree that a regular user might notice.” It affects about 3% of queries in German, Chinese and Arabic, but in “more heavily-spammed languages,” he says. “For example, 5% of Polish queries change to a degree that a regular user might notice.”

  • Google Webspam Update: Where’s The Viagra? [Updated]

    Google Webspam Update: Where’s The Viagra? [Updated]

    Update: Viagra.com is back at number one.

    As you may know, Google launched a new algorithm update, dubbed the Webspam Update. According to Google, it’s designed to keep sites engaging in black hat SEO tactics from ranking. The update is still rolling out, but it’s already been the target of a great deal of criticism. You can just peruse the comments on Google’s Webmaster Central blog post announcing the change, and see what people have to say.

    I can’t confirm that Viagra.com was number one in Google for the query “viagra,” but I can’t imagine why it wouldn’t have been. Either way, viagra.com is not the lead result now. That is, unless you count the paid AdWords version.

    Google Viagra results

    As you can see, the top organic result comes from HowStuffWorks.com. Then comes….Evaluations: Northern Kentucky University? Interesting. Here’s what that page looks like:

    Northern Kentucky University

    You’ll notice that this has absolutely nothing to do with Viagra.

    Browsing through some more of the results, there are some other very suspicious activity going on. Look at this result, which points to : larryfagin.com/poet.html. That URL does’t sound like it would have anything to do with Viagra, yet Google’s title for the result says: “Buy Viagra Online No Prescription. Purchase Generic Viagra…” and the snippet says: “You can buy Viagra online in our store. This product page includes complete information about Viagra. We supply Viagra in the United Kingdom, USA and …”

    If you actually click on the result, it has nothing to do with Viagra. It’s about a poet named Larry Fagin. Not once is Viagra mentioned on the page.

    Larry Fagin

    Also on the first results page: aiam.edu. That’s the American Institute of Alternative Medicine. At least it’s semi-drug-related. However, once again, no mention of Viagra on this page, though the title and snippet Google is providing, again, indicate otherwise. Google also informs us, “this site may be compromised”. I’m not sure what about this particular result is telling Google’s algorithm that it should be displayed on page one.

    The next result is for loislowery.com:

    Lois Lowery

    You guessed it. Yet again, nothing to do with Viagra. And once again, Google displays a Viagra-related title and snippet for the result, and tells us the site may compromised.

    Note: Not all of these results indicate that they’ve been compromised.

    A few people have pointed out the oddities of Google’s viagra SERP in the comments on Google’s announcement of the webspam algorithm change:

    Sean Jones says, “There is something wrong with this update. Search ‘viagra’ on Google.com – 3 edu sites are showing in the first page. Is it relevant? Matt you failed.”

    Lisaz says, “These results have to be a complete joke, so much unrelated content is now surfaced to the top it’s sickening. As a funny example check this one out….Search VIAGRA and look at the results on first page for USA queries. Two completely unrelated .edu’s without viagra or ED in their content. Another site about poetry with not even a mention of viagra anywhere to be found. Then two more sites that in google that have this site may be compromised warnings. LOL what a joke this update is. Sell your Google stocks now while you can.”

    ECM says, “Google.com. buy viagra online. Position 2… UNIVERSITY OF MARYLAND lol. I have seen a big mess in results now. Doesn’t this algo change just allow spammers to bring down competitors a lot more easily, just send a heap of junk/spam links to their sites. Nice one google, you’re becoming well liked. Enter BING.”

    How’s Bing looking on Viagra these days?

    Bing Viagra Results

    Yeah, I have to give Bing the edge on this one.

    And Yahoo:

    Yahoo Viagra Results

    And Blekko:

    Blekko Viagra Results

    And DuckDuckGo:

    DuckDuckGo Viagra Results

    We’ve seen people suggesting that the new Google update had a direct effect on exact match domain names. That could explain why viagra.com is MIA. However, it doesn’t exactly explain why some of these other results are appearing.

  • Google Panda Update: Data Refresh Hit Last Week

    On the 17th, we wrote about some webmasters who were suspecting a major update from Google. Google’s Matt Cutts has now come out and said that there was a Panda refresh around the 19th. They just didn’t say anything about it until now, which is interesting itself, considering they were tweeting about Panda updates before.

    This latest Panda refresh came to light as Searchmetrics put out its winner and loser lists (though the firm specified that they were not the final lists) for Google’s new Webspam update, which is presumably still rolling out. Cutts commented in response to Danny Sullivan’s article about the lists, saying, “Hey Danny, there’s a pretty big flaw with this “winner/loser” data. Searchmetrics says that they’re comparing by looking at rankings from a week ago. We rolled out a Panda data refresh several days ago. Because of the one week window, the Searchmetrics data include not only drops because of the webspam algorithm update but also Panda-related drops. In fact, when our engineers looked at Searchmetrics’ list of 50 sites that dropped, we only saw 2-3 sites that were affected in any way by the webspam algorithm update. I wouldn’t take the Searchmetrics list as indicative of the sites that were affected by the webspam algorithm update.”

    @dannysullivan Searchmetrics data is a weekly diff & includes a Panda data refresh, so sites going up/down mostly aren’t due to algo update. 11 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    @dannysullivan yup, believe a Panda data refresh on 4/19. I don’t think @rustybrick asked us about it; we sometimes wait for him to ask. 🙂 11 hours ago via web ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Webmasters have had over a year to get used to the Panda update, but it is clearly still wreaking havoc. For one, here’s the list of losers from Searchmetrics again:

    Searchmetrics loser list

    A couple weeks ago, we wrote about DaniWeb, which managed to get hit by Google yet again, after being hit by and recovering from the Panda update multiple times over the course of the past year. The latest incident may or may not have been Panda.

  • Google Webspam Update: Losers & Winners, According To Searchmetrics [Updated]

    Update: It turns out that Google launched a Panda refresh a few days ago, and Matt Cutts says this is more likely the culprit for Searchmetrics’ lists.

    Danny Sullivan has Cutts’ comment:

    There’s a pretty big flaw with this “winner/loser” data. Searchmetrics says that they’re comparing by looking at rankings from a week ago. We rolled out a Panda data refresh several days ago. Because of the one week window, the Searchmetrics data include not only drops because of the webspam algorithm update but also Panda-related drops. In fact, when our engineers looked at Searchmetrics’ list of 50 sites that dropped, we only saw 2-3 sites that were affected in any way by the webspam algorithm update. I wouldn’t take the Searchmetrics list as indicative of the sites that were affected by the webspam algorithm update.

    Google is in the process of rolling out its Webspam update, which the company says will impact about 3.1% of queries in English.

    Whenever Google announces major updates, Searchmetrics usually puts together some data about what it determines to be the top winners and losers from the update, in terms of search visibility. They’ve put out their first lists for this update.

    “There could be other iterations from Google that we’re not aware of at the moment, but Searchmetrics is tracking closely and will update the list accordingly,” a Searchmetrics spokesperson tells WebProNews. “These are the first numbers and Searchmetrics will have more in the future, but we want to stress that the loser list could change in the next few days.”

    “It’s unusual for Google to make major update on a Wednesday,” says Searchmetrics Founder Marcus Tober. “Normally Google makes this kind of updates on a Monday or Thursday. That’s why I assume that in the next days we’ll see more updates and this update is just the beginning. That’s why, all results in the winner and loser tables are marked as preview.”

    “In a first study I took over 50.000 keywords from short-head to medium and low search volume and looked at the current rankings from position 1 to 100,” Tober adds. “So I analyzed 5,000,000 URLs and compared the rankings to last week. In my second study which is not finished yet I take one million keywords to get a complete overview, but this will take more time.”

    Google did say yesterday, it was launching over the next few days.

    Here’s Searchmetrics’ list of biggest losers:

    Webspam losers

    Here’s their list of winners:

    Search Winners

    We’ll be taking a closer look at some of the sites on these lists.

    See also:

    Google Webspam Algorithm Update Draws Mixed Reviews From Users
    Google Webspam Update: Where’s The Viagra?
    Google Webspam Update: “Make Money Online” Query Yields Less Than Quality Result