WebProNews

Tag: SEO

  • Major UK Flowers Site Interflora Gets Slapped By Google

    Update: While Google hasn’t named Interflora, the company seems to have confirmed the situation with an out of the blue post about paid links on Google’s Webmaster Central blog today. There’s nothing new. It’s just Matt Cutts reminding everybody that Google “takes this issue very seriously”. Here’s a sample:

    Please be wary if someone approaches you and wants to pay you for links or “advertorial” pages on your site that pass PageRank. Selling links (or entire advertorial pages with embedded links) that pass PageRank violates our quality guidelines, and Google does take action on such violations. The consequences for a linkselling site start with losing trust in Google’s search results, as well as reduction of the site’s visible PageRank in the Google Toolbar. The consequences can also include lower rankings for that site in Google’s search results.

    It appears that Interflora, a major flower-seller in the UK, has been hit by manual action from Google, after participating in link buying from newspaper sites.

    Anthony Shapley at Dave Naylor’s blog has the breakdown of what he believes to have happened, as the site no longer ranks for keywords it used to, including its own name. He shares a table of over 50 newspapers sites who had their PageRank reduced after Inteflora made a big advertising push ahead of Valentine’s Day.

    Search Engine Land shares the following statement fro Google:

    We typically don’t comment on whether we’ve taken corrective webspam action regarding specific companies.

    As Barry Schwartz at that blog notes, Google has commented on similar stories in the past, like when they related to JC Penney’s, Forbes, and Overstock.

    Shapley says he is confident in his explanation, but Interflora has not commented, and if Google won’t, we may not see an official word on this, but rest assured, if Google catches you doing paid links they will punish you.

    At least Interflora will get some new brand recognition out of the whole thing. It seems unlikely that they won’t make their way back into the rankings after a while. Google managed to get its Chrome landing page back in the rankings after penalizing it.

  • My SEO Confession

    I have a confession to make. My views on SEO have changed.

    Were I a politician, I would surely be accused of flip-flopping, waffling, and “being against something before I was for it” by pundits. But I am a business person, and I believe that businesses that fail to adjust course when presented with new facts will ultimately fail.

    I famously claimed back in 2006 that “SEO isn’t Rocket Science.” By that I meant that most firms could obtain most of the benefits of SEO by simply following the guidelines posted by Google without the need to resort to obscure and expensive SEO tactics. Many disagreed, the debate about the proper role of SEO produced a lot of commentary, and ultimately an SEO competition for SERP domination using the keywords “Dave Pasternack.” (The competition resulted in a SERP draw between myself and the famous seafood chef).

    2006 was eons ago in Internet time and I think most people would agree with me that the SEO landscape has changed radically. Google polices its SERP real estate much more methodically than it did in 2006 and its penalties for violation of certain of its rules — especially related to content and linking policies — are severe and unforgiving. The Wild West Days are over — civilization — for better or worse — has tamed the Frontier.

    Part of me wants to gloat because the Google Guidelines really do rule the Frontier now. At the same time, however, the claim that “SEO Isn’t Rocket Science” may no longer be true.

    Why? Because everything we do now — in this era of big data — is rocket science. The level of complexity that’s required to run multi-channel, multi-device, geo-targeted campaigns requires more human and computation power than a 1968 Moon Launch. Many firms (including my own) are required to hire Data Scientists to make sense of all of the volume, velocity and variety of data.

    So what’s ahead for SEO? Well, take a look at what’s happened in the past two years. Panda and Penguin have forced the SEO industry into a completely new, very healthy course heading — toward quality content creation/curation and general competitive webmastering. “Gaming the system” is still part of the DNA of SEO, but the focus is on sustainable results — not quick ranking bumps. Consequently, within Corporate America, SEO is increasingly being appreciated strategically — in terms of where it fits into the total paid/earned/owned media mix environment. For the first time, expectations – and budgets — for SEO are being set correctly — as something that every firm must concern itself with if it wants online visibility. SEO careers — because they are multi-disciplinary, multi-skill, and team-based, will continue to thrive.

    So call me a flip-flopper, but I’m as bullish on the future of SEO as anyone. SEO has a great future. (And by the way, if I didn’t believe in SEO I wouldn’t have agreed to acquire an SEO firm last year).

  • Google Penguin Update Hasn’t Been Refreshed Since October [Report]

    Search Engine Roundtable reports that it has confirmed with Google that it has not launched a data refresh for the Penguin update since the last reported refresh in October.

    Barry Schwartz writes, “Google has told us that Penguin is rarely refreshed, unlike Panda and we didn’t miss any Penguin refreshes since.”

    Apparently some people thought there may have been unreported refreshes, and misconstrued something Google’s John Mueller said in this Hangout:

    The Panda is updated regularly, and much more frequently. The last one of those came last month, as Google announced on Twitter.

    We’re still waiting for Google to release its (what used to be) monthly lists of algorithm changes (or “Search Quality Highlights”) for the past several months. They haven’t done that since October either.

    So far, Google hasn’t really pushed out any earth shattering updates in 2013.

  • Google Talks About Phone Number Spam Again

    Nearly a year ago, Google’s Matt Cutts took to Google+ to discuss phone number spam.

    “I wanted to clarify a quick point: when people search for a phone number and land on a page like the one below, it’s not really useful and a bad user experience. Also, we do consider it to be keyword stuffing to put so many phone numbers on a page,” he said. “There are a few websites that provide value-add for some phone numbers, e.g. sites that let people discuss a specific phone number that keeps calling them over and over. But if a site stuffs a large number of numbers on its pages without substantial value-add, that can violate our guidelines, not to mention annoy users.”

    This is the image he was referring to:

    Phone Number Spam

    Today, Google released its latest Webmaster Help video, which features Cutts talking about the subject once again. It’s short and sweet, and basically serves as a reminder that Google will take action on this kind of thing:

  • SEO Factors To Consider When Choosing A Domain Name

    The old saying “a stitch in time saves nine” couldn’t be more applicable than when it comes to launching a new website. It pays to take the time “make your list and check it twice.” Making the right choices before you launch a website can save a lot of time later.

    Obviously, one of the of the most important decisions you’ll need to make when launching a new site is your domain. Since about ⅔ of consumers use search engines to help make buying decisions, search engine traffic is critical to the success or failure of most websites. This results in SEO being a common decision-making factor for choosing a domain.

    How will my domain name impact SEO?

    There are two primary ways your domain will impact your future SEO efforts and search rankings:

    • Keywords
    • Branding

    Let’s examine each of these in more detail.

    Keywords

    Historically, many SEOs chose domain names that included their target keyword phrases. For example, if you wanted to rank for the keyword green widgets, you might use a domain such as greenwidgets.net (exact match domain or EMD) or greenwidgetsshop.com (phrase match domain or PMD). The presence of the keyword phrase in the domain made it easier to gain a high ranking for that keyword phrase.

    With the introduction of recent algorithms such as the Penguin update and the Exact Match Domain (EMD) update, Google has changed how they view domains that include keywords. Is it still worthwhile to choose a domain that includes your target keyword phrase? Let’s look at the data.

    Should you choose a domain name with your keywords in it?

    Our recent Google’s EMD Update study found that after the Google EMD Update:

    • Average EMD site ranking decreased from #13.4 to #26.6
    • Average PMD site ranking decreased from #39.7 to #47.7

    Dr Pete also has some excellent data on EMDs in his article Are Exact-Match Domains (EMDs) in Decline?

    From this data, we can draw the conclusion that EMDs (and PMDs) no longer provide the same ranking boost that they used to. However, EMDs can, and in many cases do, still rank well. Our advice regarding keywords in your domain is:

    • If you already own an EMD or PMD, you don’t necessarily need to get rid of it
    • If you’re buying a new domain, an EMD or PMD isn’t necessarily bad, but branding factors are more important
    • If you can buy a domain that includes one or more of your keywords without sacrificing any branding considerations, that may be a good choice

    Branding

    It may interact with SEO in a less obvious way, but branding is actually the most important SEO consideration for purchasing a new domain. Your online brand (how people perceive and remember you) will directly impact your SEO efforts and results. Why? It’s simple:

    • “Brands are the solution, not the problem. Brands are how you sort out the cesspool.” ~Google CEO Eric Schmidt
    • Google likes brands, because users like brands. Which site would you rather read, link to, or share with your friends – NYtimes.com or your-ny-news-stuff.com ?

    See The Rise of Brands in Google’s Relevancy Algorithms.

    A strong online brand means users are more likely to click on, read, share, and link to a website…all of which will help the site gain higher Google rankings.

    Choosing a domain as the foundation of your online brand

    The first step in building a strong online brand is choosing a good domain. Choose a domain that is:

    • Memorable. You have no hope of building a brand if users can’t remember your name.
    • Unique. A generic sounding name, such as musicsite.com won’t have the same impact as a unique domain name.
    • Relevant. Some domains are industry-neutral, whereas others are clearly relevant to a specific industry (example: WebMD).
    • Not error-prone. For instance, a domain such as example.ws is a branding nightmare, because users will tend to type example.com instead. Delicious changed its domain name because so many users got confused by their non-standard domain.
    • Short. Most well-known online brands are 1-2 words or less. SEOmoz suggests sticking to a domain of 15 characters or less.

    Remember that your domain is just the start of building a brand – an essential step, but only the first step.

    Bonus Tip: Avoid Hyphens

    If mysite.com is taken, should you buy my-site.com? No. Here are 3 reasons to avoid hyphenated domains.

  • Google On How To Figure Out Which Links To Remove

    For the past year or so, webmasters have been receiving a great deal of messages from Google about unnatural links pointing to their sites. You may know exactly which links Google doesn’t like, but there’s also a good chance you may not.

    As we’ve seen, a lot of people have gone on link removal request rampages, greatly overreacting, and seeking the takedown of legitimate links out of fear that Google might not like them.

    In the latest Webmaster Help video, Google’s Matt Cutts discusses how to figure out which links to get removed. The video is a response to this user-submitted question:

    Google Webmaster Tools says I have “unnatural links,” but gives little help as to which specific links are bad. Since I have never purchased links, I don’t know which ones to have removed, and I’m scared of removing good ones, which will hurt my traffic. Suggestions?

    “We’ve tried to become more transparent, and when we were saying, ‘Links were affecting the reputation of an entire site,’ we would tell people about that,” says Cutts. “And more recently we’ve been telling people, and opening up and saying, ‘Hey, we still like your site. Your site, overall, might be good, but maybe there’s some individual links to your site that we don’t trust.’ Now, the problem is that we weren’t, at that time, giving specific examples. So one feature that we rolled out is the ability to sort by recent, discovery of links, so you can actually get the date of when we discovered a link. So if you sort that way, you can look for the recent links. But a feature that we are working on – we are in the process or rolling out – is that we will actually – we will basically give you examples.”

    “So it’s a…you know, as we’re building the incident whenever a webmaster analyst or something like that is saying, ‘Okay, these are links not to trust,’ they’ll include an example link,” continues Cutts. “You might get one, you might get two, you might get three, depending, but basically it will give you an idea of the sorts of links that we are no longer trusting. Now, it’s not exhaustive. It’s not comprehensive, but it should give you a flavor, you know. Is it a bunch of widget links? Were you doing a bunch of keyword-rich anchor text in article bank or article marketing type stuff? Maybe you weren’t trying to do paid links, but maybe you hired an agency, and it turns out they were doing paid links, and you didn’t realize it.”

    “I would look in the text of the messages,” concludes. “Over time, we’re working really hard on trying to include an example or two link, so that when you get that message, you have an idea of exactly where to look.”

  • Eric Schmidt Ties Search Rankings To Verified Profiles In Upcoming Book

    It’s become pretty obvious that Google is looking to put user identity at the forefront of a number of its products, and that includes search. Nothing new there.

    Eric Schmidt apparently talks about this in his upcoming book, and notes flat out that profile verification will be directly tied to search engine rankings. The Wall Street Journal has a few quotes from the book (hat tip to Search Engine Watch), including this one:

    “Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.”

    Seems pretty clear cut. Google, as you may know, has been pushing authorship for quite some time now, and Google’s Matt Cutts recently made comments indicating that this will be a much more significant signal going forward. Here are some quotes from a webmaster hangout he participated in:

    “In the short term, we’re still going to have to study and see how good the signal is, so right now, there’s not really a direct effect where if you have a lot of +1s, you’ll rank higher. But there are things like, we have an authorship proposal, where you can use nice standards to markup your webpage, and you’ll actually see a picture of the author right there, and it turns out that if you see a picture of the author, sometimes you’ll have higher click through, and people will say, ‘oh, that looks like a trusted resource.’ So there are ways that you can participate and sort of get ready for the longer term trend of getting to know not just that something was said, but who said it and how reputable they were.”

    “I think if you look further out in the future and look at something that we call social signals or authorship or whatever you want to call it, in ten years, I think knowing that a really reputable guy – if Dan has written an article, whether it’s a comment on a forum or on a blog – I would still want to see that. So that’s the long-term trend.”

    “It’s just the case that that picture is just more likely to attract attention. It’s just a little more likely to get the clicks, and you now, it’s almost like an indicator of trust.”

    “The idea is you want to have something that everybody can participate in and just make these sort of links, and then over time, as we start to learn more about who the high quality authors are, you could imagine that starting to affect rankings.”

    Schmidt’s book, which he co-authored with Jared Cohen comes in April. It looks like he just gave SEOs and webmasters a reason to read it.

  • Google: New Image Search Increases Clicks To Sites

    Google announced the launch of a new format for its Image Search. If you’re not already seeing it, you should within the next few days as it continues to roll out.

    The new design displays the image’s metdata below the image in the search results, rather than taking users to a separate landing page. Frankly, this is probably something Google should have done a long time ago. Additionally, the title of the page, the domain name, and the image size are feature more prominently.

    “People looking for images on Google often want to browse through many images, looking both at the images and their metadata (detailed information about the images). Based on feedback from both users and webmasters, we redesigned Google Images to provide a better search experience,” says associate product manager Hongyi Li. “In the next few days, you’ll see image results displayed in an inline panel so it’s faster, more beautiful, and more reliable. You will be able to quickly flip through a set of images by using the keyboard. If you want to go back to browsing other search results, just scroll down and pick up right where you left off.”

    According to Google, the changes will benefit webmasters, as the company claims that in internal testing, it has seen a net increase in the average click-through rate to the hosting site. This is attributed to how the domain name now being clickable and a button they’ve added to visit the page the image is hosted on.

    “This means that there are now four clickable targets to the source page instead of just two,” says Li.

    Source pages will no longer load an iFrame in the background of the image detail view, which makes the image-browsing experience faster. Again, Google probably should have done this a long time ago, especially considering the emphasis they’ve been putting on speeding up the user experience in recent years. Google says the move reduces the load on the source site’s servers, and improves the accuracy of metrics like page views.

    Image search query data will continue to be available in Top Search Queries in Webmaster Tools.

  • SEOmoz’s Acquisition Spree Culminates With AudienceWise

    SEOmoz announced that it has acquired AudienceWise, as the latest in a spree of acquisitions kicked off by an $18 million round of funding the company received last year from The Foundry Group and Ignition Partners. Other acquisitions have included Followerwonk and GetListed.

    “AudienceWise marks the culmination of our acquisition spree for this year,” SEOmoz CEO Rand Fishkin tells WebProNews. “Moving forward in 2013, our focus will be on accelerating our product roadmap and leveraging all the resources we’ve acquired over the last five months. The remainder of 2013 will be dedicated to putting our money where our mouth is: investing in our foundations, and ramping up our product offerings to SEOmoz’s 20,000+ users and our 300,000+ online community members.”

    AudienceWise was a company specializing in audience development consulting for publishers and e-commerce sites, but its team will now be integrated into SEOmoz’s as a talent acquisition, though SEOmoz does say it will incorporate some of AudienceWise’s technical processes, strategy and products into the Moz toolkit.

    “The main motivators for acquiring AudienceWise are the brains behind the operation, Matthew Brown and Tim Resnik,” says Fishkin. “Matt and Tim are joining Moz to help us scale our in-house marketing and grow our product expertise. Both have built software products in the past (Matt worked with Marshall on SearchCLU, Tim on an online poker subscription service) and have tremendous depth of knowledge in the fields of both inbound and paid marketing.”

    “We have a lot of phenomenal talent at SEOmoz, but only a few of us are deep into the fields of SEO, social media, content marketing, email, CRO, etc.,” he adds. “Matt and Tim are here to help serve as mentors and as internal consultant experts to our entire team, a role that I’ve been far too busy to fill effectively over the last 18 months.”

    From the sound of it, SEOmoz users have as much to gain from the deal as SEOmoz does.

    “We believe that SEOmoz subscribers will benefit immensely from Matt and Tim’s expertise, which will improve how we build our products,” says Fishkin. “Tim is a Big Data junkie. He understands the enormous potential of both structured and unstructured data, as well as the challenges that come with harnessing and leveraging it across all segments of a business.”

    “We know that Big Data matters to our users, and Tim’s vision will take us to the next level,” he continues. “Matt has worked in some of the toughest search marketing gigs, including The New York Times and at various Fortune 500 companies. Arguably, there is no tougher search marketing gig than publishing, where you live or die by clicks, and the competition grows daily. With this background, Matt knows what it takes to drive successful search marketing. His brain and willpower are going to help evolve SEOmoz’s product roadmap to meet the needs of our ever-growing user base.”

    Exact terms of the deal were not disclosed, but the price was somewhere in the low seven-figure area.

  • Google: Panda Update Data Refresh Rolling Out Now

    Google tweeted today that a new data refresh for the Panda update is currently rolling out in English, and that it affects 1.2% of queries.

    Lately, Google seems to be launching these Panda refreshes at least once a month. There was one in December, a few days before Christmas, and two in November.

    Google launched Panda in early 2011, then Penguin in the first quarter of 2012. We’re still waiting to see what (if any) major algorithm update the search engine will implement in 2013.

    More Panda coverage here.

    Image: Panda Express

  • What Does Facebook Graph Search Mean For SEO?

    Facebook has dominated the conversation in the tech world this week (for several reasons), but especially because of its unveiling of Graph Search. We’ve been waiting for years for Facebook to “get into search” and “take on Google,” and we appear to have the company’s first real attempt at doing so.

    Do you think Facebook has a legitimate shot at cutting into Google’s share of the search market? Let us know in the comments.

    It is very clear that Graph Search is not going to instantly come out and reduce Google’s piece of the search pie very significantly. It’s in very early beta and limited preview. Facebook says it is rolling out slowly, and many who have already signed up to be part of the preview are still waiting for a chance to actually use it. The company knows it has a whole lot of work to do on this product. It’s starting off by focusing on four main areas of search: people, photos, places and interests. These are four major things, but there is so much more that Facebook could (and will) do. Facebook posts and open graph actions will be added in the coming months, according to Mark Zuckerberg. Mobile will eventually be added as well. So will Instagram, and probably plenty of other things in time.

    In other words, it’s not so much about what Facebook has unveiled, as what Graph Search could evolve into. Could it evolve into a Google killer? Probably not, but who can say for sure? The reality is that it doesn’t have to be a Google killer to be successful, and a useful tool for Facebook users. More time spent on Facebook (especially time spent using search on Facebook) has the potential to draw away some amount of ad spend from Google to Facebook, which really could hurt Google to some extent.

    Facebook has a legitimate shot at being a real player in search because, for one, it has over a billion users already, and for two, because it can provide answers that Google can’t. There is plenty of room for Facebook Graph Search to flourish with or without Google dominating traditional web search, because Graph Search is not traditional web search. In fact, one of the first things Zuckerberg said when he introduced the product on Tuesday, was that it is “not web search”.

    Facebook does utilize its partnership with Bing to add the web search element, and as Liz Gannes at All Things De writes, Graph Search should only help Google’s case for increased competition in search when it comes to antitrust scrutiny.

    Some have dismissed the offering as “not a big deal”. I’m not so sure I agree with that. Either way, we at least owe it to Facebook to let the product show us what it can do before rushing to snap judgments. Give users a chance to figure out what they can do with it. Give Facebook a chance to move it forward out of beta, and add the stuff it really wants to add.

    Privacy

    Privacy concerns generally come attached to any major Facebook product launch. The controversy the company has drawn in the past with regards to privacy doesn’t help perception. Still, privacy was a major point of discussion by Facebook as it unveiled Graph Search. In fact, they released a video with some privacy tips just as they announced the product.

    The fact of the matter is (at least this is what Facebook is telling us), is that users will only be able to see things on Facebook that they already could. The only thing that changes is that users have a new way to discover these things. Still, that could be enough to make some users feel uneasy, which is why Facebook recommends checking out how you’re already sharing your data. Indeed, if you haven’t perused your privacy settings lately, you might want to take a look and make sure you’re comfortable with them.

    Facebook SEO

    Okay, now let’s get to the business side of things. Graph Search may just present businesses with some great new opportunities to get in front of users on Facebook, a feat that has become increasingly challenging as Facebook has tinkered with the way it displays updates from Pages in the News Feed. With Graph Search comes a whole new area of search engine optimization. Whereas optimizing for Bing might be pretty similar to optimizing for Google, optimizing for Facebook’s Graph Search is bound to be an entirely different beast.

    For one, optimizing for Graph Search is not about optimizing a web page (although it might make your Bing rankings of greater concern).

    Facebook has already shared some optimization tips for businesses. “The search bar first returns the top search suggestions, including people, Pages, apps, places, groups, and suggested searches,” the company explains. “People can search for things like restaurants near them, hotels in places they want to travel to, photos posted by Pages they like, or games that their friends like to play.”

    “These search suggestions take people to a unique results page,” it adds. “The results returned are based on factors that include information that has been shared by your business and the connections of the person searching.”

    Facebook will also make suggestions in the search bar, and will display Bing results (and ads) for web searches. Pages and apps will continue to be able to use sponsored results. These will continue to appear whether or not the user has Graph Search yet.

    Here are the specific tips Facebook offered for “making sure your Page is complete and up-to-date”.

    • The name, category, vanity URL, and information you share in the “About” section all help people find your business and should be shared on Facebook.
    • If you have a location or a local place Page, update your address to make sure you can appear as a result when someone is searching for a specific location.
    • Focus on attracting the right fans to your Page and on giving your fans a reason to interact with your content on an ongoing basis.
    • You can learn more about fan acquisition and Page publishing best practices here.

    You may also want to consider going through Facebook’s “Managing A Page” help section, which covers: getting started, accessing your page, settings/general administration (editing, notifications, managing admins, usernames/page addresses, claiming/merging duplicate pages), customizing how it looks, growing your audience (best practices/reaching more people), private messages, apps, using your page on mobile, policy questions, page insights (analytics), page admin privacy, bugs/known issues, and posting/moderating posts by others (page posts, offers for page admins, translating page posts, moderating what people post on your page, and photos/events/links).

    It’s hard to know this early in the game how businesses will best be able to use Graph Search for increased visibility, but you can rest assured, people will be trying to take advantage. It will be interesting to see just how gameable the system is. Facebook is likely going to have to take on this issue with its own set of “quality guidelines” the way Google does, which will enable it to manually (and algorithmically) penalize Pages that are in violation.

    Facebook does already have a business resource site here.

    Facebook also notes that app developers have a lot to gain from Graph Search. The company says, “Apps are now more discoverable on Facebook with Graph Search. In addition to showing up in search results based on your app’s name, they can show up in search results based on criteria like ‘strategy games my friends play’ or ‘apps my friends who live in San Francisco use.’ To optimize your app for Graph Search, please make sure your app details are up-to-date and that your app is properly categorized.”

    Potential Relevancy Problems

    You know how paid links is a problem for Google optimization? It will be interesting to see how “like buying” fits into the Facebook picture as it pertains to ranking in Graph Search.

    Steve Cheney has an interesting blog post on how businesses may be able to influence the results based on how much of their marketing budgets they put toward fan acquisition.

    “For the past several years big advertisers on FB have actually been directing massive amounts of paid media to acquire fans. They quite literally bought likes,” he writes. Why? Early on FB made the case to brands that they must have fans… together with the ad agencies they convinced the Cokes of the world to spend money to be competitive (hey Pepsi is here too). Then, FB promised, something miraculous would happen. Your friends would see in their news feed you liked Coke!”

    “So… FB convinced big advertisers to spend huge sums on CPA-like ad units whose sole purpose was to acquire fans. Ad agencies dedicated creative, planning and strategy resources to get the Cokes and American Expresses of the world to pay to have users click—almost 100% of the time because the user was promised some sweepstake or contest,” he continues. “Recall back to all the past campaigns you’ve ignored where you could ‘like to enter’ or ‘like to qualify’. They are literally everywhere and are always tied to fan acquisition.”

    He goes on to note that over the past several years, American Express, for example, has spent about half of its ad spend on buying likes, which he says equates to tens of millions of dollars, and that “across the board, big advertisers were told to spend 50% of their ad buy solely on fan acquisition.”

    Of course outside of these massive marketing budgets, pages give users deals for “liking” them on Facebook all the time. These are not necessarily genuine liking of a business, and this could dilute the relevancy of search results for users actually looking for some useful information to help them make decisions.

    Another potential relevancy problem is that people change their minds. Just because you liked something two years ago (or longer) does not mean it represents your current opinion. People get older and grow up. They have bad experiences with businesses that they used to like. They’re not always going to go back and unlike things. It’s simply not going to occur to everyone. For that matter, Facebook has already buried so many updates from pages in the news feed, users are no doubt forgetting that they ever even liked some pages to begin with, since they never see updates from them.

    Then there’s the fact that a lot of Facebook users aren’t taking the time to “like” everything they actually like.

    “Consider me,” writes Danny Sullivan at Search Engine Land. “Not only have I not liked my electrician, my plumber, my dentist, my doctor or my tax person on Facebook, but I don’t even know if they have Facebook pages. I have nothing to offer to my Facebook friends in this regard. Similarly, despite the huge number of books I read through my Kindle, I never go to like those books on Facebook, so books I love are more or less invisible on Facebook.”

    Social Signals

    Despite all of this, there’s no question that Facebook has the strongest social signals of any service on the web. It has the volume, and has the close, personal connections. It has your family and the people you have known all your life. it has your co-workers, your old friends from all levels of school, and it has the people in your town. It also has the people on the other side of the world. If social signals are ever to be important to the relevance of search results on a mass scale, wouldn’t it have to be Facebook’s?

    We had a conversation with blekko CEO Rich Skrenta about Graph Search and social signals. He tells us social signals are “critical” for search relevance.

    “PageRank originally measured the web’s primary social signal — links,” he says. “Facebook has even better social data which would be great for ranking recommendations. And they could be personalized to you, based on your friends.”

    “Facebook Graph Search addresses a completely new class of searches that you can’t do today on Google,” he says.

    “Another difference is the layers of searching or refinement that Facebook Search offers compared to Google,” writes Sullivan. “For example, a Google search can show you restaurants in San Francisco, a pretty much single dimensional view.

    “A Facebook search can show you restaurants in San Francisco liked by your friends,” he adds. “Or further, those liked by your friends who actually live in San Francisco, as opposed to those who live elsewhere. Or those liked by your single friends, your straight friends, your gay friends, your friends who work for a particular company.”

    Wrapping Up

    Clearly Facebook has a lot of challenges ahead of itself if it wishes to be a serious player in search, and while Graph Search could pose some threat to Google, not just a search destination, but as a more complete web experience, it has a long way to go. Even still, the amount of data that Facebook has at is disposal, along with the engineering talent behind the offering, led by former Googlers, it’s hard to imagine this offering won’t be at leas an important too for Facebook itself, if not a bigger deal for the web itself.

    Do you think Graph Search is a big deal? A threat to Google? A useful Facebook tool? An important tool for businesses? Let us know in the comments.

  • With Facebook Graph Search Comes Graph Search Optimization

    Facebook unveiled its much anticipated search product on Tuesday with Graph Search. This could mean big things for businesses who have pages on Facebook. In fact, Facebook has already gone out of its way to offer business owners some tips to make sure their business pages are “complete and up-to-date,” which is essentially to say, optimized for Graph Search.

    We may have a whole new area of SEO to consider going forward. The easier it is for users to find your Page in relevant search situations, the better it is for Facebook and the success of its new product, so it’s easy to see why Facebook wants business owners to get on the ball.

    “The search bar first returns the top search suggestions, including people, Pages, apps, places, groups, and suggested searches,” Facebook explains. “People can search for things like restaurants near them, hotels in places they want to travel to, photos posted by Pages they like, or games that their friends like to play.”

    “These search suggestions take people to a unique results page,” the company adds. “The results returned are based on factors that include information that has been shared by your business and the connections of the person searching.”

    Facebook will also make suggestions in the search bar, and will display Bing results (and ads) for web searches. Pages and apps will continue to be able to use sponsored results. These will continue to appear whether or not the user has Graph Search yet.

    Here are the specific tips Facebook is recommending on its Studio blog:

  • The name, category, vanity URL, and information you share in the “About” section all help people find your business and should be shared on Facebook.
  • If you have a location or a local place Page, update your address to make sure you can appear as a result when someone is searching for a specific location.
  • Focus on attracting the right fans to your Page and on giving your fans a reason to interact with your content on an ongoing basis.
  • You can learn more about fan acquisition and Page publishing best practices here.
  • Graph Search is in limited beta, and will be rolling out pretty slowly from the sound of it. It’s also starting off in English only. It might be a good time to associate as much information with your Page a possible to get ready for an influx of searches on Facebook.

    We’ll see what Facebook does on the advertising front in time, no doubt.

  • Googler Talks About Not Reindexing Pages

    Ever had a problem with Google indexing your pages? If there’s no real content on them, then that’s probably why.

    Barry Schwartz at Search Engine Roundtable points to a Google Webmaster Central help thread, where one person says they had over 2,000 pages indexed on Webmaster Tools, but that the number went down to 60, and eventually back up to 116.

    “When I google: ‘site:www.gamez4you.com’ I see all the indexed pages correctly,” the webmaster says. “I suspect that is the reson my Adsense CPC went down by 60-70%.”

    Google’s Gary Illyes responded, indicating that it was a lack of content on the pages that was the problem. Here’s his full response:

    As we improve our algorithms, they may decide to not reindex pages that are likely to be not useful for the users. I took a look on the pages that were once indexed but currently aren’t and it appears there are quite a few that have no real content; for example http://www.gamez4you.com/car-games/play-crazy-mustang seems to be a soft error page, which means that even though it comes back with 200 HTTP status code, it is in fact an error page that shouldn’t be indexed. Another example would be http://www.gamez4you.com/all-games/page-71 which seems to be an empty page.

    Another reason you may see the number of pages dropping in the Webmaster Tools’ Sitemaps module is that your Sitemap is referencing URLs that are not canonical. For example in your Sitemap you reference
    however the currently canonical URL of that particular URL is
    To fix the indexed urls count, I would recommend fixing canonicalization of the URLs one your site and to set your server to return proper status codes (e.g. 404) for inexistent URLs. You can read more about canonicalization at http://support.google.com/webmasters/bin/answer.py?answer=139394 and about soft error pages at http://support.google.com/webmasters/bin/answer.py?answer=181708
    Hope this helps

    Another webmaster in the thread indicates that he had a similar problem to the Sitemap referencing URLs that aren’t canonical issue.

  • Fear Of Google Has People Asking StumbleUpon To Remove Links. Really.

    As we’ve discussed in several articles over the months, a lot of webmasters have been going overboard with link removal requests in the wake of Google’s Penguin update and link warning emails. People are trying to get links removed that they shouldn’t be – links that very likely are not hurting them in Google. It’s essentially the SEO equivalent of killing a bug with a rocket launcher.

    Danny Sullivan posted an article today about how StumbleUpon is even getting these kinds of requests. Given the amount of traffic StumbleUpon can send to websites itself, completely independent of Google, it’s hard to express how ridiculous this is.

    StumbleUpon confirmed with us that they do indeed get these messages from webmasters. The company tells WebProNews, “We typically receive a few of these requests a week. We evaluate the links based on quality and if they don’t meet our user experience criteria we take them down. Since we drive a lot of traffic to sites all over the Web, we encourage all publishers to keep and add quality links to StumbleUpon. Our community votes on the content they like and don’t like so the best content is stumbled and shared more often while the less popular content is naturally seen less frequently.”

    As Sullivan says, “StumbleUpon to remove your link is a waste of time. I’d say. That’s the type of link you actually do want.”

    If Google has a problem with links you’re getting from StumbleUpon (which it seems unlikely that they would), you’re probably better off with the StumbleUpon referrals anyway. That said, you shouldn’t be spamming StumbleUpon, and should be creating quality content that SU users would enjoy (and there are a ton of very niche categories). StumbleUpon won’t hesitate to ban you if you’re abusing its service.

  • Google Talks Site Verification In New Video

    Google has released a new Webmaster Help video. This one comes from Maile Ohye from Google’s webmaster support team, who talks about verifying ownership of your site in Webmaster Tools. The goal of the video is to help webmasters choose the verification method that is easiest for them.

    Possibilities include: domain name provider, HTML file upload, HTML meta tag and Google Analytics.

    “Verifying ownership of your site in Webmaster Tools provides you and Google a secure channel for giving and receiving information,” says Ohye. “For example, Google can show you more confidential information, such as the search queries that bring visitors to your site, and by verifying ownership, you’ll have privileges to do things like adjust targeting settings to associate your site with the audience of a particular country.”

    “Verification doesn’t affect your site’s performance in Google search results,” she notes.

  • Matt Cutts Gives Webmasters Extended Dinosaur Impersonation For Christmas

    In case you have been enjoying the holidays, and not worrying about Google and Matt Cutts for a week or two, you may have missed the latest “Webmaster Help” video, which the company posted to YouTube on Christmas.

    If you regularly watch these videos, this is one you don’t want to miss. You saw Cutts act like a dinosaur once before. This time, he really goes all out.

    More Matt Cutts videos here.

  • How To Rank #1 In Google: The Best Of Matt Cutts 2012

    Google’s Matt Cutts has been putting out Webmaster Help videos for years now, answering questions from webmasters who want to learn more about optimizing for Google search results. Sometimes these videos are really helpful. Sometimes, perhaps, not as much. Either way, you’re not going to get much better advice about what Google is looking for from sites than you’ll get from Google itself.

    Following are some of the best Matt Cutts videos of 2012. Most are from the Webmaster Help videos, but there are a few others sprinkled in. You may have seen some or all of them. Maybe you’ll come across some you missed.

    We’ll start with the best: this parody video of Cutts on how to rank #1 in Google:

    The rest are real.

    Did the last Google Dance come with update BART or Fritz? (A Google History Lesson)

    How much time should I spend on meta tags, and which ones matter?

    How will Google treat new TLDs?

    Don’t block Googlebot from crawling JavaScript and CSS

    Should I disallow Googlebot from crawling slower pages?

    Matt Cutts At 2012 Korea Webmaster Conference

    How does Google search work?

    How effective is Google now at handling content supplied via Ajax?

    How does Google view font replacement?

    What happens if I link to a good page that later becomes spammy?

    What should I do if my competitors are using webspam techniques?

    A Hangout With Matt Cutts

    What hardware and software powers Googlebot?

    Is freshness an important signal for all sites?

    What is Google’s view on guest blogging for links?

    Does the use of schema.org markup create a ranking benefit?

    Disavow Links

    What is Google’s thinking about links from article marketing, widgets, etc?

    If I haven’t been participating in link schemes, do I need to worry about my links?

    Why doesn’t Google release an SEO quality check up calculator?

    Are reconsideration requests read by real people?

    Do human “quality raters” influence which sites are impacted by Panda?

    Should I structure my site using subdomains or subdirectories?

    Why does Google shut down products?

    Should I keep a domain parked without content before I launch the website?

    How does Google consider site-wide backlinks?

    How will Google interpret links to URLs ending with a campaign tag?

    How long does a reconsideration request take to process?

    How many messages did Google send about unnatural links?

  • Matt Cutts Is Surprised People Are Still Being Duped By “Dog Fart Jr.”

    Matt Cutts Is Surprised People Are Still Being Duped By “Dog Fart Jr.”

    Google has put out a new Webmaster Help video. This time, Matt Cutts shares his biggest surprise of the year for the web spam team.

    “I would probably say the sheer number of people who continued to be snookered by snakeoil salesman products…that promise to instantly rocket you to number one,” says Cutts. “I would expect that in 2012, people would be a little bit more skeptical about that, especially after we do a pretty good job of finding various link networks, different ways of spamming Google – all that sort of stuff.”

    “A lot of the times, black hat guys will use a particular technique, and when it looks like it’s reaching the end of its lifespan, they’ll package it up, they’ll sell it to you in an ebook, or they’ll sell it to you as a script package, or a recurring service, or a link network that you can subscribe to,” he says. “And I’m like, I don’t get why people would believe this, and they’re like, ‘Well, there’s two other guys on the forum who say that it’s great,’ and it’s like, well, they’re named Dog Fart Jr. and Black Hat Assassin, and the package that you’re thinking about buying is spam forum software. You think those guys don’t know how to make sock puppet accounts that say, ‘Yeah, this is great, has anybody else used it?…I love it!’”

    Google sure does love the ol’ “Dog Fart” reference.

  • Former Googler Talks About Google’s Changes And Controversies

    Vanessa Fox is always good for an interesting conversation about Google. Considering she used to work there, created Webmaster Central, and wrote the book Marketing In The Age of Google, it makes sense.

    What has been the most controversial thing Google has done this year in your opinion? Let us know in the comments.

    We recently caught up with Fox, who shared some thoughts on a number of Google-related topics, such as how Google+ is impacting search, the quality of Google’s search results, the direction Google is going in, areas in which Google needs to improve, Bing, Google’s paid inclusion Google Shopping model, antitrust concerns, and the mistakes marketers are making.

    Has Google+ Made Search Results Better For Users?

    Google makes it more and more clear all the time that it’s in this Google+ thing for the long haul. Last week, the company pointed out that Google+ is the next phase of Google. It’s getting integrated into just about every aspect of what Google has to offer in one way or another. Ever since Google launched Search Plus Your World, its social search integration, which relies heavily on Google+, the results have brought about mixed reactions from users.

    “Google has experimented with how to bring Google+ into search results and some of the experiments have been pulled back, with good reason,” Fox tells WebProNews. ”Better for users’ means getting the searcher exactly the information that they’re looking for. Does Google+ integrated into search results provide that? Maybe sometimes; often now. I think Google knows that the online world has gone social and they need to evolve to take that into account. I don’t think they’ve hit on exactly the right formula quite yet.”

    When asked whether this brand of social search has been a blessing or a curse for search marketing, Fox says, “I don’t look at search marketing in a silo; I think of how to better understand and engage with audiences overall. Online audiences are social and so it’s always a good idea to meet them where they are. I think that Google+ specifically being integrated into search results does give marketers a new opportunity for visibility on the search results page, so it shouldn’t be ignored.”

    The Impact Of Google’s Changes

    We asked Fox what she thinks has been the most significant event or feature release of the year, in terms of the impact it has on white hat SEO/search marketing.

    “I think the continued evolution of Google’s Panda algorithm and the other strides they are making to move beyond counting links to really getting to the heart of content quality and utility are fantastic,” she says. “The more loopholes are closed and the less the algorithms can be manipulated, the more the best quality sites win.”

    “I also see Google making significant advances in supporting structured data, which is a great foundation for the future,” she adds. “New ways of interacting with content, such as Google Now and Google Glass are significant as well, because they start to move us beyond the keyboard and beyond explicit search. HTML5 and Google’s support of responsive design techniques make it easier to build content once and supply it audiences on any device.”

    Search Visibility And Google’s Evolution

    As Google continues to get smarter at giving users information they seek, whether that be through the Knowledge Graph, Quick Answers, Google Now, or anything else, marketers have a lot to adjust to.

    “It’s definitely getting harder to track,” says Fox. “One of the primary reasons I’m building a search analytics software (called Blueprint) is to provide a foundation that enables marketers to better understand their audiences and better measure ROI on search-related investments. I think in some ways, search visibility is becoming easier. We now have so many more ways to connect with audiences and so much more data to better know what they’re looking for so we can meet their needs. Smart marketers will take advantage of this data and these new opportunities.”

    When asked if there is any particular area of focus where she thinks Google definitely needs to do a better job, she says, “The hard problem of discerning quality and utility is definitely not solved yet. I think Google has been doing a great job of starting to pull in all kinds of data sources (images, mobile, structured data…) and is evolving as devices and searcher behavior change (Google Now, for instance), but their biggest obstacle is also their oldest one: how do you surface the more useful results to the top and cut through the clutter of an ever-increasing web?”

    Bing And Scroogling

    We asked how Bing is stacking up to Google these days in her opinion, in light of all of Bing’s campaigns against Google of late.

    “Well, funny you should say that because I finally took the Bing challenge yesterday and I picked Google (4 out of 5 times),” she tells us. “But honestly, Bing is doing a pretty good job.”

    Bing would probably be happier with her answer to the next question, which was: Has Google made any product changes (non algorithmic) that you think they should not have made?

    “I’m not super happy about the shift to paid placement in product search,” she says. “I can see the rationale of why they did it, but doesn’t reflect the stated mission all that well.”

    Bing has a “Don’t Get Scroogled” campaign based on this.

    Competition And Antitrust Concerns

    We’ve written a number of times about how Google is lacking real time search, and potentially sending information seekers to Twitter. A recent NYT report talked about more people starting their product searches at Amazon rather than Google. We asked Fox what kinds of information she is turning to non Google sites and services for.

    “I primarily use Urban Spoon for restaurant searches,” she says. “And Oyster or Tablet Hotels for hotel searches. Although I then will search Google for hotel reviews but will specifically click on Tripadvisor results, rather than search Tripadvisor directly. No, I can’t explain this behavior.”

    As you may know, many companies have voiced dismay with Google’s business practices with regards to competition. We asked Fox if she believes any of the antitrust concerns are legitimate.

    “I can only speak to those that relate to unpaid search,” she says. “But no, I don’t. Google’s search quality team is maniacally focused on including as much of the web in the index as they possibly can and then surfacing the most useful results to searchers. The complaints I’ve seen tend to be based on a misunderstanding of how search works (from those outside of the online world) or misplaced fears about Google’s goals (from those inside).”

    What’s Wrong With Your Strategy

    Finally, on common pitfalls she sees in organizations’ marketing strategies, Fox tells us, “I see a lot of algorithm chasing, which is such a short term game. I also see a lot of fragmented organizations — the web developers don’t think SEO is important and don’t know best practices of making sure the site is search-friendly; the user experience team thinks that SEO is spam and has no idea that insight can be gained from search data; and so on. And I see a lot of tactics with no strategy. Organizations need a plan, they need to prioritize changes based on impact. There’s ideal online marketing and then there’s practical online marketing. Practical should win every time.”

    Do you agree with Vanessa on these various points? Which don’t you agree with? Let us know.

  • Google On Reconsideration Requests: Tell Us About The Link Network Or The SEO

    Google put out a new Webmaster Help video today. This time Matt Cutts talks about what to include in a reconsideration request, which you may have to submit if Google has caught you violating its quality guidelines.

    “The goal of your reconsideration request is to, number one, to tell Google you’ve stopped whatever violations of the quality guidelines were happening – paid links, cloaking, scraping, you know, doorways – whatever it was, you need to make a clear and compelling case that that has actually stopped,” says Cutts. “That behavior is no longer going on, and that you’ve cured that as much as possible. So, if you were doing paid links, you’ve gotten as many of those paid links pulled down as you possibly can.”

    “The second aspect of a reconsideration request is to basically give us a good faith assurance that it won’t happen again,” he continues. “You don’t want to say, ‘Oh, well this site looks like it’s reformed, okay, we’re going to lift this manual action,’ and then they immediately go back to spamming or doing their old tricks. So, what you want to do is step into Google’s shoes, and say, ‘Okay, what would best convince Google that we’ve turned the corner, and this behavior has stopped, and that we’ve cured whatever was going on, and it’s not going to happen again.’”

    “Great things to include: things like details of the sorts of sites that you were contacting if you were removing links, for example, if you used an SEO, and they really just shot you in the foot because they were doing all sorts of unethical things…that’s the sort of thing where I would give us details about that,” he says. “Tell us about the link network or the SEO.”

    The more stuff you can include to make your case, the better chance you have of success.

    In another recent video, he talked about how quickly you should hear back about the requests. In another one, he noted that Google is experimenting with ways to make reconsideration requests better.

  • Matt Cutts Talks 301s vs. rel=”canonical”

    Matt Cutts Talks 301s vs. rel=”canonical”

    Google put out a new Webmaster Help video today. In this one, Matt Cutts responds to the following submitted question:

    In the 2009 rel=canonical video, you suggest using rel=canonical when we CAN’T use a 301 redirect. But 301s hurt performance; they require browsers to make an extra round-trip to my servers. Shouldn’t I use rel=canonical everywhere, instead of 301s?

    “You are the master of your domain, so you can choose whether you use a 301 or whether you use a rel=canonical,” he says. “That’s your call. Most of the time though, I would recommend using the 301. That’s because everybody knows how to deal with it. Browsers know how to deal with it. All search engines pretty much know how to deal with it. If there’s some new little startup, they might not know how to deal with rel=canonical – if they’re doing their own search engine, for example.”

    Was that “master of your domain” thing another Seinfeld reference?

    “Another thing to sort of keep in mind is, usually when you’re doing a 301, it’s because your’e going to some new place on your site, but you’re typically not doing a 301 on every single interaction that your browser has,” he says. “Normally, it will be your browser lands somewhere, you do a 301 to the new location, and then that functionality continues just fine. So it’s usually just a one time pop. It’s not like it’s a huge amount of extra work.

    “The other thing is, if you’re moving to a new location, your users will look at the address bar, and they’ll notice where they are, so they’ll want to have a good mental model of where they are on your site,” he says. “So there’s a lot of good cognitive reasons and reasons why it might make sense to use a 301.”

    Here’s what Cutts had to say about rel=”canonical” when we interviewed him when it was announced. Here, he talks about reasons Google might skip it.