WebProNews

Tag: SEO

  • Google Shows Twitter Results In “Search Plus Your World” Fashion

    Google Shows Twitter Results In “Search Plus Your World” Fashion

    Update: A Google spokesperson tells WebProNews: “Like you wrote up in your article, it’s not new. Search plus Your World builds upon existing search features such as Social Search, personalized search, and authorship. You will continue to see existing Social Search features including +1s and content shared by your connections on Google+ and other sites. We’ll continue to look at your Google+ profile to see other content you’ve published online and linked to your profile.”

    Original Article: Ian Lurie at Conversion Marketing claims to have seen Twitter results appearing in Google search results in “Search Plus Your World” fashion.

    He shows a screen cap to back up his claim. It’s not the People and Places box or anything, but it does appear to show a social search result from Twitter, very similar to the recent injection of personalized Google+ connection results.

    I’m not sure this is the result of any new offering from Google. They’ve had such social search features long before SPYW. See the “Social Connections and Content” section of your Google dashboard. This is basically the same connections you have listed on your Google Profile. So if you have your Twitter account connected, Google has that information, and can deliver you such results.

    It’s not exactly the same as having access to the Firehose, which would blast all tweets into Google’s index in real time.

    That said, SPYW has pretty much dominated those personalized search results with Google+ connections since it was announced, though Google made it clear, that it does in fact draw from other open web sources.

    Still, Twitter raised a big stink about the whole thing, claiming Google was making Twitter results less visible. Many criticized the lack of non-Google sources in the People and Pages box in particular.

    There have been reports of the relationship between Google and Twitter souring. Apparently the companies were supposed to have an Android-related conversation at the Consumer Electronics Show last month, but that didn’t happen, as Google’s SPYW raised the aforementioned stink.

    It’s really not clear if this finding from Lurie is the result of any new developments. My guess is not. We’ve reached out to Google for comment, and will update accordingly.

    Either way, it does show that Google will still show personalized Twitter results in some cases. That said, given Google’s increased emphasis on freshness, that firehose would be a lot more helpful.

  • Google Panda Update Addressed In New Google Announcement

    Google today listed changes it made to its algorithm in January. As previously discussed, the biggest takeaway from that (at least in my opinion) was an increased focus on freshness through not only updates to the “Freshness Update,” but also through changes to universal search, which focus on the queries that deliver news results.

    The company also addressed a recent Panda tweak:

    High-quality sites algorithm improvements. [launch codenames “PPtl” and “Stitch”, project codename “Panda”] In 2011, we launched the Panda algorithm change, targeted at finding more high-quality sites. We improved how Panda interacts with our indexing and ranking systems, making it more integrated into our pipelines. We also released a minor update to refresh the data for Panda.

    Google actually confirmed that this happened last week. Google reportedly said that there were no additional signals or actual changes to the algorithm, which would explain why there wasn’t a whole lot of fuss made about it, compared to Panda updates of the past.

    Of course, there has been much more fuss about Google’s introduction of Search Plus Your World last month, which many have complained about with regards to its impact on Google search results relevancy.

    Followers of the Panda saga may find this somewhat ironic, given that Google has spent nearly an entire year releasing Panda updates with the goal of improving search quality. Obviously this is not a goal Google is publicly backing away from, but some are questioning whether they’re placing a little more priority on making Google+ successful.

  • Local Paid Inclusion And What Bruce Clay Said About It Ahead Of The New Year

    This week, Bruce Clay, a respected search engine marketing agency (whose founder WebProNews has interviewed many times) launched something called Local Paid Inclusion. The official description said:

    Local Paid Inclusion is a Google, Yahoo and Bing contracted service and is offered as an approved official program in cooperation with those search engines.

    Local Paid Inclusion promotes a local business’ profile page, like those found in Google Places, Yahoo Local and Bing Local, into a top position on the search result page for up to 30 keywords per profile page.

    This is a NEW program offered by Google, Yahoo!, Bing and 18 other major directories and indexes that places a business profile into a premium area above all other local profiles. Combine this with all of your other optimization programs to maximize your traffic.

    What this means is local businesses that participate can essentially pay for the top local ranking position!

    This caused some uproar among the SEO community, and has turned into a big, jumbled, confusing PR disaster.

    There was indication from Bruce Clay that a company called Universal Business Listings was involved, but UBL denied this in communications with Search Engine Land’s Danny Sullivan, who also shares a pair of statements from Google and Bing (respectively):

    “We are not working on any program that enables a site to pay to increase ranking in organic search results.”

    “Bing has no interest in paid inclusion into the local algo that artificially impacts ranking of algo results…. Microsoft does not have an agreement with UBL today.”

    The denial from UBL seems fairly fishy, considering that Search Engine Watch, which first reported on Local Paid Inclusion this week, spoke to them on the phone, and they reportedly said the service was on hold.

    LocalPaidInclusion.com, the landing page for the service in question, now redirects to a statement from Bruce Clay Inc on the matter. It says:

    Late Monday, we announced the service “Local Paid Inclusion,” which we said gives local merchants higher rankings in the Places and local search results in Google, Yahoo! and Bing. We believed that the service offering was finalized between our backend partner and the aforementioned search engines.

    So far, we have determined that it is not a released program, made even more complicated by statements of confidentiality agreements that put the kibosh on further discussion. Bruce Clay, Inc. has ceased to engage in Local Paid Inclusion while we dig into confusing and contradicting statements.

    We announced what we believed to be a legitimate program where Bruce Clay, Inc. was going to be one of several distributors of this service. Our understanding of this service was that it impacted the sequence of entries within the Places or local results in search engines. And within that separate area of the results, this service would validate local profiles, assuring those entries would naturally result in appearing higher in the local results.

    There was misinterpretation of the information surrounding this service; mainly that it would impact the organic search results, instead of only the local results. We take responsibility for an unclear message being announced in an untimely manner, where specifics of the program were not disclosed and the messaging was jumbled.

    Bruce Clay, Inc. also takes responsibility for the early promotion of the service Local Paid Inclusion without taking the extra steps to verify these contracts existed as we understood them. For that, we apologize.

    We believed at the time that the offering was valid and acted accordingly. We did not collect money at this time, choosing to only set up a notification contact list dubbed “pre-registration” for when the program formally released.

    Bruce Clay, Inc. has always been committed to ethical search engine marketing practices that work alongside the values of the search engines: to serve the end user and provide exposure to businesses. This program seemed to be a solid way for local merchants to validate themselves online and to have their companies be found.

    At this time, it’s our highest priority to be as clear as possible on this issue with the business and search communities. Bruce Clay, Inc. is prepared to openly discuss this matter as best we can with media and community to be as transparent as possible.

    We will make every effort to answer looming questions as soon as we know more, but please understand that we are forced to work within confidentiality agreements, and may be unable to talk specifics.

    We are currently working to better understand all of the contractual agreements in place, if any, with those search engines regarding this service.

    We also need to thank the various social communities and search marketers for their passion regarding this matter; the voices were heard loud and clear, showing there’s no lack of diligent, inquisitive and knowledgeable marketers and business people in our community.

    In the meantime, Bruce Clay, Inc. has withdrawn Local Paid Inclusion pending our further research into this matter. And the site LocalPaidInclusion.com has been taken down while this issue is resolved.

    Clay himself talked about Local Paid Inclusion in an interview with WebProNews in December.

    “There’s a group of people that remember the Yahoo Search Submit Pro, which is a process where you could pay them, and it would get you into the index,” said Clay in the interview. “What seems to be forming is the ability to create a premium, local entity, much like a Places type page,except across the various engines – their own respective ‘Places’ if you will. And that you create a premium account, and that the premium account would allow you to appear at the top of the local results.”

    “Now, those premium accounts have additional features,” he continued. “One would be the ability to put in a call tracking type system, where you could actually appear at the top of the local results and have a phone number appear there – and much like pay-per-click, if they click on that phone number or they call that phone number, there would be a fee paid to the search engine. So it’s a fairly similar concept to Search Submit Pro of years gone by. I’m pretty sure we’re going to see that emerge as a significant local resource in 2012. It is in the process, and we’re actually building a product around it, assuming all those pieces come to be in 2012. And I think that it has to be.”

    “I think there’s going to be a natural tendency for people to click more in the organic space, and the organic space includes…the Places type results,” he continued. “The local results. And those local results will get a lot of clicks. Or they will get a lot of interest, because a lot more local people are going to be doing searches. There has to be a way to monetize that. And I think that paid inclusion is actually the least intrusive, the most easily embraced, keyword-centric way to be able to do that.”

    “The way I envision it working is: there will be a base fee, there’ll probably be a fee added for call tracking…and the search engines are going to share that with a channel,” he said. “In which case you’re going to see a great many people encouraging (as SEOs) their clients to embrace a local paid inclusion program.”

    “The earliest adopter, if you can base it on history, will be the Yahoo and Bing environment,” he said at the time. “They’re likely to embrace it, and Google will watch it, and of course invent their own version of it that’s a little bit better in the eyes of Google. Now I think that the program will be somewhat similar across all of them to facilitate the ease of selling it. I don’t think anybody wants to be particularly different from everybody else…”

    Based on the statements Sullivan received, it doesn’t sound like the search engines are much interested in this at all.

    In the video, Clay then goes on to talk about his company’s other local business service “LocalWare,” which promises: local SEO, custom keyword research, content development, SEO-friendly CMS with “pristine code” to support local organic SEO, optimization of other online avenues such as Google Places, Facebook, Foursquare, LinkedIn, Yelp, et. and “Bruce Clay’s world-class online Seo training for you and your team.”

    WebProNews is communicating with Bruce Clay, and will have more details as they become available.

  • Beware The Fake Matt Cutts

    Beware The Fake Matt Cutts

    One or more people are going around impersonating Google’s head of web spam (and now “Distinguished Engineer”) Matt Cutts, leaving comments on various articles on the web. It’s been going on for a while.

    Usually, if you pay close enough attention, you can spot the fakes, but sometimes they can be deceiving.

    We’ve had a fake Matt Cutts leaving a bunch of comments here on WebProNews lately. We delete them as we spot them, but sometimes they still surface, so beware of what you’re reading. Of course, the impostor(s) hasn’t been sticking to WebProNews only. Here are a couple of tweets from Cutts the other day:

    @sarahcuda another fake comment to report. I didn’t comment on http://t.co/zqzaxxf6 . http://t.co/q5CYMOLp is fake. Any way to delete it? 2 days ago via web · powered by @socialditto

    @sarahcuda no worries–thanks for taking care of it. “Pretending to be Matt Cutts” would be a boring/lame game; surprised anyone plays it 🙂 2 days ago via web · powered by @socialditto

    Lame indeed. With all due respect to Cutts. Pretending to be others and spreading misinformation is lame. Period. So, fake Matt Cutts, please leave us alone. Real Matt Cutts, feel free to chime in any time. Readers, just beware that the comments you read may or may not be real. We’ll continue to police them the best we can.

    If you are a regular follower of what the real Cutts says, and what he is saying in a comment sounds like something he wouldn’t say, there’s a good chance it’s not him who said it. If you’re ever unsure, and you care enough to make a decision based on what was or wasn’t said, Cutts has said in the past that he will verify via Twitter whether it was him that said it. So keep that in mind.

  • Is Google Getting Worse At Delivering Relevant Results?

    It was around this time last year when Google’s search results really started attracting a whole lot of criticism (more than usual). The content farm discussion was going full-throttle, and finally in February, Google launched the Panda update, its attempt at increasing the quality of search results.

    Whether or not this actually worked has been widely debated.

    It’s almost like the whole thing has started over this year, but not because of content farms so much. Now, everyone’s complaining about Search Plus Your World, for one. They’re saying the heavy Google+ integration is making their results less relevant. OK, maybe not everyone, but a lof of people are complaining.

    I pointed out an example yesterday, where it seemed like Google’s recent freshness update was actually hurting the relevancy of search results.

    Google privacy

    Danny Sullivan is pointing to some questionable video results in Google, pointing out better results on Bing.

    The reality is that often when we find poor quality results in Google, they’re not much better in Bing or Yahoo. However, recent Google updates may be changing that a bit – particularly Search Plus Your World. You’re not going to see Bing infused with tons of Google+ content.

    For the record, I have found SPYW to both increase and decrease the relevancy of search results, depending on the query. Sometimes it makes more sense than others.

    Regarding the video that Sullivan references, he says, “It’s embarrassing for Google to be doing this. And it’s worse when you look at the views the video has received: only about 2,000, at this point.”

    He essentially goes on to make the case that even while the video in question isn’t coming from SPYW (my example wasn’t caused by SPYW either), it “feels like another bit of evidence that Google’s original core mission, delivering awesome search results, is being forgotten.”

    I don’t know if it’s being forgotten. Google has hardly slowed down on the algorithm updates. But that doesn’t mean Google’s strategy isn’t having some negative side effects.

  • Google’s Algorithm Change: When Too Many Ads Attack

    If your site looks like one of those “Post No Bills” wall that’s covered in assorted flyers, ads, and various forms of clutter, there’s a good chance Google’s new algorithm change will result in these ad-covered sites being punished, especially if the ad assault appears above the fold. News of Google’s latest algorithm adjustment, something that’s been hinted at before, was recently announced on Google’s Inside Search blog, and the details are pretty straightforward, that is, until you ask “how much is too much?”

    Are you one of the site owners Google is referring to? If so, do you plan on moving your ads around or are you going to wait and see what happens? Let us know if Google’s new algorithm adjustment affects you.

    While there is no “x amount of ads exceeds our standard,” the blog post does offer some theoretical details about how the change targets the “too many ads” sites, while emphasizing their “above the fold” standard. The [emphasis added] sections are ours:

    This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.

    This algorithmic change noticeably affects less than 1% of searches globally.

    For more on Google’s change:

    How Is Google’s Algorithm Update Determining “Ads Above The Fold”?
    If you’re a user then you’re probably clapping and cheering Google for this update. If you’re an SEO expert, or an owner who’s sole income comes from a website or group of websites then depending on how neurotic you are, you’re either slightly concerned or freaking out… Read more here

    Google’s Ad Related Algorithm Update Analyzed By Experts
    There’s one specific aspect of this topic that many experts have been questioning, and it’s whether or not Google is being hypocritical in regards to their latest algorithm changes penalizing sites with too many ads above the fold… Read more here

    The question everyone is asking is how much is too much? Does Google have a number of ads in mind before it starts dolling out updated algorithm punishments? Based on what Google’s saying, if the ads make accessing the actual on-page content a chore, and the proliferation of ads appear above the fold, there’s a good chance your site will be stung.

    As indicated, Google made the point of saying the new adjustment will only affect under one percent of searches, meaning, from Google’s perspective at least, the act of stuffing ads on a site, ads that obfuscate the content, is not an epidemic. From Google’s perspective, the updated algorithm is presented as a housecleaning tool, one that improves a user’s search engine experience by pushing ad-loaded sites down, while bringing the quality content up, at least theoretically.

    For those sites that might have been punished in lieu of the new algorithm adjustment, unfortunately, you may be stuck with your site’s adjusted position in Google’s index, even if you make the necessary changes immediately after you’ve been punished. As pointed out by Danny Sullivan, around the time the above-the-fold ads algorithm change went live, he received an email from Google’s AdSense team, suggesting his personal blog put more ads on it, and they even sent a diagram suggesting how these ads should be positioned around the content.

    Essentially, Google wants the content surrounded by ads, they just don’t want you to over do it, especially for content appearing above the fold. Google’s email also refers its recipients to a video about ad placement, which also suggests avoiding a proliferation of ads, especially at the top of the page:

    Essentially, the lesson is this: don’t let ads push your site’s on-page content down, unless you risk being punished. Furthermore, the algorithm update punishes the entire site, not just the pages that have lots of ads above the fold. This means if 99 percent of your site is compliant, but you forgot to clean up one of you ad-heavy pages that obscures the content, there’s a chance the site as a whole will be dinged by the new algorithm.

    The question is:

    Google ad-heavy SERPs
    Click for larger image

    Will Google punish themselves?

    As you can see, the search engine results page for the query “Samsung TV” pushes the organic content almost off the page. What happens when the gatekeeper quasi-violates their own rule, especially when they are making a point to inform site owners about the potential damage they can cause search engine rankings by making their pages top heavy with advertisements?

    Does Google’s new algorithm change give you pause when it comes to placing ads on your sites or is this much ado about nothing? Let us know what you think.

  • SMX Israel: What You Missed

    SMX Israel: What You Missed

    SMX Israel took place in Jerusalem on Sunday as a one-day event full of keynotes and sessions, led by Barry Schwartz from Search Engine Land/Search Engine Roundtable.

    That’s a long trip for those of us in the states (or for those in many other parts of the world for that matter), so if you were unable to attend, you could hardly be blamed. But that’s what makes the Internet great. Attendees and presenters have shared info and commentary about the event for everybody else to see. I’m sure it doesn’t quite match actually being there, but it’s better than nothing. And it’s free.

    Some presenters have shared their presentations online. Here’s one from Aviv Manoach:

    Here’s one from Mark Ginsberg:

    Dixon Jones shares one here.

    Ben Druce offers a live-blogged account of SMX Israel here.

    “The SERP (search engine result page) scene from Google has always been changing – so their updates such as Search Plus Your World and Panda are not necessarily spoken of with resentment – Panda specifically is a good wakeup call to many for remembering that real people like fresh and real content,” he says, in a separate “highlights” piece. “However, the notion that Google is being unfair in their current practices is now coming to the forefront. The most common claim is that Google SERPS are showing Google Plus results on top of the far more relevant traditional organic sites, or even Facebook or Twitter results. The general feel was that Google is here to stay, and if we don’t like it, we still have to deal with it.”

    Nichola Stott has her own summary of the event, concluding that the speakers “seemed to be very much in agreement on the following points:”

    • There is a very clear chronology of events, which gives a clear directional guide as to your future search strategy. Know your search history and you know your search future
    • It is getting harder and harder to fake it – plus why bother? If you can’t deliver then what’s the point of trying to rank a mediocre page anyway? (Mediocrity doesn’t convert so well)
    • “Treat Google well” to continue to succeed and you can’t go far wrong. {Quote is from Eli Feldblum, though I have paraphrased.]

    Gil Reich compiled a list of the “best lines” from the event, which includes:

    Roman Zelvenschi: Nobody knows how to pronounce my last name, but that’s OK, I rank number 1 for it.

    Eli Feldblum: Use schema. Do it now. Seriously. You have an internet-connected device with you.

    Eli Feldblum: We’ve reached the point where “normal” blue text links get lost in the noise on a Google SERP.

    Barry Schwartz: Google is recommending … Doesn’t mean you should do it … Just saying.

    Shira Abel: Google owns you. Get used to it.

    Marty Weintraub: Facebook owns you too.

    Marty Weintraub: Use Facebook to target businesses. Raise your hand if you have a FB account. Raise your hand if you have a job. See …

    Tomer Honen (from Google): We got better at Flash. Right about the time people stopped using it

    Olivier Amar: When you’re in-house you pay a lot more attention to long term. Because you still want to be here.

    Ofer Dascalu: Some people say “publishers and Google are partners.” My partners reply to my e-mails. They pick up the phone when I call.

    Michael King: When you interact with people on Twitter don’t use the same account that you use to Tweet SEO articles. That’s like trying to pick up a girl while holding a book called How to Be a Pickup Artist.

    Reich has a more complete round-up of the event here.

    It’s also interesting to see conversations that transpire in the aftermath of these conferences. For example, this one on Google+ including one of the presenters, Miriam Schwab, about the necessity of using Google+ for search marketing. Schwab said in a post, “Welcome to all my new followers since SMX Israel yesterday. Oh the hilarious irony that after bashing Google+, my community here grows. Love it :)”

    Aaron Zakowski responded, “Hi Miriam, I enjoyed your presentation yesterday. But despite many people’s feeling about G+, all of us marketers need to be here b/c Google is making G+ a necessary component for online success. I predict that within a few months, G+ will be more important to most us than Twitter.”

    Schwab replied, “Aaron, I don’t disagree. Google has made Google+ necessary. My problem is that they are forcing us to join, and promoting Google+ results over other networks, even when the relevance is questionable. They are acting like a big bully, and that is not the right way, and maybe even not the sustainable way, to build community. We will be here because we have to, but will there be activity here? Will non-marketers join too? Possibly not, since they’re all comfy on facebook.”

    A pretty timely discussion, given the regulatory scrutiny Google is getting.

  • Going Dark For SOPA Blackout? Here Are Some Tips

    As major sites like Wikipedia and reddit prepare for a Wednesday blackout to protest the Stop Online Piracy Act and its Senate cousin the Protect IP Act, many smaller sites are also debating whether or not they want to participate in what is being called “SOPA Blackout Day.”

    White there is still a great deal of debate surrounding the possible efficacy vs. consequences of going dark for an entire day to protest domestic legislation, plenty of sites will turn off on January 18th.

    Googler Pierre Far made a timely Google+ post in which he outlined some tips for webmasters who want to go dark in protest, but have SEO considerations.

    Check it out below, and remember, the internet must remain free.

    Website outages and blackouts the right way

    tl;dr: Use a 503 HTTP status code but read on for important details.

    Sometimes webmasters want to take their site offline for a day or so, perhaps for server maintenance or as political protest. We’re currently seeing some recommendations being made about how to do this that have a high chance of hurting how Google sees these websites and so we wanted to give you a quick how-to guide based on our current recommendations.

    The most common scenario we’re seeing webmasters talk about implementing is to replace the contents on all or some of their pages with an error message (“site offline”) or a protest message. The following applies to this scenario (replacing the contents of your pages) and so please ask (details below) if you’re thinking of doing something else.

    1. The most important point: Webmasters should return a 503 HTTP header for all the URLs participating in the blackout (parts of a site or the whole site). This helps in two ways:

    a. It tells us it’s not the “real” content on the site and won’t be indexed.

    b. Because of (a), even if we see the same content (e.g. the “site offline” message) on all the URLs, it won’t cause duplicate content issues.

    2. Googlebot’s crawling rate will drop when it sees a spike in 503 headers. This is unavoidable but as long as the blackout is only a transient event, it shouldn’t cause any long-term problems and the crawl rate will recover fairly quickly to the pre-blackout rate. How fast depends on the site and it should be on the order of a few days.

    3. Two important notes about robots.txt:

    a. As Googlebot is currently configured, it will halt all crawling of the site if the site’s robots.txt file returns a 503 status code for robots.txt. This crawling block will continue until Googlebot sees an acceptable status code for robots.txt fetches (currently 200 or 404). This is a built-in safety mechanism so that Googlebot doesn’t end up crawling content it’s usually blocked from reaching. So if you’re blacking out only a portion of the site, be sure the robots.txt file’s status code is not changed to a 503.

    b. Some webmasters may be tempted to change the robots.txt file to have a “Disallow: /” in an attempt to block crawling during the blackout. Don’t block Googlebot’s crawling like this as this has a high chance of causing crawling issues for much longer than the few days expected for the crawl rate recovery.

    4. Webmasters will see these errors in Webmaster Tools: it will report that we saw the blackout. Be sure to monitor the Crawl Errors section particularly closely for a couple of weeks after the blackout to ensure there aren’t any unexpected lingering issues.

    5. General advice: Keep it simple and don’t change too many things, especially changes that take different times to take effect. Don’t change the DNS settings. As mentioned above, don’t change the robots.txt file contents. Also, don’t alter the crawl rate setting in WMT. Keeping as many settings constant as possible before, during, and after the blackout will minimize the chances of something odd happening.

    You can then head on over to webmaster central to continue the discussion.

  • Google Search Plus Your World: Marketing Implications

    Google launched Search Plus Your World (SPYW) this week (as if you didn’t know), and has sent waves of controversy throughout the media – mostly with regards to competition and relevancy. But what do the changes mean for marketers?

    Online marketing firm iProspect reached out to WebProNews to offer some commentary, after distributing a POV to its clients with insights into the changes.

    Here’s a sample from their POV:

    These moves mark a continuation of the trends to include more social content and signals as part of both search results and the algorithms that determine them. By integrating both related Google+ profiles and the ability to follow them directly from SERPs for musicians, this may also mean the integration of Google+ business pages as well – for example, suggesting users follow the adidas brand page as a result of searching for adidas, or Motel 6 as a result of searching for Motels, making optimization, linking, following and keywords usage surrounding these profiles even more important.

    Furthermore, the wider use of content from a user’s social sphere theoretically opens the door to other Google-related services and activities becoming part of search results. For example, highlighting YouTube channels that a user (or a user’s contacts) are subscribed to, have liked, rated highly, stores and restaurants reviewed by people in a user’s circles, or content from sites that are part of their friends’ reader list, makes participation and gaining a following in these spheres even more important.

    Herndon Hasty, Associate Director, SEO at iProspect tells us, “It’s a firm step towards integrating social content and signals into search results, which is itself an effort to deliver more relevant results to its users and deliver results that’ll keep people coming back.”

    “It’s also our most solid look yet at how Google views the value of a social share, and how much it’s banking on the success of Google+,” he adds.

    Indeed. If people don’t show interest in Google+, and their search results continue to be bogged down by Google+ content, they may just find themselves going elsewhere for their search needs.

    “Brands definitely need to at least be claiming their names in Google+, if not contributing at the same level that they might in other social networks to take advantage of the special preferences that Google+ is getting in results,” says Hasty. “Images shared on Google+ are getting a lot more real estate on the SERPs than they did before, and shared videos are called out in the new SERPs as well, so making sure to share these kinds of assets from Google+ can help put you at an advantage when it comes to continually attracting your followers’ attention.”

    “It also means that brands need to be encouraging followers and shares like they do for Facebook and Twitter,” he adds. “Link to your profile from your site and customer communications, include +1 buttons at your site, and in general make an effort to capture fans in the Google+ network.”

    “Ultimately, it’ll come down to user adoption of Google+ as to whether or not this really tips the scales on social influence, and it’s going to be very interesting to see what Bing and Facebook do in response,” says Hasty. “We expect to see a lot of tweaking in the coming months as Google weighs how much influence to put on shares, and how often to show shared content balanced against traditional results. It’s certainly another fun start to the year!”

  • Google Talks About Why It Changes Your TItles In Search Results

    Google changes the titles of search results sometimes. This is nothing new, but the company is shedding a bit of light on the process, saying their alternative titles usually improve clickthrough rate.

    Google Webmaster Trends Analyst Pierre Far writes on the Google Webmaster Central Blog, “Page titles are an important part of our search results: they’re the first line of each result and they’re the actual links our searchers click to reach websites. Our advice to webmasters has always been to write unique, descriptive page titles (and meta descriptions for the snippets) to describe to searchers what the page is about.”

    “We use many signals to decide which title to show to users, primarily the <title> tag if the webmaster specified one,” he continues. “But for some pages, a single title might not be the best one to show for all queries, and so we have algorithms that generate alternative titles to make it easier for our users to recognize relevant pages. Our testing has shown that these alternative titles are generally more relevant to the query and can substantially improve the clickthrough rate to the result, helping both our searchers and webmasters. About half of the time, this is the reason we show an alternative title.”

    “Other times, alternative titles are displayed for pages that have no title or a non-descriptive title specified by the webmaster in the HTML,” he adds. “For example, a title using simply the word “Home” is not really indicative of what the page is about. Another common issue we see is when a webmaster uses the same title on almost all of a website’s pages, sometimes exactly duplicating it and sometimes using only minor variations. Lastly, we also try to replace unnecessarily long or hard-to-read titles with more concise and descriptive alternatives.”

    Far refers readers to a Google Help Center article about site titles and descriptions, which includes this video from Matt Cutts talking about titles and snippets:

    In the help center article, Google says to make sure very page on your stie has a title tag, that they’re descriptive and concise, to avoid keyword stuffing and repeated or boilerplate titles, an to brand your titles (concisely).

    “If we’ve detected that a particular result has one of the above issues with its title, we may try to generate an improved title from anchors, on-page text, or other sources,” Google says. “However, sometimes even pages with well-formulated, concise, descriptive titles will end up with different titles in our search results to better indicate their relevance to the query. There’s a simple reason for this: the title tag as specified by a webmaster is limited to being static, fixed regardless of the query. Once we know the user’s query, we can often find alternative text from a page that better explains why that result is relevant. Using this alternative text as a title helps the user, and it also can help your site. Users are scanning for their query terms or other signs of relevance in the results, and a title that is tailored for the query can increase the chances that they will click through.”

    If you don’t like the way Google has re-titled your pages, you can let them know in the Webmaster Help Forum.

    Pierre Far on <a href=Google+” src=”http://cdn.ientry.com/sites/webpronews/article_pics/pierre-far-gplus.jpg” title=”Pierre Far on Google+” class=”aligncenter” width=”616″ height=”263″ />

    On Google+, Far highlighted two main takeaways for webmasters from all of this:

    1. Our algorithms generate thee alternative titles so that your page is no longer constrained with having just the one title for all the different queries your page ranks for. This has the nice side effect of making the result look more relevant to our searchers and…

    2. … On average, the alternative titles increase the clickthrough rate on the results, i.e. more traffic for you.

    “The <title> tag is still a primary source for titles we show so all our advice about make them concise and useful and enticing still very much apply,” he says. “Keep an eye on the HTML Suggestions page in the Diagnostics section in Webmaster Tools for title suggestions.”

    Have you noticed Google changing your titles? Are they being improved?

  • Aaron Wall Interview: Google Paid Link Story Wrap-Up

    The topic of paid links is in the headlines once again, and ironically, Google is the accused. As WebProNews previously reported, Google was recently caught up in a controversy after it violated its own Webmaster Guidelines as part of a marketing campaign for Google Chrome.

    Aaron Wall, the author of SEO Book, first reported on the news after someone posted about it in one of his forums. As he explained in the above interview with WebProNews, the campaign was designed to relate Google Chrome to the Internet and tell why small businesses should use it. However, the posts were not of very high quality. Danny Sullivan, in fact, called the content “garbage.”

    “Basically, all these posts exist for no reason other than they are paid, they’re very low quality, and they’re flowing link juice,” Wall pointed out.

    While Google admits the campaign is theirs, it says that it did not intend to do any paid sponsorships. Apparently, Google hired Essence Digital, a digital media agency, for a video ad campaign to promote Chrome. Unruly Media, which is another media agency, was involved in the ordeal as well, and, from all indication, appears to be the company that actually executed the campaign.

    In the end, Google did come out and take action against itself. For “at least 60 days,” the PageRank for Google Chrome’s homepage will be demoted. On Google+, Matt Cutts said:

    I’ll give the short summary, then I’ll describe the webspam team’s response. Google was trying to buy video ads about Chrome, and these sponsored posts were an inadvertent result of that. If you investigated the two dozen or so sponsored posts (as the webspam team immediately did), the posts typically showed a Google Chrome video but didn’t actually link to Google Chrome. We double-checked, and the video players weren’t flowing PageRank to Google either.

    However, we did find one sponsored post that linked to www.google.com/chrome in a way that flowed PageRank. Even though the intent of the campaign was to get people to watch videos-not link to Google-and even though we only found a single sponsored post that actually linked to Google’s Chrome page and passed PageRank, that’s still a violation of our quality guidelines, which you can find at http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769#3.

    In response, the webspam team has taken manual action to demote www.google.com/chrome for at least 60 days. After that, someone on the Chrome side can submit a reconsideration request documenting their clean-up just like any other company would. During the 60 days, the PageRank of www.google.com/chrome will also be lowered to reflect the fact that we also won’t trust outgoing links from that page.

    Did Google fairly punish itself? Let us know what you think.

    A Google spokesperson also sent us the following statements:

    “Google never agreed to anything more than online ads. We have consistently avoided paid sponsorships, including paying bloggers to promote our products, because these kind of promotions are not transparent or in the best interests of users. We’re now looking at what changes we need to make to ensure that this never happens again.”

    Regarding the action:
    “We’ve investigated and are taking manual action to demote www.google.com/chrome and lower the site’s PageRank for a period of at least 60 days. We strive to enforce Google’s webmaster guidelines consistently in order to provide better search results for users. While Google did not authorize this campaign, and we can find no remaining violations of our webmaster guidelines, we believe Google should be held to a higher standard, so we have taken stricter action than we would against a typical site.

    According to Wall, because Google is such a big company, it is possible that all departments don’t know what other parts are doing. For this reason, he believes that Google should be more “lenient” when dealing with other individuals and companies regarding similar issues.

    “The big thing is, if all this stuff can happen to Google and they’re the one that makes those guidelines, then, of course, it can happen to tons of other people,” he said.

    Should Google be more lenient on the issue of paid links? What do you think?

  • Google On Targeting Parts Of Your Site To Different Locations

    Google recently put out a new webmaster help video with Matt Cutts discussing how to target parts of a site to different locations.

    The exact question Matt addresses in the video is: “Webmaster tools allows site owners to specify a geo location for targeting. How can this be done for multiple locations?”

    “It turns out there’s a very easy way to do it,” says Cutts. “If you have a domain, you can add, for example, sub-domains or sub-directories as separate sites in Webmaster Tools.”

    “And then once you have those added as separate sites, you can geo-locate or geo-target those individually,” he adds.

    Watch the video for the rest of his explanation.

  • 2011: Year Of The (Google) Panda

    2011: Year Of The (Google) Panda

    Perhaps the biggest story line in Internet search this year has been the ongoing saga of the Google Panda Update. Let’s recap, and look ahead to next year.

    Has Panda been the most significant thing to happen in search this year to you? If not, what was? Let us know in the comments.

    At the beginning of the year, there was a lot of attention being payed to the quality of Google’s search results, as the content farm movement was reaching a high search result saturation point. There was also a lot of criticism. Eventually, Google finally took action. It launched in February (globally in April), and initially earned the nickname “Farmer” update. I believe this was coined by Danny Sullivan. Then Google came out and let the world know what its real name was: Panda, named after a Google engineer that goes by Panda.

    “He was one of the key guys,” explained Google’s Amit Singhal in an interview with Wired in early March. “He basically came up with the breakthrough a few months back that made it possible.”

    So, whether you think Panda has been a great thing for search, or it has ruined your life and/or business, I guess you have this guy to thank. Though, I’m sure if he didn’t come up with it, someone else at Google would have come up with something similar. The criticism was getting pretty strong, and Google can’t afford to lose users due to poor search quality. Though Google does many, many other things and offers many products that people use on a daily basis, search and advertising are still Google’s bread and butter, and Google’s quality has still kept it high above competitors in search market share.

    We’ve probably posted close to a hundred Panda-related article at WebProNews this year, if you count the ones leading up to it, about content farms and their effects on search, and the ones about the update before it actually came to be known as Panda. I could probably turn them into a book if I wanted, so I’m not going to rehash it all here, but let’s go through some highlights.

    Google “Panda” Algorithm Update – What’s Known & What’s Possible was an early look at some things that were evident, and what people were speculating about what might be hurting them with the Panda update. There were a lot of good comments on this one too, for further discussion.

    Suite101, was one of the sites hit hard by Panda. In that Wired interview, Matt Cutts actually mentioned them by name, saying, “I feel pretty confident about the algorithm on Suite 101.”

    Suite101 CEO Peter Berger responded with an open letter to Cutts. You can read it in its entirety here, but it concluded with:

    Another level of depth may be added to this discussion if the word “quality” were more fully defined. “Quality” without much more precisely defining it, especially when the quality mentioned does only seem to be a quality signal relating to a given search query, leaves a lot still misunderstood…

    HubPages, which eventually had some recovery success attributed to the use of sub-domains, noted a lack of consistency on how Google viewed quality. According to CEO Paul Edmondson, some of the site’s best content had dropped in rankings, while others went up.

    Dani Horowitz of DaniWeb, which recovered, dropped, and recovered again, shared some interesting stories with us about how some of her most relevant stuff stopped ranking where it should have, while other less relevant pieces of content (to their respective queries) were ranking higher.

    Google, however, has always acknowledged that “no algorithm is perfect.”

    Panda hit a lot more than content farms, and sites that in that vein. E-commerce sites were hit. Coupon sites were hit. Affiliate sites were hit. Video, news, blogs and porn sites did well (at least initially).

    Oh yeah, Google’s own properties didn’t too bad either, though some of its competitors did well also.

    There was a lot of surprise when Demand Media’s eHow wasn’t hit by the Panda update, as this was essentially known as the posterchild for content farms, but that didn’t last. In a future interation, eHow eventually got hit, which led to the company deleting 300,000 eHow articles and launching a content clean-up initiatve. Yahoo just did something similar with its Associated Content this month.

    Eventually Google simply put out a list of questions that all sites should consider when thinking about creating “quality” content. The moral of the story is that, no matter what kind of site you have, if you heavily consider these things, you should have a better chance of beating the Panda update, because you’ll be creating good, trustworthy content. Those questions were:

    • Would you trust the information presented in this article?
    • Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?
    • Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
    • Would you be comfortable giving your credit card information to this site?
    • Does this article have spelling, stylistic, or factual errors?
    • Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?
    • Does the article provide original content or information, original reporting, original research, or original analysis?
    • Does the page provide substantial value when compared to other pages in search results?
    • How much quality control is done on content?
    • Does the article describe both sides of a story?
    • Is the site a recognized authority on its topic?
    • Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?
    • Was the article edited well, or does it appear sloppy or hastily produced?
    • For a health related query, would you trust information from this site?
    • Would you recognize this site as an authoritative source when mentioned by name?
    • Does this article provide a complete or comprehensive description of the topic?
    • Does this article contain insightful analysis or interesting information that is beyond obvious?
    • Is this the sort of page you’d want to bookmark, share with a friend, or recommend?
    • Does this article have an excessive amount of ads that distract from or interfere with the main content?
    • Would you expect to see this article in a printed magazine, encyclopedia or book?
    • Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?
    • Are the pages produced with great care and attention to detail vs. less attention to detail?
    • Would users complain when they see pages from this site?

    Other fun Panda nuggets:

    Panda reference in Google Earth Day Doodle

    Google Panda Update Gets Animated (And Kind of Weird)

    Google Panda Update: A Delicious Set of Resources

    Google tells you exactly where to let them know when you’ve been hit by Panda

    Hitler Not a Fan of the Google Panda Update

    Panda Bread: The Ultimate Treat For The Panda Enthusiast

    So here we are, almost through with 2011, and we’ve seen numerous iterations of the Panda update. We’ll continue to see more next year most likely. Google has said flat out, that it is done with them for the rest of 2011 though.

    In 2012, we can look forward to not only more Panda updates, but more focus on “above the fold” content from the sound of it, and who knows what else Google will have up its sleeve. The most important things to remember are that Google makes algorithm changes every day (over 500 a year), and there are over 200 signals the algorithm uses to determine rankings. Any of these signals or tweaks can help or hurt you. Stay on top of what Google is doing, and keep a focus on quality, and you should be fine. Remember, if you want Google’s RESPECT, you better RESPECT Google.

    Panda has affected a lot of websites. It’s cost people jobs, forced companies to rethink their content strategies, and even inspired people to offer rewards for help recovering.

    You can view all of our Panda coverage from throughout the year for more details, advice, case studies, parodies, and just about anything Panda-related that came up.

    Has Panda improved Google or hurt it? Let us know what you think.

  • Matt Cutts Talks Keyword Density

    Google has put out a new Webmaster Help video, featuring (as usual) head of web spam Matt Cutts. This time, Cutts is answering his own question, rather than a user-submitted question.

    The question is: What is the ideal keyword density: 0.7%, 75, or 77%? Or is it some other number?

    “A lot of people there’s some one recipe, and you can just follow that like baking cookies, and if you follow it to the letter, you’ll rank number one,” he says.

    Shockingly, that’s not the case.

    There’s no set percentage for keyword density, that will help you rank, according to Cutts. “That’s not the way that search engine rankings work,” he says.

    “The way that modern search engines, or at least Google, are built,” says Cutts with a slight chuckle, “is that the first time you mention a word, you know, ‘Hey, that’s pretty interesting. It’s about that word.’ The next time you mention that word, ‘oh, OK. It’s still about that word.’ And once you start to mention it a whole lot, it really doesn’t help that much more. There’s diminishing returns. It’s just an incremental benefit, but it’s really not that large.”

    “And what you’ll find is that if you continue to repeat stuff over and over again, then you’re getting in danger of keyword stuffing, or gibberish and those kinds of things.”

    “So, the first one or two times you mention a word – that might help with your ranking. Absolutely. But just because you can say it seven or eight times, that doesn’t mean that it will necessarily help your rankings.”

    “The way to think about it is this,” Cutts wraps up. “Think about the keywords that you’d like to have in your copy. Make sure your copy is long enough that you can work those keywords into your copy in a natural way and not an artificial way. And my recommendation is to either read it aloud or read it to someone else or have someone else read it, and sort of say, ‘Do you spot anything that’s artificial or stilted or that doesn’t quite read right?’ And if you can read through the copy, and have it read naturally where a person isn’t going to be annoyed by it, then you’re doing relatively well.”

    Another tip for surviving Panda? Don’t annoy readers.

  • Google Authorship Clicks And Impressions Added To Webmaster Tools

    We know that who you are is more important in Google now. Google has been pushing authorship markup for months. This ties you, as an author of content, to your Google Profile, which is linked to from a picture of you that appears next to your content in Google search results.

    Google has been clear about aiming to turn this into a ranking factor, if it isn’t already. It gives Google more information about the credibility of a piece of content. If it knows more about who wrote it, it can keep that in perspective. We don’t know how big a role this plays, exactly, but given the emphasis Google has been placing on the concept in recent months, you’d probably do well to put some emphasis of your own on it.

    Google announced the launch of author stats in Webmaster Tools. These show how often content is showing up – by author – on Google search results pages, allowing you to track clicks and impressions.

    “If you associate your content with your Google Profile either viae-mail verification or a simple link, you can visit Webmaster Tools to see how many impressions and clicks your content got on the Google search results page,” writes software engineer Javier Tordable on the Google Inside Search blog.

    The image at the top is what Matt Cutts would see here.

    I love Google’s new “author stats” feature: http://t.co/E5DO9MK8 Shows you helpful info without extra noise. 10 hours ago via Tweet Button · powered by @socialditto

    To see your own, you can log into Webmaster Tools with the same name you use for your Google Profile, and go to “author stats” under “labs” on the left-hand side.

    Keep in mind that being under the “labs” label means it is still in its experimental stage, so there is the possibility that there are bugs.

    For more on setting up your authorship, read these articles.

  • Google & “Young Babysitters”: An Exercise In Pornified Phenomenology

    Do you ever play that game where you Google random words or phrases like “ice cream” or “Kevin” or “filipino” just to see how far up in the results the first pornography page shows up? It’s a good holiday game. Something the whole family can enjoy.

    Apparently this isn’t so much fun for at least one person, who posted a complaint on Google’s Web Search Help forum:

    Wow. Quite a fascinating conundrum we have here. Anyone have any guess as to why?

    Anyone?

    …..Any-one?

    …..Bueller….?

    Wait! – Maybe those results are appearing because the Internet is obviously overrun with lecherous hordes of young babysitter-porn hounds! Actually, that’s undoubtedly a scientific fact of the universe. Anyone familiar with Rule 34 will shrug off this event without any further concern but this problem demands our continued attention as nothing so far really explains why similar searches for “babysitters,” “teen babysitters,” or “good babysitters” don’t return the same lurid results as “young babysitters.” A search of “young babysitters” returns, save for the second and third results, an entire page of babysitter-themed hardcore pornography. However, searching the other terms I listed above, all you receive are sites legitimately associated with the honest profession of babysitting children.

    This perplexing discovery beggars the important question: Why would Google’s search results for specifically “young babysitter” defile such an otherwise innocent inquiry on how to become a babysitter? What libidinous fascination with young babysitters has seeped so deeply into our cultural subconscious that, upon searching the Internet, this particular word pairing should automatically yield pants-shrinking depictions of prurient affairs?

    The most likely answer? We’re all perverts and these results are probably completely appropriate (NSFW, by the way) to the motives of “young babysitter” Googlers. The information that Google provides us is based solely on humanity’s prevailing interest and, in this case, that mechanism happens to perform like a dark mirror in which we glimpse our twisted reflection snarling back at us. Occasionally someone doesn’t recognize this reflection, like the forum poster above, and confusion begins. According to Google’s results, when most of us hear “young babysitters” we cognitively index that construct as “sex with young babysitters.”

    So let this be a lesson to all of you Internet searchers and would-be babysitters: be very careful with how you choose your search terms because something as seemingly benign as “young” could easily be an Internet synonym for yeahright.

  • Google: No More Panda Updates This Year

    Google: No More Panda Updates This Year

    Google formally announced via Twitter that there will be no more Panda updates for the remainder of the year. Granted, there’s not that much left of the year.

    This might have been nice to know a little bit earlier, but it’s still good to know. The last known Panda update happened nearly a month ago.

    Here’s Google’s “weather report” tweet (I’d embed it, but now Twitter’s giving me fail whales):

    Search weather report: no major Panda updates until the new year. Context: http://t.co/nDkj74ou

    There has been some discussion in WebmasterWorld about the possibility of Google taking a break on Panda for the holidays. Barry Schwartz over at SearchEngineRoundtable picked up on this disucssion as well, noting that Google probably doesn’t want a repeat of the whole Florida udpate fiasco, where online retailers kind of got the shaft due to an unexpected algorithm change years ago.

    Of course, in more recent years, Google has been pretty open about not wanting to do that to sites, and has recently expressed its goal to be more transparent about algorithm updates in general.

    Even if Google had thrown another Panda update at us, I’m not sure it would be comparable to the Florida situation, as that was much more unexpected. Google has semi-regularly been updating the Panda update all year, and savvy webmasters know that this will continue.

  • Google Panda, Google+, and Other Search Events of 2011

    It’s hard to believe that 2011 is drawing to a close, but it is. That said, if you could sum up the search industry over the course of the year in one word, what would it be? According to search veteran Bruce Clay, that word is “turmoil.”

    What do you remember most about the search industry in 2011? Let us know.

    Looking back at 2011

    The turmoil that Clay was referring to was largely because of Google’s Panda update. As WebProNews previously reported, Google rolled out its Panda update in an effort to target low quality sites across the Web. The impact of it, however, was extremely significant. Many people, such as Dani Horowitz of DaniWeb, saw their site drop dramatically and had no idea why.

    “We’ve determined, or at least convinced ourselves, that linking, the quality of your inbound link networking, is also part of the quality of your site certainly at a trust level,” said Clay. “Trust scores and components associated with the quality of how your site connects to everybody is part of the factor to determine whether or not you are a site worthy of ranking.”

    He went on to say that Panda was “disruptive” but that he thought, in the end, that it had helped Google’s search results.

    “Overall, I think that the results have improved,” he said.

    Google also released a “Freshness” update not long ago that was intended to index fresher content more quickly. From Clay’s perspective, this update really only impacts news content. Fortunately, most people seem to be hopeful about it.

    Another move, however, that Google made that did and will continue to have an impact on the search industry was Google’s move to encrypt search. If you remember, Google said it would begin encrypting logged-in searches that users do by default when they are logged into Google.com. For SEOs, this means that they will not receive referral data from the websites consumers click on from Google search results.

    Although Google claimed the move was done to protect user privacy, most SEOs – Clay included – aren’t buying into this theory, mostly because the move did not impact advertisers.

    “I really think that the intent there was more to allow Google to see what we are searching for themselves because they are now in the stream,” said Clay. “It’s sort of not a universal privacy issue [because] people don’t know, many times, that it’s an ad.”

    While Google announcements have primarily dominated the 2011 recap thus far, the yearly events do go beyond the search giant. For starters, social media is bleeding over much more into search. Clay told us that social media, and especially Twitter, has changed how people find sites.

    In other words, social media is becoming a replacement for the browser. Searchers look to their social networks for recommendations and reviews before they visit the brand sites. Clay said that this shift in behavior is still resulting in conversions even though the traffic is down.

    Speaking of social and search, Google’s release of its own social network Google+ was another significant move during the year. Clay told us that it doesn’t have a big impact on search at this point, but he suspects it will.

    In terms of the other search engines, Clay said that Bing has held its own during the year. Microsoft and Yahoo collectively appear to be growing in search share, but Clay said he thinks the reason is because Ask and AOL have lost some.

    Looking ahead to 2012

    Going forward into 2012, Clay has several predictions. For starters, he believes that Google Panda will continue. In fact, he said that the image should be changed to a polar bear instead of a panda because it would get meaner and more aggressive.

    “Google is in the business of making money,” he said. “Everybody needs to recognize that Google is a money generator.”

    For this reason, he believes that Google will also integrate Google+ into search in 2012. A few years ago, Google went from a “one size fits all” approach with search to personalized search results. In order to make these results geared more toward individuals instead of groups of people, Clay explained that Google+ would give the search giant this ability.

    “The best way to get your history is to just watch you and, I think, Google+ is that tool,” he said.

    “It is entirely within reason for Google, every time you login to Google+, for them to know where you are,” he added.

    As far as the other search engines go, Clay told us that Bing has good technology and that it would grow, especially in light of its partnerships with both Facebook and Mozilla.

    While some have already written Yahoo out of the search market, Clay said that Yahoo would remain a leader in the space. According to him, it’s out of the spidering business but not the search or algorithm business.

    “It’s kind of hard to criticize a company that only did a billion dollars in business,” he pointed out.

    In addition, Clay said that local search would continue to grow in 2012. Due to this growth, he thinks the search engines will begin to monetize it through a concept called local paid inclusion. He said it would be similar to Yahoo’s Search Submit Pro and that the companies would pay to get included in the top of the search results.

    Clay thinks the premium listing will have a call tracking system associated with it that would work like PPC ads work. For instance, if the number is clicked, the company pays the search engine. Based on past trends, he believes that Bing and Yahoo will offer this service before Google. He said that Google typically watches services from other companies and then develops their own version of it.

    Clay said we could expect this element as soon as January and believes so strongly in the concept that Bruce Clay Inc. is already preparing to offer services in this area.

    According to Clay’s predictions for 2012, the year looks to be just as interesting as 2011. Do you agree?

    What do you think the search industry will hold for 2012? Will it be as “disruptive” as 2011? Please comment.

  • Google Hides +1 Button In Search Results

    Google Hides +1 Button In Search Results

    Google used to show the +1 button on all of its search results at the same time, meaning you could look at a page, and see the button on each one. Now, you have to mouse over a result to see it.

    You can still see plus one counts when applicable, without having to mouse over. You can also see the button if you’ve already +1’d it, as well as your friends that have +1’d a result, without mousing over. If you have not clicked the button in the past, however, you will not see it without the mouseover.

    Google Plus One Hidden

    Is this an earth-shattering change by Google? No. Probably not. Still, I have to wonder if this will result in more or less clicks on the +1 button from search results. It’s a subtle, but interesting move by Google as the company is obviously trying to grow the user base of Google+ and the +1 button. Is making it less visible the right choice? Does it matter at all?

    +1’s do send Google a signal that is uses in ranking. I wonder if, simply by default, this will weaken that signal, even if just slightly. On the other hand, maybe people will notice it more if it pops up upon mouseover. What do you think?

    Hat tip to Alex Chitu at Google Operating System for pointing this out.

  • Twitter, Organic Links & The Elderly

    Twitter, Organic Links & The Elderly

    Today’s infographic round-up features an ode to the organic link from SEOBook, a look at whether or not you should be using Twitter, and how we care for our disabled and elderly.

    View more daily infographic round-ups here.

    Should you use Twitter? (Flowtown via AllTwitter):

    Should you use Twitter

    An ode to the organic link:

    How Google Hit Organic Links.

    SEO Infographic by SEO Book

    Our GoFigure infographic looks at the results of a Gallup poll of caregivers who also have a full- or part-time job.
    Source:LiveScience

  • Matt Cutts Sheds More Light On Google’s Quality Raters Process

    Matt Cutts Sheds More Light On Google’s Quality Raters Process

    Internet marketer Jennifer Ledbetter (otherwise known as PotPieGirl) wrote a post last month about the Google Quality Raters (you know, those people Google’s Matt Cutts and Amit Singhal talked about in that famous Wired interview about the Panda update, which look at search results and rate the quality, giving their feedback to the company).

    In that post, she wrote, “Now this makes sense to me – ONE rater can not cause a rankings change. However, I do believe that if a certain percentage of raters mark one url as spam or non-relevant, that it does throw up some type of flag in the system that can cause something to happen to that url. Now I naturally do not KNOW this, but I get that sneaky feeling.”

    Cutts responded today in the comments of the post to “dispel a misconception”. That “sneaky feeling,” he says, is “unfounded.”

    “Even if multiple search quality raters mark something as spam or non-relevant, that doesn’t affect a site’s rankings or throw up a flag in the url that would affect that url,” he says.

    In response to Jon Cooper, commenting on that post, Cutts goes on to say:

    The search quality raters sit in the “evaluation” part of search quality and they assess whether a new potential search ranking algorithm is a good idea or not. When they rate something as spam or not, we use that data to answer questions like “If we launch algorithm A, will spam go up or not?” or “Has our quality/spam gone up recently?”

    But the search quality raters are strictly “read-only”–they don’t directly affect our rankings in any way. If you think about it, you definitely wouldn’t want to spamfight on the same queries that you’re using to evaluate your quality: you’d get skewed quality metrics as a result.

    To be clear, Google does reserve the right to take manual action on spam. But that action happens in the webspam team, which is completely separate from the evaluation team and the search quality raters.

    PotPieGirl wrote a follow-up post discussing Cutts’ “debunking” of her other post. Someone named Steve commented on that post, saying, “It’s a safe bet that if raters flag a site for spam they turn it over to the spam team.”

    Cutts stepped up again to debunk just a little bit more, responding, “No, that’s a very bad bet, because it’s not true. If search quality raters rate a site as spam, it’s not sent over to the webspam team. Please see what I replied to Jon Cooper: you don’t want to spamfight on the data you use for metrics, or else you’ll get skewed metrics.”

    Google gave us a brief glimpse of the quality raters in this video earlier this year:

    I do mean glimpse.

    We gotta hand it to PotPieGirl for getting Cutts talking about this.

    @potpiegirl No worries–happy to clarify that point. 14 hours ago via web · powered by @socialditto

    Update: After this article was initially published, Cutts pointed out in a tweet that the topic also came up in a recent panel he participated in:

    @CCrum237 your article at http://t.co/17FcltBR reminded me that we touched on this during our Churchill Club panel & made me find it. 🙂 9 hours ago via web · powered by @socialditto

    Here’s the video from that: