WebProNews

Tag: SEO

  • Challenges with Raising Venture Capital & Being Transparent about It

    It hurts to get close to something that you want and then not get it, doesn’t it? When we’re talking about money and business, this situation is even more painful. Furthermore, talking about the situation does nothing but add more grief to an already complicated situation.

    Unfortunately, this is exactly the scenario that our friend Rand Fishkin, the CEO and co-founder of SEOmoz, found himself in not long ago. In 2007, the company received venture capital funding from investment firm Ignition, and earlier this year, was approached by a number of firms interested in investing further.

    Fishkin told us that the company had not planned on raising funding but that it began to get excited about the potential opportunity. During the bidding process, there was clearly one firm that stood out. Fishkin said it made them a good offer and the companies signed a term sheet.

    As he explained, this is “usually a done deal unless the investment firm finds fraud of some kind.” However, three weeks after the signing, the investment firm pulled out. Aside from the fact that SEOmoz did not receive the funding, he said it was also hard to understand why it happened since the firm did not give a clear reason for its action.

    “That experience was new for us,” said Fishkin. “I think folks tend not to write about the fact that even after a term sheet is signed, the investor can still pull out.”

    Because he has always been very open about all things SEOmoz, Fishkin wrote a very detailed post, within legal bounds of course, about the entire experience. WebProNews asked Fishkin about why he felt so compelled to be open since most companies would not go to the extreme to find out what they could actually disclose.

    He told us that transparency has always been a core value of SEOmoz and always would be. He believes that this includes both the good times and the bad times.

    “There’s nothing up my sleeve,” said Fishkin. “It’s all out there.”

    Is it possible for a business to be too transparent? What do you think?

    Fishkin and SEOmoz take transparency very seriously and believe in being upfront about all matters, even when they involve finances and legalities that aren’t flattering.

    “It’s one of the qualities that consumers and business customers appreciate so tremendously much these days,” pointed out Fishkin. “We’re getting a culture, it’s particularly in the technology world, that anticipates, loves, and rewards transparency.”

    With this transparency, there is also a risk since investors may avoid SEOmoz in the future out of fear of being the subject of a blog post. Fishkin admits that this is a very real concern but said it was one that he was willing to take.

    “It’s a risk that we feel comfortable with,” he said. “I would rather say I’m going to commit to our core values, we’re going to do it 100%, we will be transparent no matter the costs, rather than say… we’re transparent but only when it’s convenient for us.”

    Even though SEOmoz didn’t receive the funding, no one can say that the company doesn’t stick by its values. The experience, however, has made the company hesitant about raising capital in the future.

    “We’re going to go back to our original mission of not raising capital,” said Fishkin. “Maybe we’ll think about it again next year, but I sort of hope we don’t.”

    “I’d prefer not to go through that process,” he added. “It takes a lot of time and energy away from running the business.”

    If the opportunity were to come up again, Fishkin told us that he would like his company to be in a position in which it doesn’t need the funding, so that it could walk away if it wanted. Since most startups that are covered by the Silicon Valley media receive funding, he also said that he would try to create buzz around his company before he attempted another VC round.

    Although the experience was difficult, Fishkin and SEOmoz have received a lot of praise and support for being transparent. Fishkin told us the praise is a “good consolation prize” but that it was a little “bittersweet.”

    Going forward, he hopes that startups will be more aware of potential issues and that investors will be more cautious.

  • Google Panda Update: New Winners and Losers

    This past week, Google rolled out its latest iteration of the Panda update, which the company (as usual) downplays as only one of roughly 500 yearly algorithm changes.

    It doesn’t sound like such a big deal when they put it that way, but for those who have lost major traffic because of it, it was a bigger deal than most of those other roughly 499 changes. Ask Dani Horowitz from Daniweb, who noticed the big traffic drop and tipped us about it before we confirmed the update with Google.

    Daniweb was hit by Panda earlier this year, and was able to get all the way back to a 110% recovery – something few have been able to achieve. Then along came Panda “2.5” (as the industry is calling it) early in the week and took away more than half of Daniweb’s traffic overnight. All of the hard work that Daniweb put into that recovery might as well have been erased.

    But Daniweb is far from being the only victim here. SearchMetrics, which has regularly released data about Panda winners and losers throughout the year, has compiled another list of the top winners and losers as a result of 2.5.

    Here are the biggest losers:

    SearchMetrics Panda Update

    A few things worth noticing:

    A. Press release distribution sites were hit again. We talked about PRNewsire getting victimized by Panda in the past. Now it, along with BusinessWire – arguably the two top services in this area on the web, have been hit again.

    B. EzineArticles and Demand Media’s eHow – two big past Panda victims are not present on the list.

    C. Some pretty high profile sites are on the list. Today.com. TheNextWeb (which if anything has increased in quality if you ask me).

    It’s a pretty interesing list, as is the winner list:

    SearchMetrics Panda Update

    A few things of note with regards to this list:

    1. Google sites won again (YouTube and Android.com). I’m not saying they shouldn’t be on the winners list, but given the regulatory scrutiny Google has found itself in over how it treats its own content in search results, one has to wonder if this will draw the attention of regulators.

    2. HubPages is on the winners list. The site, which we have written about several times, used to make the loser list. They must be doing something right. But who knows? They could get hit on the next one. One would have thought at that Daniweb was doing something right too.

    3. The list is dominated by pretty big brands.

    I’m sure we’ll be digging into all of this more soon, but this is a quick look at what Google’s algorithm is considering to be of quality, for better or worse. It will be interesting to watch how these sites perform moving forward.

    I can tell you one thing, Google is all about some identity these days. I’d encourage you to take advantage of the authorship markup Google uses to highlight who is responsible for various content. They’re even starting to include Google+ Circles numbers with it. It’s looking more and more like you ought to be taking full advantage of Google+ if you want to do better in search.

  • Google Shows Circle Counts for People In Search Results

    Lately, Google has been placing a lot of emphasis on the importance of who you are on the web. That’s why they want you to use your real name on Google+ (or more broadly, your Google Profile).

    This thinking certainly applies to search. This year, Google introduced authorship markup, which helps Google associate various content from a person with that person in search results, and ultimately gets that person’s profile prominence in Google search results. If you ever see a little image of a person off to the side of a search result, which is clickable (leading to that person’s profile), this is likely what you’re seeing.

    It’s good for authors to gain exposure, and it helps readers establish some level of trust by simply knowing where a result is coming from (regardless of whether or not they actually trust any specific author). In fact, Google is so concerned about this, it doesn’t even want authors to have profile pictures that are the least bit unprofessional. For example, I know a guy who was using a picture of himself in his Halloween costume for his profile picture, and a Googler actually contacted him and asked him to change it. There was nothing bad about the picture, they just wanted a regular picture of him for his profile pic, presumably so people wouldn’t see anything goofy in the search results, and hurt the perception of Google’s rankings, even if the content it showed up next to was perfectly legitimate.

    Google recently posted a pair of videos explaining how to implement authorship markup, if you need a bit of guidance:

    It would appear that Google considers how many people have you in Circle on Google+ to be some indication of who you are now.

    The Next Web says an unnamed source confirmed that the next step of Authorship Markup is to show the number of Circles you’re in on the search results pages. You can already see it in action for some people.

    Circle counts in search results

    This actually makes the whole Circle limit thing a little more interesting. If you can only have so many people in your Circles on Google+, you’re not going to want to add just anybody right? In an article this week, we called for Google to get rid of Circle limits because it limits our access to information through Google+, but is this the mindset Google has here?

    The bigger names on the web are going to have more connections, so if they can’t put every one of them into a Circle, they’re only going to want to put their top connections in there. I don’t know if this is the way Google is looking at things, but it raises an interesting point, especially with the attention that Klout has been getting (and now its new competitor Kred).

    While we don’t know that the number of Circles you are in is a search ranking signal, it seems very likely. Remember, when Google was talking about authorship markup, they said they want to “get information on credibility of authors from all kinds of sources, and eventually use it in ranking.” It seems pretty logical that circle count could play a role.

    If I’m wanting to get more Google search respect, I’m trying to get in more Circles.

  • Business Wire Patents SEO Strategy

    Business Wire Patents SEO Strategy

    Press release distribution service Business Wire announced that it has been awarded a U.S. patent for the technological process of optimizing and distributing press releases to maximize their ability to be found and tracked in the search engines.

    So, basically they’ve patented an SEO strategy. Strange, but interesting. Will this lead to other SEO strategies being patented?

    Business Wire says its strategy is the result of “years of research and development and considerable investment.”

    I wonder how many SEO firms would make similar claims.

    “Our new SEO patent provides complimentary enhancements to Business Wire’s already powerful press release distribution and measurement services,” said Laura Sturaitis, Executive Vice President of Media Services and Product Strategy.

    “Through Business Wire, customers have the power to effectively analyze and optimize their press release content for search, then simultaneously deliver their news to media and market participants via our patented NX delivery network, then measure audience engagement via our NewsTrak reports,” she added.

    “With the awarding of this patent, in addition to Business Wire’s NX distribution technology patents, our company continues to develop unique, proprietary and more effective communications innovations that have been a hallmark at Business Wire for 50 years,” said Cathy Baron-Tamraz, Business Wire CEO.

    It’s going to be interesting to see if this patent leads to any legal battles in the SEO world.

    Business Wire is owned by Warren Buffett’s Berkshire Hathaway.

  • Rick Santorum Asks Google to Help with his Google Problem

    Senator and current Presidential hopeful Rick Santorum still has a Google problem. When I say “still,” I am referring to the fact that Santorum has had this little problem for going on four years.

    And apparently, he has contacted Google in an attempt to rid himself of the annoyance. That request has been denied.

    First, a little background for those of you who are unfamiliar with Rick Santorum’s Google problem:

    Years ago, Santorum drew the ire of popular blogger Dan Savage by making some unsavory comments regarding the gay community. During an interview where he stated the position that consenting adults have no expectation of privacy, Santorum equated homosexuality to bigamy and incest. He also made some comments relating homosexuality to bestiality, although he has maintained that the were taken out of context.

    Either way, Savage and some other activists were less than pleased. They launched a Google bombing campaign to redefine the definition of the word “Santorum.” Through SEO tactics and link-trading, they were able to push a website called spreadingsantorum.com to the very top of the Google search results for “Rick Santorum.”

    To this day, if you search “Rick Santorum” or just “Santorum,” the first thing that you will see is this definition:

    1. The frothy mix of lube and fecal matter that is sometimes the byproduct of anal sex. 2. Senator Rick Santorum.

    For reasons that need be explained to nobody, Rick Santorum isn’t pleased with this. And Politico is reporting that the Senator has contacted Google in an attempt to have the site removed. According to the report, he referred to Google as a purveyor of filth and an irresponsible business –

    “I suspect if something was up there like that about Joe Biden, they’d get rid of it,” Santorum said. “If you’re a responsible business, you don’t let things like that happen in your business that have an impact on the country.”

    He continued: “To have a business allow that type of filth to be purveyed through their website or through their system is something that they say they can’t handle but I suspect that’s not true.”

    A Google spokesperson has responded, saying that Google “does not remove content from out search results, except in very limited cases such as illegal content and violations of out webmaster guidelines.” Basically, no dice, Senator.

    In a recent Funny or Die video, Dan Savage threatens to redefine the word “Rick” if the Senator attacks gay people during his presidential campaign. The clip taps celebrities with a particular interest in keeping the name unscathed. It’s funny, and NSFW –

    Santorum’s Google problem was front and center back in May when The Daily Show decided to bring it up on his birthday.

    Unfortunately for Rick Santorum, his apparent request to Google has really only done one thing: Upped the searches for “Rick Santorum” on Google.

  • Fear, Email, SEO, Video, Mobile and Working From Home

    There are a lot of videos hitting the web on any given day, so we’ve been curating a daily round-up of some of the most interesting ones we’ve come across. In a similar spirit, there have also been a lot of interesting infographics coming out lately, so we thought, why not take a similar approach to these.

    Here are some that are currently making the rounds.

    This one about email viewing habits comes from litmus:

    Email infographic

    AOL Jobs points to this one based on Harris Interactive Poll data about watching video at work:

    video at work  

    This one from WorkSimple analyzes the “work from home” phenomenon:

    work from home

    This one looks at how people share things using Bump:

    Bump info

    This one from Marketo looks at how marketers are embracing the mobile web:

    mobile web  

    This is more a chart than an infographic, but still interesting (from Silicon Alley Insider):

    Photos  

    Search Engine Journal posted this SEO Audit Checklist:

    SEO Audit

    And then there’s the fear:

    What do you fear?
     

  • Google and Bing Changes You Need to Know About

    Google and Bing Changes You Need to Know About

    There have been a whole lot of announcements from the major search engines this week, that all webmasters should be aware of – especially from Google, because while its market share may have slipped slightly (while Bing-powered search has grown a bit), it’s still by far the most used search engine.

    Are the search engines headed in the right direction? Tell us what you think in the comments.

    Cutts on Why Your PageRank Would Drop

    While not exactly an announcement, Google’s head of web spam Matt Cutts did post a video discussing reasons why Google Toolbar PageRank would drop. We talked about this a little bit more here, but you can hear exactly what he had to say in this video:

    There is a part in there where he mentions that if you were caught selling links, but have stopped and want to earn Google’s trust back, you should submit a reconsideration request. On that note, Google announced that it is getting “more transparent” with its reconsideration requests.

    Better Communication

    “Now, if your site is affected by a manual spam action, we may let you know if we were able to revoke that manual action based on your reconsideration request,” explain Tiffany Oberoi and Michael Wyszomierski of Google’s Search Quality team in a joint blog post. “Or, we could tell you if your site is still in violation of our guidelines. This might be a discouraging thing to hear, but once you know that there is still a problem, it will help you diagnose the issue.”

    “If your site is not actually affected by any manual action (this is the most common scenario), we may let you know that as well,” they add. “Perhaps your site isn’t being ranked highly by our algorithms, in which case our systems will respond to improvements on the site as changes are made, without your needing to submit a reconsideration request. Or maybe your site has access issues that are preventing Googlebot from crawling and indexing it.”

    Google says it’s not able to reply to individual requests with specific feedback, but that now webmasters will be able to find out if their site has been affected by a manual action and will know the outcome of the reconsideration review.

    Google Using Blocked Site Data in Algorithm

    Earlier this year, Google announced some new domain blocking features, which included a browser extension, and a link next to search results, which allow users to block sites that they don’t like. This was part of Google’s big quality clean up initiative, which also includes the Panda update and the +1 button. Initially, the sites blocked were on a personalized basis, but that is no longer completely the case. Google search quality engineer Johannes Henkel is quoted as saying, “We’ve also started incorporating data about sites people have blocked into our general search ranking algorithms to help users find more high quality sites.”

    Pagination and View-All in Search Results

    Google is “making a larger effort” to return single-page versions of content in search results, when the content is broken up among multiple pages. Think multiple page articles and content slideshows. Google says users tend to prefer single page versions of content, but sometimes these can load slowly, so there are also times when the multiple pages work better.

    “So while a view-all page is commonly desired, as a webmaster it’s important to balance this preference with the page’s load time and overall user experience,” Google indexing team software engineers Benjia Li & Joachim Kupke say in a joint blog post on the Webmaster Central blog.

    You can read more about the technical specs here. They summarize it all nicely: “Because users generally prefer the view-all option in search results, we’re making more of an effort to properly detect and serve this version to searchers. If you have a series of content, there’s nothing more you need to do.”

    To better optimize your view-all page, you can use rel=”canonical” from component pages to the single-page version; otherwise, if a view-all page doesn’t provide a good user experience for your site, you can use the rel=”next” and rel=”prev” attributes as a strong hint for Google to identify the series of pages and still surface a component page in results.

    They talk even more about the specs of using rel=”next” and rel=”rev” in this post.

    Rich Snippets for Apps

    Google is also showing rich snippets for apps in search results now. They’re getting info for these from various places including: Android Market, Apple iTunes and CNET.

    Application rich snippets

    “Before you install a software application, you may want to check out what others think about it and how much it costs,” says product manager Alejandro Goyen. “Starting today, you’ll be able to get information about the applications, including review and price information, right in your search results.”

    That’s something to consider if your business has an app. It’s a reputation factor.

    Editing in YouTube

    This isn’t exactly a search feature, but when you consider how big a role video can play in search marketing and that YouTube is the second largest search engine, it’s certainly worth your attention. YouTube has launched new editing tools that allow you to easy edit videos right from YouTube itself.

    This should help you improve your videos, which are not only searchable on the second largest search engine and embeddable across the web, but often appear right in the results of regular Google searches. This new editing functionality will make it easier to try new things with less successful videos and potentially make them more viral.

    Bing Adaptive Search

    Ok, getting away from Google, Bing has launched adaptive search, which is essentially its version of personalized search. The company says it “helps decipher the intent and context of each search you conduct based on your search history.”

    “The concept of personalized search is not a new idea, but Bing continues to focus on it and drive progress as the search space evolves,” a representative for Bing tells WebProNews. “In fact, Bing views personalized search as less of a ‘feature’ and more of what to expect from search.”
 
“Ultimately, the goal is to reduce ambiguity and help people find what they’re looking for more quickly,” he adds. “The personalization can be pretty subtle to the naked eye, but the more Bing learns about your intent the more personal it will become. And Bing also wants to be sure a diverse set of results still show up so people aren’t locked in a ‘filter bubble’. We think this provides a good balance.”

    You’ve been dealing with this kind of thing with Google for quite some time, but it does throw in another SEO factor to consider for Bing, which as previously mentioned continues to gain market share.

    WebProNews is interviewing Bing’s Stefan Weitz as I write this, so check back at WebProNews for more on this soon.

    New Analytics Tool from Blekko

    Finally, alternative search engine Blekko has released an interesting search analytics tool, which some of you might find useful. It’s called “Web Grepper”.

    “The Web Grepper searches for unique data information and trends that are embedded in code and cannot be found on any other search engine,” a spokesperson for the company tells WebProNews. “For example, you could search to see how many pages request your user information when you visit, the types of  targeting information the site collecting, or how many sites have ‘Like’ vs ‘+1′ buttons, etc.”

    Users can submit questions to the tool and the Blekko community votes on a daily basis on which questions will be analyzed.

    These aren’t the only things going on in search this week, but these are some of the more noteworthy things that are likely to have a bigger impact on most site owners, than say things like Flight Search and Baseball scores.

    Do any of these items concern you? Make your life easier? Let us know in the comments.

  • A Customer-centric Content Marketing Approach

    A Customer-centric Content Marketing Approach

    The pressure of competition and desire for business growth pushes marketers towards tactics that promise quick wins. Pundits advocate strategy first (been there) but doing so in a comprehensive way isn’t always practical, especially when it comes to areas like social media and content marketing.

    For marketers in need of practical advice on customer-centric, practical content marketing, a solid framework can be invaluable for an adaptive approach that is thoughtful about overall direction and measurable short term impact at the same time.

    An increasing number of Search Engine Marketers are advocating both Content Marketing and Social Media in concert with achieving SEO objectives which is a great sign, but often lacking a customer-centric approach.

    Here’s a Content Marketing framework that proves to be customer-centric as well as SEO and Social Media savvy that I think any smart online marketer can follow.  Keep in mind, with a holistic approach, this 4 part framework can be applied to any type of online content that a company produces: HR, Customer Service, Public Relations, etc.

    Optimize - Content Marketing Optimization

    I talked about this approach at Content Marketing World recently and will be elaborating on it at several future events as well. Of course I drill down even deeper in “Optimize“.  But since that book won’t be out until the first part of next year, here is a bit of an elaboration.

    Customers – Optimize for keywords or optimize of customers? It may be semantics and it’s certainly not a mutually exclusive situation with customer segments and individual search keywords. Many online marketers focus on keywords that are popular and relevant to products and services without ever considering things like customer pain points, behaviors and position within the buying cycle and how that manifests as a search query.

    Content Marketers organize their campaigns according to customer needs and how to influence those customers to buy. Add keyword optimization (SEO) to that mix and you have a very powerful combination.

    • Identify customer segments – What do they care about? What is their context?
    • Document pain points & information needs during buying cycle.
    • Build a path of content including triggers that inspire purchase and social sharing.

    Keywords – As you understand the language of your customer, the opportunity to optimize content for search “findability” becomes very important. What better place to connect with customers than at the moment they proactively seek a solution? Build relevant keywords according to customer interests into a content creation plan with key messages and you’ll be one step closer to “relevant ubiquity” .

    Besides search keywords, it’s worth considering social topics. The interplay between searching and social referrals is becoming more standard as buyers navigate information resources online.

    • Brainstorm and research keywords with tools like Google AdWords Keyword Tool, Wordtracker and Ubersuggest.
    • Tap into social media monitoring tools to gauge what topics cluster together on social networks, blogs and Twitter, relevant to your search keywords.
    • Organize search keywords and social topics into a keyword glossary shared with anyone in your company that creates online content.

    “Content – is King and Creativity is Queen”, according to Pan Didner of Intel. I happen to agree. Content Marketing is growing and soon “everybody will be doing it” but certainly not doing it well. Through a combination of keen customer insight, analytics and smart creativity, online marketers can stand out amongst the 27 million pieces of content shared in the U.S. each day or the 5 Exabytes of information created every 2 days around the world.

    Keywords and topics can fuel a Content Plan that provides a calendar of planned content publishing, topics, optimization focus, promotion channels and planned repurposing. Allow for wildcards and spontaneous content creation according to real-time opportunities and current events.

    • Plan content according to customer segments, keyword topics and business services/product offering.
    • Leverage search keywords for content optimization on the website, blog and on social media sites.
    • Create modular content that can serve its purpose individually, as part of a matrix of topics and as repurposed content in the future.

    Optimize & Socialize – Armed with customer insight, a keyword glossary and a content plan, it’s time for those Social SEO smarts to see some action.  With content staff and social media teams trained on SEO best practices, new content will be easier for prospects and customers to find – when it matters. They’re looking for it!   Monitoring search analytics for refinement of on-page optimization helps keep your investment in optimized search and social content high impact and current.

    In today’s online marketing world, there is no “Optimize” without a smart dose of “Socialize”.  Social network development and content promotion is essential to inspire sharing, traffic and links. Social links and web page links to your content provide a powerful combination for search engines to use when finding and ranking helpful information that leads your customers to buy and share.

    • Train copywriting and social media staff on keyword glossaries and SEO best practices. Keep social topics up to date!
    • Optimize web and social content on and off the corporate websites while engaging and growing social networks.
    • Create, optimize and share useful content that will inspire customers to buy and share with their social friends.

    The particular strategy, goals and methods of measurement will vary according to your situation of course, but as I mentioned above, this framework is applicable to any area of online content that a company might be publishing: Marketing, Sales, Customer Service, Human Resources, Public and Media Relations.

    Have you seen examples of companies doing a great job of going from basic SEO to more robust content marketing optimization? Have you implemented or observed some great examples of “optimize and socialize”?

    Check out Top Rank Blog for more articles by Lee Odden

  • Reasons Why Google Toolbar PageRank Would Drop, According to Google

    Google’s Matt Cutts posted one of his webmaster help videos discussing Google Toolbar PageRank, why it’s only updated a few times a year, and why webmasters might see their PageRank drop. He also talks about how to get back in Google’s good graces if this happened because you were selling links.

    Typically in these videos, Cutts is responding to a user-submitted question, and that is the case here. The question was: “I use the Google Toolbar to monitor PageRank. I read on the Internet that it gives old and quite unreliable data. Can I have reliable realtime PageRank information about the sites I administer? And how can I idenity causes of a PageRank drop?”

    “The information that you get from the Google Toolbar is updated about 3 or 4 times a year, and the reason we don’t provide it every single day is because we don’t want webmasters to get obsessed with the green in the Google Toolbar, and not pay the attention that should be spent on titles and accessibility, and good content, and all those kinds of things,” says Cutts. “A lot of people, if you show them just the PageRank and update it every day, they’re just going to focus on that. So we didn’t want that kind of obsession or backlink obsession to take hold where people would only pay attention to the PageRank in the toolbar.”

    This is not the first time we’ve seen Google de-emphasize the need for webmasters to focus on PageRank. Ultimately, while it may be a strong signal used by Google in determining search result ranking, there are over 200 other ones, and the formula changes every single day. Social and location factors have certainly played bigger roles in recent memory. You can bet that Google’s +1s are going to continue to play a strong role.

    “The question that it’s ‘quite unreliable’ – it’s not unreliable, it’s just rounded to a zero to ten sort of scale, so there’s nothing unreliable about that necessarily,” says Cutts.

    “Then, the question of ‘how can I identify the causes of a PageRank drop’ – well, if the only PageRank that you had, for example, was from one very reputable link, and that site stopped linking to you, that could lead to a drop in PageRank,” he continues. “If you’ve done something really weird with your internal linking, and you canonicalization is very strange, so we don’t know – maybe there’s a completely different site on www vs non-www – so you know, those kinds of canonicalization issues, that can also lead to a PageRank drop.”

    “But one of the most common reasons we see for a PageRank drop, at least in the Google Toolbar, is if a site is selling links, and so if your PageRank dropped 30% all of a sudden, and you were selling links that passed PageRank, the reason for that is selling links that pass PageRank violates our quality guidelines,” he says. “And if you think about it, it’s a pretty understandable thing. It’s a lot like payola, in the sense that you pay somebody money and that you get a mention, and it’s not adequately disclosed to the search engine. If some site is doing that, that can account for a drop in the Toolbar PageRank.”

    “So if that’s what might have happened, all you have to do is remove the links that you were selling, and then do a reconsideration request, and say, ‘Hey, I was selling links, they passed PageRank, I saw my PageRank dropped, and so I’ve removed those links, you can verify it, and please let me regain my trust with Google.’ And so if we see that things look good, and it looks like there’s a good faith effort there, and we’re reasonably convinced the selling of PageRank won’t happen again for example, then often times your PageRank will return.”

    You can find the reconsideration request form here.

  • Google Webmater Tools – Changes To Link Categorization

    Google announced that it is changing the way it categorizes link data in Webmaster Tools.

    “As you know, Webmaster Tools lists links pointing to your site in two separate categories: links coming from other sites, and links from within your site,” says Google Webmaster Trends analyst Susan Moskwa. “Today’s update won’t change your total number of links, but will hopefully present your backlinks in a way that more closely aligns with your idea of which links are actually from your site vs. from other sites.”

    For one, subdomains are now counted as internal links, which makes a great deal of sense. Here’s a chart showing how links have changed:

    Link categorization

    “If you own a site that’s on a subdomain (such as googlewebmastercentral.blogspot.com) or in a subfolder (www.google.com/support/webmasters/) and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links,” says Moskwa. “We’ve made a few backend changes so that these numbers should be even more accurate for you.”

    She does note that if you own a root domain, your number of external links may appear to go down.

  • Google Authorship Markup – An Easier Way

    Google really wants people writing web content to start using authorship markup. Not only are they looking to use it as a ranking signal, but it also pushes the Google Profile, which is essentially the backbone of the Google+ user experience.

    Granted, you don’t need to be a Google+ user (at least at this point) to have a Google Profile, and Profiles existed before Google+, but in Google+, the Profile is essentially the equivalent of the Facebook Wall, and authorship markup places them right in search results with nice little clickable graphics.

    In a recent article, we looked at a video Google released discussing how to implement authorship markup on your site. They’ve now released another one offering a few quick steps to get it to work when you don’t necessarily control the CMS of the site you’re writing content for. This way, even guest authors can add it.

    Google calls it, “a way to make it even easier to annotate your pages and show that there is authorship.”

    Here are the basic steps:

      1. Find your Google Profile

      2. Add “?rel=author” on the end of your Google Profile URL

      3. Wrap that in an a tag – <a href=”that url here”

      4. Google wants you to use something like “+Matt Cutts” as the anchor text.

      5. Insert that on your article, and point your Google Profile back to the site

    “If I can’t control the attributes, I can still add a link to this special URL,” says Cutts, and it’s really as simple as that.

  • Create and Maximize Videos for Improved SEO

    If you’re like many website managers, you’ve been dealt a hefty blow by Google recently, with its various algorithm updates that have taken place over the past few months. The purpose of Google’s updates, known as “Panda” or “Farmer,” was to separate quality content from content farms, scraper sites and other types of low-quality websites. Google estimates that the changes impact nearly 12% of all searches.

    While these updates were made to help users find high-quality information, they had unfortunate side effects for many credible websites. A poll by Search Engine Roundtable found that after the first update, 40% of respondents were seeing less Google traffic. Visits continued to fall off as subsequent updates continued.

    If you’ve been investing in SEO initiatives, these modifications to Google could be a tremendous setback. Fortunately, there is a fast, powerful way to boost organic search traffic—as quickly as this week. The answer is in video.

    Video is the most powerful force on the web today, and continues to occupy more of users’ time and attention. Interestingly, video sites such Metacafe and YouTube were virtually immune to the Google updates, which tells you something. Also, videos continue to rank high in search results. So for anyone needing to improve their SEO rankings, video represents a tremendous opportunity. When video is on your site it can help raise your visibility significantly. But how can you build a video presence online?

    If you don’t already have video on your site, below is a short guide for how you can add video to quickly see an SEO boost. If you do already have video but don’t know if it’s being indexed by Google, skip ahead to tip #4 for guidelines on how to get your video indexed – a critical step to seeing SEO benefits.

    1. Cover your Entire Product Catalog with Video

    First off, as all of us in the online world know, video is engaging. And for e-commerce sites in particular, product videos are an effective way to create trust, demonstrate pretty much any product, and connect with your shoppers. More importantly, they help increase conversion rates. So, if you haven’t already begun to convert your product catalog to video, it’s time to get started. This may sound daunting, but automated video platforms make this so easy, virtually anyone can create and publish compelling product videos fairly quickly.

    You can also hire a professional video production firm, but that gets very costly – and if you have hundreds of products to cover, it may also take months to go through your entire catalog. And what happens if pricing or availability changes?

    If covering your entire catalog sounds like biting off too much, then simply start with your top sellers. Continue to add video over time, rather than waiting until everything is available on video to launch.

    2. Ensure Videos are Relevant

    Like your product choices and all the content on your site, your videos need to be relevant to your audience and what they’re seeking. One way to ensure relevance is by including appropriate videos on each product page. If you’ve done your homework and labeled your videos correctly, you can easily serve up video that answers a consumer’s needs.

    What’s more, video is undoubtedly the best way to engage shoppers and keep them on your site longer. As an added bonus, time spent viewing is another measurement weighed by Google as it determines how to serve up search results.

    3. Test, Test and Test Again

    What works better for your product set and your audience? Text or voiceover? Classical music, popular music or none at all? Illustrations or photography? Testing doesn’t have to be expensive, but with the wealth of user information available, it can point you in the right direction for cost-efficient, effective efforts in the future.

    4. Ensure Your Video Content Is Indexed

    Even if you already have video on your site, you can’t assume it’s being indexed by Google. You need to help them index your videos in order to get the SEO benefits you’re working so hard to achieve.

    To find out if your videos are indexed:

    1. Go to Google
    2. Type “site:domainname” (using your own domain name) in the search box
    3. Click “Videos” in the left navigation

    The number you see above the search results is the number of videos indexed. If that number is lower than the number of videos on your site, you know you need to take steps to make sure Google finds your content. This example shows you that Heavenly Treasures has roughly 5,000 videos on its online retail site.

    5. Submit a Video Sitemap

    To get your videos indexed, you need to submit a video sitemap to Google—manually via their webmaster tools page or through a third-party service. Once the sitemap is properly submitted, video content gets indexed almost immediately, unlike other types of content which can take weeks. In order to get the maximum benefit, make sure that you follow Google’s steps and include all the information required, and resubmit your sitemap any time you make a change to your video catalog.

    Once indexed, you’ll be pleased to see that your product videos quickly begin to rise in search rankings. Even better, because thumbnail images appear with video listings, your results are likely to enjoy significantly higher click-through rates.

    Many online retailers have found that the use of video has dramatically improved their SEO efforts. Online wholesaler DollarDays, for example, created videos and submitted a sitemap shortly after the new algorithm went live. Within 24 hours, all of their product videos were fully indexed and many were appearing on page one of Google search results. Naturally, this also had an instant impact on views and conversions.

    Video may seem a bit daunting, but there was a time when doing business online seemed out of reach, too. There’s no better time for you to make the move to video—especially with the holidays on the horizon. From visitor engagement to SEO advantages, there are many good reasons to become a part of the video world.

     

  • The Next Google Ranking Signal: Your Google Profile?

    The Next Google Ranking Signal: Your Google Profile?

    Long story short: if you’re looking to help your search engine rankings, you might need a Google Profile (the backbone of Google+).

    Who you are as an individual is becoming more important in search ranking. Is this the right way to go for search? Share your thoughts.

    As previously reported, Google has a new series of tutorial videos, and in a new one, Google’s Matt Cutts and Othar Hansson discuss “authorship markup”.

    Google announced this back in June saying it is “experimenting ” with using the data to help people find content from authors in search results.

    In the new video, Cutts asks, “Will people get higher rankings? Is there a rankings boost for rel=’author’?”

    Hansson then replies, “It’s obviously early days, so we hope to use this information and any information as a ranking signal at Google. In this case, we want to get information on credibility of authors from all kinds of sources, and eventually use it in ranking. We’re only experimenting with that now. Who knows where it will go?”

    For the time being, what you get, he explains, is your photo showing up next to your results. The idea is to show photos next to results. That’s the goal with this project, he says.

    “If people believe it’s a good idea, you know, using HTML5 hopefully might help Google and any other search engine figure out more about content on the web, and what’s trustworthy and what’s less trustworthy over time,” says Cutts.

    Given the emphasis Google has been putting on trustworthy content (see Panda update), it’s easy to imagine this not only becoming a ranking signal, but a significant one.

    Here’s where it gets even more interesting. You have to have a Google Profile to use it. Kind of like Google+. Another interesting strategy to get people using Google’s new social network, no? And while still managing to keep things search-related. Well played, Google.

    When the authorship markup is used, it leads to an author photo being displayed in Google search results when applicable. For example, if I write an article and that article appears in search results, it would come with a picture of me (from my Google Profile) next to it, and that would link to my Google profile. So, as an added bonus for Google, this will greatly increase the visibility of the Google Profile, and no doubt contribute to further growth of not only Google profiles, but Google+.

    The Google Profile does actually keep the feature from being abused though. “To make sure that I can’t start writing nonsense and attributing it to Matt…you have to link back from your Google Profile to the site,” explains Hansson. “You need to control both endpoints basically.”

    To use the markup on a single author site, you basically just need to:

    1. On every post, add a link somewhere on the page pointing to your Google Profile (more visibility for Google Profiles)

    2. On that link, add an attribute rel=”author”

    3. The link can go in the footer or the header or wherever you can make it work.

    4. You can wrap it around an image if you want.

    If you have multiple authors on the site, like each author’s post to that author’s Google Profile. “That could be as simple as just at the bottom of each post, have the author actually insert a link themselves, with this attribute on it,” says Hansson. “Another thing that a lot of sites have, is…author bios.”

    Link the bio to the author’s bio page, add rel=author on the links to the bio, and from the bio page add rel=”me” links to the Google Profile, and link the Google Profile back to that page.

    “This obviously requires authors to make Profiles, and it requires webmasters to do the markup,” says Hansson.

    Cutts says they’re trying to work with CMS manufacturers so that individual people don’t’ have to do all the work if they don’t want to.

    For more detailed instructions on how to implement authorship markup, see Webmaster Tools help.

    By the way, Google has been pretty weird with authors lately, though this is unrelated to the authorship markup discussed here, as far as I can tell.

    Should authorship markup in this form be a significant ranking factor? Tell us what you think.

  • Google Launches New Series of Matt Cutts Webmaster Tutorials

    Google Launches New Series of Matt Cutts Webmaster Tutorials

    Google’s head of web spam, Matt Cutts, has been answering webmaster questions in short videos for quite some time now. The videos have often been quite informative, and have tackled numerous issues that common webmasters face on a day-to-day basis.

    Now, Cutts is appearing in some longer more tutorial-driven videos.

    “Over the past couple of years, we’ve released over 375 videos on our YouTube channel, with the majority of them answering direct questions from webmasters,” says Michael Wyszomierski on Google’s Search Quality Team. “Today, we’re starting to release a freshly baked batch of videos, and you might notice that some of these are a little different.”

    “Don’t worry, they still have Matt Cutts in a variety of colored shirts,” he adds. “Instead of only focusing on quick answers to specific questions, we’ve created some longer videos which cover important webmaster-related topics.”

    Woohoo! The next new batch of webmaster videos is ready! The first one is about 301 redirects: http://t.co/WWPUfBr 15 minutes ago via Tweet Button · powered by @socialditto

    Here’s the first one:

    In the video itself, he talks about 301 redirect limits. In this particular video, Cutts answers his own question, which is: “Is there a limit to how many 301 (Permanent) redirects I can do on a site? How about how many redirects I can chain together?”

    The short answer is “no.” There is no cap, though it’s best not to redirect one page to too many redirects. 4 or 5 “hops” are dangerous, he says.

    Good question though. Hopefully the majority of questions will be as thoughtful as Matt’s. Or maybe we’ll see him answer more of his own questions. I guess we’ll just have to stay tuned.

  • Bing: Here Are the 4 Reasons You Want Links

    Bing’s Duane Forrester has followed up his recent post about how Bing evaluates content quality with one about how Bing looks at links. He says you want links for a few reasons, and lists 4 of them:

    1 – because they alert us to your website when its new, or to new content
    2 – because they are a vote of confidence in your site – quality websites tend to link to other quality websites
    3 – because those links can send you direct traffic
    4 – because over time, they can help establish a footprint that points to your authority on a topic (think guest blogging)

    The main point from the post is pretty much: links are not everything when it comes to ranking in search engines. Nothing new there. Still, it never hurts to listen to the policies as they’re explained by the search engines themselves.

    You love links. We love links. Build for the right reasons. – From an early stage people are taught that links are i… http://ow.ly/1e7uc9 2 days ago via HootSuite · powered by @socialditto

    On how many links you need, Forrester says, “Not as many as you may think.  Again, as with so many other areas of search optimization, there’s no exact number here.  On popular phrases with lots of query volume, to rank well will require more links from trusted, quality websites to boost your rankings.  Less popular phrases can often require many less links pointed at your site to see the same lift in rankings.  This is where a targeted link building approach can pay off for you.”

    The take-aways of the post, Forrester says, are: don’t buy links, great content builds links, prove to users you’re a trusted authority (and links will follow), and social media can help grow links.

    Here’s where Bing gives its advice for link building.

  • Bing’s Take on Content Quality

    Since the Google Panda Update first launched back in February (and really for some time before that), there has been a lot of discussion about search quality throughout the industry – the quality of the content that search engines are returning in their results.

    This is the whole reason the Panda update exists. It’s all about improving the quality of results. Some will dispute the success of that, but it is the reason for better or for worse.

    But what about Bing? It doesn’t command nearly the search market share that Google does, but as it powers Yahoo search, it’s really the only major competitor in town.

    Bing talked a bit about its own views on content quality this week, and content producers might do well to check take notice of that as well – especially those who may have been hit by the Panda update, but are still doing ok in Bing.

    Whereas Google had a list of questions one could ask themselves to asses the quality of their site, Bing has published a list of things to avoid, which reads as follows:

    • Duplicate content – don’t use articles or content that appears in other places.  Produce your own unique content.
    • Thin content – don’t produce pages with little relevant content on them – go deep when producing content – think “authority” when building your pages.  Ask yourself if this page of content would be considered an authority on the topic.
    • All text/All images – work to find a balance here, including images to help explain the content, or using text to fill in details about images on the page.  Remember that text held inside an image isn’t readable by the crawlers.
    • Being lonely – enable ways for visitors to share your content through social media.
    • Translation tools – rarely does a machine translation tool leave you with content that reads properly and that actually captures the original sentiment.  Avoid simply using a tool to translate content from one language to the next and posting that content online.
    • Skipping proofreading – when you are finished producing content, take the time to check for spelling errors, grammatical mistakes and for the overall flow when reading.  Does it sound like you’re repeating words too frequently?  Remove them.  Don’t be afraid to rewrite the content, either.
    • Long videos – If you produce video content, keep it easily consumable.  Even a short 3 – 4 minute video can be packed with useful content, so running a video out to 20 minutes is poor form in most instances.  It increases download times and leads to visitor dissatisfaction at having to wait for the video to load.  Plus, if you are adding a transcription of your video, even a short video can produce a lengthy transcription.
    • Excessively long pages – if your content runs long, move it to a second page.  Readers need a break, so be careful here to balance the length of your pages.  Make sure your pagination solution doesn’t cause other issues for your search optimization efforts, though.
    • Content for content’s sake – if you are producing content, be sure its valuable.  Don’t just add text to every page to create a deeper page.  Be sure the text, images or videos are all relevant to the content of the page.

    The rest of Bing’s advice basically boils down to focusing on creating a good user experience and letting Bing know about your content. “Whether you call them rich snippets or by their proper names, the act of marking up your content to tell the engines more details about the content is a wise investment,” says Bing’s Duane Forrester. “By following the plan outlined at Schema.org, you can embed meta tags around your content. Visitors won’t see them, but the search engines will, enabling us to understand your content and use it in unique ways to create more engaging search experiences.  Take some time and review this idea to see if you can leverage the great content you’re creating in new ways.”

    If you’re living up to Google’s definition of quality, you probably won’t be doing too bad in Bing either, and if you’re doing well in Google, you’re probably getting a lot more search referrals from Google than you could ever get from Bing anyway, but it’s still helpful to get a look into Bing’s own thinking on this issue.

  • Google Refreshes Spam Reporting

    Google’s head of web spam Matt Cutts tweeted that the company has refreshed its spam report form. He calls it the biggest refresh in 10 years.

    Side note: It’s worth pointing out that he used Twitter to announce this. I see no updates about it in his posts on Google+. This is the kind of thing that makes Twitter essential to Google’s realtime search feature, and why Google+ has a long way to go before it can serve as a useful replacement for it. Even Googlers are still relaying important information via Twitter. It looks like he hasn’t posted to Google Buzz since May 28, either, btw. But that’s another story.

    We just released the biggest refresh of our spam report form in, oh, say 10 years: http://t.co/ty2MxmN 16 hours ago via Tweet Button · powered by @socialditto

    Here’s what the new spam report form looks like:

    Spam report form

    The page says, “‘Webspam’ refers to pages that try to trick Google into ranking them highly. Before you file a webspam report, see if the page might have a different problem.” Users are then presented with options for:

    • Paid Links (the page is selling or buying link)
    • Objectionable content (the page is inappropriate)
    • Malware (the page is infected)
    • Other Google products (This page abuses Google products other than Search, e.g., AdSense, Google Maps, etc.)
    • Copyright and other legal issues (This page should be removed under applicable law).
    • Personal/private (This page discloses private information)
    • Phishing (This page is trying to get sensitive information)
    • Something else is wrong (This page has other, non-webspam related issues)
    • And finally an option that says “This page is really webspam. Report webspam”

    Each option will take you to a different form or information source about how to proceed from there.

    Google’s approach seems to have ruffled at least one feather. “Marketing Guy” Scott Boyd talks about the new form, saying:

    Let’s see. Google crushes legitimate business websites in an attempt to remove spam from the index. Google crushes competition by undercutting them left, right and centre (analytics market is pretty much stagnent and frankly Adense just promotes lazy webmasters who’d rather take some easy bucks than work at their business). Oh and is quite happy to take vast amounts of our information without mentioning how valuable it actually is too loudly

    And now they want us – that’s the webmaster community (because frankly, no one else cares about paid links – in fact most normal people probably find the idea ridiculous) – to hunt down some evil paid linkers!!

    I already give you my search data, browsing history and patterns via Google toolbar, metrics on the quality of my websites via Google Adsense (for a minute fee), traffic metrics via Google Analytics, an idea of my financials, budgets and target market via Google Adwords. And now you want ME to improve YOUR product.for FREE?

    I think not.

    Eric Enge at Stone Temple Consulting recently posted an interview with Tiffany Oberoi, an engineer on Google’s Search Quality team. Cutts said, “Every SEO/search person should read” it. She talks about how reconsideration requests work.

    Now that Google has refreshed its spam reporting, I’m guessing we’re going to see a whole lot more reporting, and of course a whole lot more of such requests. Here are some key quotes from Oberoi from that interview:

    “We do have a few different manual actions that we can take, depending on the type of spam violation. We would tend to handle a good site with one bad element differently from egregious webspam. For example, a site with obvious blackhat techniques might be removed completely from our index, while a site with less severe violations of our quality guidelines might just be demoted. Instead of doing a brand name search, I’d suggest a site: query on the domain as a sure way to tell if the site is in our index. But remember that there can be many other reasons for a site not being indexed, so not showing up isn’t an indication of a webspam issue.”

    “We try to take an algorithmic approach to tackling spam whenever possible because it’s more scalable to let our computers scour the Internet, fighting spam for us! Our rankings can automatically adjust based on what the algorithms find, so we can also react to new spam faster.”

    “And just to be clear, we don’t really think of spam algorithms as “penalties” — Google’s rankings are the result of many algorithms working together to deliver the most relevant results for a particular query and spam algorithms are just a part of that system. In general, when we talk about “penalties” or, more precisely, “manual spam actions”, we are referring to cases where our manual spam team stepped in and took action on a site.”

    “If a site is affected by an algorithmic change, submitting a reconsideration request will not have an impact. However, webmasters don’t generally know if it’s an algorithmic or manual action, so the most important thing is to clean up the spam violation and submit a reconsideration request to be sure. As we crawl and reindex the web, our spam classifiers reevaluate sites that have changed. Typically, some time after a spam site has been cleaned up, an algorithm will reprocess the site (even without a reconsideration request) and it would no longer be flagged as spam.”

    She goes on to point out that reconsideration requests will not help you if you’ve been impacted by the Google Panda update.

  • Video and Images Dominate Google Universal Search Results

    Video and Images Dominate Google Universal Search Results

    Searchmetrics has released a new study showing how universal search can help marketers in search visibility. This certainly isn’t a groundbreaking concept. We’ve discussed this plenty in the past, but the firm shares universal search data, which it has used to try and identify which sites are dominating the top 10 positions of video, news, shopping, images, and map results.

    The study is based on the analysis of th top 100 search engine results displayed by Google for a database of about 28 million search terms over a four-month period, Searchmetrics says. The timeframe was February to May.

    Video results appeared in over 60% of all searches where universal search results are included in the top 100 listings.  Images (coming in second behind videos) appeared in 30%, followed by shopping results at about 20% and news at around 10%. Judging from the following graph, it looks like Books were ahead of news, before dropping off in April.

    Searchmetrics data on universal search

    “For a few years now Google has been bringing specific shopping, news, image, video, blog and map-based results into the general search listings it presents to searchers as part of what has been termed its ‘universal search’ strategy – it’s intended to help searchers find what they’re looking for more easily,” said Searchmetrics CEO Dr Horst Joepen. “We found that video and images are highly visible in Google searches when compared with other types of universal search content. So it makes sense for marketers to increase the volume of video and image content they’re creating and to optimize it both on their own sites and on third party sites such as YouTube and Flickr.”

    “Interesting videos and images aren’t just good for your SEO, they’ll obviously also help make your site more engaging for visitors,” added Dr Joepen.

    He says marketers should be thinking about creating things like client testimonials, interviews and product demos for video content.

    Google has not made things easy on SEOs over the years. They are always changing so many things, it’s hard to keep up. Add Google’s personalization into the mix, and you never know who’s going to see what in their results for any given query.

    Universal search, though surely not its intended reason for existence, has proven to be something of a bone Google has thrown to websites. It’s a shortcut to from page search results. If you can rank well for videos or images, for example, there’s a good chance you will find your way onto the front page of Google’s web results for some searches.

  • Yahoo Microsoft Search Alliance To Launch in Europe This Week

    Yahoo Microsoft Search Alliance To Launch in Europe This Week

    Yahoo issued an update today indicating that its “Search Alliance” with Microsoft will get underway in Europe as soon as August 3rd. That goes for Yahoo UK, France, Germany, Spain and Italy.

    It won’t be a full-on transition to the way it is here in the states, at least at first. Yahoo will switch to Bing results for organic search results only. Yahoo says advertisers should continue to manage their Yahoo Search Marketing accounts as usual, and that it will provide ample notice before the paid search transition in each respective market.

    “Search ad inventory from Yahoo!, Microsoft, and their respective partners will be combined into a new, unified search marketplace, giving advertisers of all sizes access to a combined audience of 607 million unique searchers worldwide,” the Search Alliance explains on its UK site. “In the UK and France, Yahoo! and Microsoft will combine their existing marketplaces. In other markets throughout Europe, Asia and Latin America, this transition will be seamless as Microsoft is already an existing Yahoo! partner, drawing from the Yahoo! marketplace.”

    “We have adjusted the planned timing of the paid search transition for the UK, France, and Ireland,” it says. “Yahoo! advertisers with accounts in the UK and France will not transition to adCenter in 2011. Advertisers should continue to manage and optimise their campaigns on Yahoo! Search Marketing and Microsoft Advertising adCenter separately. We will provide advertisers with updated timing information well in advance of when transition activities begin, with further details to help them plan and prepare.”

    For the time being, Yahoo is advising businesses to compare organic search rankings on Yahoo Search and Bing for keywords to help determine the potential impact on traffic and sales, and then to decide if they’d like to modify their paid campaigns. They’re also telling businesses to review the Bing webmaster tools and optimize for the Bing crawler.

  • Google is a Speed Freak and Wants to Drag You Along with Page Speed Service

    Google is a Speed Freak and Wants to Drag You Along with Page Speed Service

    I originally had the title of this post as, “Google has a need, a need for speed!” and then suddenly realized that was probably the least original post title ever.  After a quick Google search, I confirmed as much, and instead alluded that a search engine has drug issues… much better.

    Anyway, starting in 2010 with the Caffeine release, Google made it known that it was really into websites that load quickly.  So much so that it is now part of the algorithm that determines your site’s rankings in organic results.  How much of a factor is not known, but we’ve seen it make a big difference in our client’s rankings once page load time is improved, so it’s definitely worth your time to look into the matter.

    As you may know from some of my other posts, Fang Digital performs a lot of SEO Audits for our clients. During these audits, we look at three major areas of interest: Content, Site Architecture, and Inbound Links. Within these three major areas, there lives a myriad of details that can make or break a site’s organic rankings, and page load speed is an important part of the Site Architecture area of interest.

    We usually charge for SEO Audits, but here’s a freebie… seriously check out your page load speed. I say this because, during the many audits we provide for our clients, page load speed comes up a lot… I mean, a lot, a lot.  Plus, it’s one of those factors of the audit that is always a surprise when we present our findings, so it’s easy to say that its one of the pieces of the algorithm that is often overlooked.

    Here’s another reason to look at page load speed: your customers. You know, those people that the site is actually for in the first place?  Yeah, they can’t stand it when a site loads slowly, which is really the reason why Google includes it as a ranking factor (you may hear others say it was for the previews you can get in organic listings now or other reasons, but the heart of the matter is, nobody likes a slow site).

    The good news is that Google is here to help and will drag your slow butt into the fast lane if it has to in order to make sure that everybody is on board with this concept.  How are they doing this? For starters, “Site Performance” is one of the items you can review within Google Webmaster Tools under “Labs;” that’s usually your first indication that you have an issue (and where that graph in this post comes from).  Then, about two years ago Google released the Page Speed browser extension for Firefox and earlier this year a Google Chrome extension that allows website owners to find out exactly what is slowing their pages down.  They also released a Page Speed Online API to provide developers with specific suggestions to make their web pages faster. Last year they released an Apache module called mod_pagespeed, to automatically rewrite web pages before they are delivered to the public. Just the other day, Google announced the latest tool in the speed tool chest, Page Speed Service.

    A lot of SEOs hate telling you about the tools they use to do their audits, but that’s mostly because if they keep SEO sounding like its a form of dark magic, it will scare the regular folk into paying them big bucks to fix their organic listings. We at Fang Digital think that’s kind of silly and respect the fact that the reason companies hire SEOs is because they usually just don’t have the resources to handle it internally or with the same level of efficiency.  I mean, I could probably do my own plumbing work too, but it would take me twice as long and probably some cuts and bruises, so I just hire a pro when the time comes.

    Anyway, one of the tools we use during our SEO Audits is the PageSpeed extension for Google Chrome and like I said, we usually always find something that is slowing our clients’ sites down enough for Google to notice.  Some of our larger clients have internal resources that can attack the recommendations we provide from the PageSpeed tool, but many of our other clients wouldn’t even know where to start, and that’s where the Page Speed Service comes in handy.

    Google’s Page Speed Service is service that automatically speeds up the load time of your web pages by grabbing your existing pages, running them through some optimization filters to fix a bunch of common issues, and then serves those optimized pages for you . As Google says, “Now you don’t have to worry about concatenating CSS, compressing images, caching, gzipping resources or other web performance best practices.”

    This is majorly cool in my book.  Anytime I can outsource something like this to another pro that will just handle it for me, I’m on board.  Of course, there are some concerns about the fact that you’re relying on Google’s servers to get your pages out to your customers, but if there was anybody you could trust with this, I would think Google would be on top of the list.

    After you sign up, there will be some changes to your DNS file to be done, but that’s gotten easier and easier over the years, so I wouldn’t sweat that too much.  Right now, Page Speed service is on a free, limited run, so not everybody can use it, but if you can get in, I’d give it a shot.  I’m also hearing some grumbling that some sites are slower after they use Page Speed service, but I have been assured that this was caused by the initial rush of new users and Google is compensating and adjusting constantly to get past this issue (if they haven’t already).

    Like I said, this is a free service right now and Google has already made it known that they’ll charge for it eventually, so I’d get in on it now while it’s free and see how much of a difference it can make for your organic listings and traffic.  As I’ve always said, there really is no “secret sauce” for SEO, there’s just knowing and playing by the rules and the rest will fall into place.

    Enjoy!

    Check out Fang Digital Marketing for more articles by Jeff Ferguson

  • Webmasters: Googlebot Caught in Spider Trap, Ignoring Robots.txt

    Sometimes webmasters set up a spider trap or crawler trap to catch spambots or other crawlers that waste their bandwidth. If some webmasters are right, Googlebot (Google’s crawler) seems to be having some issues here.

    In the WebmasterWorld forum, member Starchild started a thread by saying, “I saw today that Googlebot got caught in a spider trap that it shouldn’t have as that dir is blocked via robots.txt. I know of at least one other person recently who this has also happened to. Why is GB ignoring robots?”

    Another member suggested that Starchild was mistaken, as such claims have been made in the past, only to find that there were other issues at play.

    Starchild responded, however, that it had been in place for “many months” with no changes. “Then I got a notification it was blocked (via the spidertrap notifier). Sure enough, it was. Upon double checking, Google webmaster tools reported a 403 forbidden error. IP was google. I whitelisted it, and Google webmaster tools then gave a success.”

    Another ember, nippi, said they also got hit by it 4 months after setting up a spider trap, which was “working fine” until now.

    “The link to the spider trap is rel=Nofollowed, the folder is banned in robot.txt. The spider trap works by banning by ip address, not user agent so its not caused by a faker – and of course robots.txt was setup up correctly and prior, it was in place days before the spider trap was turned on, and it’s run with no problems for months,” nippi added. “My logs show, it was the real google, from a real google ip address that ignored my robots.txt, ignored rel-nofollow and basically killed my site.”

    We’ve reached out to Google for comment, and if and when we receive a response.

    Meanwhile, Barry Schwartz is reporting that one site lost 60% of its traffic instantly, due to a bug in Google’s algorithm. He points to a Google Webmaster Help forum thread where Google’s Pierre Far said:

    I reached out to a team internally and they identified an algorithm that is inadvertently negatively impacting your site and causing the traffic drop. They’re working on a fix which hopefully will be deployed soon.

    Google’s Kaspar Szymanski comment on Schwartz’s post, “While we can not guarantee crawling, indexing or ranking of sites, I believe this case shows once again that our Google Help Forum is a great communication channel for webmasters.”