WebProNews

Tag: rand fishkin

  • SEO 2018 is All About Branded Search Queries

    SEO 2018 is All About Branded Search Queries

    SEO pioneer and Moz co-founder Rand Fishkin was recently asked about the future of SEO and how SEO agencies and marketers should adjust to these changes. Rands says that SEO has changed to the point that the traditional SEO focus on generic related keywords for your products and services is becoming obsolete because of how Google now displays search results.

    See below for highlights from Rand’s thoughts on the changing SEO landscape:

    There is definitely this problem in the SEO space where the amount of opportunity in SEO (is lessened), because of Google entering all these business sectors and taking away of a lot of the clicks and trying to solve the searchers’ query before they ever click.

    This means that you just don’t have as much opportunity to earn search clicks from this search engine anymore.

    Branded Search Queries

    How can we overcome that? I think the answer in SEO is pretty clear, which is the one thing that Google is not taking away from us, branded search queries. If somebody searches for StatBid or they search for Moz or they search for the North Face, that is a searcher telling Google to take me to this companies website. That’s a very very powerful undeniable signal that they want to reach your site. You’re getting 90% plus click-through rates on those.

    If you can increase the number of people searching for your brand instead of drawing a smaller and smaller percentage of the people who search for outdoor jackets, because Google’s placing all these ads above the fold and trying to say these are the best outdoor jackets and having an instant answer and a featured snippet and all this type of stuff, yeah that’s that’s a big win.

    Combining Classic SEO with New Stuff

    I think for CMO’s and for marketing departments it’s going to be a combination of classic skill sets in SEO. First, how do we rank? How do we make sure that our site is technically optimized? How do we make sure that we’re doing good keyword research and keyword targeting? How do we create good content around all these things and solve searchers problems?

    Then I think it’s also going to be some new forms of marketing that SEO’s are not generally familiar with, at least not historically. Those are things like creative that draws attention and eyeballs and interest. I think it’s a lot of storytelling. Great brands are built on the back of great storytelling and that is traditionally a big weak spot for SEO.

    SEO Teams Need to Get Retooled

    So yeah, I think there’s going to be a combination of new and old. That could mean that some (SEO & marketing) teams need to get retooled or new people need to be added. It could mean that agencies will have to upgrade those practices and I’ve already seen some agencies in the SEO world start to do that. I think serving existing demand is going to long-term not be as exciting as creating more demand.

  • Moz Founder Says Google Knows Everything and is Now Relying on Behavioral Data for Search Rankings

    Moz Founder Says Google Knows Everything and is Now Relying on Behavioral Data for Search Rankings

    Google is now relying on behavioral data that it gets from searchers, Chrome users, Android, etc. as the primary way that it ranks pages, according to Moz founder Rand Fishkin, who spoke at the recent MozCon event.

    Google used to have to predict what searchers are going to do and used a reasonable surfer model as the premise for its search algorithm. No more says Rand Fishkin. Google knows everything so it doesn’t have to predict because it already knows. Interesting but very scary stuff.

    Rand Fishkin, founder and former CEO of Moz and current founder of Sparktoro, discussed Google’s current approach to ranking websites at MozCon:

    What is Google Going To Do About Judging Links?

    I want to from my mind of all the things that I knew about link building up until this point and instead, take a look at companies and brands and websites and just ask what did they do right and what did we do wrong in the past and what is Google going to do about judging links?

    You might remember that last year when Google announced RankBrain they said it is the third most important ranking factor. You might also recall Danny Sullivan asking them what are the first two? A few of us were on a phone call with one of Google’s engineers and brought this up and he was like, what are you talking about, everyone knows the first two are content and links. Still true.

    In the past, link evaluation algorithms have been in these places we’re all familiar with such as PageRank, source diversity, anchor text, trust distance, domain authority,  location on the page, spam out link analysis, yadda yadda yadda. All these little individual factors around how Google judges a particular link and all the links that point to a website.

    Google is Going Away From the Reasonable Surfer Model

    But this is not where they’re going. Google’s is going away from the reasonable surfer model. Remember what PageRank was supposed to do, even in 1998, it was supposed to predict which links on a page were important and then it was supposed to assign values to them and it was supposed to assign those based on the probability, the chance of someone clicking on those links.

    Of course, Google was very naive in 1998 and so all they could do was assign the same weight to all the links on a page and they assigned the weight of a page based on all the links that pointed to it.

    Google Search Relying on Behavioral Data Because Google Knows Everything

    But that is not today. Today, thanks to Chrome and Android and Google WiFi and Google Fiber, Google knows everything. Google’s sample of everything that happens on the web is probably in the 80 or 90 percentile range. It’s insane and it’s crazy. Because of that, they can see. Google knows where people were, where they go and where they go next. You don’t need a reasonable surfer model anymore. You don’t need to predict because you know.

    Google’s goal is pretty clear, it’s searcher satisfaction. Google knows that if they satisfy searchers well, those searchers will return again and again. The number of searches will go up and the number of searches per searcher will keep going up and that’s what we’ve been seeing. Even as desktop has leveled off in its growth, mobile keeps growing and searches and searches per searcher keep growing.

    Google’s core search team asks the same question every time, are searchers satisfied with the results? The way they know that is finding out if searchers are getting the answers that they’re seeking. Google asks how do we get to that? It’s behavioral data.

  • Zuckerberg Opens Up About Facebook Search

    Zuckerberg Opens Up About Facebook Search

    As you may know, Facebook greatly expanded its search offerings last month when it launched the ability to search posts. You can now search keywords and get back results from people and pages you’re connected to. It instantly made Facebook a lot more useful as a search tool because it gives you access to content that’s not getting indexed by Google. This is often content that’s particularly relevant based on your personal connections to its creators.

    Have you been using Facebook’s new search functionality? Let us know in the comments.

    For example, you can easily find your friends’ posts about soup to get some ideas for your next soup batch. If you have one friend in particular that you consider a soup whiz, you can easily find his or her soup posts.

    There’s a lot of speculation about where Facebook might be headed with search. Facebook released its Q4 and full year 2014 earnings on Wednesday. During the conference call that followed, CEO Mark Zuckerberg talked a little bit about the company’s search efforts.

    While Zuckerberg didn’t exactly drop any bombshells, he did offer his thoughts on the company’s search direction. From his prepared remarks (via SeekingAlpha’s transcript):

    Search at Facebook is another important effort that we expect to create a lot of value over the next few years. In this quarter, we launched updates to Facebook search to make it easier to find content and posts on mobile and desktop. We’re going to continue listening the feedback from our community and commit time to build really valuable products here. We’re optimistic about our ability to deliver value that only Facebook is able to provide.

    During the Q&A session, Zuckerberg was asked to talk more about Facebook’s approach to search. He said:

    So, our view on this is that there is a lot of unique content that people have shared in Facebook, a lot of personal content, recommendations from friends that you can get that you just wouldn’t be able to get through a traditional web search service or other app. And we’re on this multiyear voyage to basically index all the content and make it available to people and rank it well. We started off by launching graph search which I think included more than a trillion different connections in the first system.

    And the second round of the search progress that we just started rolling out at the end of last year was post search, which now has indexed more than I think a trillion posts, which I mean the sizes of these corpuses are bigger than anything in a traditional web search corpus that you would find. So it’s an interesting and fun challenge to make this work. We’re seeing that people immediately understand how they can use this and find content that they’ve seen in News Feed before or that they’ve posted with just a few keywords.

    And we’re excited about that, but there is a lot more to do. So we’re not really thinking about advertising in it yet on the scale that our community operates, a billion searches per day is actually not that big compared to what we think the opportunity here should be. And we’re just continuing to keep on working on it because there is just a lot of unique value that people should be able to get [from] their friends on Facebook search.

    Earlier this week, search marketing veteran Rand Fishkin shared his thoughts on the direction of Facebook search after predicting that the company will start to include web content in its search results this year (in a different way than it has done in the past with Bing).

    “With Bing, Facebook was simply showing external results (like a metasearch engine),” he said. “I think if they use their own crawlers to gather data and a system to serve it, there will be a more holistic, cohesive experience, likely biased/filtered by some of the things Facebook knows about the user(s) doing the searching.”

    On whether or not Facebook’s recent search improvements are having a significant impact on how people find information so far, and whether or not they will in the future, Fiskhin told us, “No, and I think in the next few years, the answer will continue to be mostly no (at least if we’re talking about websearch kinds of information vs. ‘where’s my friend’s party Friday night?’ or ‘What does so-and-so’s new boyfriend look like?’). But, long term, I think there’s a possibility. If their early efforts show promise and a direction, I think we can extrapolate from there. For now, I’m not sold.”

    Facebook has been releasing a lot of standalone apps. Among these are dedicated apps for messaging, for managing Pages, and for Groups. Would they launch a dedicated search app? Should they?

    “No and probably no,” Fishkin said. “I think Facebook’s castle is their social graph and the private postings of people to whom other people are connected. They should continue to release products and apps that help build that moat, but for right now, broad search doesn’t fit that world, IMO.”

    Regardless of whether or not people are actively using it as such, Facebook search gives users new ways of obtaining information. This must mean that businesses, who have suffered drastic declines in organic reach in the News Feed, have some new opportunities to get in front of those actually searching. Fishkin’s advice is as follows.

    “Do remarkable things that people on Facebook want to talk about and share,” Fishkin said. “And if that’s too much, at least make sure all your business details are as up-to-date and accurate as possible on Facebook, and that you’re sharing things your followers/fans on that network will actually care about (even if that’s only a few times a year). Just make sure you don’t make Facebook the center of your online promotional efforts – save that for your website and use Facebook to drive traffic to it. You should never build your castle in someone else’s walled garden.”

    Video was a focal point of much of Facebook’s earnings call. We shared some highlights on that subject over here.

    If Facebook search becomes as big as the company would like it to, that could be a very damaging thing for YouTube, as it currently rules video search. With more people opting to post videos directly to Facebook and watch them there, searching for videos could also become much more common on the social network.

    Do you expect Facebook to make a significant impact on how people find information? Let us know in the comments.

    Image via Facebook

  • Did Google Penalize A Site For A Natural Link From Moz?

    Update: We’ve updated the post with some additional comments from Fishkin he gave us via email. See end of article.

    Google has been on a warpath against what it thinks are unnatural links, but many think it’s off the mark with some of them. Meanwhile, the search giant scares people away from using even natural links in some cases, whether it intends to or not.

    Have Google’s warnings to webmasters had an impact on your linking practices? Let us know in the comments.

    When one thinks about reputable companies and websites in the SEO industry, Moz (formerly SEOmoz) is likely to be somewhere near the top of the list. YouMoz is a section of the site that gives voices to other people in the industry who don’t work for the company. It’s essentially a place for guest blog posts.

    YouMoz, while described as a “user generated search industry blog” isn’t exactly user-generated content the same way something like Google’s YouTube is. YouMoz content must be accepted by the Moz staff, which aims only to post the highest quality submissions it receives. This is the way a site is supposed to publish guest blog posts. In fact, Google’s Matt Cutts seems to agree.

    If you’ll recall, Google started cracking down on guest blogging earlier this year. Google made big waves in the SEO industry when it penalized network MyBlogGuest.

    A lot of people thought Google went too far with that one, and many, who either hosted guest blog posts or contributed them to other sites were put on edge. Reputable sites became afraid to link naturally, when the whole point is for links to be natural (isn’t it?).

    Understandably concerned about Google’s view of guest blogging, Moz reached out to Cutts to get a feel of whether its own content was in any danger, despite its clear quality standards. In a nutshell, the verdict was no. It was not in danger. Moz co-founder Rand Fishkin shares what Cutts told them back then:

    Hey, the short answer is that if a site A links to spammy sites, that can affect site A’s reputation. That shouldn’t be a shock–I think we’ve talked about the hazards of linking to bad neighborhoods for a decade or so.

    That said, with the specific instance of Moz.com, for the most part it’s an example of a site that does good due diligence, so on average Moz.com is linking to non-problematic sites. If Moz were to lower its quality standards then that could eventually affect Moz’s reputation.

    The factors that make things safer are the commonsense things you’d expect, e.g. adding a nofollow will eliminate the linking issue completely. Short of that, keyword rich anchortext is higher risk than navigational anchortext like a person or site’s name, and so on.”

    It sounded like YouMoz was pretty safe. Until now. Contributor Scott Wyden got a warning from Google about links violating guideolines, which included his YouMoz article as well as a scraper post (that’s a whole other issue Google should work out).

    “Please correct or remove all inorganic links, not limited to the samples provided above,” Google’s message said. “This may involve contacting webmasters of the sites with the inorganic links on them. If there are links to your site that cannot be removed, you can use the disavow links tool…”

    The problem is that, at least according to Moz, the links were not inorganic.

    “As founder, board member, and majority shareholder of Moz, which owns Moz.com (of which YouMoz is a part), I’m here to tell Google that Scott’s link from the YouMoz post was absolutely editorial,” says Fishkin in a blog post. “Our content team reviews every YouMoz submission. We reject the vast majority of them. We publish only those that are of value and interest to our community. And we check every frickin’ link.”

    “Scott’s link, ironically, came from this post about Building Relationships, Not Links,” he continues. “It’s a good post with helpful information, good examples, and a message which I strongly support. I also, absolutely, support Scott’s earning of a link back to his Photography SEO community and to his page listing business books for photographers (this link was recently removed from the post at Scott’s request). Note that “Photography SEO community” isn’t just a descriptive name, it’s also the official brand name of the site Scott built. Scott linked the way I believe content creators should on the web: with descriptive anchor text that helps inform a reader what they’re going to find on that page. In this case, it may overlap with keywords Scott’s targeting for SEO, but I find it ridiculous to hurt usability in the name of tiptoeing around Google’s potential overenforcement. That’s a one-way ticket to a truly inorganic, Google-shaped web ”

    “If Google doesn’t want to count those links, that’s their business (though I’d argue they’re losing out on a helpful link that improves the link graph and the web overall). What’s not OK is Google’s misrepresentation of Moz’s link as ‘inorganic’ and ‘in violation of our quality guidelines’ in their Webmaster Tools. I really wish YouMoz was an outlier. Sadly, I’ve been seeing more and more of these frustratingly misleading warnings from Google Webmaster Tools.”

    Has Moz lowered its standards in the time that has passed since Cutts’ email? Fishkin certainly doesn’t think so.

    “I can promise that our quality standards are only going up,” he writes, also pointing to an article and a conference talk from the site’s director of community Jen Lopez on this very subject.

    “We’d love if Google’s webmaster review team used the same care when reviewing and calling out links in Webmaster Tools,” Fishkin writes.

    Burn.

    Cutts would most likely have something to say about all of this, but he happens to be on leave, and isn’t getting involved with work until he comes back. He has been on Twitter talking about other things though. It will be interesting to see if he gets sucked back in.

    The whole ordeal should only serve to scare more people away from natural linking as Google has already been doing. If Google is penalizing a site for links from a site like Moz, what’s safe?

    We’ve reached out to Fishkin for further comment, and will update accordingly.

    Update: Fishkin tells us via email that he doesn’t think Google’s targeting of guest blogging in general is off base, but that their reviewers “need to be more discerning in marking problematic links.”

    He goes on to say: “When they select editorial links to highlight as problematic ones, they’re creating a serious problem for site owners on both sides. Correctly identifying non-editorial links really does help site owners improve their behavior, and I know there’s plenty of folks still being manipulative out there.”

    “In terms of Google ruining natural linking, I suspect that’s an unintended side effect of their efforts here. They’re trying to do a good thing – to show which links are cuasing them not to trust websites. But when they mark editorial links as inorganic, they inadvertently scare site owners away from making positive contributions to the web with the accordingly correct citation of their work. That’s how you get a Google-shaped web, rather than a web-shaped Google.”

    Image via Moz

    Do you think Google is going overboard here? Share your thoughts in the comments.

  • Moz Partners With Bitly For Click Tracking And Link Data

    Bitly announced today that it will provide click tracking technology and inbound link data to Moz to help users better understand who is linking to websites and how relevant those links are.

    The data will use number and frequency of clicks to determine relevancy.

    Moz had been using Twitter link data to rank relevance, but Bitly’s will utilize Twitter as well as Facebook, Google+, blogs, and other sources.

    “The Bitly click dataset is hands down the broadest and most authoritative available to anyone looking for information on how their content and brand is performing across the web,’ said Moz co-founder and former CEO Rand Fishkin. “Marketers armed with these insights are able to build campaigns that are designed to optimize attention through content.”

    “Previously we were using just Twitter data to understand the relevance of shared content,” he added. “While that’s a great start, our clients are looking for a holistic view. Bitly’s click data gives us a much more comprehensive and accurate picture by looking at the entire web and drilling into actual clicks, which is more valuable than simply looking at how frequently content is shared.”

    Bitly CEO Mark Josephson said, “Bitly owns a unique view of how links are shared across the internet. Insights gleaned from our differentiated data set can help all marketers make better decisions. We’re excited to put this into action with Moz so their clients can better understand how content and links are shared across the Web.”

    According to the company, marketers can identify recently created URLs and links within seconds, and highlight the most clicked content for effective campaign management.

    Image via Moz

  • Sarah Bird Is Now Officially The CEO Of Moz

    A little over a month ago, Rand Fishkin announced that he would step down as CEO of Moz (formerly SEOmoz). He would remain with the company, and focus on product and marketing, while handing over the reins to President and COO Sarah Bird.

    Read our recent interview with Fishkin about the transition here.

    Fishkin told us the transition would come in mid-January, which has now arrived. Bird is now officially CEO.

    The linked post includes a half-hour video about Fishkin’s and Bird’s past work together and plans for the future.

    Image via Moz

  • Rand Fishkin On The Best And Worst Parts Of Being Moz CEO

    Last month, we learned that Moz (formerly SEOmoz) CEO Rand Fishkin is stepping down from the role. He revealed that he would be handing the reins over to President and COO Sarah Bird, while taking on less of a people management role, and instead focusing more on his product and marketing passions with the company.

    We reached out to Fishkin for some more about his decision and the pending transition. He told us about what he liked and disliked about being a CEO, as well as his regrets about holding the position.

    On what he enjoyed most, Fishkin told WebProNews, “The ability to create and influence the company culture, product, team, and mission have certainly been the best parts. I’m hopeful that the ‘influence’ parts will continue for a long time to come in this new role.”

    On what he enjoyed the least, he said, “Over time, it’s been a lot of the organizational development, conflict resolution, and people management issues. Those seem, to me, to be less about how to make a great product, market it, improve it, and deliver value to customer and more about politics, which I wish didn’t exist. The bigger a company gets, the harder all that stuff is, and the better you have to be at it in order to have success doing all the customer-value-add stuff.”

    “I also don’t really enjoy interacting with financial folks outside of Moz,” he added.

    Fishkin had plenty of nice things to say about Bird in his announcement and in an email he sent to Moz staff. He told WebProNews, “Sarah is far more capable of possessing and projecting optimism to the team, more emotionally and culturally well-suited to the people challenges at scale, and she’s not as easily overwhelmed by non-productive emotions as I am (which is something we definitely need).”

    When asked if he has any regrets about being CEO, Fishkin told us, “Absolutely. I think I’ve made numerous terrible decisions as CEO.”

    “That said,” he added. “It’s also been a remarkable run for the company – we’ve built something really amazing culturally, product-wise, and with the Moz brand, and I’m hopeful that long term, we’ll achieve the mission we’ve set for ourselves and help hundreds of thousands of SEO-focused marketers to do their job better.”

    After sharing his plans, Fishkin wrote a blog post titled, “Can’t Sleep; Caught in the Loop,” in which he talked about his worst weeks of 2013 in which he had what he described as a “weird mental cycle,” which has kept him awake. He calls it ‘the loop.”

    “Moz’s performance this year (which wasn’t great, but was still fairly good, ~25% growth) isn’t directly connected to Sarah taking the leadership role, but it does have an indirect impact,” Fishkin told us. “I think the people challenges at our scale, combined with some of the tough decisions that didn’t pan out created a lot of cycling negativity in my head that I’ve referred to as ‘The Loop.’ That negativity and the emotional impact it’s had on me, and by extension, Moz, are certainly part of the reason I wanted to make this move.”

    “That said, there are others, too,” he added. “I think Sarah will make an excellent CEO long term, and I want to focus more on individual contributor types of work. I also want to put my energy into things I love (like product & marketing) rather than those I don’t, but felt obligated to do (like people issues).”

    Fishkin and Bird recently spoke with the Moz board, and determined that the move will be made in mid-January, when they’ll be moving to a new office.

    Image: Rand Fishkin

  • Rand Fishkin Is Stepping Down As CEO Of Moz

    Moz CEO Rand Fishkin announced in a blog post that he is stepping down as the CEO of Moz (formerly SEOmoz). But don’t worry, you won’t be seeing less of him. In fact, from the sound of it, you’ll be seeing more.

    Fishkin is not leaving his company or anything like that. He’s just decided that he’d like to focus less on things like people managing and more on product and marketing – the areas where he is more comfortable. He’s handing the reins over to President and COO Sarah Bird.

    Fishkin shared an email sent to staff, in which he says:

    My role will actually be very similar. I’ll likely be spending more time in the weeds with product design, marketing initiatives, and evangelism (blogging/speaking). I’ll continue to represent Moz externally quite a bit. But I won’t be doing much people managing (only Nicci will continue to report to me), work on our finances, organizational development stuff, or recruiting/hiring of senior staff. I’ve also promised to write a book next year on startup marketing!

    I want to change my title to “individual contributor.” Mostly because it reflects my belief that you don’t need to manage people in order to have influence, I love and want to promote the IC track/concept, and that titles are kinda BS 🙂

    I will continue to be on the eteam and on the board of directors, representing internal shareholders (like y’all).

    Later in the email, addressing the question of if this means he will leave in the near future, Fishkin says, “Hell no…You’d have to push me kicking and screaming. I plan to be here for a long time to come.”

    The timeframe for the transition isn’t clear yet, but Fishkin says it will be more so after a board meeting next week.

    Image: Rand Fishkin

  • SEOmoz Will Now Be Known As ‘Moz’

    SEOmoz has changed its name to simply, ‘Moz,’ to better reflect the growing industry of which it is a part. The company, however, isn’t only rebranding itself. It’s also launching a completely new product.

    SEOmoz now redirects to Moz.com.

    “The Problem: There isn’t a product that measures the effectiveness and impact of inbound marketing efforts,” a spokesperson for the company tells WebProNews. “Sure, Google Analytics can give you traffic data—but it doesn’t give you data for all your efforts across other channels. Moz Analytics is built to expose that data and give marketers insight into how to improve their efforts. It helps users answer a difficult question: What is the ROI of inbound marketing?”

    “The Pitch: Moz is the result of two years of product development, based on a trend that was clear to our Founder and CEO, Rand Fishkin,” the spokesperson adds. “The world of search marketing has evolved. Social media marketing matters, content matters, and SEO matters—they all contribute to the greater picture of inbound marketing, or earned marketing, as we like to call it. Moz manages and analyzes those marketing efforts on a single subscription platform.”

    “For the past decade, we’ve fought to make SEO a legitimate, respected part of a web marketer’s arsenal,” says Fishkin. “Today that battle is expanding to include content marketing, social media, community building, brand tracking, and other inbound channels. While SEO remains a key part of our product, it’s no longer transparent or authentic to say we’re purely an SEO software company.”

    The Moz Analytics platform includes SEO and link analysis features, social analytics and brand/web mention data. It will show where a brand, competitor, or industry topic is being talked about on the web, and companies can see where they’re being mentioned, but not linked to.

    “The transformation of Moz over the past year is a direct result of the feedback we’ve received from our customers and community,” says Fishkin. “We’ve taken input from thousands of marketers, and built the tools they need to understand the impact of their efforts.”

    Moz has over 25,000 customers, and boasts a community of over 300,000 online marketers. Moz Analytics is in private invite-only beta mode for the time being. They will start transitioning customers to the new software over the coming weeks.

    Fishkin discusses the change more over on the Moz Blog.

  • SEOmoz’s Acquisition Spree Culminates With AudienceWise

    SEOmoz announced that it has acquired AudienceWise, as the latest in a spree of acquisitions kicked off by an $18 million round of funding the company received last year from The Foundry Group and Ignition Partners. Other acquisitions have included Followerwonk and GetListed.

    “AudienceWise marks the culmination of our acquisition spree for this year,” SEOmoz CEO Rand Fishkin tells WebProNews. “Moving forward in 2013, our focus will be on accelerating our product roadmap and leveraging all the resources we’ve acquired over the last five months. The remainder of 2013 will be dedicated to putting our money where our mouth is: investing in our foundations, and ramping up our product offerings to SEOmoz’s 20,000+ users and our 300,000+ online community members.”

    AudienceWise was a company specializing in audience development consulting for publishers and e-commerce sites, but its team will now be integrated into SEOmoz’s as a talent acquisition, though SEOmoz does say it will incorporate some of AudienceWise’s technical processes, strategy and products into the Moz toolkit.

    “The main motivators for acquiring AudienceWise are the brains behind the operation, Matthew Brown and Tim Resnik,” says Fishkin. “Matt and Tim are joining Moz to help us scale our in-house marketing and grow our product expertise. Both have built software products in the past (Matt worked with Marshall on SearchCLU, Tim on an online poker subscription service) and have tremendous depth of knowledge in the fields of both inbound and paid marketing.”

    “We have a lot of phenomenal talent at SEOmoz, but only a few of us are deep into the fields of SEO, social media, content marketing, email, CRO, etc.,” he adds. “Matt and Tim are here to help serve as mentors and as internal consultant experts to our entire team, a role that I’ve been far too busy to fill effectively over the last 18 months.”

    From the sound of it, SEOmoz users have as much to gain from the deal as SEOmoz does.

    “We believe that SEOmoz subscribers will benefit immensely from Matt and Tim’s expertise, which will improve how we build our products,” says Fishkin. “Tim is a Big Data junkie. He understands the enormous potential of both structured and unstructured data, as well as the challenges that come with harnessing and leveraging it across all segments of a business.”

    “We know that Big Data matters to our users, and Tim’s vision will take us to the next level,” he continues. “Matt has worked in some of the toughest search marketing gigs, including The New York Times and at various Fortune 500 companies. Arguably, there is no tougher search marketing gig than publishing, where you live or die by clicks, and the competition grows daily. With this background, Matt knows what it takes to drive successful search marketing. His brain and willpower are going to help evolve SEOmoz’s product roadmap to meet the needs of our ever-growing user base.”

    Exact terms of the deal were not disclosed, but the price was somewhere in the low seven-figure area.

  • SEO Shows Google Results Can Be Hijacked

    People have been claiming to see scrapers of their content showing up in Google search results over their own original content for ages. One SEO has pretty much proven that if you don’t take precautions, it might not be so hard for someone to hijack your search result by copying your content.

    Have you ever had your search results hijacked? Scrapers ranking over your own original content? Let us know in the comments.

    Dan Petrovic from Dejan SEO recently ran some interesting experiments, “hijacking” search results in Google with pages he copied from original sources (with the consent of the original sources). Last week, he posted an article about his findings, and shared four case studies, which included examples from MarketBizz, Dumb SEO Questions, ShopSafe and SEOmoz CEO Rand Fishkin’s blog. He shared some more thoughts about the whole thing with WebProNews.

    First, a little more background on his experiments. “Google’s algorithm prevents duplicate content displaying in search results and everything is fine until you find yourself on the wrong end of the duplication scale,” Petrovic wrote in the intro to his article. “From time to time a larger, more authoritative site will overtake smaller websites’ position in the rankings for their own content.”

    “When there are two identical documents on the web, Google will pick the one with higher PageRank and use it in results,” he added. “It will also forward any links from any perceived ’duplicate’ towards the selected ‘main’ document.”

    In the MarketBizz case, he set up a subdomain on his own site, created a single page by copying the original HTML and images of the content he intended to hijack. The new page was +’ed and linked to from his blog. The page replaced the original one in the search results, thanks to a higher PageRank and a few days for Google to index the new page.

    In the Dumb SEO Questions case, he tested whether authorship helped against a result being hijacked. Again, he copied the content and replicated it on a subdomain, but without copying any media. The next day, the original page was replaced with the new page in Google, with the original being deemed a duplicate. “This suggests that authorship did very little or nothing to stop this from happening,” wrote Petrovic.

    In the Shop Safe case he created a subdomain, and replicated a page, but this time the page contained rel=”canonical”. The tag was stripped from the new page. The new page overtook the original in search, but it didn’t replace it when he used the info: command. +1’s had been removed after the hijack to see if the page would be restored, and several days later, the original page overtook the copy, Petrovic explained.

    Finally, in the Rand Fishkin case, he set up a page in similar fashion, but this time, but “with a few minor edits (rel/prev, authorship, canonical)”. Petrovic managed to hijack a search result for Rand’s name and for one of his articles, but only in Australian searches. This experiment did not completely replace the original URL in Google’s index.

    Rand Fishkin results

    If you haven’t read Petrovic’s article the article, it would make sense to do so before reading this. The subject came up again this week at Search Engine Land.

    “Google is giving exactly the right amount of weight to PageRank,” Petrovic tells WebProNews. “I feel they have a well-balanced algorithm with plenty of signals to utilise where appropriate. Naturally like with anything Google tries to be sparing of computing time and resources as well as storage so we sometimes see limitations. I assure you, they are not due to lack of ingenuity within Google’s research and engineering team. It’s more to do with resource management and implementation – practical issues.”

    The Dumb SEO Questions example was interesting, particularly in light of recent domain-related algorithm changes Google has made public. In his findings, Petrovic had noted that a search for the exact match brand “Dumb SEO Questions” brought the correct results and not the newly created subdomain. He noted that this “potentially reveals domain/query match layer of Google’s algorithm in action.”

    Petrovic believes there is still significant value to having an exact match domain. “Exact match domains were always a good idea when it comes to brands, it’s still a strong signal when it it’s a natural situation, and is now more valuable than ever since Google has sweeped up much of the EMD spam,” he says.

    Here’s what industry analyst Todd Malicoat had to say on the subject in a recent interview.

    Regarding the Fishkin experiment, Petrovic tells us, “Google’s perception of celebrity status or authority are just a layer in the algorithm cake. This means that if there is a strong enough reason Google will present an alternative version of a page to its users. There goes an idea that Wikipedia is hardcorded and shows for everything.”

    When asked if freshness played a role in his experiments, he says, “Yes. Freshness was a useful element in my experiments, but not the key factor in the ‘overtake’ – it’s still the links or should I say ‘PageRank’. I know this surprised a lot of people who were downplaying PageRank for years and making it lame to talk about it in public.”

    “This article was me saying ‘stop being ignorant,’” he says. “PageRank was and is a signal, why would you as an SEO professional ignore anything Google gives you for free? The funniest thing is that people abandon PageRank as a ridiculous metric and then go use MozRank or ACRank as an alternative, not realising that the two do pretty much the same thing, yet [are] inferior in comparison.”

    “To be fair, both are catching up with real PageRank, especially with Majestic’s ‘Flow Metrics’ and the growing size of SEOMoz’s index,” he adds.

    Petrovic had some advice for defending against potential hijackers: use rel=”canonical” on your pages, use authorship, use full URLs for internal links, and engage in content monitoring with services like CopyScape or Google Alerts, then act quickly and request removals.

    He also wrote a follow up to the article where he talks more about “the peculiar way” Google Webmaster Tools handles document canonicalization.

    So far, Google hasn’t weighed in on Petrovic’s findings.

    What are your thoughts about Petrovic’s findings? Share them in the comments.

  • Rand Fishkin Talks Twitter’s Impact On SEO

    Rand Fishkin Talks Twitter’s Impact On SEO

    As previously reported SEOmoz has acquired Twitter analytics company Followerwonk. CEO Rand Fishkin said in a blog post announcing the deal that the companies have actually been working together since June.

    Followerwonk is a tool designed to help users find, analyze and optimize for “social growth,” and that means digging into Twitter analytics (who your followers are, where they’re located, when they tweet, etc.), and finding and connecting with influencers. Fishkin sees an opportunity to bring his SEO-savvy customers this kind of data, which can help them in their SEO endeavors, which are obviously not getting any easier these days.

    “I see Twitter impacting a lot of relationship building, which often leads to partnerships, links, referrals, and business development of all kinds,” Fishkin tells WebProNews. “We’re also seeing a very observable correlation directly between URLs/sites that are heavily mentioned on Twitter and enhanced performance in the search results.”

    “Whether that’s a direct or indirect results is harder to know, but plenty of examples and evidence certainly exist,” he adds.

    Google’s Matt Cutts actually talked a bit about social signals at the Search Engine Strategies conference in San Francsico this week. He briefly touched on Google’s relationship with Twitter data, since the deal the two companies once had fell apart last year.

    According to a paraphrased account of the conversation from Brafton, Cutts noted that Google can’t crawl Facebook pages or Twitter accounts to see who is reputable or has real world impact as a brand. Brafton’s account of Cutts’ words continues:

    People were upset when Realtime results went away! But that platform is a private service. If Twitter wants to suspend someone’s service they can. Google was able to crawl Twitter until its deal ended, and Google was no longer able to crawl those pages. As such, Google is cautious about using that as a signal – Twitter can shut it off at any time.

    We’re always going to be looking for ways to identify who is valuable in the real world. We want to return quality results that have real world reputability and quality factors are key – Google indexes 20 billion pages per day.

    SEOmoz may just be able to help users identify who is valuable in the real world, using Twitter data, thanks to its new acquisition. Fishkin noted in his announcement, by the way, that they may add Google+ and/or Pinterest data into the mix at some point.

  • The Everywhereist Has A Brain Tumor, Names It Steve, And Blogs About It

    Geraldine, the travel blogger behind The Everywhereist, has a brain tumor.

    As we’ve been covering the SEO industry for many years, many of our readers may know her best as Rand Fishkin’s wife. Rand, as you probably know, is the CEO of SEOmoz, and as Geraldine explains, The Everywhereist is as much a love letter to her husband as it is a travel blog.

    That love is on display, probably as much as it has ever been, as Geraldine has blogged about her tumor, which she has named Steve.

    In her post, Geraldine writes:

    As for why I named it Steve, … well, duh. What else was I going to name it? There is no one to whom I am particularly close who is named Steve. I’ve never kissed a boy named Steve. I’ve never uttered the phrase, “Steve, I love you.” And Steve is nice and short and easy to add to a long list of unrepeatable words. Behold:

    “Fucking goddamn miserable piece-of-shit Steve.”

    See how well that works? It kind of rolls off the tongue, really. And considering how many big words we’ve had to deal with over the last couple of weeks, I’m inclined to stick to something short and sweet and monosyllabic (this must be how the Kardashians feel).

    Rand comments on the post:

    Fucking goddamn miserable piece-of-shit Steve.

    Hey, look at that. It DOES roll off the… er… keyboard.

    I’m really proud of you KTL. You’ve been a trooper, and you’ve been so awesome to me these last few weeks. I love that you wrote this, too. You know it’s been tough for me to keep it secret, and I almost feel like part of the reason you’re publishing is to make me feel better. Thank you. I love you. I promise to be (mostly) nice and patient with your Mom while we wait at the hospital.

    p.s. We’re not religious, so mentions of various deities may confuse us.

    Rand has also been tweeting about the situation:

    Geraldine says the doctors are confident the tumor is not a not a glioma, but rather pilocytic astrocytoma. Characteristics of this, according to the National Brain Tumor Society, which Geraldine linked to, include:

    • Slow growing, with relatively well-defined borders
    • Grows in the cerebrum, optic nerve pathways, brain stem and cerebellum
    • Occurs most often in children and teens
    • Accounts for two percent of all brain tumors

    The good news, Geraldine says, is that that the doctors say there’s an 80% chance that Steve is benign, and that even if the tumor is not benign, “odds are he’s still very easily treatable.”

    Finding out a loved one has a brain tumor is tough news to hear. I know. I’ve lived it. I can’t imagine what it must be like to find out you have one yourself. These two have clearly kept in good spirits about the whole thing, or at least as good as anyone could keep. Considering the circumstances, the outlook seems pretty positive.

    Best of luck to Geraldine and Rand.

  • Rand Fishkin’s Negative SEO Challenge: 40K Questionable Links And Ranking Well

    Last month, we reported that SEOmoz CEO Rand Fishkin issued a negative SEO challenge. He challenged people to take down SEOmoz or RandFishkin.com using negative SEO tactics.

    “I’ve never seen it work on a truly clean, established site,” Fishkin told us at the time. He is confident enough in his sites’ link profiles and reputation. He also said, “I’d rather they target me/us than someone else. We can take the hit and we can help publicize/reach the right folks if something does go wrong. Other targets probably wouldn’t be so lucky.”

    We had a conversation with Fishkin today about the Penguin update, and about a new SEOmoz project related to webspam. We also asked for an update on how the challenge is going, and he said, “On the negative SEO front – I did notice that my personal blog had ~40,000 more links (from some very questionable new sources) as of last week. It’s still ranking well, though!”

    It sounds like the the challenge is working out so far, which certainly looks good on Google’s part, especially in light of the Penguin update, and the opinions flying around about negative SEO. Just peruse any comment thread or discussion forum on the topic and there’s a good chance you’ll run into some of this discussion.

    I’m guessing the challenge is still on the table, but so far, Fishkin doesn’t seem top be having any problems.

    Of course, most people don’t have the link profile or reputation that Fishkin has established, but that also speaks to the need for content producers to work on building both.

  • Google Penguin Update: SEO And Marketing Services Feel The Effects

    There’s been a great deal of talk about the Google Penguin update since it launched last month, and a lot of webmasters are still trying to sift through the rubble and determine if their sites were even impacted by Penguin or some other Google algorithm change. In addition to Penguin, there were two Panda refreshes last month, and over 50 other changes, which Google finally listed on Friday.

    SEOmoz CEO Rand Fishkin tells WebProNews, “It’s done a nice job of waking up a lot of folks who never thought Google would take this type of aggressive, anti-manipulative action, but I think the execution’s actually somewhat less high quality than what Google usually rolls out (lots of search results that look very strange or clearly got worse, and plenty of sites that probably shouldn’t have been hit).”

    SEOmoz, by the way, has launched an interesting project aimed at tackling Webpsam on its own.

    Fishkin actually posted a new video discussing the Penguin update today, which is worth the watch, particularly if you’ve been affected. There are six main points he discusses, but one in particular that I found interesting is that there are a lot of sites in the marketing industry that appear to have been hit.

    Fishkin says, “There appears to be a very disproportionate level of sites in the marketing/services field affected by this. What I mean is, we have seen more people write in about keywords like, ‘seo services,’ ‘seo company, you know, some particular city name’, or ‘web design services, some particular city name’. Those types of results seem to be hit heavily.”

    “Now, I’m gonna throw out to things I think may be to blame here,” he continues. “One is: a lot of people who operate in these marketing services fields are also likely to have a lot of correlation with the people who are potentially getting the kinds of link spam to their web pages that Google hit in this update. So, it’s not necessarily [that] Google focused on these. It could be the types of spam they focused on and the types of links that these people had just happened to be correlated and connected. The other things is, this could merely a leading indicator…we’re obviously in the marketing and SEO field, and so it could be that we’re just getting a disproportionate number of those types of folks talking about it in Q&A, emailing, tweeting at us…all those kinds of things.”

    “That’s also possible, though usually we see more balance across the board, typically,” he notes.

    Beyond the obviously spam-heavy topics, like making money online and pharmaceuticals, we’d be interested to hear more about what kinds of sites have been impacted most by Penguin. Do you believe you were hit by Penguin? What industry is your site part of?

  • SEOmoz Takes On Webspam With Ambitious Project, Talks Penguin Update

    SEOmoz is working on a new spam research project aimed at classifying, identifying and removing (or at least limiting) the link juice that spam pages and sites can pass – a pretty ambitious goal, to say the least. Can SEOmoz do this better than Google itself?

    CEO Rand Fishkin announced the project on Google+ Monday evening, acknowledging that his company is “certainly not going to be as good at it or as scaled as Google,” but that it’s making for interesting research.

    Fishkin tells WebProNews that Google’s Penguin update was not the motivator behind the project, though he did have this to say about the update:

    “In terms of Penguin – it’s done a nice job of waking up a lot of folks who never thought Google would take this type of aggressive, anti-manipulative action, but I think the execution’s actually somewhat less high quality than what Google usually rolls out (lots of search results that look very strange or clearly got worse, and plenty of sites that probably shouldn’t have been hit).”

    You can read more about Penguin via our various articles on the topic here.

    “We’ve been wanting to work on this for a long time, but our data scientist was previously tied up on other items (and we’ve just hired a research assistant for the project),” Fishkin tells us. “The original catalyst was the vast quantity of emails and questions we get about whether a page/site is ‘safe’ to acquire links from, or whether certain offers (you know the kind – ‘$100 for 50 permanent text links guaranteed to boost your Google rankings!’) were worthwhile.”

    “Tragically, there’s a lot of money flowing from people who can barely afford it, but don’t know better to spammers who know that what they’re building could hurt their customers, and Google refuses to take action to show which spam they know about,” he continues. “Our eventual goal is to build a metric marketers and site owners can use to get a rough sense of a site’s potential spamminess in comparison to others.”

    “A score (or scores) of some kind would (eventually, assuming the project goes well) be included in Mozscape/OSE showing the spamminess of inlinks/outlinks,” he explained in the Google+ announcement.

    According to Fishkin, the SEOmoz algorithms will be conservative and focus on the most obvious and manipulative forms of spam. “For example, we’d probably catch a lot of very obvious/bad link farms, but not necessarily many private blog networks or paid links from reputable sites,” he said in response to a comment on his Google+ post.

    Also in the comments, Fishkin indicated that data would be presented in a ‘matches patterns of sites we’ve seen Google penalize/ban” kind of way than a “‘you are definitely webspam’ type of thing.”

    The data scientist Fishkin spoke of will present the findings at the company’s Mozcon event in July. Fishin expects an actual product launch late this year or early next year.

    Earlier this month, the company announced that it has raised $18 million in VC funding.

  • Can Your Site Lose Its Rankings Because Of Competitors’ Negative SEO?

    Rand Fishkin, the well known SEO expert and Founder/CEO of SEOmoz, has challenged the web to see if anyone can take down his sites’ rankings in Google by way of negative SEO – the practice of implementing tactics specifically aimed at hurting competitors in search, as opposed to improving the rankings of one’s own site. Fishkin tells WebProNews about why he’s made such a challenge.

    Do you think negative SEO practices can be effective in hurting a competitors’ rankings, even if that competitor is playing by all of Google’s rules and has a squeaky clean reputation? Let us know what you think.

    First, you’ll need a little background. There’s a thread in the forum Traffic Planet started by member Jammy (hat tip to Barry Schwartz), who talks about an experiment run with the cooperation of another member in which they were successfully able to have a hugely negative impact on two sites.

    “We carried out a massive scrapebox blast on two sites to ensure an accurate result,” Jammy writes. I’m not going to get into all of the details about why they targeted specific sites or even the sites themselves here. You can read the lengthy forum thread if you want to go through all of that.

    The important thing to note, however, is that the experiment apparently worked. BUT, Fishkin maintains that the sites in question weren’t necessarily in the best situations to begin with.

    “In terms of negative SEO on the whole – I think it’s terrible that it could hurt a site’s rankings,” Fishkin said in the forum thread. “That creates an entire industry and practice that no one (not engines, not marketers, not brands) benefits from. Only the spammers and link network owners win, and that’s exactly the opposite of what every legitimate player in the field wants. Thus, I’m wholeheartedly behind identifying and exposing whether Google or Bing are wrongly penalizing sites rather than merely removing the value passed by spam links. If we can remove that fear and that process, we’ve done the entire marketing and web world a huge favor.”

    “I’ve never seen it work on a truly clean, established site,” Fishkin tells WebProNews, regarding negative SEO. He says the examples from the forum “all had some slightly-seriously suspicious characteristics and not wholly clean link profiles already, and it’s hard to know whether the bad links hurt them or whether they merely triggered a review or algorithm that said ‘this site doesn’t deserve to rank.’”

    “If negative SEO can take down 100% clean sites that have never done anything untoward and that have built up a good reputation on the web, it’s more concerning and something Google’s search quality engineers would need to address immediately (or risk a shadow industry of spammers popping up to do website takedowns),” he adds.

    When asked why he would antagonize those who disagree with his view by offering his own sites as targets, Fishkin says, “Two things – one, I’d rather they target me/us than someone else. We can take the hit and we can help publicize/reach the right folks if something does go wrong. Other targets probably wouldn’t be so lucky.”

    Perhaps there should be a Good Guy Rand meme.

    Good Guy Rand (Fishkin)

    “Two – if this is indeed possible, it’s important for someone who can warn the search/marketing industry to have evidence and be aware of it,” says Fishkin. “Since we carefully monitor our metrics/analytics, haven’t ever engaged in any spam and have lines over to some folks who could help, we’re a good early warning system.”

    So what happens if challengers are successful at taking down either SEOmoz or RandFishkin.com?

    “SEOmoz gets ~20% of its traffic from non-branded Google searches, so worst case, we’d see a 20-25% hit for a few days or a few weeks,” Fishkin tells WebProNews. “That’s survivable and it’s worth the price to uncover whether the practice is a problem. Our core values (TAGFEE) dictate that this is precisely the kind of area where we’d be willing to take some pain in order to prevent harm to others.”

    When asked if he’s confident that Google will correct the problem in a timely fashion if he’s proven wrong, Fishkin says, “Fairly confident, though not 100%. I have my fingers crossed it won’t get too messy for too long, but my COO and community manager are a little nervous.”

    Fishkin concludes our conversation with: “I’d say that the evidence on the Traffic Power thread is strong that if a site already has some questionable elements, a takedown is possible. But, it’s not yet proven whether wholly clean sites can be brought down with negative SEO. I hope that’s not the case, but I suspect the hornet’s nest I kicked up will probably answer that for us in the next month or two.”

    Word around the industry is that Google is making SEO matter less, in terms of over-optimization. Google’s Matt Cutts talked about this last month at SXSW, and that discussion had led to a great deal of discussion and speculation as to just what this would entail.

    “The idea,” he said, “is basically to try and level the playing ground a little bit, so all those people who have sort of been doing, for lack of a better word, ‘over-optimization’ or overly doing their SEO, compared to the people who are just making great content and trying to make a fantastic site, we want to sort of make that playing field a little more level.”

    One thing’s for sure though: If negative SEO can truly impact clean sites, that’s not quite the level playing field Google is aspiring to create.

    Fishkin’s experiment is going to be an interesting one to keep an eye on. If SEOmoz can be severely impacted from this, who’s to say your site can’t? Do you think it’s possible? Tell us in the comments.

  • Challenges with Raising Venture Capital & Being Transparent about It

    It hurts to get close to something that you want and then not get it, doesn’t it? When we’re talking about money and business, this situation is even more painful. Furthermore, talking about the situation does nothing but add more grief to an already complicated situation.

    Unfortunately, this is exactly the scenario that our friend Rand Fishkin, the CEO and co-founder of SEOmoz, found himself in not long ago. In 2007, the company received venture capital funding from investment firm Ignition, and earlier this year, was approached by a number of firms interested in investing further.

    Fishkin told us that the company had not planned on raising funding but that it began to get excited about the potential opportunity. During the bidding process, there was clearly one firm that stood out. Fishkin said it made them a good offer and the companies signed a term sheet.

    As he explained, this is “usually a done deal unless the investment firm finds fraud of some kind.” However, three weeks after the signing, the investment firm pulled out. Aside from the fact that SEOmoz did not receive the funding, he said it was also hard to understand why it happened since the firm did not give a clear reason for its action.

    “That experience was new for us,” said Fishkin. “I think folks tend not to write about the fact that even after a term sheet is signed, the investor can still pull out.”

    Because he has always been very open about all things SEOmoz, Fishkin wrote a very detailed post, within legal bounds of course, about the entire experience. WebProNews asked Fishkin about why he felt so compelled to be open since most companies would not go to the extreme to find out what they could actually disclose.

    He told us that transparency has always been a core value of SEOmoz and always would be. He believes that this includes both the good times and the bad times.

    “There’s nothing up my sleeve,” said Fishkin. “It’s all out there.”

    Is it possible for a business to be too transparent? What do you think?

    Fishkin and SEOmoz take transparency very seriously and believe in being upfront about all matters, even when they involve finances and legalities that aren’t flattering.

    “It’s one of the qualities that consumers and business customers appreciate so tremendously much these days,” pointed out Fishkin. “We’re getting a culture, it’s particularly in the technology world, that anticipates, loves, and rewards transparency.”

    With this transparency, there is also a risk since investors may avoid SEOmoz in the future out of fear of being the subject of a blog post. Fishkin admits that this is a very real concern but said it was one that he was willing to take.

    “It’s a risk that we feel comfortable with,” he said. “I would rather say I’m going to commit to our core values, we’re going to do it 100%, we will be transparent no matter the costs, rather than say… we’re transparent but only when it’s convenient for us.”

    Even though SEOmoz didn’t receive the funding, no one can say that the company doesn’t stick by its values. The experience, however, has made the company hesitant about raising capital in the future.

    “We’re going to go back to our original mission of not raising capital,” said Fishkin. “Maybe we’ll think about it again next year, but I sort of hope we don’t.”

    “I’d prefer not to go through that process,” he added. “It takes a lot of time and energy away from running the business.”

    If the opportunity were to come up again, Fishkin told us that he would like his company to be in a position in which it doesn’t need the funding, so that it could walk away if it wanted. Since most startups that are covered by the Silicon Valley media receive funding, he also said that he would try to create buzz around his company before he attempted another VC round.

    Although the experience was difficult, Fishkin and SEOmoz have received a lot of praise and support for being transparent. Fishkin told us the praise is a “good consolation prize” but that it was a little “bittersweet.”

    Going forward, he hopes that startups will be more aware of potential issues and that investors will be more cautious.

  • Are Likes and Retweets the New Links?

    Are Likes and Retweets the New Links?

    Search has been evolving for years, and it looks as though its really starting to enter a new era entirely. While search and social media may be two different animals, it is becoming more clear that they’re directly related, and will continue to be more mixed into one another.

    We’re already seeing search engines attempt to place some kind of ranking on social updates. For example, we’ve already know that search engines take things like follower quality into account in how they rank tweets (see more on that from Google and Bing).

    There has been a lot of talk of Facebook "likes" and Twitter retweets taking the place of links. Nobody’s saying that links are dying exactly. There is obviously plenty of room for link sharing on either of these services, but in some ways these kinds of sharing are replacing links in many cases. Before Facebook even announced its plans to take over the web, WebProNews talked with Rand Fishkin of SEOMoz about how Twitter is "cannibalizing the Web’s link graph":

    Now that Facebook’s Open Graph and social plugins are devouring the web, suddenly liking is taking the place of linking in some speculative scenarios. We talked about some implications Facebook’s initiatve has for search in a recent article.

    While I dont’ think anyone specifically saw the Open Graph stuff coming too long before it was announced (maybe somewhat in the days leading up to it), it’s really still reflective of what we’ve known for some time. The way people are obtaining information online is diversifying. I feel like I’m beating a dead horse (as I’ve written about his repeatedly, but it’s just what the big picture is about. Google’s real competition isn’t coming from other search engines. It’s coming from different avenues of information access.

    The biggest threat to Google the search engine (as opposed to the company, which offers a lot more) is people not having to rely on the traditonal search engine. While I don’t think Google has anything to truly worry about in terms of losing users, it has to worry more about users just not using it as often, because they’re getting their information from apps…from friends via social networks…even when they’re not necessarily at Facebook.com itself, but on any given site or app, via things like social plugins (Twitter has its own @anywhere platform, and we’ll probably see more ways networks are penetrating sites. Hell, Google already has its Friend Connect and Buzz…I would not count the company out in expanding into more of this kind of stuff).

    Style Coalition CEO Yuli Ziv has an interesting article at Mashable about "5 reasons Google and Search wont’ Dominate the Next Decade". Her reasons include:

    1. The search process is inefficient
    2. Mobile GPS Eliminates the need for location-based search
    3. Social Matching Could Create Valuable Connections
    4. Content Recommendations to Replace Search
    5. Suggestions Will Be the Core of Our Shopping Experience

    She elaboraates on each of these of course, and some of them are debatable, but really, the diversification of how people obtain information has already begun.

    Facebook likes may not translate to better Google rankings, but so what? They may translate to a better Facebook ranking. After all, the more people that "like" you brand, the greater the visibility within Facebook. With over 400 million users and counting, and Facebook expanding its presence, that means more visibility period, and at a more meaningful level of personalization. It’s not about choosing between likes and links. Both are ideal.

    WebProNews recently stopped by comScore’s New York offices, and had a chat with search evangelist Eli Goodman who made some good points about where search is headed, and how not only the technology of search engines changes over time, but the habits of users, and the relationship between the two.

    As far as optimizing for search, it seems pretty clear that social and mobile will continue to play larger roles. It also seems clear that if you want social success, you need to work at your relationships with others within your networks. Look at Twitter’s Promoted Tweets strategy around "resonance." Look at tools like Trst.me, which uses a PageRank-like strategy to score Twitter users.

    Look at the implications of Facebook likes. Regardless of what Facebook chooses to do with this data itself, they’re already being utilized in other places, like in search via OneRiot. The whole point of Facebook’s Open Graph is to connect the web. It stands to reason that Facebook likes will be of influence in plenty more places.

    The point of all of this is, it’s not just about getting links anymore. Links will always be of use, but social interactions may equal them in importance, and in some cases may be of greater use to your visibility, and ultimately getting people to your site, your content, your store, or your shopping cart.