WebProNews

Category: SEMProNews

SEMProNews

  • Microsoft Fires Two Bing Marketing Execs For Misuse Of Funds

    Would you believe that the company you work for might not approve of misguided spending of said company’s funds? That’s like First Job 101, right? And if anything, for god’s sakes save those receipts so you can at least account for the expenses on that company card.

    Two now-former Bing marketing executives, Eric Hadley and Sean Carver, were reminded of this policy in the most painful of ways today as they’ve been fired from the company due to the misuse of funds. According to a spokesperson with Microsoft, “We can confirm that as result of an investigation, Eric Hadley and Sean Carver’s employment with Microsoft has been terminated for violation of company policies related to mismanagement of company assets and vendor procurement.”

    Although he was Bing’s Marketing Chief in title, Hadley was perhaps more recognized as Bing’s behind-the-scenes marketing mastermind known for rubbing elbows with Jay-Z when Bing launched a marketing campaign to promote Hova’s autobiography, Decoded. The Daily Beast’s 2010 profile on Hadley illustrates him as a mold-breaker with how he passed effortlessly among both tech world and the limelighted stage of pop culture where, in addition to Jay-Z, Hadley mingled with the likes of Beyoncé, Lebron James, and Venus Williams; dabbled in programing on network television through an NBC collaboration; and even produced a documentary about noted cardiologist, Dr. Richard Bing (that’s not the namesake of the search engine, by the way).

    Carver was no stranger to the bright lights of the big time, either. Having served as the Director of Brand Entertainment at Bing since 2009, Carver involved himself in several projects, including the aforementioned documentary about Dr. Bing and other projects attached with recognizable Hollywood names like Jason Sukeikis and Olivia Munn.

    Bing famously allotted an enormous budget to Hadley to re-brand the company as the cooler alternative to Google, though now that the two executives have been axed one has to wonder if Bing (and Microsoft, really) weren’t satisfied with the returns on their bet on Hadley and Carver. In the interim, Bing General Manager Mike Nichols will assume the responsibilities of the pair.

    According to Ad Age, neither Hadley and Carver responded to requests for comment. Can you blame them, though? I wouldn’t want to talk to anybody either, especially if it turns out , for whatever reason, that they try to appeal Microsoft’s decision.

  • Fact: UK Internet Users Actually Pronounce “Search” As “Google”

    Fact: UK Internet Users Actually Pronounce “Search” As “Google”

    Experian Hitwise pointed out yesterday that UK internet users’ visits to search engines is sustaining a massive growth rate. Year-on-year, there was an 8.7% increase in visits to search engines when comparing February 2012 and February 2011 and the trend appears to set to continue next month.

    One thing I immediately noticed about the visits to search engines in the United Kingdom is that “search engine” might as well be spelled G-O-O-G-L-E. Last month, Google netted 91.57% of all searches, up from 90.64% in January 2012. That market share for Google is up from last year at the same time by .89%. The nearest competitor to Google is Microsoft, whose sites (like Bing) amassed a blink-and-you’ll-miss-it 3.69% of the search market. Yahoo claimed 2.58% and after that, remaining people who didn’t use those three search sites went back to using two aluminum cans connected with a piece of twine.

    Comparatively, Google’s market share of online search in the United States isn’t quite that dominant for February 2012 but it’s still unquestionably commanding at 66.4%. Laying claim to two-thirds of the search market is still an amazing take but it’s nowhere near the absolute monopoly Google has on search in the UK.

    Google isn’t eating most of the search pie in the UK so much as its gobbling up all of it and leaving Microsoft and Yahoo to bicker over the crumbs left cooling on the placemat. Microsoft is slowly carving out a piece of the search market for itself in the United States, but its virtually non-existant in the UK. So why is Google so much more dominant in the UK, I wonder.

    James Murray, a marketing research analyst with Experian Hitwise, spoke with me about why Google has achieved such an irresistible dominance in the search market. Turns out it really might be as simple as pointing to Google’s unrivaled quality in producing reliable search results.

    “In the UK at least Google has held over a 90% share of the search market for at least the last five years,” Murray said. “What’s amazing is that Google managed to achieve such a dominant place in the UK search market with no discernible advertising or marketing. It achieved its status by being the best in its field, offering the most relevant information and by word of mouth.”

    Murray also notes that Google’s achievement has been so convincing that it’s changed the vernacular related to internet search.

    “Google has just become synonymous with search, so much so that we’ve created a verb around its brand name,” he added. “When you want to find information online, you Google it, that’s how powerful the brand is.”

    That much is true no matter what side of the Atlantic you find yourself on. As for why Google has been verbified to denote the act of conducting an internet search in the UK? Maybe it’s just more fun to say with a British tongue. Regardless, you’d be hard pressed to find anyone in the UK that uses Bing or Yahoo to search the internet.

  • SEO Solution, InferClick, New From SearchDex

    In the realm of search engine optimization offerings, SearchDex, a digital marketing service and solutions provider, today announced at eTail Palm Springs the release of InferClick, a new solution to help empower online retailers with actionable insights into consumer behavioral patterns.

    InferClick, the behavioral data analytics technology from SearchDex, provides online retailers revenue tracking by keyword, increased keyword valuation, enhanced merchandising, intelligent product recommendations, keyword expansion based on user behavior, as well as the ability to create new pages around newly discovered keywords.

    “As online retailers strive to recognize how buyers find and interact with their ecommerce sites, e-tailers need the necessary tools to discover the relevant actions and behaviors of consumers in order to determine critical revenue drivers,” said David Chaplin, CEO at SearchDex. “InferClick is our answer to the demands of online marketers who would benefit from these capabilities, offering improved keyword targeting and additional enhancements focused on SEO program expansion through the analysis of view-stream and shopping cart data.”

  • YouTube Hires Bing General Manager To Be New VP Of Marketing

    Google has hit Bing where it hurts by hiring away its general manager of marketing.

    AdAge is reporting that Google has hired Danielle Tiedt, former general manager of marketing for Bing. She led the recent “Bing is for Doing” ad campaign and was around when the search engine first launched. What will she be doing under Google? She’s going to become the VP of marketing for YouTube.

    What will Tiedt be doing for YouTube? Google says that she “play a pivotal role as we move forward into our next phase of growth.”

    Bing has definitely grown into its own and is now the second most popular search engine on the Web. If Tiedt can bring that kind of growth to the YouTube brand, we can expect some big things from what is already the biggest video service on the Web.

    As YouTube becomes more of a content delivery service instead of just a place where I can upload videos of myself complaining about the world; the video site is going to need some heavy marketing to convince more people that they can rival television and other professional video content.

    YouTube is already heavily invested in creating new content and channels with the recent announcement of a sports channel. With this new hire, expect YouTube to grow more into a service that can challenge the big boys.

  • Former Zynga Engineer: They Don’t Care About Fun, Focus Everything On Big “Whale” Spenders

    Former Zynga Engineer: They Don’t Care About Fun, Focus Everything On Big “Whale” Spenders

    I’ll start this out the way that it really should be started – with a warning to have your grain of salt ready. From the some-guy-is-bashing-his-former-company-on-reddit files, it appears that Zynga focuses the large majority of their attention on folks that you could say are borderline addicted to games like Farmville.

    On Sunday, reddit user mercenary-games began an AmA session (Ask Me Anything) titled “IAmA Former FullTime Zynga Engineer – quit 6 months ago.” They alleged former employee says that they worked at Zynga for 8 months and eventually quit. They also make a point to say that this isn’t about a grudge –

    No, this isn’t payback, or about a grudge. This is just word from someone who’s seen what this industry is capable of doing. Good, and Bad. No, I was not under contract, I was full time, offered stock (common shares, not that options bullcrap). I sympathized with contractors on how they were treated, most of the time. No, I was not some IT mangler. I worked for one of their “game studios”, basically the front lines where content made it to the masses.

    The user offered proof in the form of a Zynga termination certification letter, available for your perusal here.

    So there’s the evidence. Either this person is genuine of full of it – take your pick. If you choose genuine, then what he has to say about Zynga is pretty interesting.

    When asked about some “creepy” stuff that the company does, our engineer talks about information gathering and how that is used to target a specific type of player:

    Spying on players. Getting intimate gaming data, their habits, their networks, and how to effectively monetize given X.

    Here’s another example;
    Internal metrics researchers often give studio wide talks on what trends are going on. They’ve basically tracked down very popular players and also players who’ve spent an excess of 10k into the game. We often tweak our features to match and maximize for a particular gaming habit. We do this for massive populations of players. Players are not aware of this. To me, that’s a big brother like issue, someone is measuring and monitoring your behavior intimately, and you don’t know how that data is going to be used.

    They go on to call the super users ($10,000+) Zynga Black or “Whales.” They apparently make up about 5% of Zynga players.

    They are the “hardcore” crowd Zynga caters too. Every other player is treated as a spammer.

    [Zynga] designs the game purely to fit the needs of the “whales”. Everyone else is treated as an “enabler”; spam messages, beg for tools, etc

    Our former engineer elaborates on “creepy”:

    The creepy factor at Z comes when they start designing for “behavior” instead of game design and fun. Behavior is what they are looking for. Behavior is what they measure, on a massive scale. It’s not about having fun to them, its about monetizing the fun, cloning games, buying indie studios, and suing the shit out of other companies. THAT is creepy.

    On the issue of gameplay, they say that the actual gaming part of Zynga’s plan is secondary. Marketing is key:

    Another issue was skewing gameplay for the sake of profit, example; I actually resorted to BAD MATH, to make the case for making a feature more fun. At the end of one sprint, a QA dude was complaining about the drop rate of a specific item being absurdly insane, and therefore UnFun. I looked at the code, and tweaked some values, gave it back to QA guy, and fun was restored. Product Manager overrides this, goes for unfun, yet more profitable version.

    And later, “Zynga is a marketing company, not a game company.”

    The treatment of contractors was also a topic of discussion:

    Z treats them like second citizen employees, they almost have no feedback or say on their work schedule. I’ve seen people waiting to turn full time, but only spend more time as a contractor because of office politics. Worst of all; they are NOT welcome to company events, they are openly excluded from them. Yet, they want them to work twice as hard as regular full timers.

    Have you swallowed your salt? Yes? Good. However, sure is interesting, right?

  • Google Search Plus Your World May Draw FTC Complaint

    EPIC, the Electronic Privacy Information Center, is reportedly considering filing a complaint with the United States Federal Trade Commission over Google’s new “Search Plus Your World” features.

    In an interview with the LA Times, EPIC’s executive director Marc Rotenberg said the group may file a complaint. The group has done so in the past, with regards to Google’s inclusion of YouTube videos in search results.

    EPIC has put up the following statement on its site, citing concerns over Search Plus Your World:

    Google is changing the results displayed by its search engine to include data from its social network, such as photos or blog posts made by Google+ users, as well as the public Internet. Although data from a user’s Google+ contacts is not displayed publicly, Google’s changes make the personal data of users more accessible. Users can opt out of seeing personalized search results, but cannot opt out of having their information found through Google search. Also, Google’s changes come at a time when the company is facing increased scrutiny over whether it distorts search results by giving preference to its own content. Recently, the Senate held a hearing on Google’s use of its dominance in the search market to suppress competition, and EPIC urged the Federal Trade Commission to investigate Google’s use of Youtube search rankings to give preferential treatment to its own video content over non-Google content. Google has also acknowledged that the FTC is investigating whether Google uses its dominance in the search field to inhibit competition in other areas.

    There has been a lot debate around the new features, which make Google+ much more a part of Google Search. You can read more about Twitter’s public opposition to the features here. In a nutshell, Twitter thinks the changes make Twitter content less accessible to users. I don’t really see how this changes things in that regard. Twitter content has been less accessible since Twitter and Google failed to renew their realtime search/Twitter firehose deal last year (which I do also see as a negative thing).

    Twitter and Facebook are both keeping Google from certain data, which Google would be able to use to improve as a search engine. Some argue, however that Google can get enough public data from Twitter and Facebook to work into the new offerings, at least to some extent. All of this is true.

    On the one hand, Google could, for example recommend Twitter accounts and Facebook pages for celebrities, the way it is doing with Google+ profiles. On the other hand, Google doesn’t have the data from Twitter and Facebook to deliver the kind of personalized results it can offer via Google+. It’s easier for Google to improve the user experience, at least in theory, when they can give you any data that that is available (personalized data). Google+, which is really just an extension of the Google account itself, is Google’s way of trying to deliver this stuff, supplemented with other public data from places like Flickr, Quora, WordPress, etc.

    Danny Sullivan posted this video of Google Executive Chairman Eric Schmidt talking about the lack of Facebook and Twitter data:

    Part of Search Plus Your World is the addition of a special section for “People and Pages on Google+”. When I search for “music” I see profiles for Britney Spears, Mariah Carey and Busta Rhymes – Google profiles. Nothing but Google profiles in that section. However, the top organic result I get is for Yahoo Music. Not even Google Music.

    I see the new features as more of a relevancy problem than an antitrust problem. If Google is taking what it knows about me, to personalize my search results, it should recognize that I use Google Music (I don’t use Yahoo Music), and that I don’t give a crap about Britney Spears, Mariah Carey or Busta Rhymes) – at least not as much of a crap as it would take to deem them worthy of that kind of placement for such a broad term. In fact, I would argue that my results would be much better for the user (me) if Google actually tapped its own Google Music property to understand the music I like. I don’t need Facebook pages or Twitter accounts for Britney, Mariah or Busta either.

    Part of the reason I use multiple products from Google is because I expect there to be integration. It’s often disappointing when that integration is lacking. It makes things less usable. If I’m signed in to my Google account, I want easy access to content that’s related to my Google account. If I want things from Facebook or Twitter, I know where to look.

    If you are signed into Google, you are signed into your Google account. You are signed into Google+. When you’re not signed in, well, that’s a different ballgame. One thing that is a bit iffy here, is that Google said in is announcement that Search Plus Your World would be for users that are signed in. The personal stuff is, but the People and Places stuff that highlights Google+ accounts still appears when the user is signed out.

    That could be an issue.

  • Unruly (Of Google’s Chrome Marketing Debacle Fame) Raises $25 Million

    Unruly is the marketing firm behind Google’s Chrome paid link fiasco – the campaign by which Googlers are saying they are embarrassed.

    Still, Unruly announced today that it is has just raised $25 million in funding in a Series A investment from Amadeus Capital Partners, Van den Ende & Deitmers and The British Growth Fund.

    Unruly says it’s the largest ever for a private company in the social video sector.

    The company was founded in 2006. It has offices in San Francisco, London, Berlin, Paris, Stockholm, Amsterdam and Sydney. It claims to have delivered, tracked and audited 1.34 billion user-intended video views and executed 1,400+ successful social video campaigns for brands like Evian (Roller Babies), T-Mobile (Life’s for sharing) and Old Spice (Man Your Man Could Smell Like).

    They’ve also done campaigns for EA, Adidas, Unilever, and of course Google.

    “Today represents an important milestone for the company and social video as a whole,” said Unruly founder and CEO Scott Button. “Five years ago, we set out to help brands capture the massive opportunity in social video and we’re delighted that such a distinguished group of investors share our conviction.”

    Button’s response to the whole Google ordeal, was (via Peter Kafka):

    …we don’t ask bloggers to link to the advertiser’s site. It’s just not part of our business model. We help advertisers distribute video content and that’s what we get paid for. All links from the video player itself are wrapped in Javascript, so although Google can follow them, they don’t influence search engine rankings. Even though we don’t ask bloggers to link, we do advise them to use nofollow if they do link to the advertiser’s site. This is really important and they should do it to protect themselves as much as the advertiser.

    As far as I’m aware, there was one link in one post that was not marked nofollow. This was corrected as soon as we became aware of it.

    We’re always completely upfront and transparent with bloggers that we are running commercial campaigns and who we’re working for. We always require that bloggers disclose any commercial incentive to post video content. We always require that bloggers disclose even on related tweets that they might do off their own bats.

    It’s also a key part of how we operate that we don’t tell bloggers what or how to write. It’s really important that opinions expressed and the tone of voice belong to the author not the advertiser. Occasionally that leads to human error, as here, so we’re always really happy to have these kinds of example flagged and will sort them out as quickly as we possibly can.

    Shortly thereafter, several Google employees started discussing the whole thing. Jason Morrison, for example, responded to a post from Google’s Matt Cutts about the situation, by saying:

    This is embarrassing, but a good illustration of two things:

    1. Why I like working at Google. The Search Quality Team tries to apply the Webmaster Guidelines fairly – even on other Google products.

    2. Why you should pay attention to what any marketing, advertising, or SEO companies might be doing on your behalf.

    Either way, I’m sure some of Unruly’s campaigns, such as Old Spice and Evian will be remembered long after the Google ordeal is forgotten.

    “Unruly’s proprietary technology platform and aggressive global growth strategy in a fast-growing market is really impressive,” said Richard Anton, Partner at Amadeus Capital Partners. “We are delighted to be supporting the company build on its success, bringing our experience of building a number of international marketing and advertising technology companies, including Celltick, ComQi and EPiServer.”

    “With global online ad spend set to reach $110 billion by 2014 and online video ad spend predicted to be the fastest growing category, we believe Unruly is strongly positioned to be the winner in the global social video market,” said Martijn Hamann, Partner of Van den Ende & Deitmers.

    “In a short space of time, Unruly has played a major role in the explosive growth of social video and this investment gives it additional firepower,” said Marion Bernard, Regional Director of BGF. “We look forward to working with the company and our co-investors to take advantage of the very significant global expansion opportunities. BGF is working in partnership with other investors to expand the pool of investment capital to growing and ambitious UK companies as a key part of developing the entrepreneurial economy”

    Torch Partners and Orrick, Herrington & Sutcliffe LLP advised on the funding.

  • Blekko Gets Search Index Update, Redesign

    Blekko announced an update to the search index, a site re-design and the addition of automated slashtags to over 500 categories.

    “That means users will get the benefit of curated search results (i.e. no spam!) for over 500 search categories, regardless of whether or not they manually append a slashtag to their query,” Blekko says in a blog post.

    “As we head into 2012, our mission at blekko is more important than ever,” Blekko says. “If you think about it, prior to the internet the newspaper was the offline equivalent of the search engine. It was the starting spot where people went to get their news, job listings, personals, movies listings, classifieds, etc. Competing editorial voices were a hallmark of competition in the newspaper business and readers benefited because they had choice.”

    They point to searches for “how to clean gutters” and “good table manners” to show off their new indexing.

    Blekko

    Then, they suggest trying out their 3 Engine Monte tool, which lets you compare results with Bing and Google.

    Blekko says it purposefully biases its index away from sites with low quality content, and that it’s source-based rather than link-based. “We purposefully bias our index away from sites with low quality content. “Regardless of how many people link to healthspamsite.com, we believe sites like the Mayo Clinic, NIH.gov, etc. are better sites. On blekko, brands trump links,” Blekko says.

    Blekko secured a $30 million round of funding in September, which CEO Rich Skrenta told WebProNews the company would be used to expand and grow the service through hiring, infrastructure ,marketing, etc.

  • SEMPO: FTC Shouldn’t Regulate Google and Other Search Engines

    As companies continue to call for regulation of Google, search industry organization SEMPO has come to the search engine’s defense. The Search Engine Marketing Professional Organization recently sent a letter to FTC Chairman Jon Leibowitz in an effort to explain why regulation is not a good idea.

    According to SEMPO President Chris Boggs, the organization, which is made up of thousands of marketing professionals across 50 countries, wrote the letter in response to its members’ concerns. SEMPO felt it needed to voice these concerns and explain why its members want an “open, free channel of communication.”

    Are you concerned that the government will change the Internet as we know it? Let us know your thoughts.

    “Overall, the reason that we felt we needed to send this letter was because we were concerned specifically that the U.S. government is investigating the search operations of Google,” said Boggs.

    “It’s not just because we felt we needed to go thank our sponsor, Google,” he added.

    The letter calls for a free market approach to the Internet with little or no regulation. It focuses on Google, however, in light of the anticompetitive claims that ShopCity and others have made lately.

    Although Boggs does not discredit ShopCity’s claims, he pointed that all businesses are subject to the same rules for both organic search and paid search. In addition, he believes the accusations would have a greater impact if the Google naysayers outnumbered the Google advocates. However, judging from SEMPO’s members, they do not.

    “I would want to see a much larger sample of companies saying that it’s unfair, versus the large sample that I’ve seen that have spoken to us and messaged to the board of SEMPO their pro-support of Google and the way they do handle business,” said Boggs.

    The letter makes clear that search engines were not intended to be regulated or subject to control from governments or commercial entities, saying:

    Search is not a government-run utility, established by law and thus subject to bureaucratic oversight. It is a service provided to consumers and businesses by companies, which have set up their operations using their own principles, proprietary technologies and algorithms. Each company is free to develop its own approach, fulfilling the needs of its customers as it perceives them.”

    It makes the case that a free market methodology is what made the Internet what it is today. As Boggs explained, this freedom has allowed the Internet to grow and produce platforms such as social networks.

    “If we hadn’t allowed the growth of Facebook and Twitter, and even some of its forbearers like MySpace, we would be nowhere close to where we are now in terms of the ability to communicate and reconnect with old friends on the Internet, for example, and also to perform business,” he said.

    In order to further this free market innovation, the SEMPO letter stated that the following 4 requirements were needed:

    1. Willingness of legislative and regulatory government entities worldwide to allow the evolution of the Internet in as unfettered a regulatory environment as possible

    2. Willingness of publishers and information owners to explore ways of sharing their valuable information with the search engines while not jeopardizing their revenue models

    3. Consumers feeling a level of trust with search engines sufficient to allow the search engines to personalize results for them, maintaining privacy settings at a level comfortable to them

    4. Understanding by marketers and advertisers that the search engines’ most valuable asset is the user, and therefore the search engines will often place the consumer experience above short-term financial gain

    Although SEMPO has not received a response from the FTC, Boggs said that the organization was willing to work with it to help it further understand how the search industry works.

    In terms of regulation, Boggs told us that he didn’t see any coming in 2012 but that he could see it happening in 2015 or 2016. If it does happen, Boggs said he hopes that it protects the innocent from potentially harmful content online instead of preventing a free Internet.

  • Bing’s New Holiday Marketing Campaign, Now With More Rudolph

    Recently Bing announced a new holiday marketing campaign for its Bing search engine.

    The new campaign finds Rudolph the Red-Nosed Reindeer, and his friends, taking center stage in four separate advertisements for Bing. Each advertisement was done in the same stop-motion puppet animation that was used in the original 1964 TV special, so it will have a nostalgic feel for most.

    Check out the advertisements below. Each video is accompanied with the official video description:

    “Bumble”, the more beloved name for the Abominable Snow Monster, features the lovable Bumble, who has lost his scary roar. He uses Bing to search for “scary monster” and once inspired by a few Bing Videos, perfects his roar to be appropriately scary again.

    Santa and Mrs. Claus get a little crazy in the workshop with bubble wrap, search on Bing Video for bubble wrap how-to’s, and hilarious antics ensue.

    Hermey, Yukon and Bumble are tired of the winter weather and are in dire need of a vacation. Bing Travel saves the day via a “fabulous island resort” search, highlighting Hermey’s friends who have “liked” specific resorts, leading to Bing Travel and the perfect destination.

    Yukon Cornelius becomes exhausted from pulling Hermey and Rudolph on his sled and collapses in the snow. Luckily, Bing helps him find the closest hot yoga studio using Bing Local.

    The holiday season is an opportune time for search engines, as this is when people do more searches – to find gifts, sales ads, party supplies, airfare, and literally anything else. With more people online, this means more ads that can be targeted and displayed by the search engine.

    What did you think of the Bing holiday centric advertisements? Tell us your thoughts in the comments.

  • Keep Search In Mind While Being Social

    Keep Search In Mind While Being Social

    As the good content theme continues unabated at BlogWorld, in one of today’s sessions, Duane Forrester, Senior Product Manager at Bing, urges the attendees to keep the search industry in mind when you’re out there promoting via Facebook and Twitter.

    Your audience may forget what you posted yesterday, but search engines don’t.

    While Forrester also discusses ways to be an authority with social media and what to consider when optimizing for search, the relationship between search and social media was his main thrust. Because of the immediacy of today’s communication methods, as well as the speed at which these social media posts are indexed into search engines, timeliness in your work matters.

    Trying to catch the tail of something that was popular/trending two days ago means your late and the impact of your quality content could be missed. Beyond that, however, Forrester provides some information on how social media influences the search industry. He mentions the fact that Facebook content is showing up in search engines, well, Bing, thanks to the Facebook/Bing partnership. To further his point, Forrester mentions people delay making purchasing decisions until someone from their social media circle — friends, family — offers their opinion.

    Thanks to the search industry’s embrace of social media, instead of waiting on a Twitter/Facebook update, if a potential customer finds a social media review when conducting a search for the product of interest, leverage has occurred. Granted, if the person searching doesn’t trust the review they find, then your product will probably miss out. That being said, it’s important to get that social content out, and keep the search engines in mind when doing so.

    From Forrester’s perspective, social media in search engine results help the trust quotient because of the personal nature of social media. Essentially, if you have people bragging about your product or the content surrounding it on Facebook and these posts show up in Bing’s results, there’s a good chance people will be more apt to trust your brand/product/content.

    Another point Forrester mentions is also important, although, you might think it’s common knowledge. But then again, if he’s talking about it to conference attendees, maybe it’s not: Make sure you add a link to when you tweet, and/or, post something useful. If people can’t navigate to the item/content they are enjoying, you have failed to convert. Miserably.

    From here, Forrester discussed the sheer amount of content available from the social industry, and seeing it on “paper” is truly staggering. According to his research:

    • 25 billion tweets sent on Twitter in 2010
    • 35 hours of video uploaded to YouTube every minute
    • 3,000 photos uploaded to Flickr every hour
    • 30 billion posts/month Facebook

    Yeah, that’s a lot of content, which plays into the previous session’s discussion of how hard it is for cream to rise. The problem is, there’s a lot of cream out there. With such an obscene amount of content from the social industry, leveraging it through search is an important feature, otherwise, many of these gems would be lost forever.

    As you can see, search is still a very important aspect of the overall marketing approach, even when social media is your target. Don’t forget about the people actively looking for products related to your focus.

  • SEOs Not Buying Google’s Privacy Motive for Encrypting Search

    Google caused quite a ruckus in the search marketing community after it announced some changes to search. Last week, the search giant said that it would begin encrypting logged-in searches that users do by default, when they are logged into Google.com. This further integration of a Secure Sockets Layer (SSL) will prevent search marketers from receiving referral data from the websites consumers click on from Google search results.

    What do you think of Google’s move to encrypt searches? We’d love to know.

    While this change is only supposed to affect a single digit percentage of referral data, many SEOs are not happy with the move and believe that Google has gone too far. Eric Enge, the Founder and President of Stone Temple Consulting, told us that he was completely “baffled” when he saw the news. Rebecca Lieb, the Digital Advertising and Media Analyst at the Altimeter Group, was also surprised by the move and called it “evil.”

    “I hate to say this about Google because they’re a company that I admire and like and respect, but I think this is evil,” she said.

    “Google is taking something away that is a very, very valuable tool for anybody practicing SEO,” Lieb added.

    Amanda Watlington, the Owner of Searching for Profit, also shared with us that she would not be able to give her clients as much value as she has in the past.

    “I have learned more from the referral data that comes into the that lets me benefit the user – I won’t have that data to mine, “ she said. “Personally, it will make it harder for me to (a) understand what the performance of my pages are and (b) to learn from my pages.”

    Google has said that it did this in order to make search more secure, but the SEO community doesn’t agree. Enge told us that he didn’t recall any outcry from privacy organizations in regards to search term data and, therefore, is not convinced that security was Google’s real motive. If this were the case, he thinks that Bing and Yahoo would have had to make changes as well.

    Others, including Amanda Watlington, think that Google did this for financial purposes. She told us that it was “all about the Benjamins.” Matt Van Wagner of Find Me Faster also said that he could see the search giant thinking this move would make its search engine look more attractive to shareholders since it could potentially push more people to use paid search – its primary revenue model.

    Lieb takes a slightly different approach and said that Google could have done this to appease regulators. What’s bad though, as she points out, is that most regulators don’t understand referral data and other aspects of Internet marketing.

    “I think Google may (It’s a theory – I can’t prove it) be throwing a bone to somebody on Capitol Hill with this move,” she said.

    Is Google making moves to try to improve its reputation with regulators? What do you think?

    Todd Friesen, the Director of SEO at Performics, agrees that Google made this move as part of a greater effort. He told us that Google frequently makes small moves and waits to see how everyone reacts before it pushes out its bigger plan.

    “Google doesn’t do anything on a whim,” he said. “They’re definitely thinking 5 and 10 years out.”

    “There’s definitely a bigger plan behind it, and it’s probably big and scary with teeth and claws,” he added.

    A big part of the reason why SEOs aren’t buying into the privacy theory is because the changes do not impact advertisers. This is ironic since consumers don’t typically complain about organic search data, but they are usually concerned about targeted advertising. It seems as though Google is saying that consumer information is important for advertisers to make money, but it turns into a consumer privacy issue when it relates to organic search results.

    “The fact that they’re keeping all this referrer data alive for advertisers is strongly, if not irrefutably, indicative that the money is not where the mouth is,” said Lieb.

    Friesen also said that it’s a “hypocritical standpoint” on Google’s part. If the motive is really about privacy, he doesn’t think that Google should be passing referrer to advertisers, or anyone for that matter.

    Another point that Lieb raised was that paid search could eventually take a hit from this move. If small businesses that are investing in organic search through Google are not able to get the data they need, she doesn’t think that they would want to pursue a paid search campaign with it either.

    “It’s certainly something that would make me, as an advertiser, almost inclined to go to Bing or Yahoo just because… just because this isn’t right,” added Lieb.

    Google maintains that this change is very small and that it will only impact a small percentage of searches. Matt Cutts also pushed this message on Twitter:

    @Sam_Robson I believe it will affect things based on the referrer, but only for a small percentage of searches (signed in on .com). 9 days ago via web · powered by @socialditto

    @Rhea And we’ll be rolling out slowly(weeks). We ran some tests before launch, and I don’t think anyone even noticed the change. @blafrance 9 days ago via web · powered by @socialditto

    The SEOs, however, are not convinced. There are so many unanswered questions that this move raises that one can’t help but wonder about the future of SEO. Watlington, for instance, told us that she could see Google monetizing the data going forward and that this move is the first step.

    “To me, the move to give it to an advertiser is a monetization of the data,” she said. “What additional monetization will be, I’m waiting to see.”

    Van Wagner told us that, since he primarily does paid search, he is glad that Google didn’t include advertisers at this point. But, this move could result in more competition in paid search, which is not something is in favor of either.

    The biggest concern is the fact that no one knows what is next. Lieb told us that if Google does decide to roll this out further, SEO could really be in danger.

    “People have a right to be upset about this because, even if it’s only 10 percent now, or only 15 percent now, it could get more dire,” she said.

    Watlington believes that search marketers may have to rethink what they do moving forward. She even said that they might have to “look away from search” and focus more on traditional marketing. At this point, Google is the primary search player and everything it does directly impacts search marketers, which, according to Watlington, does not indicate a promising future for search marketing.

    “We have one very large player, a monopolistically-sized player… holding enough of the cards,” she said. “That’s not exactly what I call a real long-term strategy because whatever that player does, it impacts us.”

    Friesen, on the other hand, doesn’t really think that this impacts what SEOs do. He thinks that the process of how they track and report on it changes but said that the job of an SEO doesn’t actually change.

    “What, unfortunately, it does is drives us back to rank checking as a more important metric,” he explained.

    He does admit that the SEO industry could be more heavily impacted if Google makes a further move in this area.

    “At this point, it’s less than 5 percent… but if it starts to climb, then we get into a reporting issue,” said Friesen. “We get back to the ‘SEO is black magic voodoo stuff.’”

    Incidentally, a petition called Keyword Transparency has been created that hopes to get Google to reverse this action. The “About” section on the site says:

    This petition has been created to show Google the level of dissatisfaction over their recent changes to keyword referral information, and will be presented to the search quality and analytics teams at Google.

    The argument that this has been done for privacy reasons sadly holds little weight, and the move essentially turns the clock back in terms of data transparency.

    The argument that this only affects <10% of users is also concerning as this is likely to increase over time, even up to a point where it affects the majority of users being referred from search.

    At this point, there are over 1,000 signatures on the petition.

    Is Google’s move to encrypt searches just the first of many? And if so, is the future of SEO in question? Let us know your thoughts in the comments.

  • Meineke Gets Effective Twitter Marketing

    Meineke Gets Effective Twitter Marketing

    Currently, there’s a silly trend populating Twitter feeds everywhere, one that uses the #ItsOkayToCheatif hashtag. The tweets responding to the trend are primarily from bored kids who are trying to sound deep and compelling, or maybe even humorous. Ultimately, it’s a throwaway trend that will fall by the wayside in an hour or so.

    But, one car care company in particular provided us with a valuable lesson on how to capitalize on even the most innocuous Twitter trend in order to promote their business. The company in question is Meineke, and instead of adding yet another throwaway tweet about when it’s acceptable to cheat, their social media manager demonstrated a nifty way to use these silly trends to your business’ advantage.

    Take a look:

    #itsokaytocheatif on changing your oil if you want your engine to get destroyed. Bring this coupon and keep it honest http://t.co/kjRZXesU 19 minutes ago via Social Office Suite · powered by @socialditto

    When the link is clicked, visitors are taken to a Meineke coupon page that features a Twitter special coupon:

    Twitter Coupon

    In fact, the URL of the linked page is revealing in and of itself:

    http://www.meineke.com/twitter/

    And that, folks, is the correct way to capitalize on a Twitter trend, regardless of how silly it may or may not be. Granted, something like this would obviously not be appropriate if it was done in an insensitive manner, like, say during the outpouring of tweets concerning the Japanese earthquakes or the tornadoes that trashed various cities around the United States earlier this year.

    However, if it’s a throwaway trend we’re talking about, something the #ItsOkayToCheatif trend clearly is, then tailoring a promotional tweet to fit such a trend is a great way to take advantage of all that Twitter noise. Now, can a coupon page actually be considered a signal to be acknowledged over the normal Twitter noise? Perhaps not, but it’s still an effective way to leverage something all the Twitter users are discussing.

    Hey, even bored kids with not much to do need oil changes too, or at least, their friends who have cars do.

  • A Customer-centric Content Marketing Approach

    A Customer-centric Content Marketing Approach

    The pressure of competition and desire for business growth pushes marketers towards tactics that promise quick wins. Pundits advocate strategy first (been there) but doing so in a comprehensive way isn’t always practical, especially when it comes to areas like social media and content marketing.

    For marketers in need of practical advice on customer-centric, practical content marketing, a solid framework can be invaluable for an adaptive approach that is thoughtful about overall direction and measurable short term impact at the same time.

    An increasing number of Search Engine Marketers are advocating both Content Marketing and Social Media in concert with achieving SEO objectives which is a great sign, but often lacking a customer-centric approach.

    Here’s a Content Marketing framework that proves to be customer-centric as well as SEO and Social Media savvy that I think any smart online marketer can follow.  Keep in mind, with a holistic approach, this 4 part framework can be applied to any type of online content that a company produces: HR, Customer Service, Public Relations, etc.

    Optimize - Content Marketing Optimization

    I talked about this approach at Content Marketing World recently and will be elaborating on it at several future events as well. Of course I drill down even deeper in “Optimize“.  But since that book won’t be out until the first part of next year, here is a bit of an elaboration.

    Customers – Optimize for keywords or optimize of customers? It may be semantics and it’s certainly not a mutually exclusive situation with customer segments and individual search keywords. Many online marketers focus on keywords that are popular and relevant to products and services without ever considering things like customer pain points, behaviors and position within the buying cycle and how that manifests as a search query.

    Content Marketers organize their campaigns according to customer needs and how to influence those customers to buy. Add keyword optimization (SEO) to that mix and you have a very powerful combination.

    • Identify customer segments – What do they care about? What is their context?
    • Document pain points & information needs during buying cycle.
    • Build a path of content including triggers that inspire purchase and social sharing.

    Keywords – As you understand the language of your customer, the opportunity to optimize content for search “findability” becomes very important. What better place to connect with customers than at the moment they proactively seek a solution? Build relevant keywords according to customer interests into a content creation plan with key messages and you’ll be one step closer to “relevant ubiquity” .

    Besides search keywords, it’s worth considering social topics. The interplay between searching and social referrals is becoming more standard as buyers navigate information resources online.

    • Brainstorm and research keywords with tools like Google AdWords Keyword Tool, Wordtracker and Ubersuggest.
    • Tap into social media monitoring tools to gauge what topics cluster together on social networks, blogs and Twitter, relevant to your search keywords.
    • Organize search keywords and social topics into a keyword glossary shared with anyone in your company that creates online content.

    “Content – is King and Creativity is Queen”, according to Pan Didner of Intel. I happen to agree. Content Marketing is growing and soon “everybody will be doing it” but certainly not doing it well. Through a combination of keen customer insight, analytics and smart creativity, online marketers can stand out amongst the 27 million pieces of content shared in the U.S. each day or the 5 Exabytes of information created every 2 days around the world.

    Keywords and topics can fuel a Content Plan that provides a calendar of planned content publishing, topics, optimization focus, promotion channels and planned repurposing. Allow for wildcards and spontaneous content creation according to real-time opportunities and current events.

    • Plan content according to customer segments, keyword topics and business services/product offering.
    • Leverage search keywords for content optimization on the website, blog and on social media sites.
    • Create modular content that can serve its purpose individually, as part of a matrix of topics and as repurposed content in the future.

    Optimize & Socialize – Armed with customer insight, a keyword glossary and a content plan, it’s time for those Social SEO smarts to see some action.  With content staff and social media teams trained on SEO best practices, new content will be easier for prospects and customers to find – when it matters. They’re looking for it!   Monitoring search analytics for refinement of on-page optimization helps keep your investment in optimized search and social content high impact and current.

    In today’s online marketing world, there is no “Optimize” without a smart dose of “Socialize”.  Social network development and content promotion is essential to inspire sharing, traffic and links. Social links and web page links to your content provide a powerful combination for search engines to use when finding and ranking helpful information that leads your customers to buy and share.

    • Train copywriting and social media staff on keyword glossaries and SEO best practices. Keep social topics up to date!
    • Optimize web and social content on and off the corporate websites while engaging and growing social networks.
    • Create, optimize and share useful content that will inspire customers to buy and share with their social friends.

    The particular strategy, goals and methods of measurement will vary according to your situation of course, but as I mentioned above, this framework is applicable to any area of online content that a company might be publishing: Marketing, Sales, Customer Service, Human Resources, Public and Media Relations.

    Have you seen examples of companies doing a great job of going from basic SEO to more robust content marketing optimization? Have you implemented or observed some great examples of “optimize and socialize”?

    Check out Top Rank Blog for more articles by Lee Odden

  • Adobe Adds Keyword Performance Predictive Analytics to SearchCenter+

    Adobe Adds Keyword Performance Predictive Analytics to SearchCenter+

    Adobe has partnered with OptiMine Software to add keyword performance predictive analytics to its search marketing tool, SearchCenter+. The new feature is designed to help search marketers analyze unique characteristics of individual keywords and increase ROI.

    Tim Waddell, director of product marketing for Adobe’s Omniture Business Unit tells WebProNews, “The relationship is built on Adobe SearchCenter+’s alignment of engine, traffic and conversion data with OptiMine’s powerful keyword level algorithm which provides customers with new options for automated bid management.”

    “Adobe will pass 12 months of keyword level data along with an ongoing daily feed to provide the algorithm with historical performance data to determine seasonality or other trends combined with the recent performance,” he adds. “We then define at a keyword level which variables are most important and match one of twenty four (24) different statistical models to define the optimal bid for the next day… Thus predicting optimal paid search performance.”

    When asked what sets this apart from other similar tools, Waddell says, “The native data captured by Adobe SearchCenter+ aligned with OptiMines keyword level algorithm is a powerful combination. This option also gives our customers choices in how they want to manage their campaigns – we think the ability to manage high-touch head terms with our existing SearchCenter+ bid solution, which gives marketers total control and flexibility to bring in the human element along with the powerful automation OptiMine delivers is a great combination.”

    “Customers can choose to manage their keywords any way that suits their needs,” he adds.

    As far as cost to the user, Adobe SearchCenter+ is sold on a percentage of ad spend, and Waddell says the new functionality will follow the same model.

  • ISPs Hijack Users’ Searches, Apparently to Monetize Them

    Internet Service Providers are hijacking their users’ search queries on major search engines like Google, Bing, and Yahoo, and directing them to third-party proxies.

    This news was revealed in an article by Jim Giles at New Scientist, who explains: “The hijacking seems to target searches for certain well-known brand names only. Users entering the term “apple” into their browser’s search bar, for example, would normally get a page of results from their search engine of choice. The ISPs involved in the scheme intercept such requests before they reach a search engine, however. They pass the search to an online marketing company, which directs the user straight to Apple’s online retail website.”

    He says patents filed by Paxfire, a company involved in the hijacking, indicate the whole thing might be part of “a larger plan to allow ISPs to generate revenue by tracking the sites their customers visit,” and that “it may also be illegal.”

    A class action suit has already been filed by New York law firms Reese Richman and Milberg.

    ICSI researchers Christian Kreibich, Nicholas Weaver and Vern Paxson, with Peter Eckersley posted on the Electronic Frontier Foundation’s site:

    In short, the purpose appears to be monetization of users’ searches. ICSI Networking’s investigation has revealed that Paxfire’s HTTP proxies selectively siphon search requests out of the proxied traffic flows and redirect them through one or more affiliate marketing programs, presumably resulting in commission payments to Paxfire and the ISPs involved. The affiliate programs involved include Commission Junction, the Google Affiliate Network, LinkShare, and Ask.com. When looking up brand names such as “apple”, “dell”, “groupon”, and “wsj”, the affiliate programs direct the queries to the corresponding brands’ websites or to search assistance pages instead of providing the intended search engine results page.

    The ISPs that are redirecting search queries, according to New Scientist, are: Cavalier, Cincinnati Bell, Cogent, Frontier, Hughes, IBBS, Insight Broadband, Megapath, Paetec, RCN, Wide Open West, and XO Communication. Charter and Iowa Telecom, the publication says, were also doing it, but have stopped.

    On Google+, Google’s Matt Cutts wrote, “More than ten U.S. Internet Service Providers (ISPs) have apparently been caught hijacking search sessions. Crazy….To protect yourself against this, you can search Google via SSL search at https://encrypted.google.com . It might also help to change your DNS provider. Google has a Public DNS service:http://code.google.com/speed/public-dns/ and OpenDNS has one too.”

    Wow. 10+ ISPs have been proxying search sessions, and sometimes hijacking them, possibly for profit: http://t.co/cYh6gc4 1 hour ago via Tweet Button · powered by @socialditto

    SEOs sure have their work cut out for them these days.

  • FairSearch Accuses Google of “Grossly” Exaggerating Contribution to Economy

    FairSearch Accuses Google of “Grossly” Exaggerating Contribution to Economy

    The FairSearch Coalition released a report today, which it calls “an independent critical analysis of reports released by Google about its purported 2009 and 2010 economic impact in the U.S.”

    Economist Allen Rosenfeld was commissioned by FairSearch to review Google’s reports, and compare them against economic literature and standard practices, the organization tells WebProNews, adding that he retained full editorial control over his conclusions, findings, and opinions expressed in the report.

    “Rosenfeld concluded that Google overestimated its potential economic impact by more than 100x, and did not even take into account the negative economic effects (higher advertising costs that are passed on to consumers) of one company dominating a market to the extent that Google does (more than 70% of search in U.S., a higher % of search ad revenue; and more than 90% of search in most of Europe),” a representative for FairSearch said. “In Rosenfeld’s opinion, if that impact were to be taken into account, Google’s net economic impact could be negative.”



    “As the FTC and other antitrust enforcers scrutinize Google’s business, and whether it abuses its dominance and how that harms consumers, innovation and competition, it’s important to look closely at any claims by Google that it is after anything other than inflating its own profits at everyone else’s expense,” they added.



    Robert Birge, Chief Marketing Officer of KAYAK (a founding member of FairSearch.org), offered the following statement: “The Google tax isn’t obvious, but it’s very real. As Google has become the starting place for most people on the Internet, companies large and small have to pay a toll in order to just show up on the consideration list. These costs are real, they are substantial, and they get passed along to the end user even if they are hidden.”

    Under the “main findings and conclusions” section of the report, Rosenfeld writes:

  • Google’s claims about its contribution to the U.S. economy are grossly exaggerated; can deceive policy makers, news media, and the public; and should not be trusted.
  • Google’s overestimate was at least 100 times the value of the actual contribution of its search engine. Google takes credit for economic activity that is mostly generated by other economic agents. In reality, the contribution of Google’s search engine to the economy is very small, amounting to at most only 1% of the overestimated economic contribution claimed by Google in its reports.
  • Google’s net impact on the economy could well be negative after accounting for the impacts of its dominance and market power. Google has consistently generated percent net (profit) margins that are between 4 and 8 times the U.S. corporate average, indicating that advertisers’ costs are likely higher than they would be in a competitive market environment.
  • Google’s misleading claims were largely the result of fatally flawed, inaccurate assumptions. Google’s analysis contradicted economic logic, did not take into account obvious costs of doing business, ignored the results of previous empirical economic studies, and failed to consider negative economic impacts of the company’s market dominance.
  • The full report by Rosenfeld can be found here (pdf). We’ve reached out to Google for comment on this. We’ll update accordingly.

    Earlier this month, FairSearch launched a site called “Searchville,” dedicated to blasting Google’s business practices.

  • New .XXX Porn Sites To Get Their Own Search Engine

    A few months ago, before the Internet Corporation for Assigned Names and Numbers (ICANN) voted to allow people to register any domain they like, they made news by approving the .XXX top-level domain.

    In a split vote, the ICANN voted to give porn sites a new home on the web. .XXX saw a long and hard battle to become a reality. It was first proposed in 2000, resubmitted in 2004, approved and then rejected in 2006, looked over in 2007 and finally approved in 2010. With each proposal, .XXX was met with opposition from conservative groups like the Family Research Council. Initially, the porn industry opposed the new domain as well, saying that it would put their industry at risk for regulation and censorship.

    The initial proposition for the .XXX domain was made by ICM Registry, a registry operator sponsored by the International Foundation for Online Responsibility. When the .XXX domain was finally approved this year, ICM won the opportunity to manage it.

    The new domain will be available starting on September 7th. They will cost roughly $75 a year, a rate much higher than most .com domains. It has been reported that during the initial registering period, the .XXX domains will cost much more, however, possibly even $650.

    Now, according to the Register, .XXX sites will be getting their own search engine.

    They talked to ICM Registry President Stuart Lawley who confirmed that the new destination, search.XXX, will index all of the porn found at .XXX addresses. Initially, the search engine will draw from about a dozen “premium” .XXX sites. It will be aided by ads and sponsorship, according to Lawley.

    Of course, major search engines like Google and Bing will also index the new .xxx sites, but history has shown us that porn and Google isn’t always a match made in heaven.

    Here’s what ICM has to say about the benefits of the .XXX domains –

    As a trusted brand, customer confidence will be very high resulting in more traffic, greater repeat traffic, and, perhaps most importantly, greater conversion into paying customers. Holders of .XXX domains will also benefit from global marketing campaigns and greater awareness in the mainstream world. Additionally ICM is developing a traffic generation search portal for .XXX sites that will be promoted internationally leading to immediate new traffic to .XXX domains for new registrants.

    The adult industry isn’t convinced. According to the Register –

    Existing porn companies with large portfolios of domains in other extensions are concerned that they will be forced to spend thousands on defensive registrations or risk being cybersquatted.

    Some porn publishers are also worried that .xxx domains carry the risk that ICM’s policy-setting body, IFFOR (International Foundation for Online Responsibility) may create draconian new rules that will damage their businesses in future.

    ICM Registry is now in the process of finding “technology partners” for its upcoming search engine.

  • Keywords and Content Marketing

    Keywords and Content Marketing

    I recently had an interesting discussion with Ron Jones who is writing a book specifically on using keywords for online marketing called “Keyword Intelligence“. He was researching for the content marketing portion of the book and we talked about where keywords fit. These kinds of discussions are great for blog posts so here are a few ideas for you on keywords, SEO, Social Media and content.

    Content marketing is customer centric and therefore often focused not only on creating information to educate prospects and customers about product/service features and benefits, but also about topics of interest relevant to the situations that cause people to need or want those products and services.

    Effective content marketing informs prospective buyers of what they need to know in order to help them arrive at a logical conclusion to buy and recommend. Relevant and engaging content facilitates that outcome.

    “Great content isn’t great until it’s discovered and shared.”

    Understanding the information needs of the customers you’re trying to reach is the first step in creating a great editorial plan. The role of keywords in a content marketing program come into play as a manifestation of knowing what customers are interested in and what their pain points are. What are they searching for? What are they talking about on the social web?

    Great content is best optimized, so to speak, for the intended reader first and foremost. At the same time, that content is thoughtful about keywords that can attract new readers through search and social recommendations. Great content is amazing. Great content that is findable and shareable is even better.

    Here’s an Example Scenario:  Company 1 2 3 wants to focus on “Round Widgets”

    • Target Customers Care About Round Widgets That Cost Less and are Environmentally Safe
    • Target Customers Search for “round widgets”, “low cost widgets”, “green widgets”, “environmentally safe widgets”
    • Target Customers Socially Discuss “save money on widgets”, “widget impact on the environment”
    • The Content Plan Outlines An Array of Content Objects Supporting Search Keywords & Social Topics
    • Content Plan Tactical Execution: Blog Hub, Video Tips, Shared Customer Widget Photos, Facebook Page for Widget Environmental Tips, Email Tips & Issues Newsletter, Widget Deals Twitter Account, Guest Blog Posts Using Target Keywords on Widget Blogs, Contributed Articles to Consumer & Environmental Publications on Widget Cost Saving Tips and Being “Green”

    By coordinating customer needs with content creation, optimization and social publishing, there’s a much greater and more relevant reach for the investment.

    Keywords guide content optimization for findability through search engines as well as a focus on topics that customers care about and are discussing on the social web. Keywords are also useful guides for the blogger and publication outreach.

    Keywords drive the “optimize and socialize” efforts of content marketers to share, promote and increase the reach of information that is relevant for customers who may buy or refer brand products and services.

    The mistake online marketers often make is to solely lead with keywords (vs. customer needs) thinking that optimizing for the most popular phrases are all that is needed to maximize customer reach. High ranking content that doesn’t resonate with readers to share or with customers to buy and refer isn’t an effective approach. Also, customer information needs will vary according to where they are in the research and buying process.

    Keywords and topics change over time so even after a customer is acquired, it’s important to monitor, measure and refine as needed.

    My question for you: Are your content marketing and optimization efforts focused solely on high popularity count keywords? Are you digging into both search keywords and social topics as you formulate your content marketing strategy?

    Originally published at TopRank Online Marketing Blog

  • Search Engine Patents and Panda

    Search Engine Patents and Panda

    Bill Slawski is the president and founder of SEO by the Sea, and has been engaging in professional SEO and internet marketing consulting since 1996. With a Bachelor of Arts Degree in English from the University of Delaware, and a Juris Doctor Degree from Widener University School of Law, Bill worked for the highest level trial Court in Delaware for 14 years as a court manager and administrator, and as a technologist/management analyst. While working for the Court, Bill also began to build and promote web pages, and became a full time SEO in 2005. Working on a wide range of sites, from Fortune 500 to small business pages, Bill also blogs about search engine patents and white papers on his seobythesea.com blog.

    What are the Most Likely Signals Used by Panda?

    Eric Enge: Let’s chat about some of the patents that might be playing a role in Panda 1, 2, 3, 4, 5, 6, 7 and beyond. I would like to get your thoughts on what signals are used for measuring either content quality or user engagement.

    Bill Slawski: I’ve been looking at sites impacted by Panda. I started from the beginning with remedial SEO. I went through the sites, crawled through them, looked for duplicate content issues within the same domain, looked for things that shouldn’t be indexed that were, and went through the basic list that Google provides in their Webmaster Tools area.

    In the Wired interview with Amit Singhal and Matt Cutts regarding this update, they mentioned an engineer named Panda. I found his name on the list of papers written by Googlers and read through his material. I also found three other tool and systems engineers named Panda, and another engineer who writes about information retrieval and architecture. I concluded that the Panda in question was the person who worked on the PLANET paper (more on this below).

    For signals regarding quality, we can look to the lists of questions from Google. For example, Does your web site read like a magazine? Would people trust you with their credit card? There are many things on a web site that might indicate quality and make the page seem more credible and trustworthy and lead the search engine to believe it was written by someone who has more expertise.

    The way things tend to be presented on pages, for instance where eight blocks are shown, may or may not be signals. If we look at the PLANET whitepaper “Massively Parallel Learning of Tree Ensembles with MapReduce” its focus isn’t so much on reviewing signals with quality or even user feedback but, rather, how Google is able to take a machine learning process dealing with decision trees and scale it up to use multiple computers at the same time. They could put many things in memory and compare one page against another to see if certain features and signals appear upon those pages.

    Eric Enge: So, the PLANET whitepaper described how to take a process, which before was constrained to a one computer machine learning process, and put it into a distributed environment to gain substantially more power. Is that a fair assessment?

    Bill Slawski: That would be a fair assessment. It would use the Google file system and Google’s MapReduce. It would enable them to draw many things into memory to compare to each other and change multiple variables at the same time. For example, a regression model type approach.

    Something that may have been extremely hard to use on a very large dataset becomes much easier when it can scale. It’s important to think about what shows up on your web page as a signal of quality.

    It’s possible that their approach is to manually identify pages that have quality, content quality, presentation, and so on and use those as a seed set to use with the machine learning process. To identify other pages, and how well they may rank in terms of these different features, makes it harder for us to determine expressly which signals the search engines are looking for.

    If they are following this PLANET-type approach in Panda with the machine learning, there may be other things mixed in. It is hard to tell. Google may not have solely used this approach. They may have tightened up phrase-based indexing and made that stronger in a way that helps rank and re-rank search results.

    Panda may be a filter on top of those where some web sites are promoted and other web sites are demoted based upon some type of quality signal score.

    It appears that Panda is a re-ranking approach. It’s not a replacement for relevance and Page Rank and the two hundred plus signals we are used to hearing about from Google. It may be a filter on top of those where some web sites are promoted and other web sites are demoted based upon some type of quality signal score.

    Eric Enge: That’s my sense of it also. Google uses the term classifier so you could imagine, either before running the basic algorithm or after, it is similar to a scale or a factor up or down.

    Bill Slawski: Right. That’s what it seems like.

    Page Features as an Indicator of Quality

    Eric Enge: You shared another whitepaper with me which dealt with sponsored search. Does that whitepaper add any insight into Panda? The PLANET paper followed up on an earlier paper on sponsored search which covered predicting bounce rates on ads. It Looked at the landing pages those ads brought you to based upon features found on the landing pages.

    They used this approach to identify those features and then determined which ones were higher quality based upon their feature collection. Then they could look at user feedback, such as bounce rates, to see how well they succeeded or failed. This may lead to metrics such as the percentage of the page above the fold which has advertising on it.

    Bill Slawski: Now you are talking about landing pages so many advertisers may direct someone to an actual page where they can conduct a transaction. They may bring them to an informational page, or an informational light page, that may not be as concerned with SEO as it is with calls to action, signals of reassurance using different logos, and symbols that you would get from the security statistical agencies.

    That set of signals is most likely different from what you would find on a page that was built for the general public or for search engines. However, if you go back to the original PLANET page they said, “this is sort of our proof of concept, this sponsored search thing. If it works with that it can work well with other very large datasets in places like organic search.”

    Eric Enge: So, you may use bounce rate directly as a ranking signal but when you have newer information to deal with why not predict it instead?

    Bill Slawski: Right. If you can take a number of features out of a page and use them in a way that gives them a score, and if the score can match up with bounce rate and other user engagement signals, chances are a feature-based approach isn’t a bad one to take. Also, you can use the user behavior data as a feedback mechanism to make sure you are doing well.

    Eric Enge: So, you are using the actual user data as a validator rather than a signal. That’s interesting.

    Bill Slawski: Right. You could do the same thing with organic search which, to a degree, they did that with blocked pages signal. This is where 85% of pages that were blocked were also pages that had lower quality scores. You can also look at other signals, for example, long clicks.

    Eric Enge: Long clicks, what’s that?

    Bill Slawski: I dislike the term bounce rate because it, by itself, doesn’t conclusively infer that someone visits the page and then leaves in under a few seconds. It implies that someone goes to a page, looks at it, spends time on it, and then leaves without going somewhere else. A long click is when you go to a page and you actually spend time there.

    Eric Enge: Although, you don’t know whether or not they spent time there because they had to deal with a phone call.

    Bill Slawski: Or, they opened something else up in a new tab and didn’t look at it for a while. There are other things that could measure this and ways to confirm agreement with it, such as how far someone scrolls that page.

    Eric Enge: Or, if they print the page.

    Bill Slawski: And clicks at the bottom of the page.

    Eric Enge: Or clicks on some other element. Could you track cursor movements?

    Bill Slawski: There have been a couple patents, even some from Google, on tracking cursor movements that they may possibly use someday. These could give them an indication of how relevant something may, or may not, be to a particular query.

    One patent is described as being used on a search results page, and it shows where someone hovers for a certain amount of time. If it’s a search result, you see if they hover over a one-box result which may give them an incentive to continue showing particular types of one-box results. That’s a possibility, mouse pointer tracking.

    Bounce Rates and Other User Behavior Signals

    Eric Enge: Getting back to the second whitepaper, what about using the actual ad bounce rate directly as a signal because that’s also potentially validating a signal either way?

    Bill Slawski: It’s not necessarily a bad idea.

    Eric Enge: Or low click through rates, right?

    Bill Slawski: As we said, user signals sometimes tend to be noisy. We don’t know why someone might stay on one page longer than others. We don’t know if they received a phone call, if they opened it up in a new tab, if they are showing someone else and have to wait for the person, or there are plenty of other reasons.

    You could possibly collect different user behavior signals even though they may be noisy and may not be an accurate reflection of someone’s interest. You could also take another approach and use the user behavior signals as feedback. To see how your methods are working, you have the option to have a wider range of different types of data to check against each other.

    Rather than having noisy user data be the main driver for your ranking… you look at the way content is presented on the page.

    Bill Slawski: That’s not a bad approach. Rather than have noisy user data be the main driver for your rankings, you find another method that looks at the way content is presented on a page. One area is segmentation of a page which identifies different sections of a page by looking at features that appear within those sections or blocks, and which area is the main content part of a page. It’s the part that uses full sentences, or sometimes sentence fragments, uses periods and traumas, capital letters at the beginning of lines or text. You use a Visual Gap Segmentation (White Space) type process to identify what might be an ad, what might be navigation, where things might be such as main content areas or a footer section. You look for features in sections.

    For instance, a footer section is going to contain a copyright notice and being able to segment a page like that will help you look for other signals of quality. For example, if an advertisement appears immediately after the first paragraph of the main content area you may say, “well, that’s sort of intrusive.” If one or two ads take up much of the main space, that aspect of the page may lead to a lower quality score.

    How the Search Engines Look at a Page

    Eric Enge: I understand how features may impact the search engine’s perception of a page’s quality, but that presumes they can unravel the CSS to figure out where things are really appearing.

    Bill Slawski: Microsoft has been writing white papers and patents on the topic of Visual Gaps Segmentation since 2003. Google had a patent called “Determining semantically distinct regions of a document” involving local search where they could identify blocks of text reviews for restaurants or other places that may be separated.

    For example, you have New York, a village voice article about restaurants in Greenwich Village, and it has ten paragraphs about ten different restaurants, starts with the name of the restaurant in each paragraph, and ends with the address, and in between is review.

    This patent said, “we can take that page, segment those reviews, and identify them with each of the individual restaurants,” and then two or three paragraphs sets they say, “we can also use the segmentation process in other ways like identifying different sections of a page, main content, a header, a footer, or so on.” Google was granted a patent on a more detailed page segmentation process about a month ago.

    Bill Slawski: Segmentation is probably part of this quality review, being able to identify and understand different parts of pages. They don’t just look at CSS. In the days where tables were used a lot you had the old table trick.

    You moved the content up and, depending on how you arranged a table, you could use absolute positioning. With CSS you can do the same type of thing, but the search engine is going to use some type of simulated browser. It doesn’t render a page completely, but it helps them give an idea if they look at the DOM (Document Object Model) model of a page.

    They look at some simulation of how the page will render, like an idea of where white space is, where HR tags might be throwing lines on the page, and so on. They can get a sense of what appears where, how they are separated, and then try to understand what each of those blocks does based upon linguistic-based features involving those blocks.

    Is it a set of multiple single word things that have links attached to them? For instance, each one is capitalized that might be main navigation. So, you can break up a page like that, you can look at where things appear. That could be a signal, a quality signal. You can see how they are arranged.

    The Search Engines Understand That There Are Different Types of Sites

    Eric Enge: Does the type of site matter?

    Bill Slawski: Most likely there is some categorization of types of sites so you are not looking at the same type of quality signals on the front page of a newspaper as you are on the front page of a blog or an ecommerce site.

    You can have different types of things printed on those different places. You are not going to get a TRUSTe badge on a blog, but you might on an ecommerce site. You look at the different features and realize that different genres, different types of sites, may have different ones associated with them.

    Eric Enge: Yes.

    Bill Slawski: That may have been derived when these seed quality sites were selected. There may have been some preprocessing to identify different aspects such as ecommerce site, labels, blog labels, and other things so whatever machine learning system they used could make distinctions between types of pages and see different types of features with them.

    It’s called a Decision Tree Process, and this process would look at a page and say, “is this a blog, yes or no? Is this a new site, yes or no?” It crawls along different pathways and asks questions to go crawl over that vital score.

    Eric Enge: Other things you can look at are markers of quality, such as spelling errors on the page. I think Zappos, if I remember correctly, is currently editing all their reviews because they’ve learned that spelling errors and grammar affect conversion. So, that’s a clear signal they could potentially use, and the number of broken links is another.

    Another area that’s interesting is when you come to a page and it is long block of text. There may be a picture on top, but that’s probably a good predictor of a high bounce rate. If it is a research paper, that’s one thing, but if it is a news article that is something else.

    Bill Slawski: Or, if it’s the Declaration of Independence.

    Eric Enge: Right, but they can handle that segmentation. If someone is looking for a new pair of shoes, and they come to a page with ten paragraphs of text and a couple of buttons to buy shoes, that’s a good predictor of a high bounce rate.

    Bill Slawski: On the other hand, if you have a page where there is a H1 header and a main heading at the top of the page, a couple of subheadings, a list, and some pictures that all appear to be meaningful to the content of the page, that would be a well-constructed article. It’s readable for the web, it’s easy to scan and it’s easy to locate different sections of the page that identify different concepts. This may make the page more interesting, more engaging, and keep people on a page longer.

    So, do these features translate to the type of user behavior where someone will be more engaged with the page and spend more time on it? Chances are, in many cases, they will.

    User Engagment Signals as a Validator

    Eric Enge: Another concept is user engagement signals standing by themselves may be noisy but ten of them collectively probably won’t be noisy. You could take ten noisy signals and if eight of them point in the same direction, then you’ve got a signal.

    Bill Slawski: They reinforce each other in a positive manner.

    Eric Enge: Then you are beginning to get something which is no longer a noisy signal.

    Bill Slawski: Right. For example, if you have a warehouse full of people, in an isolated area, printing out multiple copies of the same document over and over and over, because they think printing a document is a user behavior signal that the search engine might notice, you are wasting a lot of paper and a lot of time.

    In isolation that is going to look odd, it’s going to be an unusual pattern. The search engine is going to say, “someone is trying to do something they shouldn’t be doing.”

    Eric Enge: Yes. That can become a direct negative flag, and you must be careful because your competitor could do it to you. So, the ballgame seems to go on. What about misleading information which was covered by a Microsoft white paper?

    Bill Slawski: That was about concepts involving web credibility that Microsoft attempted to identify. It involved both on-site factors and off-site factors, and a third category, called aggregated information, which was the user behavior data they collected about pages. If you had on-site factors such as security certificates, logos, and certain other features, that would tend to make you look more credible. The emphasis is more on credibility than quality. It seems that the search engines are equating credibility with quality to a degree.

    Bill Slawski: The AIRWeb Conference, which was held five years in a row but not held last year, was held again this year. It covered adversarial information retrieval on the web in conjunction with another workshop on credibility. They called it the 2010 Web Quality Conference and it was shared by people from Google, Microsoft, Yahoo and a number of academic participants.

    Design actually plays a very important part, maybe bigger than most people would assume when it comes to people assessing whether or not this site is credible or not.

    You can go back a number of years to the Stanford persuasive technologies laboratory’s research and work on credibility. One of the findings stated, on a study of five thousand web sites or so, that design plays an important part, maybe bigger than most people would assume, when it comes to people assessing whether or not this site is credible or not.

    They also came out with a series of guidelines that said certain things that will make your web site appear more credible to people. It included photographs of people behind the site, explicitly showing an address, having privacy policy or ‘about us’ page, or terms of service. These are on-page signals you could look at.

    There are many off-page signals you could look at such as winning a Webby Award, being recognized in other places, being cited on authoritative type sites, or even page rank which they said they would consider as a signal to determine whether or not a page was a quality page. In the Microsoft paper they said they will look at page rank, which was interesting.

    Populating Useful Information Among Related Web Pages

    Eric Enge: Then you have the notion of brand searchers. If people are searching for your brand, that’s a clear signal. If you have a no-name web site and there are no searches for the web site name or the owner’s company name.

    Bill Slawski: That stirs up a whole different kettle of fish, and it leads to how do you determine whether or not a page is an authority page. For instance, Google decides, when somebody types ESPN into their search box on the toolbar, the ESPN web site should be the first one to come up. It doesn’t matter much what follows it. If they type Hilton but it goes into the topic of data the search engines identify as named entities, or specific people, and places ; how do they then associate those with particular query terms, and if those query terms are searched for how do they treat them?

    Do they look at it as a navigational query and ensure the site they associated with it comes up? Do they imply site search and show four, five, six, seven different results from that web site in the top ten which Google had been doing for a good amount of time?

    Eric Enge: Even for a non-brand search, for instance, Google surely associates Zappos with shoes. Right? So, in the presence of the authority, compared to some other new shoe site, you could reference the fact that the brand name Zappos is searched a bunch and that could be a direct authority signal for any search on the topic of shoes.

    Bill Slawski: Right. Let us discuss a different patent from Google that explores that and goes into it in more detail. There was one published in 2007 that I wrote about called “Populating useful information among related web pages.” It talks about how Google determines which web site might be associated with a particular query and might be identified as authoritative of it.

    In some ways, it echoes some of the things in the Microsoft paper about misinformation about authority. It not only looks at things it may see on the web, such as links to the pages using anchor text with those terms, but it may also look to see whether or not the term is a registered trademark that belongs to the company that owns a particular web site. It may also look at the domain name or yellow page entries.

    One of the authors of this patent also wrote a number of the local search patterns which, in some parts, say that citations are just as good as links. The mention of a particular business at a particular location will more likely rank higher if somebody does a search for businesses of that type in that location . So, this patent from Google expands beyond local search to find authoritative web pages for particular queries.

    Rejecting Annoying Documents

    Eric Enge: Excellent. Since we are getting towards the end I’d like your thoughts on annoying advertisements.

    Bill Slawski: Google came up with a patent a few years ago which, in some ways, seems a bit similar to Panda. It focused upon features on landing pages and the aspects of advertisements. It was called “Detecting and rejecting annoying documents”.

    It provided a list of the types of things they may look at in ads, on landing pages, the subject matter, characteristics rating, what type of language it uses, geographically where is it from, and who is the owner of the content.

    Eric Enge: It may even detect content in images using OCR or other kinds of analysis to understand what is in an image.

    Bill Slawski: Right, and also locate Flash associated with an ad, locate the audio that might be played, look at the quality of images, and the fact that they are animated or not. It was a big list. I do not know if we will see a patent anytime soon from Google that gives us the same type of list involving organic search and the Panda approach. Something might be published two, three or four years from now.

    Eric Enge: It’s interesting. Obviously, what patents they are using and not using is something you don’t get visibility to unless you are in the right particular building at the right time at the Googleplex.

    It seems to me the underlying lesson is that you need to be aware of search engines and, obviously, make search engine savvy web sites. The point is you need to focus on what people should have focused on all along which is: What do my users want? How do I give it to them? How do I engage them? How do I keep them interested? Then create a great user experience because that’s what they are trying to model.

    My perspective is search engines are another visitor to your web site like anybody else.

    Bill Slawski: Right. My perspective is that search engines are another visitor to your web site like anybody else. They may have different requirements. There may be some additional technical steps you have to take for your site to cater to them, but they are a visitor and they want what other visitors to your site want. They want to fulfill some type of informational or situational need. They want to find information they are looking for. They want to buy what you offer if, in the snippets that show up in search results, that’s what you do offer.

    If you are a web site that’s copying everybody else and not adding anything new or meaningful, not presenting it in a way that makes it easier to read and easier to find, and there is nothing that differentiates you or sets you apart, then you are not treating potential visitors the best way you can.

    When you do SEO, even in the age of Panda, you should be doing all the basics. It’s a re-ranking approach. You need to get rid of the same content with multiple different URLs, get rid of pages that are primarily keyword insertion pages where a phrase or two or three changes but the rest of everything stays the same.

    When you write about something, if you are paying attention to phrase-based indexing, make sure you include related information that most people would include on that page, related terms and so on. Those basics don’t go away and they may be more important now than they were in the past.

    Yes. As a searcher, as someone who helps people with web sites, and as someone who may present my own stuff on web sites, I want to know how it works. When I do a search, I want to make sure I am finding the things that are out on the web.

    Get some sweat equity going and make sure your stuff is stuff people want to see, learn about the search space as much as you can.

    Bill Slawski: The things I need, or want, or hope to see, and anything Google can do to make this better, I think everybody wins. That may be more work for people putting content on the web, but the cost of sweat is fairly cheap. Get some sweat equity going and make sure your stuff is stuff people want to see, learn about the search space as much as you can.

    As a ranking signal we have relevance, we have importance and, increasingly, we have content quality.

    Eric Enge: How is life for you otherwise?

    Bill Slawski: I have been trying to keep things local, get more involved in my local community, and do things with the local Chamber of Commerce. I now live in an area that’s much more rural in Northwestern Virginia and some of these local business people need the help.

    I am really close to DC and have been trying to work more with nonprofits. Instead of traveling, I am meeting many people locally, helping people learn more about what they can do with their web sites and that’s pretty fulfilling.

    Bill Slawski: I live in horse country now; there might actually be more horses in my county then there are people.

    Eric Enge: Thanks Bill!

    Originally published at Ramblings About SEO

  • Linkbait and Content Marketing – What Are Your Goals?

    One of the most difficult conversations I have with new or perspective clients is about Linkbait and content marketing and explaining how its real goal isn’t to drive sales but to build links, build awareness, and send social signals to the search engines. To make this post useful and actionable, I’m going to take you through the process/planning stage for a former client I had who has since sold his business.

    An important point to understand: we are targeting the “general online population,” not just potential customers …

    When I was working for the man and building my own business by moonlighting at night (ok–and a little during the day), one of my first clients was for a salt water fish store. He sold fish and aquarium supplies online. Now, ultimately, his goal was to get his content in front of people who own salt water fish tanks and are interested in his products. However, unless you are a well known brand, competing on price (aka running a sale or promotional offer), or are offering an impulse purchase (no long term commitment and low price), you won’t make sales from social media (stay tuned to the end when I will talk more about this).

    IMHO the biggest benefits from social media are link building potential, brand awareness, and social media signals (see what social signals might Google use). Lets take a look at our niche:

    • There is a small subset of the population that has a salt water fish tank and has a potential interest in our merchandise.
    • There is a slightly larger subset of people who know someone who has a salt water fish tank and might forward/share with them content they come across.
    • There is a larger subset of people who are interested in the science/nature/environmental aspects of marine life, marine mammals, and ocean life.
    • There is a larger set of people who are interested in travel aspect of marine life, snorkeling, scuba, diving with sharks, swimming with dolphins, and visiting aquatic-related travel destinations.
    • There is a larger set of people who would enjoy/share photos of marine/ocean-related content, especially if the photos are beautiful, interesting, engaging, or unusual.
    • There is a much larger set of people who will read/share interesting content that is about marine related subject matter, if it is exceptional.
    • There is a small group of people who will publish marine related content and will link to it
    • There is a medium sized group who will write/link/tweet about marine based content if it is exceptional enough (aka the linkerati)

    We are going to target two groups of people because they include most of the other groups. They are “people who will read/share marine based content if it is interesting enough” and “people who will write/link/tweet about marine based content if it is interesting enough.” This is an important point to understand: we are targeting the “general online population,” not just potential customers, because our goals are links, sharing, and social signals.

    So how do we get started? Let’s come up with some potential ideas for our Linkbait (see creating exceptional content for boring subjects):

    Top 10/15/20 Most Beautiful/Ugly/Bizarre Creatures in the Ocean – This isn’t a typical piece of image based Linkbait. I would do all three. Just choose a different number for each one and space them out at least a month apart.

    Best Places to Scuba/Snorkel in the Country/Continent/World – This is a bit of travel Linkbait but, again, it has multiple versions. In fact, you can do them as head & tail continent and refresh the posts every year like seasonal living URL’s.

    Most Expensive/Dangerous Seafood Meals – This has a lot of options. You can do an info graphic of seafood prices to other food like beef and chicken. You could map graphics of seafood consumption. You could create cooking linkbait about expensive seafood, or dangerous seafood to eat (like the fugu blowfish). You can do Eco/green based content on sustainability of seafood. You can do “mom” based content like how to eat healthy seafood on a budget. You can do health focused content on seafood. There are lots and lots of variations here.

    Largest Marine Mammals/Fish/Invertebrates – People like stories about giant sharks, whales, squid or octopi, and you can revisit this kind of post every 2-3 years as news/science updates (see how often should I update my content and updating evergreen content).

    Most Dangerous/Poisonous/Deadly Fish/Sea Snakes/Marine Life – Again, people tie into group-think and share common fears of (and fascination with) sharks, snakes, piranhas, and general ocean life. Just be careful and don’t run a scuba piece right before or after a piece about dangerous sharks. It looks … contradictory.

    Most Beautiful Ocean/Beach/Underwater photography/paintings – Again people like looking at “nice pictures.”

    Now, this list is by no means all encompassing and the titles are just working concepts at this point. Hopefully they give you some idea about how you can take a niche shopping site and widen the focus to include a larger group of people who would be interested in liking/sharing/linking to your website/blog.

    The next step is to start to flesh out the articles. Do a little research and figure out which one will have the best content. For example, use a service like oDesk and hire someone to research the most expensive seafood dishes, both currently and historically. Have them be on the lookout for unusual anecdotes like seafood that was expensive and hunted to extinction or seafood that’s illegal to eat. Have them give you source links so you can verify the data before sending it off to your premium content writer or infograpic artist.

    Once you know about your pieces, start scheduling them and sending them out to be produced. You could push out a minor piece every 2-3 weeks and a major piece every 4-5 weeks. You want to spread out similar pieces unless you are doing a content series. Make sure you have the tail pieces in place before you push out the head (see head and tail content). As a I mentioned above, don’t push out a “Top 5 most dangerous sharks of Australia” back to back with “Best places to scuba dive in Australia.” It looks … odd.

    So what are the takeaways from this post:

    • Think about who your customers are then widen the focus to include as large an audience as possible while still staying “on topic.”
    • Brainstorm for ideas on possible topics for articles.
    • Do research then prioritize/schedule content creation.
    • Create any backup content you may need.
    • Create content and schedule for publication.
    • Spread campaigns out over time to send new links and social signals to search engines over a prolonged period of time.
    • Pay attention to seasonal news/events and tie into them.
    • Look to update science/news/informational content on a regular basis as needed. Use living URL’s.

    Ok, you made it to the end. This post has some bonus content! What if you do want to actually sell things using social media? Well IMHO you will need to do one or more of these things:

     

    • Be a well known, established, trusted brand – If Amazon puts out a top Father’s Day gift ideas list, people will buy from them because they know/trust Amazon. If you aren’t Amazon, you will have a hard time with this strategy.
    • Compete on price – If you offer a sale, discount, or promotional price below your competition, you may make some sales. Keep the item(s) as general interest as possible (aka you can’t sell catfood–no matter how low the price–to someone who doesn’t have cats)
    • Be General Interest, Low Commitment – A lot of people like clown fish thanks to “Finding Nemo,” but not everyone wants to commit to having a fish tank, not even at a cheap price for a startup tank with a free clown fish. However, almost everyone can buy a T-shirt with sharks saying funny things on it.
    • Be impulsed priced – Lots of people want to go to France for vacation, but not a lot of people will drop a thousand dollars or more on a discount vacation at the drop of hat. However, a lot of people will spend $10/$20/$50 on an impulse item if they like it.

    Originally published on Graywolf’s SEO Blog