WebProNews

Tag: linking

  • Link Expert Eric Ward Talks Fear And The Evolution Of Links

    Link Expert Eric Ward Talks Fear And The Evolution Of Links

    The nature of links on the web, particularly in relation to search, has changed a lot over the years, and there’s probably nobody out there with more insight into this than Eric Ward, who’s focused on this specific topic for many years.

    As Search Engine People says in an article about 58 SEO authorities, “Eric’s been building links since…1994. There’s literally no one in SEO that can claim that much experience.”

    If you search the phrase “link building expert” in any of the main search engines, you’ll likely find Ward right at the top.

    Ward is a longtime friend of WebProNews, and has offered his expertise throughout various industry publications, and we’re happy to have caught up with him after quite some time to gain some perspective on the state of links and linking in 2015.

    Has your link building strategy changed significantly over the years? Let us know in the comments.

    You’ve been in link building for 20 years or so. What are the biggest changes in strategy in 2015 compared to when you started?

    Eric Ward: “Thank you Chris for the opportunity to share my thoughts. Well first and foremost I have to say the biggest change is that today people who work in content promotion seem to fear links. Many people feel the very act of pursuing links has become evil, which is sad because it’s not even close to true. In 1994 nobody gave any thought to the idea that a link to a website could be a bad thing. The entire concept of a poisoned link profile is simultaneously comic and tragic. Links are not ‘things’. Links are not imbued with the quality of Good or Evil. Links are the visible manifestation of a human’s action and opinion, and in some cases, intent.”

    “For me the second biggest change over the years is the ease with which URLs can now be shared and migrate throughout the web and between people. There really was a time when the easiest way to tell somebody about a really cool site was just to call them on the phone and tell them the URL. This was back in the 14.4 modem days. If you were lucky a web page might give you the option to ‘Email this page to a friend’, but then you’d click that option and have to fill out a form requesting 21 fields of data from you first. I have fond memories of hundreds of sticky notes with URLs written on them stuck all over my monitors, keyboards, kitchen table, mirrors.”

    “But back to my previous comment about fearing links. The reason nobody had to fear links back then was because none of the search engines at that time used link analysis or any type of linking related metric as a part of their algorithms. These were the pre-Google days. That’s the key point. The linking strategies I would use then had nothing to do with any sort of manipulation of the search engines because there were no search engines to manipulate links for. SEO was on-page only.”

    “This ended up accidentally being the best thing that could’ve happened to me because I developed linking strategies for clients for several years before Google launched. This led me to pursue the very type of relevance related linking strategies Google wanted to see. I didn’t and still don’t believe in manipulation of organic rankings as a viable business strategy. I know what Google wants to see because of a twist of fate, that had me starting my linking and outreach service business before Google existed, again, by accident.”

    “What’s most rewarding is in many ways linking strategies have come full circle back to the way things were and should have remained. When Google launched, the SEO industry went through a period of time (probably more than a decade long) where many agencies and practitioners viewed links as commodities and people would use any kind of tactic, good, bad, ugly and everything in-between in their attempt to try to manipulate Google’s algorithm in their client’s favor.”

    “In some ways I find it kind of funny when companies who were proponents of extremely manipulative practices in the past suddenly talk about how we must comply with Google guidelines or else we could be penalized. This advice coming from the very people who gave you advice that got sites penalized in the first place. But that’s another story for another time.”

    “I guess if I had to boil down the biggest change of all from a strategy standpoint it would be in trying to help people realize that it is incredibly easy compared to the old days to get URLs to migrate or propagate across the web. What I mean by that is today everyone is a Link builder, they just don’t see themselves that way, and many linking strategists overlook this.”

    How about compared to the pre-Penguin era?

    EW: “Once Google aimed its scope at backlink profiles, and more specifically what it considered to be unnatural backlink profiles, it was truly a game changer. For the first time the links pointing to your site could end up hurting you rather than helping you or simply being ignored. The impact of that change cannot be underestimated.”

    As Google continues to put more of its own content and direct answers in search results, has the value of links declined at all from an SEO standpoint?

    EW: “Yes and no. If your entire business model was centered around a high Google ranking, and your content provided people with answers to questions that Google now answers directly, well the reality is you’re screwed. Let’s not sugarcoat it. Once upon a time Google was a shuttle taking you to whatever site it thought had the answer you needed. Now if Google can give you the answer directly, it makes perfect sense for them to do so. Sure your site may still be there among the top ranked sites, but I don’t need to click that link and visit your site because Google just answered my question.”

    “However, there are still hundreds of different types of businesses and verticals for which it does not make sense for Google to provide a direct answer because a direct answer is not what the searcher is looking for. It is in those instances and for those businesses that a linking strategy should still incorporate tactics that are intended to improve organic ranking. However I must always include this caveat: you never want to rely solely on any search engine as the primary means for your businesses success, and your content strategies should not involve anything designed to try and fool the Google brain trust.”

    How has the Disavow Links tool impacted linking?

    EW: “More than anything it seems to me like an admission from Google that there are links on the web for which it cannot truly determine the intent or rationale behind why those links exist. Otherwise they would not need us to disavow them. They would be able to recognize and discount them themselves, without our help.”

    “What’s interesting to think about though is we now have a scenario where millions of people are uploading disavow files that collectively represent billions of URLs. In some ways you could argue this is a crowd-sourced spam detection signal that Google could use to improve their algorithm. For example imagine if you were to do a co-citation analysis across all those disavow files. What does it mean if the same URL or domain is disavowed by 15,000 different people?”

    Do you believe Google should just ignore links it doesn’t find valuable rather than making webmasters jump through hoops to have them discounted?

    EW: “Yes. And the reason I feel this way is there are many people who find themselves spending significant amounts of time trying to undo links that have been placed there by people who were working on the site long before they were. Or even if they placed them there themselves, they now have to spend time removing them, and that’s time that might be better spent creating a more useful content experience for the site’s users. Ask yourself this: would you rather spend six months researching and sending link takedowns, or six months creating awesome new content?”

    You recently tweeted that “link strategists are affected as well,” in reference to an article about the B2B SEO opportunity in organizational mergers and acquisitions. Can you elaborate on that?

    EW: “I’ve seen many cases where companies merge or acquire the assets of another company and among those assets are websites, often more than one, sometimes several, maybe even 10 or 20. Each of those websites was likely launched at a different time, and over the years, each of those sites developed its own individual and distinct backlink profile. Now when we have a merger one of the things that has to be taken into consideration is what do we do with all this link equity that is spread across all of these various web properties that are now owned by the same entity. Sometimes the answer is to leave them just as they are, or it may be better to merge some of the sites that have a similar target audience or client base, or, the best strategic move may be something else entirely. I think the best linking strategists are those of us who can take a look at the big picture of all of those brands and sites, look at the mergers and acquisitions and help navigate the client through the best strategy that will maximize the link equity. There is no one-size-fits-all solution to the link equity challenge.”

    I read your piece on the “link apocalypse,” which made some great points about how different sites should be taking different approaches to link building. Can you talk a little bit about that?

    EW: “I wrote that column almost 8 years ago, and I have to be careful here to fight the urge to say ‘I told you so’. Still, if you read that piece I don’t think there’s anything in there that did not end up coming to pass. But let’s be honest. I can’t be that smart or I wouldn’t still be working.”

    “But…aside from the specific predictions, the main thesis of that piece is that marketers must respect that which makes one site different from another site. The example I used in the article was intended to be funny but also illustrate the point. A site about whale watching in Iceland does not need the same link building strategy as a site about spelunking in Arkansas or accordion repair in Biloxi. Unfortunately though, for over a decade companies have been trying to sell link building packages in a cookie-cutter approach without regard to what differentiates one site’s mission and passion and content from another.”

    What’s the one piece of link building advice you’d give above all else in 2015?

    EW: “Don’t use your website to just write about yourself and how awesome you are. Don’t use your blog to simply write a little bit longer summary of one of your products or services or an in-depth profile of your CEO or how awesome your staff is. Instead, make other people, places, events, and other industry specific happenings the stars of your content. It will come back to you in the form of links, shares, likes, tweets and quite possibly, earned improved rankings.”

    “Give to get. Just like 1994.”

    “Thank you again Chris for the opportunity to share my thoughts and now please everybody go to http://ericward.com/lmp and sign up for LinkMoses Private, my Linking Strategies Newsletter where I provide effective linking strategies, tactics, Q/A, advice, and and Link Opportunity Alerts. I’m also available for consults and very specialized link dev projects. I’ve got three kids to put through college, so I wont be retiring anytime soon :)”

    And thank you, Eric for the great (as always) insight into the state of link building.

    How have your linking and link building practices changed over the years? Please discuss in the comments.

  • Prince Drops Linking Lawsuit Against Fans

    You know how Prince is actively suing his fans for posting links on their blogs and Facebook pages that point other fans in the direction of his music?

    Well, apparently he thought about it and decided to go another route.

    TMZ reports that the 55-year-old actor/musician has dropped the lawsuit, just days after news of it went public.

    Just a couple weeks ago, Prince filed a suit in the Northern District of California, naming 22 defendants who posted links to various concerts (some locations contained torrent downloads) on their Facebook pages and personal blogs. The lawsuit asked for $1 million from each link-sharer.

    The lawsuit didn’t come as a huge shock to those familiar with Prince’s history of going after what he deems as unlawful content sharing on the internet.

    As we previously noted, the Prince suit was filed in the same place where a previous ruling stated that linking is not direct copyright infringement. In the case of Perfect 10, Inc. v. Amazon.com, Inc. the court decided that posting a link is not the same thing as actually hosting the content yourself.

    But according to TMZ, the suit is dropped but not dead. It’s been let go without prejudice, which means that Prince has the ability to refile in the future. It looks like the bad press may have influenced the Purple Rain star to at least delay the suit.

    Now, let’s see what happens with the somewhat similar Tarantino link-suit.

    Image via Wikimedia Commons

  • Google Has Made People Afraid To Link

    Google has made it so people are scared to link to content. That’s what it has come to.

    I don’t think it’s ever been Google’s intention to scare people away from linking when it’s natural and deserving, but its never-ending advice, warnings, rules and policy re-wordings have simply led to mass confusion, and people being afraid to link to legitimate content in a legitimate way for fear that Google will penalize their site in search rankings.

    Are webmasters being overly paranoid about their linking practices or are they legitimately afraid of what Google might do to their sites? Share your thoughts in the comments.

    We’ve written several articles in the past about how fear of Google has led to people frantically rushing to have external links to their sites removed, in some cases even when these links are totally legitimate (meaning playing by Google’s rules) or creating non-Google-related value. Sometimes, they’ve even considered making natural links unnatural.

    Sure, some of it has been overreaction, but a Google penalty or loss of rankings can be a huge deal for a business. Companies have laid off staff because of it.

    While most of the time, we’re talking about people being afraid of Google not liking the links that are pointing to their own site, people are now also worried about linking to other sites.

    Barry Schwartz at Search Engine Roundtable writes, “I see questions popping up left and right. Can I link to this site? If so, should I nofollow it anyway? Should I make sure to not use keyword rich anchor text when linking?”

    “It is making natural linking unnatural because of the fear of linking is now killing natural links,” he adds. “Publishers and webmasters are less likely to link out because of that fear.”

    He points to a WebmasterWorld thread where people are voicing their concerns.

    Simply put, if websites stop linking to each other, the fabric of the web crumbles. Links are what make it a web. Other wise it’s just a bunch of silos.

    Again, I don’t think Google wants people to stop linking to each other, but people are clearly concerned about what might happen if they do link, and especially without a nofollow. It doesn’t help that Google recently advised that Infographic links be nofollowed. Here you have, at least in some cases, legitimate content that people editorially link to because they like that content and want to share it with their readers. Why should these links not count? Why is it so different? People that include others’ infographics on their sites make an editorial decision to do so. I know because I have made that decision editorially on occasion. And I’m happy to give some link love to the creator for taking the time to put together that content that I found valuable enough to share with my readers.

    If I created an infographic, and an authoritative site like CNN or The New York Times wanted to use it, and would certainly expect a link and its corresponding PageRank juice.

    But there are bigger problems still with people not linking. For one, credit is often not going to be given when due. Traffic to an original source is not going to happen. Readers are going to be deprived of additional, helpful and contextual information.

    From Google’s perspective, it doesn’t make a lot of sense for sites not to link to one another appropriately, because as far as we know, PageRank still carries weight in Google’s organic rankings. That said, Google does appear to be doing everything it possibly can to not have to point users to other websites.

    There have been numerous reports of Google increasingly showing more of its own stuff and less organic results on more and more SERPs. Hell, I even see Google displaying a Google+ link for an article I’ve written rather than the article page itself on SERPs. You know, I wrote an article, then shared it on Google+, and Google decides to show the Google+ link rather than the real link. This happens fairly often, actually.

    So really, it’s going to be interesting to see how long organic rankings really even matter. But they do still matter for now, and some are probably going to suffer from not getting the links they deserve.

    What do you think of all of this linking fear? Reasonable or not? Let us know in the comments.

    Image: Matt Cutts.com

  • Google Discusses Its New Official Link Rules

    Google Discusses Its New Official Link Rules

    Google has some new rules for the kinds of links it allows (or doesn’t allow, rather). The concepts are actually not exactly new, but Google has updated its official documentation to reflect its views of certain kinds of links.

    Are you concerned with following Google’s rules for links on the web? Does Google have too much power over how people treat their content? Let us know what you think in the comments.

    As you may know, one of the things Google says in its Quality Guidelines to avoid is participation in link schemes. Google has updated the link schemes page, as Search Engine Land (tipped by Menaseh) recently reported.

    Now included as things that qualify as link schemes are:

    • Large-scale article marketing or guest posting campaigns with keyword-rich anchor text links
    • Advertorials or native advertising where payment is received for articles that include links that pass PageRank
    • Links with optimized anchor text in articles or press releases distributed on other sites.

    Guest posts have been discussed numerous times recently. A recent article at HisWebMarketing.com suggested that “high quality guest posts can get you penalized.

    Google talked about the topic in several videos (which you can watch here if you want to spend the time doing so).

    In one video, Matt Cutts said that it can be good to have a reputable, high quality writer do guest posts on your site.

    He also said, “Sometimes it get taken to extremes. You’ll see people writing…offering the same blog post multiple times or spinning the blog posts, offering them to multiple outlets. It almost becomes like low-quality article banks.”

    “When you’re just doing it as a way to sort of turn the crank and get a massive number of links, that’s something where we’re less likely to want to count those links,” he said.

    “Generally speaking, if you’re submitting articles for your website, or your clients’ websites and you’re including links to those websites there, then that’s probably something I’d nofollow because those aren’t essentially natural links from that website,” Google’s John Mueller said in another video.

    In another video, Mueller said, “Think about whether or not this is a link that would be on that site if it weren’t for your actions there. Especially when it comes to guest blogging, that’s something where you are essentially placing links on other people’s sites together with this content, so that’s something I kind of shy away from purely from a linkbuilding point of view. I think sometimes it can make sense to guest blog on other peoples’ sites and drive some traffic to your site because people really liked what you are writing and they are interested in the topic and they click through that link to come to your website but those are probably the cases where you’d want to use something like a rel=nofollow on those links.”

    Cutts said in a recent interview with Eric Enge, “The problem is that if we look at the overall volume of guest posting we see a large number of people who are offering guest blogs or guest blog articles where they are writing the same article and producing multiple copies of it and emailing out of the blue and they will create the same low quality types of articles that people used to put on article directory or article bank sites.”

    “If people just move away from doing article banks or article directories or article marketing to guest blogging and they don’t raise their quality thresholds for the content, then that can cause problems,” he said. “On one hand, it’s an opportunity. On the other hand, we don’t want people to think guest blogging is the panacea that will solve all their problems.”

    Advertorials are another thing Google has been cracking down on recently. Cutts put out a video specifically addressing this topic a few months ago.

    “Well, it’s advertising, but it’s often the sort of advertising that looks a little closer to editorial, but it basically means that someone gave you some money, rather than you writing about this naturally because you thought it was interesting or because you wanted to,” he said. “So why do I care about this? Why are we making a video about this at all? Well, the reason is, certainly within the webspam team, we’ve seen a little bit of problems where there’s been advertorial or native advertising content or paid content, that hasn’t really been disclosed adequately, so that people realize that what they’re looking at was paid. So that’s a problem. We’ve had longstanding guidance since at least 2005 I think that says, ‘Look, if you pay for links, those links should not pass PageRank,’ and the reason is that Google, for a very long time, in fact, everywhere on the web, people have mostly treated links as editorial votes.”

    More on all of that here.

    Finally, with regard to the optimized anchor text in articles or press releases thing, Google gives the following example of what not to do:

    There are many wedding rings on the market. If you want to have a wedding, you will have to pick the best ring. You will also need to buy flowers and a wedding dress.

    I wonder if that’s a real sample.

    Barry Scwhartz from Search Engine Roundtable jumped in a hangout to ask Mueller some questions about press releases:

    He recaps:

    John Mueller from Google makes it clear that Google wants all links in these press releases to be nofollowed. He did say having a URL at the end should be okay but when he was grilled about it again, he said it is best to nofollow the links. John even said press releases should be treated as advertisements and thus links in those releases should be nofollowed.

    I asked John why all of a sudden the change in policy for press releases and John said that it is because SEOs were using these more and more in a way to promote their site [artificially in the Google search results] and Google needed to clarify their stance on them.

    Google did remove a few to make room for the new ones. Now gone are “linking to web spammers or unrelated sites with the intent to manipulate PageRank” and “links that are inserted into articles with little coherence”.

    I guess it’s game on with those. Just kidding.

    What do you think of Google’s updated language for links schemes? Do any of the changes concern you? Let us know in the comments.

  • Linking Practices That Annoy Matt Cutts, But Don’t Make A Difference To Google

    Google posted an interesting Webmaster Help video today about linking. It’s basically about whether it’s better to link to an original source somewhere at the top of a post, or at the bottom. The answer is essentially that it makes no difference, as far as Google’s algorithm is concerned. The link will flow PageRank either way, so as fas as SEO is concerned, it really doesn’t matter.

    After Cutts answers the question directly, he gets into his personal opinions and discusses what he finds annoying about linking practices.

    “I’ll just say, for my personal preference, I really appreciate when there’s a link somewhere relatively close to the top of the article because I really kind of want to know when someone’s talking about it, you know, hey, go ahead and show me where I can read the original source or let me look up more information,” says Cutts. “There are a lot of blogs that will give one tiny little link all the way at the bottom of a big long story, and by that time, it just doesn’t seem like it’s quite as useful, but that’s just a personal preference. That’s not ranking advice as far as it goes.”

    “The only other thing I hate – this is once again just personal – is whenever you’ve got a regular news report, whether it’s in a mainstream newspaper – New York Times, AP, whatever – and they say, ‘Blah Blah Blah said on a popular webmaster blog that blah blah blah,’ and they don’t link to the source,” he continues. “I mean, come on. Link to your sources, whether you’re a journalist, whether you’re a blogger, let people go and look at the original information themselves so that they can suss out what they think about whatever it is that you’re writing about. So if you just say, ‘Oh, it was discovered on a popular forum that blah blah blah,’ then we have to go look for it. That’s really annoying.”

    “Again, not ranking advice,” he reiterates. “Just asking everybody to be considerate on the web, and share credit, and attribute, so that people can, you know, do the research for themselves if they want to.”

    As if anybody on the web would ever be inconsiderate.

  • Should Sites Be Forced To Pay For Linking? Harvey Weinstein Thinks So.

    Harvey Weinstein, an Oscar winning producer and prolific proponent of Obama, told Deadline that he is going to push for legislation that would force websites to pay for linking to news articles. This legislation would require news websites and blogs to pay a monitoring organization a fee for every link to an article written by a journalist.

    Should news sites, bloggers and other sites like Facebook, Twitter and Google pay for the privilege of including snippets and links to news stories? Also, should YouTube or sites that include embedded videos of movie/TV clips pay every time somebody views them?

    Give us your thoughts on this important topic that goes to the heart of the internet in the comments below.

    Weinstein said, “Journalists don’t benefit when their stories are taken, and given a link. It would be like me launching a newspaper–call it Link—where I can have the greatest journalists in the world working for me without paying them. It’s inconceivable. If BMI and ASCAP can monitor the music business, we need a BMI and an ASCAP to monitor these businesses. This will be the one legislation for our industry that I’ll press.”

    This would be part of a broader law that where a monitoring organization would also monitor the web for video clips and require websites like YouTube to pay this organization a fee for each view of a clip of a movie or television show.

    As the publisher of WebProNews and a longtime advocate of the right to link, in my opinion Weinstein’s idea would destroy the internet as we know it today. The internet is based on the idea of linking, that’s why it was originally referred to as the World Wide Web! If you make publications, blogs, Google, Twitter and Facebook pay for linking to a news story, how many of them would still do it. The answer is none.

    Weinstein may think he’s only talking about making news linking giants like Google News pay, but laws against free linking could not just apply to them. His proposed legislation would also have to apply to Reddit, Stumbleupon, Facebook, Twitter and news publishers and bloggers who routinely republish snippets of news articles with links to the original. Many of these sites also inbed video clips as well.

    Weinstein challenges the assertion by publishers that linking and taking small snippets of articles is not stealing content but is actually promoting the content. Weinstein equates linking and publishing as one and the same. Weinstein also told Deadline, “When it comes to journalists and journalism, I’m with you. It is important they get paid for good work, and wrong that others just take it, with a link.”.

    Since most articles have numerous social buttons encouraging “sharing” their articles via social media sites like Facebook and Twitter, you would think it would be obvious to Weinstein that publishers and journalists want their stories to be linked to. The definition of going viral is mass sharing on social media sites which pushes huge numbers of people to a journalist article if he is so lucky. Linking drives traffic to an article which theoretically can then be monitized by the publisher. If the publisher doesn’t want the traffic he can put up a firewall login and charge visitors to read the sites content.

    If a news site like Deadline doesn’t want its articles linked to then it shouldn’t publish them on a linking platform called the Web. Weinstein may be surprised to learn that Deadline and most news sites are quite happy that their articles get free traffic driven by links!

    Just like the music industry, which has in the past sued the parents of kids who downloaded music without paying for it, Weinstein proposes that those linking to content should also have to pay up. He wants to do it a bit more tactifully than the RIAA, but still wants to collect nonetheless. His idea I presume is to first change the definition of fair use which is permitted per U.S. and many international copyright laws, where a website can take snippets of content and reuse it to a certain extent.

    Theoretically, considering Weinstein’s personal connection with Obama, he could persuade the President to tighten this definition via some minor changes in regulations and rules and bypass Congress. The definition of fair use as written in U.S. copyright laws is vague and could easily be redefined via regulation. This is a scary proposition considering that linking and discussing news articles is integral to free speech.

    Once fair use is redefined to allow copyright holders the ability to charge websites a retroactive fee for each time a visitor viewed a news summary and link, that’s when a new organization similar to BMI would emerge to ensure that journalists are paid for their work. BMI has people going into businesses, such as bars and restaurants, all around the country looking to see if music is being played without their license. When it catches a business playing unauthorized music it forces them to pay based on a variety of factors such as number of seats in a restaurant and number of songs played.

    If a bar doesn’t join BMI and agree to pay a monthly fee up front, then often BMI will sue for huge amounts. For instance, one restaurant in North Carolina was order by a court to pay the BMI $30,450 for playing just four unauthorized songs.

    This is what Weinstein wants for publishers and writers of news content! If you are a blogger that makes a small amount of money from ads and you include a snippet from a news article in your story you could be sued if you didn’t already agree to a monthly payment.

    For Facebook, Google and Twitter the ramifications of this kind of heavy handed legislation could be huge. They are the YouTube of written content since so many of us share snippets and links via them. If sites like these need to license links with a BMI type organization, it’s likely that they would just eliminate news links and snippets altogether which would change the web forever… don’t you think?

  • German Publishers Reportedly Won’t Go For A Google Deal Like Those In France

    Google and France President Francois Hollande, on Friday, announced a deal that the search giant has made with French publishers to who want to be paid for the content that Google links to.

    Google agreed to create a €60 million fund called the DIgital Publishing Innovation Fund to “help support transformative digital publishing initiatives for French readers.” Google says it will also “deepen” its partnership with French publishers to help increase their online revenues using Google’s ad technology.

    Though Google has indicated that it hopes to reach similar agreements with publishers in other countries, it doesn’t look like those in Germany are going for it. Germany’s The Local reports that German newspapers have rejected the idea of copying the agreement Google made with French publishers:

    The German association of newspaper publishers (BDZV) said the French agreement did have some positive points. The major of these was that it was established and accepted “that the aggregation of content from third parties as a business model costs them money,” said Anja Pasquay, BDZV spokeswoman on Sunday.

    But she said a drawback was that the French solution only referred to Google. “The publishers there have no legal recourse against other aggregators who operate in the same fashion – or those who will do so in the future,” she said.

    Back in December, Google made a deal with publishers in Belgium. While not exactly the same as the one it made in France, it seems that German publishers would take similar issue with such a deal.

  • Does Google the Link Lister Equal Google the Publisher?

    Is Google a publisher? Or is Google simply a displayer of links? Are these two things the same?

    Those questions are at the heart of a Australian case that just tipped against Google, and are likely at the heart of many cases to come. An Australian high court has found Google liable for libelous content tying a man to organized crime. Of course, Google didn’t create the article that made the references, it simply provided a link to it within its search results.

    The man’s name is Milorad Trkulja, and he claimed that Google defamed him by associating his name and image with (untrue) claims of ties to organized crime, both in regular search results and in Google Image search. The jury in the case found Google guilty and therefore responsible for the content that they link to. They’ve been fined $200,000, but are in the process of appealing the ruling (as you would expect).

    Is Google responsible for the content that is found using their search engine? Or is this a ridiculous claim to make? Let us know in the comments.

    Here’s what the Judge in the case had to say:

    The question of whether or not Google Inc was a publisher is a matter of mixed fact and law. In my view, it was open to the jury to find the facts in this proceeding in such a way as to entitle the jury to conclude that Google Inc was a publisher even before it had any notice from anybody acting on behalf of the plaintiff. The jury were entitled to conclude that Google Inc intended to publish the material that its automated systems produced, because that was what they were designed to do upon a search request being typed into one of Google Inc’s search products. In that sense, Google Inc is like the newsagent that sells a newspaper containing a defamatory article. While there might be no specific intention to publish defamatory material, there is a relevant intention by the newsagent to publish the newspaper for the purposes of the law of defamation.

    Basically, Google may not want to publish it, but they are publishing the publishers. And since Google’s algorithms are tooled to find said content, they are responsible. Or at least it is plausible that a jury could see it that way. The Judge is clearly unconvinced that this stance is set in stone.

    The Judge also differentiated search results pages from Google Image searches. The plaintiff also complained of images tying him to crime figures. The Judge notes that a Google Image search is a more-sophisticated version of cut-and-paste from magazines, and importantly a Google-created page:

    As was pointed out by counsel for the plaintiff in his address to the jury, the first page of the images matter (containing the photographs I have referred to and each named “Michael Trkulja” and each with a caption “melbournecrime”) was a page not published by any person other than Google Inc. It was a page of Google Inc’s creation – put together as a result of the Google Inc search engine working as it was intended to work by those who wrote the relevant computer programs. It was a cut and paste creation (if somewhat more sophisticated than one involving cutting word or phrases from a newspaper and gluing them onto a piece of paper). If Google Inc’s submission was to be accepted then, while this page might on one view be the natural and probable consequence of the material published on the source page from which it is derived, there would be no actual original publisher of this page.

    You can see just how much of a charlie-foxtrot this is. Which pages are Google’s creation, and which are simply the “consequence of the material published on the source page from which it is derived?”

    The jury concluded that Google was a publisher, and was liable for the defamatory content even if they weren’t notified of it yet. Although Google contended that it doesn’t matter if they were notified of the content of not – they’re not responsible – the Judge rejected that notion as well.

    It follows that, in my view, it was open to the jury to conclude that Google Inc was a publisher – even if it did not have notice of the content of the material about which complaint was made. Google Inc’s submission to the contrary must be rejected. However, Google Inc goes further and asserts that even with notice, it is not capable of being liable as a publisher “because no proper inference about Google Inc adopting or accepting responsibility complained of can ever be drawn from Google Inc’s conduct in operating a search engine”.

    This submission must also be rejected. The question is whether, after relevant notice, the failure of an entity with the power to stop publication and which fails to stop publication after a reasonable time, is capable of leading to an inference that that entity consents to the publication. Such an inference is clearly capable of being drawn in the right circumstances (including the circumstances of this case). Further, if that inference is drawn then the trier of fact is entitled (but not bound) to conclude that the relevant entity is a publisher.[42] Google Inc’s submission on this issue must be rejected for a number of reasons, the least of which is that it understates the ways in which a person may be held liable as a publisher.

    Of course, $200,000 to Google is basically nothing. The appeal really has nothing to do with the monetary damages. Google knows that this kind of decision sets an unsettling precedent for their future defenses in similar cases. Google as “automated news agent that’s responsible for what their algorithms pull out of the depths” is a view of Google that the company can’t afford to have stick.

    We’ve seen this story play out numerous times over the past couple of years with Google’s autocomplete feature. In August of 2011, Google lost a case in Italy and was forced to remove autocomplete suggestion in its search box that tied a man to the word “truffatore,” meaning con man. A few month later, Google was fined $65,000 because one of its autocomplete suggestions labeled a French man “esroc,” meaning crook.

    And this year, Google made an out-of-court settlement with French anti-discrimination groups over a “Jewish” autocomplete suggestion.

    Google’s argument in these cases is similar to the argument in the Australian case. We’re not suggesting anything. We’re not defaming anyone. Google’s autocomplete suggestions are based on popularity of terms. That means that if anything, Google users are the ones linking people’s names with unsavory terms. Google’s search results are also based on an algorithm. Just ask Rick Santorum about how much responsibility Google claims in what people find using its search engine.

    So, is Google a publisher? If not, what are they, exactly? How much responsibility do you think Google has for what people find using their search engine? Tell us what you think in the comments.

  • Google Continues To Battle Publishers Who Want To Be Paid For Links

    Google prepared a note about a proposal by French lawmakers and backed by French news publsihers, which want search engines to license all of their content. Publishers want to be paid for the privilege of linking to their content. Obviously, this doesn’t sit well with Google, and the company has threatened to stop linking to such sites.

    Of course, the note is in French, but Google has provided it in its entirety here. On the company’s European policy blog, Google’s Director of Public Policy in France, Oliver Esper, says, “The web has led to an explosion of content creation, by both professional and citizen journalists. So it’s not a secret that we think a law like the one proposed in France and Germany would be very damaging to the internet. We have said so publicly for three years.”

    “In order to shed light on the reasons that lead us to believe that this law is detrimental to French users, innovation on the Internet and ultimately to the news publishers themselves, we decided to post the note in its entirety,” he says. “We have always been and remain committed to collaborate with French Publishers associations as they experiment and develop sustainable economic models on the Internet.”

    Regarding Germany, we discussed the proposed law in that country and the larger ramifications of such a law here. Of course, Google has battled similar criticism and threats from publishers here in the U.S. and abroad.

    AFP reports: “France’s new Socialist government, which is open to helping struggling media companies, warned Google that it should not threaten democratic governments.”

    Google is also having some issues with publishers in Brazil. There, publishers have gone so far as to pull out of Google News altogether. It will be interesting to see how long that lasts.

    The Knight Center for Journalism In The Americas (via PaidContent) reports that all of the 154 newspapers that belong to the National Association of Newspapers in Brazil (ANJ), accounting for a whopping 90% of Brazil’s newspaper circulation, have left Google News.

    Google didn’t budge on publisher requests to be paid, so now, Google apparently has a lot less Brazilian news sources in its system. Of course, that doesn’t mean that Google users will necessarily have a hard time finding the content, as pulling out of Google News hardly keeps your content from being crawled by Google. In fact, Google’s regular web results often come from news publishers, and with Google’s increased emphasis on freshness in recent months, there’s a good chance that brand new articles will show up for news-related queries.

    Publishers who don’t want to be crawled by Google at all, can keep the search engine from doing so with robots.txt, but publishers face losing a ton of traffic by doing so. Then, the question becomes, will users miss these sources enough to go directly to their sites and give them whatever compensation they may be seeking?

    Browsing Google News for Brazil, it doesn’t look like there is a shortage of available content for people to read.

    Last month, Google celebrated the tenth anniversary of Google News, reporting that it is now available in 72 editions and 30 languages, and that it counted 50,000 publications among its news sources. I guess the number is slightly less now.

  • Google Is Considering Discounting Infographic Links

    Matt Cutts spoke with Eric Enge at SMX Advanced, and Enge has now published the entire interview. In that interview, Cutts reveals that Google may start looking at discounting infographic links.

    That doesn’t mean Google is doing this right now, or that they definitely will, but…come on.

    “In principle, there’s nothing wrong with the concept of an infographic,” Cutts says in the interview. “What concerns me is the types of things that people are doing with them. They get far off topic, or the fact checking is really poor. The infographic may be neat, but if the information it’s based on is simply wrong, then it’s misleading people.”

    “The other thing that happens is that people don’t always realize what they are linking to when they reprint these infographics,” he adds. “Often the link goes to a completely unrelated site, and one that they don’t mean to endorse. Conceptually, what happens is they really buy into publishing the infographic, and agree to include the link, but they don’t actually care about what it links to. From our perspective this is not what a link is meant to be.”

    I don’t think it’s much of a surprise to a lot of people that Google would consider not counting these kinds of links. In fact, last month, we ran an article from David Leonhardt, who talked about this very thing.

    There are certainly legitimate infographics, just as there are legitimate directories, but there is always that room for abuse, and it could represent something like what Google considers to be a linking scheme (which is against its quality guidelines).

    “I would not be surprised if at some point in the future we did not start to discount these infographic-type links to a degree,” Cutts told Enge. “The link is often embedded in the infographic in a way that people don’t realize, vs. a true endorsement of your site.”

    I think that says it all. If you have a major infograhpic strategy that’s built for SEO purposes, I wouldn’t put too much stock into it moving forward. That doesn’t mean, however, that infographics can’t still provide value, and certainly spark some quality social traffic.

    That’s only a small part of Enge’s interview with Cutts. Read the whole thing here.

    Hat tip: Barry Schwartz

  • Go Ahead, You Can Now Tell Bing to Ignore Links

    While the SEO brethren have been waiting for Google to provide a tool that will allow webmasters to disavow specific links, Bing went ahead and punched that ticket first by launching the Disavow Links feature in Bing Webmaster Tools.

    The premise isn’t all that complicated: go to Bing Webmaster Tools, click on the Disavow Links under the Dashboard menu, and – presto – simply submit the page, directory, or domain that you suspect to be coming from spam or poor quality sites.

    Bing Disavow Links

    Bing Webmaster Tools resident SEO samurai Duane Forrester explained in a blog post how the Disavow Links feature can help protect your site from malicious links but cautions that you won’t be able to go a-roving across the internet to cook up a better rank for your site.

    These signals help us understand when you find links pointing to your content that you want to distance yourself from for any reason. You should not expect a dramatic change in your rankings as a result of using this tool, but the information shared does help Bing understand more clearly your intent around links pointing to your site.

    Forrester adds that there isn’t a limit on the number of links you can report with the Disavow Links tool.

    The launch of the feature presents two pretty obvious questions: 1/ What’s taking Google so long to launch a similar tool?, and 2/ What does Bing get out of launching this tool?

    The answer to the first question is as elusive as the end of the rainbow, and probably only Google’s webspam avenger Matt Cutts could answer that one. The second question was discussed by Search Engine Land’s Vanessa Fox, who inferred that Bing’s release of this tool may indicate that Bing does in fact penalize websites that have bad backlinks.

    Forrester has always been fairly accessible and welcoming of questions about whatever webmasters may have concerning Bing Webmaster Tools and he’s already been answering questions from users over on Twitter.

     

     
     

    Ultimately, why wouldn’t Bing have a tool like this available to webmasters? It falls in line with what Bing’s been working on to improve the overall quality of its search algorithms, specifically with the search engine’s recent Phoenix Update. Plus, it just helps keeps the web all much less janky.

    At any rate, anybody webmasters out there plan on using this tool to see how it affects your site’s ranking (if at all)? Let us know what you think.

  • This Webmaster Changed A Page’s Internal Linking Structure, Recovered From Google’s Penguin Update

    Another webmaster has claimed to have recovered from Google’s Penguin update.

    In a WebmasterWorld forum post (via Barry Scwhartz), member neildt said that they noticed a page they thought had been hit by Penguin, had returned to ranking for the keyword phrase they were targeting. “Now it’s either just a coincidence that it has returned, or it is due to the changes we made throughout our site for this page based on why we were hit by Penguin,” neildt wrote.

    “Basically from the 24th April when we appeared to be affected by Google’s Penguin update, I took a totally random page that was affected and changed the internal linking structure that pointed to the page,” neildt wrote. “Before the 24th April this page would rank on page 1 of Google for ‘hotel name’ and ‘hotel name city’ as example phrases we were targeting. After this time, the ranking for those phrases was beyond page 30 of Google’s SERPS.”

    “Until yesterday (Sunday 24 June) when I checked if these phrases had made any progress and there were some changes in Google’s SERPs,” neildt continued. “To my surprise we are ranking on page 1 for ‘hotel name city’ and page 3 for ‘hotel name’.

    At this point, it’s unclear whether or not Google launched a data refresh for the Penguin update over the weekend. We’ve reached out to the company, and will update when we learn more.

    Schwartz, who linked to the forum thread where this Penguin recovery is being discussed, says he has heard other rumors of a Google update, that may have occurred on Thursday night.

    As far as Penguin goes, clearly it is possible to recover. You may or may not need to start with a freshly designed site, but make sure you’re in total compliance with Google’s quality guidelines. For more details on another recent recovery, read here.

    When the webmaster from that story recovered, there had been a Penguin data refresh.

    Image: The Batman Season 4 Episode 2 (Warner Bros.)

  • Google Will Soon Ignore Links You Tell It To

    Google’s Matt Cutts gave a keynote “You and A” presentation at SMX Advanced this week, and mentioned that Google is considering offering a tool that would let webmasters disavow certain links.

    Would you find such a tool useful? Let us know in the comments.

    Matt McGee at SMX sister site Search Engine Land liveblogged the conversation. Here’s his quote of Cutts, which was in response to a question about negative SEO:

    The story of this year has been more transparency, but we’re also trying to be better about enforcing our quality guidelines. People have asked questions about negative SEO for a long time. Our guidelines used to say it’s nearly impossible to do that, but there have been cases where that’s happened, so we changed the wording on that part of our guidelines.

    Some have suggested that Google could disavow links. Even though we put in a lot of protection against negative SEO, there’s been so much talk about that that we’re talking about being able to enable that, maybe in a month or two or three.

    We recently wrote about Google’s wording change regarding negative SEO, which seemed to be an admission from the company that this practice is indeed possible. These words from Cutts seem to be further confirmation.

    Rand Fishkin, CEO of SEOmoz, recently issued a challenge to people to show that if you have a strong enough reputation and link profile, you can’t be hurt by negative SEO. That seemed to go pretty well, but not everyone has the reputation of SEOmoz, even if they don’t necessarily have a bad one. Such a tool from Google could go a long way in helping combat negative SEO practices.

    As far as people suggesting that Google could disavow links, Search Engine Land editor Barry Schwartz actually had a pretty good article talking about this last month. “The concept is simple,” he wrote. “You go to your link report in Google Webmaster Tools and have an action button that says ‘don’t trust this link’ or something like it. Google will then take that as a signal to not use that link as part of their link graph and ranking algorithm.”

    “What I can’t understand is why hasn’t Google released it yet,” he wrote. “It is a great way for Google to do mass spam reporting by webmasters and SEOs without calling it spam reporting. You will have all these webmasters rush after a penalty to call out which links they feel are hurting them. Google can take that data to back up their algorithms to on links they already know are spam but also find new links that they might not have caught.”

    He went on to make the point that Google would find more spam this way.

    Once Google launches this tool, assuming that it actually does, it will be very interesting to see how the rankings shake out. It should be an indication of just how important links actually are these days.

    As you may know, Google has sent out a ton of Webmaster Tools warnings this year, and such a tool would help users take quick “manual action” on links rather than spend a ton of time sending link removal requests to other sites. It might even prevent some lawsuits (and the death of the web as we know it).

    According to Cutts, however, not many of the warnings were actually about links.

     

    @VegasWill that’s the right range. I may pull the stats just to help clarify.
    6 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Update: Here’s his clarification:

    Matt Cutts
    Matt Cutts   15 minutes ago Earlier this year, Google revealed that we sent out over 700,000 messages to site owners in January and February 2012 via our free webmaster console at http://google.com/webmasters . I wanted to clarify a misconception about those messages. A lot of people assumed that most or all of the 700K messages were related to "unnatural link warnings" that some site owners received.

    The reason for sending the 700,000 messages via Webmaster Tools was actually because we started sending out warnings about blackhat techniques. The vast, vast majority of manual actions we take are on pages that are engaging in egregious blackhat SEO techniques, such as automatically created gibberish or cloaking.

    In fact, of the messages that we sent out to site owners, only around 3% were for unnatural or artificial links. So just to be clear, of the 700,000 messages we sent out in January and February, well above 600,000 were for obvious blackhat spam, and under 25,000 of the messages were for unnatural links. #smx   #seo  


    Google Sent Over 700,000 Messages Via Webmaster Tools In Past Two Months
    At SMX West last week Tiffany Oberoi from Google shared that Google has sent over 700,000 messages to webmasters via Google Webmaster Tools in January and February 2012. That is more than the total nu…

    By the way, Google only sends those messages when it’s a penalty, and penalties, as far as Google is concerned, are manual action.

    It will be interesting to see if the new link tool helps a lot of sites recover from algorithm updates like Penguin, and/or prevents a lot of sites from getting hit. Will we see less complaining about Google’s algorithm changes? Somehow, I doubt that. I have no reason to believe we will see less finger pointing.

    Will you use the new link tool if Google provides it? Let us know in the comments.

  • Google: We’re Starting To Enforce Paid Links More

    Google’s Matt Cutts has been making light of paid links all week, but in reality, Google isn’t joking when it comes to enforcing this part of its quality guidelines. According to Google, it’s cracking down on this more than ever, and we have seen in recent weeks, that Google is indeed cracking down.

     

     

     

    @kerrydean At the Q&A I should be like “Hey everyone, Kerry Dean is buying links in this session, so please get in touch if you’re selling.”
    1 day ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

     

     

     

    @SEOAware “Matt Cutts, Linkbuyer Psychologist.”
    7 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    Now, the serious stuff.

    Cutts participated in a keynote discussion at SMX Advanced, and while much of the talk was about Penguin, the subject of paid links also came up. SMX sister site Search Engine Land has a liveblogged account of the discussion. Here’s what Cutts said about paid links, according to that:

    We’re always working on improving our tools. Some of the tools that we built, for example, to spot blog networks, can also be used to spot link buying. People sometimes think they can buy links without a footprint, but you don’t know about the person on the other side. People need to realize that, as we build up new tools, paid links becomes a higher risk endeavor. We’ve said it for years, but we’re starting to enforce it more.

    It makes you wonder how safe those directories that charge for “reviews” to potentially get links are.

    The liveblog continues:

    I believe, if you ask any SEO, is SEO harder now than 5-6 years ago, I think they’d say it’s a little more challenging. You can expect that to increase. Google is getting more serious about buying and selling links. Penguin showed that some stuff that may work short term won’t work in the long term.

    On a semi-related note, Cutts also talked about paid inclusion, given that this has been in the news, as it relates to Google’s new sponsored results and Google Shopping.

    “You call it paid inclusion, but it’s a separately labeled box and it’s not in web ranking,” Cutts told Danny Sullivan, according to the liveblog, which continues: “Google’s take on paid inclusion is when you take money and don’t disclose it. Google’s web rankings remain just as pure as they were 10 years ago. We have more stuff around the edges, that’s true, but that stuff is helpful. Matt mentions using Google Flight Search to book his trip here to Seattle. ‘You can’t buy higher rankings. That hasn’t changed. I don’t expect it to change.’”

     

    @aschottmuller another way to say it would be: payment should always be clearly disclosed + payment doesn’t affect web search rankings.
    8 hours ago via web · powered by @socialditto
     Reply  · Retweet  · Favorite

    At the conference, Cutts also revealed that Google is considering launching a tool that would allow webmasters to tell Google to ignore certain links. The idea is already attracting a lot of praise from webmasters, many of which though Google should have had something like this long ago. Cutts indicated that such a tool would be several months away.

  • SEOmoz Analyst: Google Will Be Cracking Down On Directories More

    Earlier this month, there was some discussion about Google having de-indexed free web directories. Most of the ones we looked at had not actually been de-indexed, but were not ranking well, but there were clearly some that had been de-indexed.

    Since then, SEOmoz has been doing somde extensive data gathering, investigating the situation further. Kurtis Bohrnstedt, the company’s “Captain of Special Projects” gathered a total of 2,678 directories, and only found 200 of them to be banned, but an additional 340 to be penalized (as in not de-indexed, but not ranking for obvious terms where they would be the only result that makes sense).

    Still, that’s only 540 directories out of 2,678. It would seem that there are a lot more directories in the clear, but Bohrnstedt thinks this is only Google sending a warning, and that there is likely more to come.

    “That is not to say the ones left unharmed are safe from a future algorithmic update,” he writes. “In fact, I suspect this update was intended to serve as a warning; Google will be cracking down on directories. Why? In my own humble opinion, most of the classic, ‘built-for-SEO-and-links’ directories do not provide any benefit to users, falling under the category of non-content spam.”

    I wonder if that includes directories that are apparently built for SEO and links and charge webmasters for the chance to get links, but offer some form of editorial oversight.

    Interestingly, when this topic was being discussed a couple weeks ago, one webmaster said he had a paid directory he hadn’t touched in years, which was unexpectedly seeing an increase in PageRank.

  • iAcquire Gets Rid Of Paid Link Offerings Following Google De-Indexing

    Blogger Josh Davis recently put out an investigative report exposing the marketing firm iAcquire for engaging in paid links for clients. Once Google caught wind of it, iAcquire was de-indexed from Google’s search results.

    Now, the company has openly admitted to “financial compensation,” though it says it has been transparent about this with its clients. On Tuesday, iAcquire put out a blog post talking about the ordeal. Here’s a snippet:

    There are many methods to develop link relationships. Based on the client strategy we deploy a variety of approaches to link development, and in some cases we’ve allowed financial compensation as a tool. Removing financial compensation from the link development toolset has been a long term goal for us. We are using these recent events to be a catalyst to expedite those plans effective immediately.

    We do not mislead customers nor operate in any manner contrary to their wishes or directives. Every strategy we develop is done in conjunction with knowledgeable online marketing specialists from iAcquire and our clients. Our process is transparent- every aspect of a campaign is available to our customers. In the past, we have responded to the frequent needs for urgency and speed from our clients. We are going to take this opportunity to discuss with our clients the best approaches to ensure a long term strategy and horizon for their program.

    The company has been engaging in a lot of related conversation on Twitter:

     

    @jonahstein We’re not that concerned with being deindexed–we weren’t driving much traffic through search anyway. But thanks for the…
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @righthatseo not exactly getting rocked, just nudging us in the right direction even quicker http://t.co/6RLQedRg
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @jonahstein but of course we are working to comply with google in order to return to the search results
    17 hours ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

     

    @craigaddyman Can’t speak for everyone else but we haven’t stopped working hard to be the best SEOs we can right now
    45 minutes ago via HootSuite · powered by @socialditto
     Reply  · Retweet  · Favorite

    We may soon see other companies being exposed in similar fashion, as Davis recently told WebProNews, “I have come across some other smaller companies which seem to be doing it (maybe one other large one, but I am still researching that).”

    Google penalties from paid links, as we’ve seen in the past, can have big effects on big companies. Overstock.com even blamed Google’s penalty for an “ugly year”.

    Update: Davis now tells us, “I am not currently researching other companies large or small that may be buying undisclosed links. While my research for the initial piece did unearth what appeared to be other paid links, that was just a byproduct of my initial work. I have not further pursued examining any more links. It took so much time to do the ‘Search Secrets’ piece in thorough manner, I don’t intend to duplicate that amount of work again.”

  • What If The Google Penguin Update Inadvertently Killed The Web As We Know It?

    Note: Perhaps the headline of this article is a little sensational, but don’t overlook the “what if” part. I’m not suggesting Google has some plot to kill the web. However, many businesses rely on Google and people are freaking out about backlinks. Some are going so far as to threaten legal action if links are not removed. Links. If such legal action ever resulted in the outlawing of links in any capacity, the web as we know it could be put into great jeopardy. People would be afraid to link. I don’t think Google intends for anything like that to happen, but people don’t always respond to things in the most rational of ways. I don’t believe we will see links outlawed, or that the Penguin update will kill the web. However, reactions to Google penalties are leading to some pretty strong actions from some.

    Google has said on multiple occasions that it thinks the Penguin update has been a success. Do you agree? Let us know in the comments.

    PageRank And The Web

    WWW, as you may know, stands for World Wide Web. It’s a web because it it’s connected by links. Sites all over the web link to one another, creating a path for users to click from page to the next. Often those pages are to different sites. This is the way it has worked for years. Just think what it would be like if sites couldn’t freely link to one another. The web would be broken, and users would suffer.

    When Google launched with its PageRank algorithm, it was a revolution in search. It seemed to be a better way of doing search. It gave a rhyme and reason to the ranking of search results. Today, Google uses over 200 signals to rank its search results, which are becoming more personalized than ever before. PageRank still matters, but it’s far from the only thing that matters.

    Yet, it is PageRank that has given links on the web so much power to influence the visibility of web content. Now that just about everyone is on the web, everyone is fighting to have their content seen. Once upon a time, you would have thought: the more links the better. More links can only lead to more chances people will see your content. Now, somewhat ironically, people are finding that that the links they have out there are making their content less visible. In some cases, they’re making it practically non-existent in Google, or at least so buried, it might as well be non-existent.

    Freak Out Time?

    Google’s Penguin update has been a major wake up call to webmasters about certain kinds of linking practices. The update was designed to target sites violating Google’s quality guidelines. Among those guidelines are: “Don’t participate in link schemes” and “Avoid hidden text or hidden links.”

    Some of Google’s guidelines are obvious – avoid obviously unethical practices. But in the link schemes department, things can get a little blurry. Just ask WPMU.org, which got hit by Penguin over a few questionable links (interestingly enough, after seemingly benefiting from Google’s Panda update, designed to reward higher quality sites).

    A lot of webmasters have taken to the forums and blogs to complain about the Penguin update, but Google has, on more than on occasion, deemed the update a success. We’ll also be seeing it come back around every so often, much like its Panda predecessor.

    Even before Penguin, Google was sending out tons of messages to webmasters alerting them of questionable links. All of this has gotten webmasters in to a frenzy to clean-up their link profiles, and reduce the number of links Google considers to be of poor quality, in hopes that their content can find its way back into Google search visibility.

    Legal Action Over Links?

    Some webmasters have even gone so far as to threaten legal action over sites that are linking to them. We referenced this in another article after Barry Schwartz at Search Engine Roundtable mentioned that this was happening. Now, Greg Finn at Search Engine Land has pointed to a specific example where PSKL got a DMCA take down notice from LifeShield, after writing a positive review.

    Now, to be clear, this DMCA takedown notice is not in reference to any content theft or content use. It’s about links. It threatens legal action. It says:

    I request you to remove from following website (pskl.us)
    all links to www.lifeshield.com website as soon as possible.
    In order to find the links please do the following:
    1) If this is an online website directory, use directory’s search system to find “LifeShield” links.
    2) If there are hidden links in the source code of website, open website’s main page and view its source code. Search for “lifeshield.com” in the source code and you will see hidden links.

    It also says:

    LifeShield, Inc will be perusing legal action if the webmaster does not remove the referenced link within 48 hours.

    Jeremy at PSKL actually shares the entire conversation around the matter, which did include an apology, indicating that PSKL shouldn’t have been on the list of sites that received a notice. Jeremy, however, took issue that there was a list of sites getting such notices. Throughout the conversation, it is revealed that LifeShield had a site “cloak lifeshield and generate over 700K back links” without LifeShield’s knowledge, and that “Google stepped in and slapped” them with a penalty, which led to layoffs at the company.

    Jeremy responded with, “So you’re saying that somebody went out and bought 700K back links for you, knowing that it would get you penalized by Google? So does that mean you had (Company name) send out 700K DMCA notices? Talk about throwing good money after bad. Report the linkspam to the spam team at Google, then spend that money on an SEO expert rather than on trying to bully people with intimidation.”

    The response was actually longer than that, and included the metaphor of putting out a house fire with manure, but that was the main gist.

    I suggest reading Jeremy’s entire post. It’s pretty interesting.

    Is This Where The Web Should Go?

    He does make another important point in this: A party creating large quantities of backlinks to a site in order to generate SEO (or, in this case, destroy SEO) is unethical. It is not illegal.

    While many may not have a problem with such practices becoming illegal, it’s the idea that the law could intervene with linking in any form that could lead to greater problems. Just consider all fo the gray area there already is in fair use law. There will always be different interpretations, and that can get dangerous.

    For the record (granted, I’m no lawyer), I wouldn’t expect any legal action, such as that threatened in LifeShield’s DMCA notice to hold much water in a court of law. Finn also points to two cases (Ford Motor Company v. 2600 Enterprises) and (Ticketmaster Corp. v. Tickets.com, Inc.), where the legality of linking prevailed.

    But even if things like this have to go to court, it’s going to be a major inconvenience, and legal fees will have to be paid. If sites practicing legitimate, ethical linking habits get caught up in this, where will that leave the web?

    Is this what linking on the World Wide Web will become? Will you have to worry about getting sued because you linked to a site, and that site may or may not find your site to be a strong enough site to desire a link from? Could you get sued because your page didn’t have a high enough PageRank, and not enough link juice to help the site you’re linking to in its search engine visibility?

    LifeShield seems to be targeting some very specific webspam, but sending out notices to a whole list of sites. It’s likely that LifeShield isn’t the only company panicing and resorting to such action. It’s unfortunate, for the company if some negative SEO (it’s unclear if this was from a competitor) was able to have such an impact on its business, as Jeremy suggests, this may not be the best way of trying to resolve the issue.

    Let’s Give Google Some Credit.

    You can point to Google’s guidelines and its algorithm updates, which clearly do cause some to think this way, but just the same, Google can’t be held entirely to blame for this kind of mentality either. The company has said in the past that people shouldn’t obsess with PageRank, and that it uses over 200 signals to rank content. PageRank is not the only thing that matters. In fact, the company puts out huge lists of signal changes every month.

    It shows the power over society that Google really holds though. It shows how much businesses rely on Google search that they will go so far as to threaten sites that are simply linking to them with legal action.

    Should such legal action ever lead to a victory in court, that could mean very bad news for the Web as we know it, and people could be afraid to link. I would imagine that would spawn more issues of sites not getting the credit (and possible referral traffic) they deserve.

    Do you think Google’s guidelines and penalties can have an influence on the law? Now that would be power, and made even more ironic still, by the fact that Google is constantly under scrutiny of its own.

    Share your thoughts in the comments.

    Image: Batman Returns (Warner Bros.)

  • Should The Google Penguin Update Hit Sites Like WPMU.org?

    We recently told you about WPMU.org apparently getting hit by Google’s Penguin update. The site went from 8,580 visits (pretty standard for the site, having looked through the Analytics myself) to 1,527 a week later. It’s been hovering around similar numbers ever since, with a pretty clear dip right around Penguin time.

    Do you think this site deserved to get hit by Penguin? Let us know in the comments.

    Penguin drop

    We spoke with James Farmer, Founder and CEO of Incsub, which runs the site. Farmer maintains that WPMU.org engages in no keyword stuffing, link schemes, and has no quality issues. In fact, the site has actually done well throughout Google’s series of Panda updates.

    Farmer tells WebProNews, “We did great after Panda, it was like that update recognized we were decent folk… you can’t win them all huh?”

    “Apart from not being able to guess what Google was going to do in April, 3 years ago, we haven’t done anything wrong,” he says.

    Last week, Farmer received some second-hand info from Google’s Matt Cutts, who reportedly spoke with the Sydney Morning Herald about WPMU.org. According to Farmer, Cutts provided three problem links pointing to the site. These included a site pirating their software and two links from one spam blog using an old version of one of their WordPress themes with a link in the footer. Farmer reported that Cutts “said that we should consider the fact that we were possibly damaged by the removal of credit from links such as these.”

    It’s pretty interesting that if such links were the problem that it could have such a tremendous impact. It’s no wonder there have been so many discussions about negative SEO (competitors attacking each other with these kinds of tactics) since Penguin launched.

    The site has over 10,400+ Facebook likes, 15,600+ Twitter followers, 2,537 +1s and 4,276 FeedBurner subscribers, according to Farmer. Apparently not enough to outweigh some questionable links from third parties.

    “How could a bunch of incredibly low quality, spammy, rubbish (I mean a .info site… please!) footer links have made that much of a difference to a site of our size, content and reputation, unless Google has been absolutely, utterly inept for the last 4 years (and I doubt that that’s the case),” Farmer wrote in his article on the matter.

    When asked how many links he has out there just from footers for WordPress themes, he tells WebProNews, “Given that we stopped adding links years ago, actually not that many at all.”

    “However, the challenge is that given that we provided themes to a lot of multisite installs, which have since become overrun with splogs, there’s an enormous amount of links from not that many actual root domains,” he adds. “I’d guesstimate 1-2K, 99% of clearly low quality sites.”

    We asked if he’s heard from other WordPress theme creators, having similar issues.

    “Actually no, although that doesn’t surprise me that much,” he says. “Not many folk are as open as us, and in this field they probably have good reason to be. WordPress terms are very, very competitive so I wouldn’t be surprised if 9/10 competitors had something to hide!”

    Like many webmasters, Farmer just doesn’t know what to expect from Google, in terms of whether or not Google will consider the site to be one of the innocent casualties of Penguin.

    “I have no idea, I would love it if they did. I guess the thing I’m begging for is some sort of qualitative mechanism (NOT the manual webspam web, faster approach) that allows quality operators, like us, to survive and carry on providing Google users exactly the kind of helpful content they need!”

    Google does have a form users can submit to, if they think they’ve been wrongfully hit by the Penguin update.

    Google’s Matt Cutts recently told Danny Sullivan that Google considers the Penguin update a success, despite the large number of complaints from those commenting on blogs and in forums. Of course, the Penguin update, much like the Panda update, should be periodically coming back around, giving sites a chance to make fixes and recover. That also means however, sites will also have more chances to get hit.

    We asked Farmer if he thinks Penguin has helped or hurt search results in general, outside of his site’s issues.

    “Especially in the WP field they have gone wild,” he emphasizes. “For example our flagship site WPMU DEV – if you go to search for that now a competitor writing something ridiculous about us and copyright appears above our massively popular Facebook page. It even looks like our YouTube channel has been demoted. Crazy stuff.”

    We’ve certainly seen some other questionable search results following the update, and others have complained aplenty.

    Do you think the search results have improved since Penguin? Should WPMU have been hit by Penguin? Share your thoughts.

  • Matt Cutts Shares Something You Should Know About Old Links

    Google’s Matt Cutts has put out a new Webmaster Help video discussing something that’s probably on a lot of webmasters’ minds these days: what if you linked to a good piece of content, but at some point, that content turned spammy, and your site is still linking to it?

    In light of all the link warnings Google has been sending out, and the Penguin update, a lot of webmasters are freaking out about their link profiles, and want to eliminate any questionable links that might be sending Google signals that could lead to lower rankings.

    A user submitted the following question to Cutts:

    Site A links to Site B because Site B has content that would be useful to Site A’s end users, and Google indexes the appropriate page. After the page is indexed, Site B’s content changes and becomes spammy. Does Site A incur a penalty in this case?

    “OK, so let’s make it concrete,” says Cutts. “Suppose I link to a great site. I love it, and so I link to it. I think it’s good for my users. Google finds that page. Everybody’s happy. Users are happy. Life is good. Except now, that site that I linked to went away. It didn’t pay its domain registration or whatever, and now becomes maybe an expired domain porn site, and it’s doing some really nasty stuff. Am I going to be penalized for that? In general, no.”

    “It’s not the sort of thing where just having a few stale links that happen to link to spam are going to get you into problems,” he continues. “But if a vast majority of your site just happens to link to a whole bunch of really spammy porn or off-topic stuff, then that can start to affect your site’s reputation. We look at the overall nature of the web, and certain amount of links are always going stale, going 404, pointing to information that can change or that can become spammy.”

    “And so it’s not the case that just because you have one link that happens to go to bad content because the content has changed since you made that link, that you’re going to run into an issue,” he concludes. “At the same time, we are able to suss out in a lot of ways when people are trying to link to abusive or manipulative or deceptive or malicious sites. So in the general case, I wouldn’t worry about it at all. If you are trying to hide a whole bunch of spammy links, then that might be the sort of thing that you need to worry about, but just a particular site that happened to go bad, and you don’t know about every single site, and you don’t re-check every single link on your site, that’s not the sort of thing that I would worry about.”

    Of course, a lot more people are worried about negative SEO practices, and inbound links, rather than the sites they’re linking to themselves.

    More Penguin coverage here.

  • The Blurry Lines Of Google’s Paid Links Policy

    As you probably know, Google isn’t a fan of people paying for links that pass PageRank. It’s considered to be a manipulation of search results and a violation of Google’s quality guidelines, which are the focus of Google’s Penguin update. It’s interesting that there seem to be exceptions to the rule, such as a directory like Best Of The Web, which has users pay for their sites to be considered for links.

    Update: BOTW has gotten back to us since this article was published. Please see BOTW President Greg Hartnett’s comments toward the end of the article.

    Perhaps more interesting is that some similar directory sites, which aren’t necessarily in clear violation of Google guidelines seem to be getting penalized, or at the very least drawing the ire of unhappy webmasters looking to get their link profiles cleaned up after receiving messages from Google.

    Should a directory in which you have to pay to get a listing be treated like other sites that offer paid links? Let us know what you think in the comments.

    Google recently launched a PageRank update, and many directory sites saw their PR plummet. Best Of The Web, meanwhile, has managed to maintain 4s, 5s and 6s. At at a time when flustered webmasters are looking to eliminate lower-end links, the topic of directory links on the web seems more relevant than it’s been for quite some time.

    Webmasters Are Angry

    Barry Schwartz at Search Engine Roundtable ran a very interesting story about Google being “the cause of lawsuits over links to web sites.”

    “Can you imagine writing a story, linking that story to other relevant web sites and then years later being hit with a lawsuit over linking to a web site?” he asks.

    The gist is that webmasters who have been receiving those messages from Google about unnatural links are threatening to sue sites that are linking to them. “Some webmasters are taking extreme measures and threatening to sue publishers and webmasters who are linking to them,” he reports.

    I don’t know how often this is actually happening, but I can’t say it’s much of a surprise. If any such lawsuit is successful, then we have a problem.

    I don’t know about the legal threats, but I do know a lot of directories are getting angry emails from webmasters who have links coming from them.

    Google has taken issue with directories in the past – sort of. Here’s what the company told us in 2007:

    There’s no “outright penalty” for being a directory, but we do value, as I’m sure you’ve heard, “unique, compelling content.”

    Directories can run into the problem of not containing original information.

    There do seem to be some directories that have historically received a bit more respect from Google. This includes Best Of The Web, which as I said, charges users for possible inclusion.

    Google has talked about this in the past. Here’s a video about it from Matt Cutts from 2009:

    The user-submitted questions Cutts was responding to was:

    Will Google consider Yahoo! Directory and BOTW as sources of paid links? If no, why is this different from another site that sell[s] links?

    He doesn’t entirely answer the question, however. He does say:

    “Whenever we look at whether a directory is useful to users, we say, ‘OK, what is the value add of that directory?’ So, you know, do they go out and find their entries on their own, or do they only wait for people to come to them, you know, how much do they charge and what’s the editorial service that’s being charged?”

    “If a directory takes $50 USD and every single person who ever applies in the directory automatically gets in for that 50 dollars, there is not as much editorial oversight as something like the Yahoo directory, where people do get rejected. So, you know, if there is no editorial value add there, then that is much closer to paid links.”

    So basically, it sounds like if a directory rejects some things, this is OK.

    How Best Of The Web Works

    So how does Best Of The Web Work, exactly? You go to submit a site, and you’re presented with a page like this:

    Best of the Web

    It’s clear that the main motivation for submitting to this directory is to help your search engine rankings. It says, “Listing your website in the internet’s most respected directory will help increase your website’s visibility in major search engines.”

    The first example of a “link scheme” Google lists on its page about them is: “Links intended to manipulate PageRank.” While I can’t find anything on BOTW that specifically says anything about PageRank, is that not what submitters are after here?

    Best Of the Web presents multiple quotes from various marketing-types, like:

    “After implementing a plan with listings across several BOTW directories, we were able to see immediate and quantifiable improvement in our rankings. Working with BOTW has been a great success for Marriott.” — Benjamin Burns, Search Specialist

    “BOTW provided excellent service for us and our listings. I would hire them over and over again every time we need directory listings.” — Marek Wawrzyniak, SEO Specialist

    “Best of the Web has proven to be a successful strategy for Extra Space Storage when coupled with other local SEO techniques. We have seen a consistent ranking improvement in many areas with our local storage facilities by having Best of the Web part of our organic strategy.” — Tim Eyre, Interactive Marketing Manager

    It’s obvious that the reason one would want to be listed in this directory is SEO. It’s not because people are going to the directory to search for businesses. It’s an SEO strategy – something BOTW seems pretty up-front about.

    Link Schemes

    Let us refer to that “Link Schemes” help center page (linked to from its Quality Guidelines page) for a moment. That says:

    Your site’s ranking in Google search results is partly based on analysis of those sites that link to you. The quantity, quality, and relevance of links count towards your rating. The sites that link to you can provide context about the subject matter of your site, and can indicate its quality and popularity. However, some webmasters engage in link exchange schemes and build partner pages exclusively for the sake of cross-linking, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. This is in violation of Google’s Webmaster Guidelines and can negatively impact your site’s ranking in search results. Examples of link schemes can include:

    • Links intended to manipulate PageRank
    • Links to web spammers or bad neighborhoods on the web
    • Excessive reciprocal links or excessive link exchanging (“Link to me and I’ll link to you.”)
    • Buying or selling links that pass PageRank

    Let’s read that last one again. “Buying or selling links that pass PageRank.”

    Links That Pass PageRank

    As far as I can tell, if you have managed to get listed in Best Of The Web, the link will pass PageRank. The links I looked at do not include the nofollow attribute, which would prevent them from passing PageRank:

    The links marked as “ads” at the top of category pages do include the nofollow atribute.

    The category page above has a PageRank of 5. Some pages are higher, and some are lower. The home page has a 6.

    Back To The Submission Process

    If you click to get started, you are prompted to provide your email address (twice), and then to fill out a large form. The last part of that form is for the payment details:

    Best Of The Web Payment Details

    You can choose from two plans: annual fee or one time fee. Once you click submit, your card will be charged. You must check the box that says you’ve read the ToS and privacy policy. It’s only when you click through to the ToS, and through one more link there, that you find out your site may not even appear in the listings. It says, “There is no guarantee that my site will be added to the directory” and that the charge is non-refundable. You agree that you understand that, “BOTW editors, in their sole and final judgement, shall determine the suitability, placement, title and description of all sites listed in the BOTW Directory.”

    There’s nothing wrong with BOTW wanting to be selective in the editorial process. That’s what Google has indicated in the past is actually what makes directories like this higher quality in Google’s eyes. That said, Google is always preaching about user experience, and encouraging sites to provide what’s best for the user. User trust has been a major theme, particularly since the Panda update.

    BOTW does require submitters to read the TOS, before charging them, but the part about potentially not being included, even with no refund, seems a bit buried.

    Is BOTW’s practice OK in Google’s eyes because they’re using enough judgment not to include EVERY link that people are paying for in hopes of a listing?

    Is This What Google Wants From A Directory?

    I’m not going to advise you sell or pay for links at all, but I feel like Google is sending some very mixed signals here.

    Search engine industry vet Tim Mayer, who worked from Yahoo until 2010, tells WebProNews, “It is interesting as they [BOTW] are positioned similarly to the Yahoo directory of old with editors and payment. Other directories’ such as business.com model failed due to Google changing their treatment of them. Not sure if this was due to quality or the lack of editorial oversight.”

    “Many other directories are or are considered spam sites/directory link farms as they are just pages of paid links,” he adds. “Seems to me this is may be legacy treatment. But I have not looked at BOTW and analyzed it in some time. Google probably has a better sense of if this is a good authority hub or not. If it is they should use it. I would bet that they are better quality than most directory sites.”

    But it’s not really even an issue of quality. It seems like more of a double standard on Google’s part, given that the company clearly lists “Buying or selling links that pass PageRank” as an example of a link scheme.

    Editorial judgment is clearly a factor, but is it really the “best” of what the web has to offer or is it some of the best, with some that actually paid for reviews getting in there too, regardless of whether or not they’re really the best. Update: Hartnett says “an almost imperceptible percentage” of the links are from those who paid for the reviews.

    Look at this listing for Caagal.com on BOTW’s Business Classifieds category page, for example. A quick glance at this site (complete with loading errors) doesn’t suggest “best” of what the web has to offer in this niche, though this is certainly subjective. It doesn’t even seem to be largely business-oriented, but more property and boat oriented. For the record, I have no idea if this site paid or not.

    Granted, the site is nowhere to be found in Google, for the query “business classifieds” (at least within the first six pages). It’s hard to say how much value that site may have gotten from paying to be listed in Best Of The Web, but I guess they at least got a PageRank 4 link out of it (PR for that category page).

    Obsess With Google’s Quality Guidelines or Not?

    Webmasters are frantically trying to distance themselves from some directory sites after getting messages from Google about unnatural links. Even directories who have never offered paid links are getting emails from upset webmasters. Jayde, for example (disclosure: owned by WPN parent iEntry), has gotten quite a few. Jayde has never offered paid links, and recently made all links nofollow.

    If webmasters are looking to start suing sites that are linking to them because they are under the impression that these links are hurting them, that’s pretty bad.

    Interestingly enough, Google used to encourage directory submissions.

    “In fact, if you look at our webmaster quality guidelines, we used to have a guideline that says, you know, submit your site to directories, and we gave a few examples of directories,” Cutts explains in that video. “And what we find, or what we found was happening, was people would get obsessed with that line and go out and look for a lot of directories.”

    “We ended up taking out that mention in our webmaster guidelines so that people don’t get obsessed with directories and think, “Yes i have to go find a bunch of different directories to submit my site to,’’ says Cutts in the video.

    I realize this video is 3 years old, but I have to say, this seems to be an example of mixed signals coming from Google again.This would indicate that you shouldn’t obsess over the things in Google’s quality guidelines, but as you probably know, the Penguin update, which launched a couple weeks ago, was all about targeting sites violating the quality guidelines.

    To Sum Up

    – Google used to encourage directory submissions from the quality guidelines.

    – Google decided people shouldn’t obsess about that.

    – Now people are freaking out about links that they have from such directories that they submitted to, and some may even be so angry as to threaten legal action (though I can’t imagine there are any legitimate grounds).

    – Best of the Web, who charge money for the chance to have links designed to influence search visibility, which seems like it would violate Google’s guidelines aren’t considered a major problem.

    Something seems wrong with that picture.

    We’ve reached out to Google for comment and have not heard back from them.

    Update: We have received a thoughtful response from Best Of The Web President Greg Hartnett.

    On the criteria for sites to be considered the “best,” and gain a listing, President Greg Harnett says, “Our guidelines for listing are pretty straightforward: we list sites that contain quality, unique content in the most relevant category within the directory. If the site does not provide a user with informative content then we don’t list it. We have always been focused on providing the user with quality content from trustworthy sources.”

    “When users (humans or spiders) come to BOTW, they know that they can trust that (for instance) all of the listings in a San Francisco real estate category contain relevant information about San Francisco real estate,” he adds. “A human being has been in there and verified it. We’ve got a dedicated team of fantastic editors that ensure that.”

    On the percentage of submissions that are rejected, Hartnett says, “I don’t work the submission queue, so I don’t really have a handle on the specific numbers. However, as a percentage of total submissions, I believe that we reject fewer sites now than we did in the past. The overall quality of submissions has increased as the years have gone by. Perhaps in general, people are now building better sites. Perhaps it’s a matter of more people knowing that BOTW doesn’t accept low quality sites, and they don’t even bother submitting. Whatever it is, I know that it makes our editors happier.”

    We asked: It seems like Google advises against paid links, but doesn’t Best of the Web charge users to have their links reviewed for possible listting?

    “Google certainly advises against paid links,” Hartnett tells us. “We’re not a pay for placement, or link buying platform. Payment for review in no way influences whether or not a site is listed within the directory. The fee is for the review, and is non-refundable. It’s not for a link. We caught a lot of flack about that policy in the early years of the directory, but we did it for a reason. We retain complete editorial control and integrity with each submission and listing. It’s completely up to our editors to decide is the site gets listed, and if listed, the title, description and category placement.”

    “It should also most definitely not be overlooked that the review model accounts for a minuscule amount of the listings within the directory,” he adds. “We have millions of listings, of which our editors have added approximately 95% for free. They work daily scouring the web adding quality sites to relevant categories to build a more comprehensive resource. An overwhelming majority of the listings in the directory have had zero interaction with BOTW at all, nonetheless paid for a review.”

    “I have no idea why Google does or does not approve of what it is we are doing,” says Hartnett. “I don’t work for or with Google and I don’t have any access to them outside of what Joe Internet does. I’d be surprised if they thought about us at all, but if they did I would like to think that they respect what it is we have been doing for all these years.”

    “We feel we have put together (and continue to build) a fantastic resource for users that are interested in finding resources that they can trust,” he says. “We have always focused on providing the user with quality resources, and figured users appreciated, and will continue to appreciate, that effort. We’ve recently added the ability for editors and site owners to add social information for each listing, as we continue to evolve with the landscape and provide users with additional information about listings as well. It’s really been a fantastic project to have been working on for the last decade or so, and we’re excited to continue on our mission.?”

    Do you think Google is sending mixed signals about paid links? Let us know in the comments.

  • The AP Links, But For Those Running AP Stories, It’s Up To Them

    This week, we ran an article about the Associated Press and its linking policies, which pointed out an article that seemed to be doing the kind of thing the organization has historically frowned upon from others – short articles based on someone else’s original reporting, linking to the original.

    While that was largely the point of the article, we also noticed that some of the sites running the AP story had bit.ly URLs, which pointed to the original source, but didn’t actually link. Rather than linking some anchor text, it just had the URL in parentheses, which seems odd to me. Anyway, some of the sites actually linked the URL and some didn’t. The AP pointed out to us that they link it on their own site, and don’t control whether or not the other sites link.

    Since backlinks are a pretty significant factor in building credibility on the web, I thought it would seem appropriate to make it a policy that sites running the AP’s content with links to other sources, be required to keep those links in tact. I asked Director of AP Media Relations Paul Colford why the AP does’t make this a policy. Here’s what he says:

    As a cooperative and as a provider of services to our members and commercial customers (that is, we deliver text, photos, video, graphics etc. to papers large and small, broadcasters large and small, plus websites etc., which then consider our goods for their own needs and presentations), we don’t dictate how they utilize the material, or deal with links for that matter. The choices are theirs.

    Moreover, I’m told that longer links have a tendency to break for any number of reasons; some links contain characters that transmit unevenly downstream. Which is why we also use bit.ly.

    The bit.ly thing I get, although it does mask the domain it’s referring to.

    Linking to your sources is common web etiquette. The AP gets this, clearly. They do it on their own properties, and incude the link (although I still don’t see why they don’t just link anchor text like most other news organizations). But apparently sites running AP content aren’t required to keep this etiquette in place.