WebProNews

Tag: SEO

  • Google Webmaster Tools Gets New rel-alternate-hreflang Feature

    Google announced on Monday that it is adding a new feature to Google Webmaster Tools to make it easier to debug rel-alternate-hreflang annotations. These are the attributes Google uses to serve the correct language or regional URL in search results.

    Google’s Maile Ohye talks about using rel-alternate-hreflang in the following video.

    The Language Targeting section of “International Targeting” in Webmaster Tools lets you identify missing return links and incorrect hreflang values.

    Regarding missing return links, Google’s Gary Illyes explains, “Annotations must be confirmed from the pages they are pointing to. If page A links to page B, page B must link back to page A, otherwise the annotations may not be interpreted correctly. For each error of this kind we report where and when we detected them, as well as where the return link is expected to be.

    For incorrect hreflang values, he says, “The value of the hreflang attribute must either be a language code in ISO 639-1 format such as ‘es’, or a combination of language and country code such as ‘es-AR’, where the country code is in ISO 3166-1 Alpha 2 format. In case our indexing systems detect language or country codes that are not in these formats, we provide example URLs to help you fix them.”

    Google has also moved the geographic targeting setting to the International Targeting feature.

    You can learn more about using rel-alternate-hreflang here.

    Images via Google

  • Groupon Finds ‘Direct’ Traffic Is Really Organic Search

    Groupon ran an experiment to try and figure out where “direct” traffic that appears in analytics programs is really coming from. Gene McKenna, director of product management at Groupon, wrote a blog post about the findings for Search Engine Land.

    McKenna leads Groupon’s organic search efforts. He says they completely de-indexed their site one day “for the sake of SEO science” for about six hours. During this time, they examined organic search and direct traffic by hour and by browser to any page with a “long” URL. He gives this one as an example:

    www.groupon.com/local/san-francisco/restaurants

    While traffic “attributable to SEO efforts” dropped to nearly zero, “direct” visits dropped by 60%. That 60%, he says, is really organic. It was on Groupon deal pages, which have long enough URLs to make it unlikely that people were actually putting them in manually and new enough that users were unlikely to have them bookmarked or remembered in auto-complete.

    They also hypothesize that SEO isn’t the only channel losing traffic credit because of browsers hiding referrers. Link referral campaigns could be suffering too, according to McKenna.

    He doesn’t recommend deindexing your site for your own testing. Check out the rest of Groupon’s finding in the original post. He shares some interesting graphs.

    Image via Groupon

  • Matt Cutts Is Disappearing For A While

    Just ahead of the holiday weekend, Google’s head of webspam Matt Cutts announced that he is taking leave from Google through at least October, which means we shouldn’t be hearing from him (at least about Google) for at least three months or so. That’s a pretty significant amount of time when you consider how frequently Google makes announcements and changes things up. Is the SEO industry ready for three Matt Cutts-less months?

    Cutts explains on his personal blog:

    I wanted to let folks know that I’m about to take a few months of leave. When I joined Google, my wife and I agreed that I would work for 4-5 years, and then she’d get to see more of me. I talked about this as recently as last month and as early as 2006. And now, almost fifteen years later I’d like to be there for my wife more. I know she’d like me to be around more too, and not just physically present while my mind is still on work.

    So we’re going to take some time off for a few months. My leave starts next week. Currently I’m scheduled to be gone through October. Thanks to a deep bench of smart engineers and spam fighters, the webspam team is in more-than-capable hands. Seriously, they’re much better at spam fighting than I am, so don’t worry on that score.

    He says he wont’ be checking his work email at all while he’s on leave, but will have some of his outside email forwarded to “a small set of webspam folks,” noting that they won’t be replying.

    Cutts is a frequent Twitter user, and didn’t say whether or not he’ll be staying off there, but either way, I wouldn’t expect him to tweet much about search during his leave. If you need to reach Google on a matter that you would have typically tried to go to Matt Cutts about, he suggests webmaster forums, Office Hours Hangouts, the Webmaster Central Twitter account, the Google Webmasters Google+ account, or or trying other Googlers.

    He did recently pin this tweet from 2010 to the top of his timeline:

    So far, he hasn’t stopped tweeting, but his latest – from six hours ago – is just about his leave:

    That would seem to suggest he doesn’t plan to waste much of his time off on Twitter.

    So what will Matt be doing while he’s gone? Taking a ballroom dance class with his wife, trying a half-Iornman race, and going on a cruise. He says they might also do some additional traveling ahead of their fifteen-year wedding anniversary, and will spend more time with their parents.

    Long story short, leave Cutts alone. He’s busy.

    Image via YouTube

  • Google Continues Link Network Attack

    Google Continues Link Network Attack

    It would appear that Google’s attack on European link networks is not over (if it ever will be).

    Google has been penalizing link networks on the Internet with a vengeance over the past year or so, with much of the focus on Europe. The company says it has now penalized two more from Poland.

    This week, Google posted about reconsideration requests on its Poland blog, and then Googler Karolina Kruszyńska told Rusty Brick they took action on two networks in Poland:

    She didn’t name the networks (at least publicly). Google’s Matt Cutts also tweeted about it:

    I wonder who those people are.

    Back in February, Google said it was focusing on networks in Poland. Since then, it was gone after various other networks in Europe, and also in Japan.

    Image via YouTube

  • If Links Are Changed Between Crawls, Google Will Apparently Trust Them Less

    Here’s a bit of interesting info about how Google views links. Apparently the search engine trusts them less if they’ve been changed.

    That is according to former Googler Pedro Dias, who used to work on the Search Quality and Webspam team at the company (via Search Engine Roundtable). Here’s a series of tweets from him:

    So take from that what you will.

  • Google Makes Reconsideration Request Responses More Helpful

    It appears that Google is actually making the reconsideration request process more helpful for webmasters by giving them more clues on what the problems are with their sites.

    At the Search Marketing Expo last week, Google’s Matt Cutts said the company was working on improvements to the requests. Now, one webmaster has shared an example of a response he got from the company, which includes a section for “a note from your reviewer”. This is where a Google can mention some specific advice related to the initial problem.

    John Edward Doyle, the webmaster, shared a screenshot of Google’s response in a tweet aimed at Barry Schwartz.

    As you can see, Google explicitly suggests Doyle check the canonical version of his site to see more link data when auditing links. As Schwartz notes in an article, this is something that many would not think to do, and could therefore be some very helpful, actionable advice for the webmaster.

    It will be interesting to see if a lot of webmasters find Google’s advice to be more helpful, or if this is just an obscure example.

    Image via Twitter

  • Google Launches New Version Of Payday Loan Algorithm

    Last month, Google rolled out two major updates to its algorithm around the same time – new versions of the famous Panda update and the “Payday Loans” update, which is one of its ways of fighting spam.

    A newer version of the latter began rolling on Thursday afternoon.

    Google’s head of webspam Matt Cutts announced the update at the Search Marketing Expo in front of a packed house.

    “Matt Cutts explained that this goes after different signals,” recounts Barry Schwartz at SMX sister site Search Engine Land, who was in attendance. “The 2.0 version targeted spammy sites, whereas version 3.0 targets spammy queries.”

    It will target queries like “payday loans,” “casinos,” “viagra,” etc., he says.

    According to this recap of Cutts’ announcements (as tweeted by Cutts himself), he referred to the new update as Payday Loan 2.0 with last month’s being 2.0A if that helps you for any reason whatsoever.

    Also according to that recap, Google is working on improving reconsideration requests so web spam analysts can provide additional feedback. Also, Google is close to getting IE 8 referring data back. It will still show mostly as not provided, it says, but will correctly show the visitor as coming from Google search.

    Image via MYA (Twitter)

  • Is Negative SEO Becoming A Bigger Problem For Businesses?

    Negative SEO – the practice of competitors engaging in SEO attacks in order to harm businesses in search results – has been a concern in the search industry for years. It’s rare that we see any concrete evidence of this actually working, but the suspicion is pretty much always there, and seems to be ramping up these days.

    Do you believe negative SEO is working better than it used to? Let us know in the comments.

    Last week, a webmaster started a thread in the WebmasterWorld forum asking “Has the most recent Google update made negative SEO easier?” It goes like this:

    Negative SEO: Possibly one of the webmaster’s worst nightmares. I’d like to speculate that negative SEO is now much easier to do now than it was prior to google’s latest updates. Are we seeing more evidence to confirm this? There’s certainly more talk of negative SEO.

    We all know that links revolving around bad neighbourhoods can cause problems for a site, and it’s relatively easy to generate hundreds, if not thousands of links automatically, and fairly easily putting a site on the automated radar at google. These links can also be generated manually, one-by-one, to slowly creep up on a site.

    It’s now more important than ever to have a GWT account to look out for these issues developing, and to deal with them that much faster if we’re to avoid problems. What’s your view: Is it now easier to run negative SEO campaigns, and how would you deal with the problem?

    The thread spans nearly 50 posts, and there’s not exactly a consensus that negative SEO is now easier, though Search Engine Roundtable says “most SEOs” actually agree that it is indeed easier. Barry Schwartz, who runs that site even ran a poll asking if it is easier, and nearly 74% said yes. It got 359 responses as of the time of this writing.

    Since then, another WMW thread has sprung up titled “The New SEO is Negative SEO – How to Tank a Site in Google 101“. It suggests that the following strategy would work as a negative SEO attack:

    The first month, contract a couple $5 guest blog posts [make sure the posts are in broken English of course], then go back to what you were doing.

    Second month, try a few more [4-8] $5 [broken English] guest blog posts and add some forum link drops to the mix. Go back to what you normally do — Nothing will happen.

    Third month, add even more [broken-English] guest blog links [2x or 3x per week], increase the forum link drops and sign up for long-term [“undetectable”] directory additions.

    If the site hasn’t tanked yet, month 4 hit ’em with 20,000 inbound links all at once — Keep doing it and eventually the site you’re aiming at will tank and they won’t be able to figure out how to recover — It takes almost none of your time and costs very little to tank a site due to the “penalty mentality” Google has decided to run with.

    There’s a bit more to it, which you can read on the forum, but that’s the general gist.

    The original poster includes a disclaimer: “I don’t normally post about ‘how to do negative stuff’, but Google needs to fix this sh*t, so I hope people understand how it’s done and feel free to use it until Google fixes their broken system and mentality — Penalties don’t bring links back to citations; penalties simply change who creates the links and who’s site they point to. Period!”

    Whether this actually works or not can (and will) be debated, but most people probably wouldn’t admit to actually trying it. People in the forum seem mostly convinced of its ability to work though.

    In 2012, Google changed the wording in a Webmaster Tools help center article in response to the question: Can competitors harm my ranking? Once upon a time, it said:

    There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages.

    Our search results change regularly as we update our index. While we can’t guarantee that any page will consistently appear in our index or appear with a particular rank, we do offer guidelines for maintaining a “crawler-friendly” site. Following these recommendations may increase the likelihood that your site will show up consistently in the Google search results.

    It was changed to say:

    Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index. If you’re concerned about another site linking to yours, we suggest contacting the webmaster of the site in question. Google aggregates and organizes information published on the web; we don’t control the content of these pages. Emphasis ours.

    Eventually, Google released a video of Matt Cutts discussing negative SEO.

    “So we try really, really hard to design algorithms that are robust, and that are resistant to that sort of thing,” he said. “Any algorithm that we’ve done in recent years – that the web spam team has worked on – we do try to walk through those cases and make sure that we’re resistant to that sort of thing.”

    “In my experience, there’s a lot of people who talk about negative SEO, but very few people who actually try it, and fewer still, who actually succeed,” he said later in the video.

    These words did little to quell concern. Plus, that video is nearly two years old. The topic did come up in another video from Cutts last year:

    This response didn’t really sit much easier with some viewers.

    Sadly, some businesses are even receiving negative SEO blackmail threats.

    People have long criticized Google for simply not just ignoring the links it considers spammy, so people don’t have to worry about stuff like this and spend money, time and resources trying to figure out why Google doesn’t like their site and trying to get “bad links” cleaned up. As it stands, Google penalties would seem to double as strategies for negative SEO attacks.

    Is this a real problem or is it exaggerated as Google would like webmasters to think? Share your thoughts in the comments.

    Image via Google

  • Google Highlights Guide To Site Moves

    Google Highlights Guide To Site Moves

    Google has released a new guide on how to handle moving your site in a way that’s “Googlebot-friendly”.

    “Few topics confuse and scare webmasters more than site moves,” say Google Webmaster Trends analysts Pierre Far and Zineb Ait Bahaji.

    As they explain, there are site moves with and without URL changes, which have different guidelines, of course.

    “We’ve seen cases where webmasters implemented site moves incorrectly, or missed out steps that would have greatly increased the chances of the site move completing successfully,” they say. ” To help webmasters design and implement site moves correctly, we’ve updated the site move guidelines in our Help Center. In parallel, we continue to improve our crawling and indexing systems to detect and handle site moves if you follow our guidelines.”

    They also have a new page on smartphone recommendations, which you should definitely pay attention to. Google announced earlier this week that it is now calling out websites with faulty redirects in mobile search results to save users from having to deal with tapping a search result, which redirects them to a site’s homepage.

    Image via Google

  • Facebook Says Your Organic Reach Would Be Worse If It Showed Everything In The News Feed

    Facebook’s Brian Boland wrote a lengthy blog post about the much talked about decline in organic reach of Facebook Page posts. It’s happening for two main reasons, he said: more and more content is created and shared every day and News Feed is designed to show users content that’s most relevant to them.

    It’s not about money, according to Facebook. Do you buy that? Let us know in the comments.

    “To choose which stories to show, News Feed ranks each possible story (from more to less important) by looking at thousands of factors relative to each person,” he wrote. “Over the past year, we’ve made some key changes to improve how News Feed chooses content: We’ve gotten better at showing high-quality content, and we’ve cleaned up News Feed spam. As a result of these changes, News Feed is becoming more engaging, even as the amount of content being shared on Facebook continues to grow.”

    According to at least one third-party measurement, Facebook is indeed sending more and more referrals to websites that are actually appearing in the News Feed.

    Some question why Facebook doesn’t just show people everything from all of their friends and all of the Pages they’ve liked, and let them decide what they want to see.

    “Several other online feed platforms display all content in real time,” he said. “But the real-time approach has limitations. People only have so much time to consume stories, and people often miss content that isn’t toward the top when they log on. This means they often do not see the content that’s most valuable to them.”

    He reiterated a point Facebook has made in the past, that in tests, the ranking system offers people a “more engaging experience”. He also said that using a real-time system for content would “actually cause Pages’ organic reach to decrease further.”

    I’m guessing some could argue with that, especially considering all the research that’s been done about the best times to post content. There is pretty much a whole industry dedicated to maximizing visibility on social media and helping businesses get more out of their social media strategies.

    Obviously many consider the decline of organic reach a money grab on Facebook’s part. There is a fairly widespread mentality that Facebook has dropped it to force people to pay for promoted posts.

    This is false, according to Boland, who said, “Our goal is always to provide the best experience for the people that use Facebook. We believe that delivering the best experiences for people also benefits the businesses that use Facebook. If people are more active and engaged with stories that appear in News Feed, they are also more likely to be active and engaged with content from businesses.”

    He then compared Facebook organic reach to SEO:

    Many large marketing platforms have seen declines in organic reach. Online search engines, for instance, provided a great deal of free traffic to businesses and websites when they initially launched. People and businesses flocked to these platforms, and as the services grew there was more competition to rank highly in search results. Because the search engines had to work much harder to surface the most relevant and useful content, businesses eventually saw diminished organic reach.

    Indeed, Facebook News Feed algorithm changes have been compared to Google’s famous Panda update. The comparison only stretches so far, however, because Google has over 200 signals that it takes into account in ranking content. Facebook, while it has many ranking signals, isn’t looking much beyond source in determining quality, which is problematic. At least Google gave sites a big list of things it thinks about when determining quality.

    Boland then talked about how transparent Facebook is:

    While many platforms experience a change in organic reach, some are more transparent about these changes than others. Facebook has always valued clear, detailed, actionable reports that help businesses see what’s happening with their content. And over time we will continue to expand and improve our already strong reporting tools.

    To be fair, Google might give more hints about what it considers to be quality content, but its transparency is constantly on trial in the court of public opinion. Boland at least described “great content” in his post as “content that teaches people something, entertains them, makes them think, or in some other way adds value to their lives.” The problem is that based on what Facebook has said before, it doesn’t really matter if your content does any of this if you’re not a whitelisted site. That is when it comes to Page posts. The “great content” thing can still work, of course, in terms of people just liking content from your actual site after they get to it from a search engine, Twitter or anywhere else.

    After the near-demise of organic reach on Facebook, many wonder what the point of trying to acquire new Facebook fans is. Boland attempted to answer this next, saying, “Fans absolutely have value,” first and foremost, “Fans make your ads more effective.”

    “When an ad has social context — in other words, when a person sees their friend likes your business — your ads drive, on average, 50% more recall and 35% higher online sales lift,” he wrote. “Fans also make the ads you run on Facebook more efficient in our ads auction. Ads with social context are a signal of positive quality of the ad, and lead to better auction prices. You can use insights about your fans — like where they live, and their likes and interests — to inform decisions about reaching your current and prospective customers.”

    Finally, fans can give your business credibility, he said.

    Later, he compared Facebook to search again:

    Like TV, search, newspapers, radio and virtually every other marketing platform, Facebook is far more effective when businesses use paid media to help meet their goals. Your business won’t always appear on the first page of a search result unless you’re paying to be part of that space. Similarly, paid media on Facebook allows businesses to reach broader audiences more predictably, and with much greater accuracy than organic content.

    Next, Boland said that “of course” businesses can succeed on Facebook with decreased organic reach before running down a handful of brands that have used ads successfully.

    The early reaction to Boland’s post (in the comments) has been mixed. Some appreciated the explanation, but others still fee like it’s a “money grab” on Facebook’s part, and are no less frustrated than they were before the post.

    Do you believe the money isn’t factoring into Facebook’s organic reach decline? Share your thoughts.

    Image via Facebook

  • Google Calls Out Sites In Mobile Results For ‘Faulty Redirects’

    Google is calling out webistes with “faulty redirects” in mobile search results to save users from having to deal with the “common annoyance” of tapping a search result only to be redirected to a site’s mobile homepage.

    This occurs when a site isn’t properly set up to handle requests from smartphones. As Google notes, it happens so frequently there are actually comics about it. They point to this one from xkcd:

    Google is simply noting in the search results that the result “may open the site’s homepage,” and provides a link to “try anyway.”

    To avoid this happening to your site, Google recommends first searching on your own phone to see how your site behaves, and then checking Webmaster Tools to see if Google has sent you a message about detecting any of your pages redirecting smartphone users to the homepage. Luckily, Google is kind enough to show you actual faulty redirects it finds in the Smartphone Crawl Errors section.

    After that, Google says to investigate the faulty redirects an fix them by setting up your server so it redirects smartphone users to the equivalent URL on your smartphone site, and if the page on your site doesn’t have such an equivalent, to keep users on the desktop page, rather than sending them to the smartphone site’s page.

    “Doing nothing is better than doing something wrong in this case,” says Google Webmaster Trends analyst Mariya Moeva.

    She notes that you can also try using responsive design. Google’s full guidelines for building smartphone-optimized websites can be found here. Google also has a help center article specifically on faulty redirects here, which you might find useful.

    The new disclaimer feature is only appearing in English search results in the U.S. for now.

    Images via xkcd, Google

  • Here’s Another Matt Cutts Floating Head Video (About The Most Common SEO Mistake)

    We’ll just keep this one short like the video itself. The most common SEO mistake you can make, according to Matt Cutts, is not having a website. Hopefully you feel you’ve gotten your money’s worth on that one.

    Once again, Cutts uses the ol’ floating head trick.

    I wonder how many more of these things he’s got.

    Image via YouTube

  • Google Talks Determining Quality When There Aren’t Links

    Google has a new Webmaster Help video out talking about how it looks at quality of content that doesn’t have many links pointing to it.

    Specifically, Matt Cutts takes on the following question:

    How does Google determine quality content if there aren’t a lot of links to a post?

    “In general, that sort of reverts back to the way search engines were before links,” he says. “You’re pretty much judging based on the text on the page. Google has a lot of stuff to sort of say OK, the first time we see a word on a page, count it a little bit more. The next time, a little more, but not a ton more. And that after a while, we say, ‘You know what? We’ve seen this word. Maybe this page is about this topic,’ but it doesn’t really help you to keep repeating that keyword over and over and over again. In fact, at some point, we might view that as keyword stuffing, and then the page would actually do less well – not as well as just a moderate number of mentions of a particular piece of text.”

    He continues, “We do have other ways. In theory we could say, ‘Well, does it sit on a domain that seems to be somewhat reputable? There are different ways you can try to assess the quality of content, but typically, if you go back to a user is typing possibly some really rare phrase, if there are no other pages on the web that have that particular phrase, even if there’s not that any links, then that page can be returned because we think it might be relevant. It might be topical to what the user is looking for. It can be kind of tough, but at that point, we sort of have to fall back, and assess based on the quality of the content that’s actually on the text – that’s actually on the page.”

    A few years ago, after the Panda update was first launched, Google shared a list of questions one could ask themselves about their content to get an idea of how Google might view it in terms of quality. You might want to check that out if you haven’t yet.

    Image via YouTube

  • Google’s Transparency Called Into Question Again

    Though it’s back in Google’s results now, another company is making headlines for being penalized by Google. This time it’s Vivint, which produces smart thermostats, and competes with Nest, which Google acquired earlier this year.

    PandoDaily’s James Robinson wrote an article about it, noting that Vivint had received warnings from Google about external links that didn’t comply with its quality guidelines, but didn’t confirm what the links were. Rather, the company was “left to fish in the dark to figure out what i had done to upset its rival.”

    As Robinson correctly noted, Rap Genius was removed from Google’s search results last year for violating guidelines, and was back in business within two weeks. At the time, Google was accused by some of employing a double standard for letting the site recover so quickly compared to others.

    Google’s Matt Cutts had some comments about the Pando article on Hacker News. He wrote:

    It’s a shame that Pando’s inquiry didn’t make it to me, because the suggestion that Google took action on vivint.com because it was somehow related to Nest is silly. As part of a crackdown on a spammy blog posting network, we took action on vivint.com–along with hundreds of other sites at the same time that were attempting to spam search results.

    We took action on vivint.com because it was spamming with low-quality or spam articles…

    He listed several example links, and continued:

    and a bunch more links, not to mention 25,000+ links from a site with a paid relationship where the links should have been nofollowed.
    When we took webspam action, we alerted Vivint via a notice in Webmaster Tools about unnatural links to their site. And when Vivint had done sufficient work to clean up the spammy links, we granted their reconsideration request. This had nothing whatsoever to do with Nest. The webspam team caught Vivint spamming. We held them (along with many other sites using the same spammy guest post network) accountable until they cleaned the spam up. That’s all.

    He said later in the thread that Google “started dissecting” the guest blog posting network in question in November, noting that Google didn’t acquire Nest until January. In case you’re wondering when acquisition talks began, Cutts said, “You know Larry Page doesn’t have me on speed dial for companies he’s planning to buy, right? No one involved with this webspam action (including me) knew about the Nest acquisition before it was publicly announced.”

    “Vivint was link spamming (and was caught by the webspam team for spamming) before Google even acquired Nest,” he said.

    Robinson, in a follow-up article, takes issue with Cutts calling Pando’s reporting “silly,” and mockingly says Cutts “wants you to know Google is totally transparent.” Here’s an excerpt:

    “It’s a shame that Pando’s inquiry didn’t make it to me,” Cutts writes, insinuating we didn’t contact the company for comment.

    Pando had in fact reached out to Google’s press team and consulted in detail with the company spokesperson who was quoted in our story. It is now clear why Google didn’t pass on our questions to Cutts.

    He goes on to say that Cutts’ assessment of VIvint’s wrongdoing is “exactly what we described in our article — no one is disputing that Vivint violated Google’s search rules.” He also calls Cutts’ comments “a slightly simplistic version of events, given the months-long frustration Vivint spoke of in trying to fix the problem.”

    Robinson concludes the article:

    The point of our reporting is to highlight the unusual severity of the punishment (locked out for months, completely delisted from results until this week) given Vivint’s relationship to a Google-owned company and the lack of transparency Google offers in assisting offending sites. Multiple sources at Vivint told us that the company was told that it had “unnatural links” but was left to guess at what these were, having to repeatedly cut content blindly and ask for reinstatement from Google, until it hit upon the magic recipe.

    To these charges, Cutts has no answer. That’s a shame.

    Now, I’m going to pull an excerpt from an article of my own from November because it seems highly relevant here:

    Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough? When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so.

    Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing. For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort.

    A prime example of where Google has reduced its transparency is the monthly lists of algorithm changes it used to put out, but stopped. Cutts said the “world got bored” with those. Except it really didn’t as far as we can tell.

    Image via YouTube

  • Google Expands App Indexing Into More Languages

    Google launched app indexing globally in English a couple months back after testing it since November. Now, they’re expanding it into more languages.

    The feature enables Google to deliver in-app content in search results on mobile devices (specifically Android devices for now). For example, if you search for “Dee Barnes,” you might get a result from Wikipedia. With app indexing, Google will give you the option to open the app from the result as opposed to going to a mobile web version.

    The feature requires app developers to be on board, so Google has announced specific publishers with content in different languages that are now taking advantage of app indexing. These include: Fairfax Domain, MercadoLibre, Letras.Mus.br, Vagalume, Idealo, L’Equipe, Player.fm, Upcoming, Au Feminin, Marmiton, and chip.de.

    Google has also translated its developer guidelines into eight more languages (Chinese – traditional, French, German, Italian, Japanese, Brazilian Portuguese, Russian, and Spanish), so that should help too.

    Google has a form here where you can request to participate in App Indexing. The company notes that it has added a few new apps in the U.S., including Walmart, Tapatalk, and Fancy.

    Google promises a session for developers at Google I/O dedicated to “the future of apps and search”.

    Internet giants like Google and Facebook are working to make mobile apps more web-like. At Facebook’s recent developer conference, the company announced App Links, which enable apps to link to content within other apps.

    Image via Google

  • Press Release Sites Take A Hit In Google’s Rankings

    It would appear that some big name press release distribution sites have taken a hit in Google.

    Sean Malseed at Seer Interactive pointed out that PRWeb lost over half of its traffic, and dropped out of the first 20 Google results for over 8,000 keywords, based on data from SEMrush.

    “It looks like there’s guilt by association, as well,” writes Malseed. “Bloomberg, who partners with the press release agencies to disseminate releases, also took a huge hit.”

    Barry Schwartz took things a step further and looked at Searchmetrics data for PR Newswire, PRWeb, BusinessWire, and PRLog, each of which took hits after the Google released Panda 4.0. It’s unclear whether this was an effect of that Panda update or something else Google did.

    Last summer, Google updated its guidelines for what it considers link schemes. This included, “Links with optimized anchor text in articles or press releases distributed on other sites.”

    In a Webmaster hangout, Google’s John Mueller said Google wants all links in press releases to be nofollowed, and press releases should be treated like advertisements. He indicated that SEOs were using press releases more for search purposes.

    The press release distributors have not been shy about promoting SEO value either.

    A couple years ago, BusinessWire launched an SEO-enhanced platform after patenting its SEO strategy. Google is literally running an ad from PRWeb (pictured at the top) for press release SEO right now.

    Images via PRWeb, Google

  • Is Google’s Panda Update Helping Small Businesses?

    Early last week, Google pushed out a couple of big algorithm updates: a new version of the so-called “Payday Loans” update and a new generation of the famous/infamous Panda update. Google has been talking up the latter for a while, saying that it would benefit smaller sites and businesses, and be gentler overall. Has it lived up to this promise?

    Are you a small business affected by Google’s Panda update? How has it impacted your site? Let us know in the comments.

    Google’s Matt Cutts spoke at the Search Marketing Expo in March, saying that Google was working on the “next generation” of Panda, which would be softer and more friendly to small sites and businesses. Barry Scwhartz, who was in attendance recapped what he said:

    Cutts explained that this new Panda update should have a direct impact on helping small businesses do better.

    One Googler on his team is specifically working on ways to help small web sites and businesses do better in the Google search results. This next generation update to Panda is one specific algorithmic change that should have a positive impact on the smaller businesses.

    Interestingly, we do seem to be seeing more people claiming they’ve done well with the latest Panda update compared to past updates. If it’s really helping sites this much, that bodes well for the future, because it looks like whatever Google has done with Panda will be carried forward for the foreseeable future.

    PerformanceIN says it’s helping smaller affiliate sites. Sylvia Nankivell writes:

    In the past, Google’s updates may have felt somewhat unjust to some smaller affiliate sites, and there has been much talk of the magical protection of the big brand. I have heard complaints from affiliates with pages of in-depth, rich content, losing out to big brands with a page containing only a short sentence on it.

    Perhaps this new Panda 4.0 update is in response to these sorts of complaints. It seems that now, big name brands, as well as the smaller businesses, need to consider how information rich all of their pages and directories are. If they don’t, then they are in danger of joining the Panda 4 ‘losers list’.

    We haven’t heard about any planned layoffs from the latest update yet, which is a good sign (though we recently heard about layoffs from an update that took place over a year and a half ago).

    Some sites seem to be making recoveries with Panda 4.0 after being hit by previous Panda updates.

    One of our readers commented this week, “My site was hit by the first ever Panda update and only just recovered from last week’s update. So this weaker Panda is confusing..is my content weak but you’re letting me off or was the algo wrong on the first place?”

    In case you missed it, SearchMetrics recently put out is obligatory Panda winners and losers lists for 4.0. These things are never a hundred percent accurate, but they do give you an idea of some sites that saw significant movement when the update was rolled out. eBay was among the top losers, but that ended up being a manual penalty rather than Panda, apparently.

    Named winners include Glassdoor.com, emediinehealth.com, medterms.com, yourdictionary.com, shopstyle.com, zimbio.com, myrecipes.com, couponcabin.com, buzzfeed.com, consumeraffairs.com, wordpress.com, thinkexist.com, onhealth.com, alternativeto.net, whosdatedwho.com, reverso.net, wikimedia.org, dogtime.com, findthebest.com, eatingwell.com, quotegarden.com, goodhousekeeping.com, everydayhealth.com, simplyhired.com, momswhothink.com, similarsites.com, southernliving.com, theknot.com, allaboutvision.com, openculture.com, babyzone.com, tasteofhome.com, gotquestions.org, movie4k.to, wmagazine.com, ycharts.com, historyplace.com, rcn.com, salary.com, skpdic.com, mediawiki.org, oodle.com, abbreviations.com, homes.com, spokeo.com, hlntv.com, sparkpeople.com, hayneedle.com, and emedtv.com.

    It’s a pretty interesting range of types of sites. It’s good to know that BuzzFeed has won not only the Facebook Panda update but also the Google Panda update.

    Search Engine Roundtable recently ran a poll asking how Panda 4.0 impacted readers’ sites. Over 1,200 people responded. Over 15% said they had recovered from a previous Panda penalty. Over 19% said their rankings increased, but that they were never hurt by Panda. Over 23% said their rankings remained the same, but they were never previously hurt by Panda. Nearly 27% said they were never previously hurt by Panda, but saw their rankings decrease this time. About 11% said they didn’t recover from a previous Panda penalty.

    The poll doesn’t take into account business size, but it’s probably safe to assume that a good amount of those who participated are from or represent small businesses.

    Do you think Panda 4.0 is good for small sites? Let us know in the comments.

    Image via YouTube

  • Matt Cutts Talks Google Link Extraction And PageRank

    In a new video, Matt Cutts, Google’s head of webspam, discussed how Google views two links with different anchor text on one page pointing to the same destination, and how that affects PageRank.

    The explanation is Cutts’ response to the following submitted question:

    What impact would two links on a page pointing to the same target, each using different anchor text, have on the flow of PageRank?

    He said, “This is kind of an example of what I think of as dancing on the head of a pin. I’ll try to give you an answer. If you’re telling me that the most important thing for your SEO strategy is knowing what two links from one page do – you know, I understand if people are curious about it – but you might want to step back, and look at the higher mountain top of SEO, and your SEO strategy, and the architecture of your site, and how is the user experience, and how is the speed of the site, and all of that sort of stuff because this is sort of splitting hairs stuff.”

    “So, with that said,” he continued, “looking at the original PageRank paper, if you had two links from one page to another page, both links would flow PageRank, and so the links – the amount of PageRank gets divided evenly (in the original PageRank paper) between all the outgoing links, and so it’s the case that if two links both go to the same page then twice as much PageRank would go to that page. That’s in the original PageRank paper. If they have different anchor text, well that doesn’t affect the flow of PageRank, which is what your question was about, but I’ll go ahead and try to answer how anchor text might flow.”

    “So we have a link extraction process, which is we look at all the links on a page, and we extract those, and we annotate or we fix them to the documents that they point to. And that link extraction process can select all the links, or it might just select one of the links, so it might just select some of the links, and that behavior changes over time. The last time I checked was 2009, and back then, we might, for example, only have selected one of the links from a given page. But again, this is the sort of thing where if you’re really worried about this as a factor in SEO, I think it’s probably worthwhile to take a step back and look at high order bits – more important priorities like how many of my users are actually making it through my funnel, and are they finding good stuff that they really enjoy? What is the design of my homepage? Do I need to refresh it because it’s starting to look a little stale after a few years?”

    There’s that mention of stale-looking sites again.

    The main point here is that you should spend less time nitpicking small things like how much PageRank is flowing from two links on a single page, and what anchor text they’re using, and focus on bigger-picture things that will make your site better. This is pretty much the same message we always hear from the company.

    Perhaps that’s the real reason that Google stopped putting out those monthly lists of algorithm changes.

    Image via YouTube

  • Bing Is Shutting Down Its Webmaster Forums

    Microsoft announced that it is shutting down Bing’s Webmaster Forums as they have apparently not grown in a meaningful way.

    As of the end of this month or early next month, the Bing Webmaster Community Forum will be no more.

    Senior product manager Duane Forrester wrote on the Bing webmaster blog, “Over the last few years, we’ve had our Webmaster forums up and running. They’ve been around a while now in a few iterations, and like any community, the goal is to grow it to be vibrant and engaging. To foster the deep involvement of experts who help others, creating a community that contributes to improvements and makes its own gravity. There comes a time, however, when you sometimes need to re-evaluate, and once in a while, regroup.”

    Bing will be directing users to its Help & How To section and its webmaster blog, which will have weekly posts with open comments.

    Those simply having issues with Webmaster Tools are told to use email support.

    For everything else, Forrester directs people to WebmasterWorld and similar forums. Bing itself participates at WMW, so that’s probably your best bet for Bing-related threads.

    Image via Bing

  • Google’s eBay Hit Was Apparently Manual

    Google’s eBay Hit Was Apparently Manual

    As reported last week, eBay appeared alongside the biggest apparent losers from Google’s most recent Panda update, which is supposed to be softer than past updates, and make things a little better for smaller sites and businesses. According to reports, it turns out eBay was hit by a manual penalty rather than Panda.

    Just after the update was announced, Moz spotted eBay’s loss of search rankings for numerous keywords and phrases. The main eBay subdomain fell out of Moz’s “Big 10,” which is its metric of the ten domains with the most real estate in the top 10 search results.

    “Over the course of about three days, eBay fell from #6 in our Big 10 to #25,” wrote Dr. Peter J. Meyers at Moz. “Change is the norm for Google’s SERPs, but this particular change is clearly out of place, historically speaking. eBay has been #6 in our Big 10 since March 1st, and prior to that primarily competed with Twitter.com for either the #6 or #7 place. The drop to #25 is very large. Overall, eBay has gone from right at 1% of the URLs in our data set down to 0.28%, dropping more than two-thirds of the ranking real-estate they previously held.”

    He went on to highlight specific key phrases where eBay lost rankings. It lost two top ten rankings for three separate phrases: “fiber optic christmas tree,” “tongue rings,” and “vermont castings”. Each of these, according to Meyers, was a category page on eBay.

    eBay also fell out of the top ten, according to the report, for queries like “beats by dr dre,” “honeywell thermostat,” “hooked on phonics,” “batman costume,” “lenovo tablet,” “george foreman grill,” and many others.

    Then Searchmetrics put out its regular lists of winners and losers from the Panda update, and eBay was in the top 2 for losers.

    Late on Friday, however, just as much of the U.S. was transitioning into a three-day weekend, reports emerged that eBay was actually hit by a manual penalty rather than Panda. Jason Del Rey at Re/code wrote:

    As it turns out, Google did in fact penalize eBay and knock a whole bunch of its pages off Google’s search results, but it wasn’t part of Panda, according to a person familiar with the situation. Rather, it was part of a so-called “manual action” that Google took against eBay early this week; the pages weren’t removed as part of the Panda rollout, which affects entire sites and not individual pages.

    Neither company would comment.

    Del Ray points to a blog post at RefuGeeks, which shows that pages affected were category pages that users are unlikely to get to navigating the site, and were designed specifically for search engines, which is precisely the kind of thing that will get you penalized by Google.

    And that appears to be what happened. It looks like eBay has been removing the pages. A specific URL the post points to is no longer returning a page.

    Image via Wikimedia Commons

  • Google Gives Webmasters New Page Rendering Tool

    Last week, Google named some JavaScript issues that can negatively impact a site’s search results, and said it would soon be releasing a tool to help webmasters better understand how it renders their site. The tool has now been announced.

    It comes in the form of an addition to the Fetch as Google tool, which lets you see how Googlebot renders a page. Submit a URL with “Fetch and render” in the Fetch as Google feature under Crawl in Webmaster Tools.

    “In order to render the page, Googlebot will try to find all the external files involved, and fetch them as well,” writes Shimi Salant from Google’s Webmaster Tools team. “Those files frequently include images, CSS and JavaScript files, as well as other files that might be indirectly embedded through the CSS or JavaScript. These are then used to render a preview image that shows Googlebot’s view of the page.”

    “Googlebot follows the robots.txt directives for all files that it fetches,” Salant explains. “If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.”

    Google recommends making sure Gooblebot can access any embedded resource that contributes to your site’s visible content or layout in any meaningful way to make it easier to use the new tool. You can leave out social media buttons, some fonts and/or analytics scripts, as they don’t “meaningfully contribute”. Google says these can be left disallowed from crawling.

    Image via Google