WebProNews

Tag: SEO

  • This Google Stat Has Major Implications For Your Site

    This Google Stat Has Major Implications For Your Site

    In May, Google casually noted in a blog post that mobile searches have overtaken desktop searches in ten countries including the United States and Japan. It didn’t elaborate on what the other countries were.

    Do you get more mobile traffic than desktop traffic? What’s the split like? Discuss.

    The following month, Google mentioned another country by name, adding the United Kingdom to the list. Matt Jackson at SocialMediaToday reported at the time:

    During a presentation at London Tech Week, Google’s Eileen Naughton said that not only are more searches conducted on UK mobile devices than on UK desktops, but that more UK YouTube searches were also conducted on mobile devices.

    The YouTube part is interesting as well, as Google hadn’t mentioned that before when talking about this subject, at least to my knowledge.

    The growing mobile search trend obviously illustrates why Google has put so much emphasis on websites being mobile-friendly and begun taking app indexing into account when ranking search results.

    The world is going mobile, and websites that don’t follow are going to be left behind. A recent study found that the mobile-friendly update bumped down about half of pages it threatened to, but it’s still early days. It’s not as if mobile-friendliness is going to become less of a factor going forward.

    Last week, Search Engine Land spoke with Google, and was told that mobile searches have now exceeded desktop searches worldwide. In other words, more than 50% of Google’s searches happen on mobile.

    Danny Sullivan wrote, “It’s important to note that this doesn’t mean that desktop searches have diminished. Stats on desktop search from comScore routinely show the overall amount has risen from month to month. Rather, it’s that mobile searches have been a growing new segment that have caught up and now overtaken desktop search. On the whole, desktop search has grown. As a percentage, it has dropped.”

    Google’s John Mueller said in a Google+ post (via Search Engine Roundtable), “More than half of Google’s searches are now coming from mobile. If you haven’t made your site (or your client’s sites) mobile-friendly, you’re ignoring a lot of potential users.”

    On the app indexing front, Google has also indexed over 100 billion pages within apps so far, and it’s only really getting started with this on iOS.

    Yahoo’s Flurry recently released a report looking at people’s addiction to their mobile devices. In short, addiction is on the rise.

    “On June 29th Bank of America released the findings of its second annual report on Consumer Mobility,” said Simon Khalaf, SVP of Publisher Products at Flurry. “The report showed that the US population is perpetually plugged-in with 71% of those surveyed disclosing they actually sleep with their smartphones. This prompted us to revisit the study we conducted in Q2 of 2014 in which we first uncovered the rise of a new breed of mobile users: the Mobile Addicts.”

    According to the report, worldwide mobile addicts grew 59% over the last year.

    Year over year, the total population of smart devices measured by Flurry grew from by 38% from 1.3B to 1.8B. Regular Users (those who use apps between once and sixteen times daily) grew by 25% from 784 million to 985 million. Super Users (those who use apps between 16 and 60 times daily) grew 34% from 440 million to 590 million. Mobile addicts (those who launch applications 60 times or more per day) grew 59% from from 176 million to 280 million.

    According to Flurry, if the amount of mobile addicts were the population of a country, it would be the fourth largest just behind the United States.

    Flurry shares more analysis on its findings here.

    Related Reading: Will ‘Accelerated Mobile Pages’ Help Google Rankings?

    Is your site in good shape when it comes to reaching mobile users or do you have some work to do? Let us know in the comments.

    Images via Google, Flurry

  • Google Withdraws Previous Crawling Recommendation

    Google Withdraws Previous Crawling Recommendation

    Google announced that it is deprecating its AJAX crawling scheme, and webmasters need to be aware of how things have changed to ensure Google is crawling their site correctly and in the most effective manner possible.

    Does this have any bearing on your site? Let us know in the comments.

    Specifically, Google is no longer recommending the AJAX crawling proposal it made six years ago. That was made at the time to benefit webmasters and users by making content from rich and interactive AJAX-based sites universally accessible through search results. Google said it believed this would “significantly improve the web.”

    In those days (2009), Google was unable to render and understand pages using JavaScript to present content to users, and crawlers couldn’t see any content created dynamically.

    The technology has improved a great deal in six years, as you would probably expect (for perspective, 2009 was the year of the iPhone 3GS).

    Now, as long as you don’t block Googlebot from crawling your JavaScript or CSS files, Google can render and understand your pages as modern browsers do.

    Last year, Google wrote a blog post about this and how it was starting to understand pages better. It also offered some information on things that may lead to a negative impact on search results for your site.

    “If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user,” the post, co-written by a trio of Googlers, says. “We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile.”

    “If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources,” the continues. “It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.”

    Google also notes that some JavaScript is too complex or arcane for it to execute, which means they won’t be able to render the page fully or accurately. Also, some JavaScript removes content from the page, which prevents Google from indexing it.

    At the time, Google also revealed a new tool in Webmaster Tools (now Search Console) in the form of an addition to the Fetch as Google tool, which lets you see how Googlebot renders a page. Submit a URL with “Fetch and render,” and Google tries to find all the external files involved and fetch them as well. These files include images, CSS and JavaScript files as well as other things that might be indirectly embedded through the CSS or JavaScript. Google uses all of this to render a preview image that shows Googlebot’s view of the page.

    Google updated its technical Webmaster Guidelines about a year ago to recommend against disallowing Googlebot from crawling your site’s CSS or JavaScript files. The company now says that since the assumptions from its 2009 proposal are no longer valid, it recommends following the principles of progressive enhancement, a web design strategy that emphasizes accessibility, semantic HTML markup, and external stylesheet and scripting.

    “For example, you can use the History API pushState() to ensure accessibility for a wider range of browsers (and our systems),” says Google Search Quality Analyst Kazushi Nagayama.

    Nagayama shares a few Qs and As related to all of this, which should help webmasters better understand the preferred approach:

    Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
    A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.

    Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
    A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

    Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok?
    A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.

    If all of this is insufficient in helping you get on the right track, Google suggests posting questions on Nagayama’s blog post or in the Google Webmaster Help forum.

    Are you already doing things the right way, or do you need to make changes based on what Google had to say this week? Let us know in the comments.

  • Google Is About To Let You Have Your iOS App Content Indexed

    Google Is About To Let You Have Your iOS App Content Indexed

    If you have an iOS app, you’ll soon be able to start seeing its content appear in Google search results on iOS devices just like web content.

    Google has been doing the app indexing thing for quite some time at this point, but it’s slowly been expanding the capabilities and giving developers and app owners more to take advantage of.

    Earlier this year, Google announced that it would use app indexing as a ranking signal for search results on Android, and later announced that it was starting the app indexing process on iOS. That was limited. Now, they’re opening things up to any iOS app developer.

    Last week, Google revealed that it was adding a new ranking factor for apps that use the App Indexing API. They also posted new documentation for iOS app content (h/t: Search Engine Roundtable). The company has since acknowledged this in a Google+ update.

    “Getting your app content found on Google just got easier,” the company says. “App Indexing is now compatible with HTTP deep link standards for iOS 9, as it has been on Android from the beginning. That means that you can start getting your app content into the Search results page on Safari in iOS, simply by adding Universal Links to your iOS app, then integrating with our SDK. With this improvement, we will no longer support new integrations on iOS 7 and iOS 8. Users will start seeing your app content in Safari on iOS at the end of October.”

    “And, of course, on Android, you can still get your content into the Search results page, autocompletions, and Now on Tap by adding HTTP deep links and integrating with the App Indexing API,” it added.

    Google Now On Tap recently began its initial roll-out on certain devices. The feature should present some new opportunities for businesses to gain more visibility from their app indexing efforts.

    More on getting your app content seen in Google Search here.

    Image via Google

  • Don’t Do This If You’ve Been Hit By Google’s Panda Update

    Don’t Do This If You’ve Been Hit By Google’s Panda Update

    Google’s Panda update has been around since February 2011 and continues to wreak havoc on websites when it finds content issues. Sometimes it’s not clear that the site suffering Panda’s wrath actually deserved to be algorithmically penalized. Either way, some sites have been hit really hard by it over the years, and one tactic that has sometimes been employed has been to delete content. Don’t do that.

    Do you think deleting content is a good idea when you’re trying to recover from Panda? Share your thoughts in the comments.

    Perhaps the most famous victim of Panda over the years has been Demand Media – particularly its eHow property. The site was largely considered to be a content farm, which is precisely the type of site expected to be targeted by the algorithm update. Even though eHow escaped Panda when it first launched, the algorithm eventually caught up to it in a big way. In an effort to recover its Google traffic, Demand Media redesigned the site and deleted a ton of content of questionable quality. In 2012, it looked like things were looking good again for the site, but that didn’t last. The company has since had to become less reliant on Google as such a big chunk of its traffic.

    These days, you can Google “how to fix a toilet,” which would be a prime example of the type of query you might legitimately expect eHow to rank for, and eHow is nowhere in the top results.

    Google is now flat out saying that you might not want to delete content in response to Panda. Google’s Gary Illyes said on Twitter, “We don’t recommend removing content in general for Panda, rather add more highQ stuff.”

    SEO Barry Schwartz, who first reported on Illyes’ comments, says, “Now Gary is saying generally it does not make sense to remove content. Generally you should improve your site. But the sites that are hit badly by Panda, often have serious structural issues with the site where they can consolidate content and remove a lot of the pages. I’d say generally, removing or consolidating content is the approach most SEOs take to tackle Panda issues. But Gary is saying otherwise.”

    Illyes went on in a series of tweets to say, “We see way too many people cut the good [content]. Careful what you trim…use search analytics: look for pages that don’t satisfy users’ information need for the queries they rank for…Thin content: make it better, make it…thick and ADD more highQ stuff….Don’t remove content someone might find useful…What you really need is content created with care for the users, that’s it.”

    In other words, just avoid getting rid of stuff and focus on improving the stuff you already have. Depending on how big your site is, that could be easier said than done, but that is the guidance you’re getting right from Google itself.

    Illyes did have additional advice at PubCon. Jennifer Slegg reports (via Search Engine Roundtable):

    While responding, Illyes did make an interesting recommendation for those who are removing thin content for Panda reasons. Rather than simply use a 404 or a 410, he strongly recommends that webmasters should use noindex on those pages, ensure those pages are listed in the sitemap or add them to the sitemap and then submit the sitemap to Google.

    Of course Google has a list of 23 questions to ask yourself about your site and content when it comes to high quality versus thin:

    1. Would you trust the information presented in this article?

    2. Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?

    3. Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?

    4. Would you be comfortable giving your credit card information to this site?

    5. Does this article have spelling, stylistic, or factual errors?

    6. Are the topics driven by genuine interests of readers of the site, or does the site generate content by attempting to guess what might rank well in search engines?

    7. Does the article provide original content or information, original reporting, original research, or original analysis?

    8. Does the page provide substantial value when compared to other pages in search results?

    9. How much quality control is done on content?

    10. Does the article describe both sides of a story?

    11. Is the site a recognized authority on its topic?

    12. Is the content mass-produced by or outsourced to a large number of creators, or spread across a large network of sites, so that individual pages or sites don’t get as much attention or care?

    13. Was the article edited well, or does it appear sloppy or hastily produced?

    14. For a health related query, would you trust information from this site?

    15. Would you recognize this site as an authoritative source when mentioned by name?

    16. Does this article provide a complete or comprehensive description of the topic?

    17. Does this article contain insightful analysis or interesting information that is beyond obvious?

    18. Is this the sort of page you’d want to bookmark, share with a friend, or recommend?

    19. Does this article have an excessive amount of ads that distract from or interfere with the main content?

    20. Would you expect to see this article in a printed magazine, encyclopedia or book?

    21. Are the articles short, unsubstantial, or otherwise lacking in helpful specifics?

    22. Are the pages produced with great care and attention to detail vs. less attention to detail?

    23. Would users complain when they see pages from this site?

    These have been around for years, but it never hurts to take a look again to remind yourself just what Google is looking for when it evaluates quality.

    The latest Panda refresh is still rolling out. Illyes appeared at SMX East last week and said this is the case. Google always said it would be a slow roll-out, and it wasn’t kidding. It began in mid-July. If you were waiting to recover after being hit by a previous Panda update/refresh, you may still have a shot (assuming that you’ve taken steps to fix the problems that got you hit by the update in the first place).

    Penguin is expected to return before the end of the year.

    After seeing these comments from Google, do you still think there’s a case for removing content to recover from Panda? Share your thoughts in the comments.

  • Google Gives Updates On Panda, Penguin

    Google Gives Updates On Panda, Penguin

    Wondering what’s going on with Google’s Panda and Penguin updates? Well, not much has changed, but the company did address both at an industry conference.

    At SMX East, Google’s Gary Illyes reportedly said that the last Panda refresh is STILL rolling out. They had said up front that it was going to be a slow roll-out, and they were not kidding.

    It began in mid-July, and now it’s October, still going. So if you were still waiting to recover after being hit by a previous Panda update/refresh, you may still have a shot (assuming that you’ve taken steps to fix the problems that got you hit by the update in the first place).

    Penguin, on the other hand, has not come back around yet, though it shouldn’t be too much longer. Google’s John Mueller said last week that he expected it to happen before the end of the year.

    According to Search Engine Land, Illeyes said it will be in the “foreseeable future” and that he “hopes” it will be before the end of the year. He also reiterated that it will the real-time version that Google has been talking about for quite some time.

    Image via Google

  • Google Reportedly Adds Another App Indexing-Related Ranking Signal

    Google Reportedly Adds Another App Indexing-Related Ranking Signal

    Back in February, Google announced two major mobile ranking signals back in February. The one that got the most attention and the nickname “Mobilegeddon” was a site’s “mobile-friendliness”. The other one was app indexing. Google would show content from apps to users when they had the apps installed.

    READ: How To Set Up App Indexing For Ranking In Google

    In April, Google announced that it had indexed 30 billion links within apps and that it would start showing Android users apps in search results even if the user doesn’t already have the app installed.

    In July, we learned that Etsy sellers were seeing significant benefits from the signal, and they’re surely not alone.

    Now, Google reportedly adding a new ranking factor for apps that use the App Indexing API. This was announced at the SMX East conference. Barry Schwartz at Search Engine Roundtable reports:

    On the panel, Mariya Moeva from Google announced several things around App Indexing, but got everyone’s attention pretty quickly when she announced there is an additional ranking boost, on top of the original app ranking boost, for using the App Indexing API.

    Why is there a new boost? Mariya explained that with the new API, Google is able to know start and end times of use of the app and pages within the app, amongst other important data points.

    At first, Google only made app indexing available on Android, but in May, they started indexing some iOS app content.

    This week, the company posted new documentation for that, which you can find here.

    Image via Google

  • Study Finds Wikipedia Still Outperforms Google Properties in Google

    Study Finds Wikipedia Still Outperforms Google Properties in Google

    Last month, there were a number of reports about a significant drop in Wikipedia’s Google traffic. A report from SimilarWeb found that Google “stole over 550 million clicks” from Wikipedia in 6 months. According to Search Engine Journal, the site’s organic search traffic from Google dropped 11% from May to July.

    Search Engine Land reported a few days later that Wikipedia co-founder Jimmy Wales had said there was “a long-term issue with decreasing traffic from Google,” but that the SimilarWeb article was a misrepresentation of how Wikipedia actually needs the clicks in question. The article quotes Wales as saying:

    “It is also false that ‘Wikipedia thrives on clicks,’ at least as compared to ad-revenue driven sites… The relationship between ‘clicks’ and the things we care about: community health and encyclopedia quality is not nothing, but it’s not as direct as some think.”

    Wikipedia later released its own report on the subject saying, “No direct data shows a decrease in Google traffic; in fact, direct referrals from Google have been increasing in the last few months, rather than decreasing. However, we have some fuzziness around indirect referrals that cannot be resolved without the participation of Google. We should seek that participation, and work on tracking these metrics in an automated fashion.”

    The report concluded:

    Based on the data we have we can establish that the most obvious avenues for verifying or dismissing SimilarWeb’s claim show no evidence that Google traffic has declined. However, we do not have the data at our end to eliminate all avenues of possibility.
     
    Our next work should be to reach out to Google themselves and talk to them about the data we’re seeing, and to build out infrastructure to begin tracking metrics like this on a consistent and automated basis, rather than relying on costly ad-hoc analysis.

    Now there’s a new report on this subject. This one comes from Stone Temple Consulting, which has recently delivered interesting findings related to Google’s partnership with Twitter and engagement on Google+. They ran an analysis of the rankings data for over 340,000 search queries.

    According to that, Wikipedia is still prominent on the first pages of search results, but has lost many of its #1 and #2 rank positions.

    “Wikipedia still is far more prevalent than Google properties, so we cannot conclude that Google is favoring its own content,” a spokesperson for Stone Temple says.

    It did find that Wikipedia pages are more prominent in commercial queries than for informational ones. It also found the opposite to be true for Google properties.

    Check out that full report here.

  • Google Will Make It Harder For Repeat Offenders To Get Back Rankings

    Google Will Make It Harder For Repeat Offenders To Get Back Rankings

    Google is not cool with you frequently violating their guidelines. Well, obviously they’re not cool with you violating them at all, but they do give second chances. If you screw up and get slapped with a manual penalty, you can fix the issue and file a reconsideration request, and get back into Google’s good graces.

    This will only go so far, however. If Google accepts your reconsideration request, and you keep violating guidelines after that, it’s not going to be so easy (if it was even easy in the first place) the next time.

    That’s the gist of a message Google is sending webmasters. The company’s search quality team wrote a short blog post on the subject, urging webmasters to take its guidelines seriously. It says:

    In order to protect the quality of our search results, we take automated and manual actions against sites that violate our Webmaster Guidelines. When your site has a manual action taken, you can confirm in the [Manual Actions] page in Search Console which part of your site the action was taken and why. After fixing the site, you can send a reconsideration request to Google. Many webmasters are getting their manual action revoked by going through the process.

    However, some sites violate the Webmaster Guidelines repeatedly after successfully going through the reconsideration process. For example, a webmaster who received a Manual Action notification based on an unnatural link to another site may nofollow the link, submit a reconsideration request, then, after successfully being reconsidered, delete the nofollow for the link. Such repeated violations may make a successful reconsideration process more difficult to achieve. Especially when the repeated violation is done with a clear intention to spam, further action may be taken on the site.

    Long story short, don’t violate the Webmaster Guidelines. If you do for some reason, and you get caught, and action is taken against your site, don’t keep violating them once you get your site back in the game. You’ll get caught again, and you’re going to have a much harder time getting your rankings back.

    If you need a refresher, you can find the guidelines here. Read it over, and know what Google considers spam.

    Image via Google

  • Two SEO Reasons To Use Structured Markup

    Two SEO Reasons To Use Structured Markup

    If you’re not using structured markup on your website, you risk losing out on potential clicks and possibly even a rankings boost. It’s time to take this seriously if you haven’t been already.

    Do you use rich snippets? Do you intend to? Let us know in the comments.

    There a couple of interesting pieces of news to surface over the past week or so that provide reasons to use structured markup.

    Last week, Google implied that it may start using structured markup as a ranking signal in the future. That’s not to say it is now, or that it will next week, but it’s always been a little odd to me that it wasn’t already a signal, and the fact that they are hinting that this might change is probably reason enough to go ahead and utilize it.

    Google Webmaster Trends Analyst John Mueller said in a Google+ hangout (via Search Engine Roundtable) said, “If we can recognize someone is looking for a car, we can say oh well, we have these pages that are marked up with structured data for a car, so probably they are pretty useful in that regard,” he said. “We don’t have to guess if this page is about a car.”

    He said it “definitely” makes sense to use structured data, adding, “So I think in the long run, it definitely makes sense to use structured data where you see that as being reasonable on the web site. But I would’t assume that using structured data markup will make your site jump up in rankings automatically. So we try to distinguish between a site that is done technically well and a site that actually has good content. Just because it is done technically well, it doesn’t mean it is as relevant to the users as content that is not done as technically well.”

    In other words, it would be just another signal (obviously).

    The other piece of news is a study from Blue Nile Research (via Search Engine Land), which found that search results with rich snippets can get more clicks at the number 2 position than the number 1 result without them.

    Structured marykp can of course enable Google to display rich snippets.

    As Google explains:

    Including structured data markup in web content helps Google algorithms better index and understand the content. Some data can also be used to create and display Rich Snippets within the search results. For example, the Rich Snippet at the right (above here) shows search results for a movie, including review stars, an aggregate rating value, and vote count — very useful to anyone searching for information about this movie.

    To make your pages eligible for Rich Snippets in search results, add structured data of the appropriate type to your content. Make sure to comply with any policies that each type has regarding Rich Snippet display.

    Google supports rich snippets for products (price, availability, review ratings), recipes, reviews, events, and software applications (URL, review ratings, and price).

    Blue Nile, which looked at three different scenarios, says, “When we look at the overall click results across all three scenarios, we see that the rich-media‒ enhanced result in position 2 captures an average of 61% of clicks, versus 48% when the same unenhanced result is in position 1, a lift of 13%. The same unenhanced result in position 2 receives 35% of clicks, a lift of 26%.”

    You can find the full study here.

    You can take all of this with a grain of salt if you like as there’s nothing here to say that by adding structured markup is going to automatically make you notice a jump in rankings and clicks, but at the same time, there aren’t really any drawbacks beyond the time and effort spent adding the markup. It’s only going to be in your best interest.

    Google has a YouTube playlist of instructional videos on rich snippets and structured data, which I have embedded in full below.

    Here’s Google’s resource on structured data, which includes information about how it pertains to rich snippets, breadcrumbs, and sitelinks. The rich snippets-specific section is here.

    Do you see any reason not to add the markup? Do you think it’s helping your own search results? Discuss.

    Images via Google, Blue Nile

  • New Google Video Discusses Getting Started With App Indexing

    New Google Video Discusses Getting Started With App Indexing

    The Google Developers YouTube channel has a series called “Coffee with a Googler,” in which various Googlers share insights into various dev topics in a casual interview setting. The latest one focuses on what is one of Google’s newest mobile ranking signals – App Indexing.

    App indexing gives businesses a new way to tell Google that their content should be ranking higher and to get their mobile apps in front of more users by way of the content within them. It’s something that many haven’t done yet, but more will thanks to the new opportunity Google is giving them.

    Google Developer Advocate Laurence Moroney writes on the Google Developers Blog:

    In this episode of Coffee with a Googler, Laurence meets with Jennifer Lin from the App Indexing team, who demonstrates the possibilities!

    Jennifer shares that Google has indexed over 50 billion deep links into apps, with searches returning these links to users, taking them directly into your app. She shares how the Daily Mail newspaper in the UK saw a 22% boost in search impressions, and app users spent around 20% more time reading and sharing articles when they came in via a deep link from Search. Additionally, Tabelog, a premier restaurant review app and site in Japan, saw an increase of 9.6% in page views within their app, and a 63% increase in Search impressions after adding their app to the index.

    You can find Google’s App Indexing site here. Our ongoing coverage of App Indexing-related news is available here.

    Image via YouTube

  • Google Ranking Signal Comes With A New Caveat

    Google Ranking Signal Comes With A New Caveat

    Earlier this year, Google began taking into account a site’s mobile-friendliness for ranking search results on mobile devices. It provided sites with a helpful mobile-friendly test tool so that they can make sure their pages were up to snuff. If a page passed the test, it would be good as far as that particular signal is concerned. Now, there’s a new factor in that mobile-friendliness that will cause some that previously passed the test to now fail.

    Google has been hinting for a while that app-install interstitials would become a negative ranking signal in search results, and now it’s official. Or at least it will be soon.

    Do you use app-install interstitials on your mobile web pages? What do you think about Google’s latest ranking signal? Share your thoughts in the comments.

    The company announced on Tuesday that it is updating its Mobile-Friendly Test to advise sites against showing app install interstitials “that hide a significant amount of content on the transition from the search result page”.

    Google says its Mobile Usability report in Search Console will show webmasters the number of pages across their site that have the issue.

    While the mobile-friendly test tool has already been updated to take the new signal into account, Google will not actually start counting interstitials negatively until November 1, so that should give webmasters enough time to make the updates they need to to avoid being algorithmically penalized. Google says:

    After November 1, mobile web pages that show an app install interstitial that hides a significant amount of content on the transition from the search result page will no longer be considered mobile-friendly. This does not affect other types of interstitials. As an alternative to app install interstitials, browsers provide ways to promote an app that are more user-friendly.>

    App install banners are supported by Safari (as Smart Banners) and Chrome (as Native App Install Banners). Banners provide a consistent user interface for promoting an app and provide the user with the ability to control their browsing experience. Webmasters can also use their own implementations of app install banners as long as they don’t block searchers from viewing the page’s content.

    Keep an eye on the Webmaster Central forum for chatter about this as time progresses.

    Google recently shared results of some internal testing it did with its Google+ app showing that an app install interstitial negatively impacted the user experience. Yelp CEO Jeremy Stoppelman has been very vocal about his opposition to Google’s position on this matter.

    After Google shared its study results, Stoppelman said on Twitter, “Google says stop pushing App downloads yet its own team push apps using same ‘bad’ designs. Is this about protecting consumers or protecting their search monopoly?”

    He later wrote a guest post for Search Engine Land asking the same question. In that, he said, “While many users find apps by browsing inside an app store, another critical way they discover new apps is through mobile search engines, like Google. In this way, mobile search indeed serves a critical function to users: offering a bridge from the less desirable world of mobile Web browsing to a new world inside apps.”

    He went on to discuss how apps threaten Google’s search business. Since then, LinkedIn has been publicly questioning Google’s findings as well. They started off by saying that nobody wants Google+ for one thing.

    “Naturally, an interstitial that interrupts the user experience to promote something that most people don’t want is bound to backfire,” wrote Omar Restom, mobile product manager at LinkedIn. “Google shouldn’t extrapolate based on this one case. ”

    “Google admits that it was showing their interstitial even to users who already have the app – that’s bad mojo and fundamentally bad audience targeting,” he added. “Again, Google should only have shown this promo to people who actually want and need the app. The Google+ Team also violated Google’s own SEO policy by showing this interstitial on SEO Pages.”

    He went on to make the case that LinkedIn’s interstitials work better because of better targeting and better creatives. Restom also backed up his argument with some numbers, comparing clickthrough rate, bounce rate and incremental app downloads driven between Google+ and LinkedIn.

    Stoppelman has since tweeted about LinkedIn’s post a couple times and various other articles on the subject.

    VC Bill Gurley tweeted:

    Stoppelman added:

    He also retweeted this:

    And tweeted this:

    Some are questioning why Google is specifically targeting these types of interstitials specifically as opposed to all interstitials (desktop included) that block content.

    What do you think? Is this about user experience or Google’s self-interest? Share your thoughts in the comments.

    Image via Google

  • Webmasters Suspect A Reverse In Google’s Latest Panda Update

    Webmasters Suspect A Reverse In Google’s Latest Panda Update

    Google began rolling out its latest Panda refresh in July, and made it pretty clear that it would be a very slow roll-out. Presumably it’s still in the process as they said it would take a few months to complete.

    Google indicated that it was going so slowly because of some technical issues (as opposed to purposely trying to confuse SEOs). As the slow roll-out continues, some webmasters suspect that Google may have reversed the refresh as sites that were seeing a healthy recovery are starting to lose ground again.

    That includes Barry Schwartz’s Search Engine Roundtable, which had fallen victim to Panda for some reason that’s still hard to wrap one’s head around. He’s reporting that his own data in addition to data from “the SEO community at large” show that some sites who saw a positive impact from the refresh are seeing the impact be reversed.

    Schwartz points to forum threads and tweets from others who are seeing this happen with a variety of websites. He also shares a look at his own stats in which you can see where his organic Google traffic went up after the refresh began to roll out, but has been on the decline more recently.

    A decline in Google traffic doesn’t necessarily mean Panda, but given that this has apparently been a common occurrence among sites who had begun to see recovery, it does seem to be related.

    Google recently indicated that it is shifting its infrastructure towards “more continuous changing and gradual rolling out of Panda, incorporated into their core ranking algorithms.”

    They still have work to do, but that’s the goal. It may still be a while before this is really the case.

    Image via Wikimedia Commons

  • Watch This If You Want To Capitalize On One Of Google’s Newest Ranking Signals

    Watch This If You Want To Capitalize On One Of Google’s Newest Ranking Signals

    Earlier this year, Google announced two major pieces of news with regards to how it ranks search results on mobile devices. The one that got the majority of industry coverage was related to the mobile-friendliness of sites. The other was that Google started using app indexing as a ranking signal, and has even expanded that signal since the initial announcement.

    This week, Google hosted an hour-long “office-hours hangout” (via Search Engine Roundtable) on the topic answering numerous questions about the topic, so if this is something you want to take advantage of (and if you have an app, why wouldn’t you?), you’ll probably want to give it a watch.

    App indexing and related efforts from Google should mean increased discovery for new and existing apps and the content within them. Etsy sellers appear to already be benefiting.

    You might also want to take a look at this App Indexing talk from Google I/O.

    Image via Google

  • Google Said A Little More About The Panda Refresh

    Google Said A Little More About The Panda Refresh

    Google has said before that its most recent Panda refresh would roll out slowly, taking months instead of day. It is a global refresh.

    It has also said that it is going so slowly because of technical issues rather than trying to confuse webmasters and SEOs.

    Search Engine Land is sharing some new details about the latest Panda after speaking with unnamed sources at Google.

    For one, Google reportedly said it’s shifting its infrastructure towards “more continuous changing and gradual rolling out of Panda, incorporated into their core ranking algorithms.”

    They still have work to do, but that’s the goal. This is pretty much along the lines of stuff we’ve heard before, but this is an update illustrating that it may still be a while before this is really the case.

    They also told SEL that Panda may hit different pages on a site at different times and in different ways despite it being a side-wide action. Some pages could be hurt more than others. Ultimately, you won’t know if you’re being affected by it for sure until the whole, slow roll-out completes. We’re assuming Google will indicate to webmasters when it is actually done. I’m sure people will hound them about it until they spill the beans either way.

    Finally, the company indicated to SEL that the same Panda advice they gave back in 2011 still applies today. That would essentially be a list of 23 questions to ask yourself about the quality of your site.

    None of what Google is saying is likely to make webmasters feel much better about their situation if they’ve already been impacted by the refresh or if they’re trying to recover from the previous one and not seeing results so far. That said, there have been reports of signs of recovery.

    Given that the roll-out is so slow, we don’t really know what types of sites are suffering the most and getting the most benefit out of the refresh. That will have to be evaluated after it completes.

    Image via Wikimedia Commons

  • The Most Expensive Keywords On Google

    The Most Expensive Keywords On Google

    Ever wondered what the most expensive keywords on AdWords are? If you have, you’ll want to take a look at this new infographic from SEMrush, which shows the top 100 in general as well as prices across a few different categories.

    most-expensive-keywords-infographic

    As SocialMediaToday notes, while this is a look at price CPC for paid search, it also gives you a good idea of how hard it will be to rank for a keyword organically.

    Google recently released its quarterly earnings report. In that, it reported that aggregate paid clicks were up 18% year-over-year and 7% quarter-over-quarter. That’s up 30% year-over-year on Google sites and 9% year-over-year on network members’ sites. Aggregate CPCs were up 11% year-over-year and 4% quarter-over-quarter. CPC on Google sites was up 16% year-over-year.

    Images via Google, SEMrush

  • Here’s Why Google’s Panda Update Is So Slow

    Here’s Why Google’s Panda Update Is So Slow

    Google’s latest Panda refresh is running really slowly, which pretty much adds insult to injury for sites negatively impacted by the previous one, which rolled out all the way back in October. When that one rolled out, Google implied that things would start moving with it more smoothly.

    After that, it took a surprisingly long time for Google to finally push out this latest refresh. It began rolling out about two weeks ago, but the company also said that it would take months to complete, though it is a global roll-out.

    We now have some insight into just why Google is being so slow with this one. Google Webmaster Trends analyst John Mueller participated in one of his regular webmaster hangouts and explained the technical difficulties associated with the Panda refresh.

    Search Engine Land points to the relevant section of the hour-long video with this transcript:

    This is [Panda rollout] actually, pretty much a similar update to before. For technical reasons we are rolling it out a bit slower. It is not that we are trying to confuse people with this. It is really just for technical reasons. So, it is not that we are crawling slowly. We are crawling and indexing normal, and we are using that content as well to recognize higher quality and lower quality sites. But we are rolling out this information in a little bit more slower way. Mostly for technical reasons. It is not like we are making this process slower by design, it is really an internal issue on our side.

    Ok, well he didn’t really “explain” the technical difficulties so much as explain that there ARE technical issues at the root of why the refresh is so slow.

    I don’t know that any of this will be of much comfort to sites waiting for a chance to regain lost search visibility, but at least it’s something.

    Image via Wikimedia Commons

  • Google Urges You To Reconsider Using These

    Google Urges You To Reconsider Using These

    Google shared some results of some testing it conducted with interstitials. This is of particular interest since the company has indicated using them will likely start impacting your search rankings in a negative way.

    Have you used interstitials on mobile content? Have you noticed any impact on your search visibility that appears to be related? Discuss.

    Google looked at behavior related to its own use of interstitials, specifically with the Google+ mobile site, which utilized one encouraging users to install the app. 9% of visits to its interstitial page resulted in the “Get App” button being pressed. It did note that “some percentage” of users already have the app installed, so they don’t see it in the first place. 69% of visits abandoned the page, it said. They neither went to the app store nor continued to the mobile website. Presumably they were so annoyed they just didn’t feel like going any further.

    “While 9% sounds like a great CTR for any campaign, we were much more focused on the number of users who had abandoned our product due to the friction in their experience,” Google said. “With this data in hand ,in July 2014, we decided to run an experiment and see how removing the interstitial would affect actual product usage. We added a Smart App Banner to continue promoting the native app in a less intrusive way, as recommended in the Avoid common mistake section of our Mobile SEO Guide. The results were surprising.”

    1-day active users on the mobile site increased by 17% and Google+ iOS native app installs were mostly unaffected (-2%). They didn’t report the Android numbers because most Android devices come with the app pre-installed.

    “Based on these results, we decided to permanently retire the interstitial,” Google said. “We believe that the increase in users on our product makes this a net positive change, and we are sharing this with the hope that you will reconsider the use of promotional interstitials. Let’s remove friction and make the mobile web more useful and usable!”

    Yelp CEO and frequent Google critic Jeremy Stoppelman, had this to say about Google’s post:

    Yelp recently put out its own study showing how Google allegedly manipulates search results in its own favor. It claimed Google is “reducing social welfare” with “lower quality results”.

    Interstitials might actually start hurting your Google rankings if they’re not already. Ahead of Google’s mobile-friendly update in in April, there was talk around the SEO industry that interstitials could be looked upon as hurting the mobile user experience, and therefore hurt webmasters in in rankings as Google started to take into account the mobile experience.

    Last month, Eric Enge at Stone Temple Consulting posted an interview with Google Webmaster Trends Analyst Mariya Moeva. He asked if implementing an interstitial to drive people to sign up for an app would negatively impact mobile rankings, and if that’s something people should stay away from.

    Moeva responded, “Speaking as a user myself, I have yet to see an interstitial that brought me some useful info and was more important than what I was originally trying to do. They’re disruptive and can be frustrating, especially if you show them right on the first page the user ever sees from your site. Apparently, I’m not the only one who thinks so…We see app install interstitials bother users, so we’re looking into ways of addressing that; stay tuned for more news.”

    As Enge pointed out, Google’s Maile Ohye talked a little about this at the recent SMX Advanced search conference. Jennifer Slegg blogged about her comments:

    We have known for a couple of months that Google was planning to add interstitials as a negative ranking factor in an upcoming mobile friendly algo, but it appears that the same will be coming to the regular search results too.

    Maile Ohye from Google warned webmasters at SMX Advanced that they will also be bringing up the issue of interstitials and how pages that use them will be affected. “Interstitials are bad for users, so be aware this is something we are thinking about,” she said.

    She then continued on to say that content hidden behind interstitials would be devalued.

    As Google itself noted in regard to the new test, the company actually says in its Mobile SEO Guide, which it directed webmasters to ahead of the mobile-friendly update, that they should “avoid interstitials.”

    “Many websites show interstitials or overlays that partially or completely cover the contents of the page the user is visiting,” it says. “These interstitials, commonly seen on mobile devices promoting a website’s native app, mailing list sign-up forms, or advertisements, make for a bad user experience. In extreme cases, the interstitial is designed to make it very difficult for the user to dismiss it and view the real content of the page. Since screen real-estate on mobile devices is limited, any interstitial negatively impacts the user’s experience.

    Interestingly enough, Google itself touts “interactive interstitial ads” on its Think with Google Site, saying they can “make your brand stand out”. It says they engage more users than basic text or image ads and offer mobile advertisers “great interactivity at eye-catching placements”.

    As I wrote in a previous article on all of this, interstitials can help the viewability problem in advertising, and a lot of sites use them to get sign ups. They’re also often directly linked to monetizing content.

    Should Google penalize sites that use interstitials? Should it depend on the content of that interstitial itself? What do you think? Tell us in the comments.

  • Is Google’s Latest Panda Refresh Too Slow?

    Is Google’s Latest Panda Refresh Too Slow?

    Last Friday, Google confirmed that it had finally begun rolling out a refresh to its infamous Panda algorithm the prior weekend. It’s still rolling out as the company said it would take “a few months” to complete.

    Are Google’s Panda refreshes too slow and far between? Share your thoughts in the comments.

    The update, Google’s Gary Illyes said, affects 2 – 3% of queries.

    The refresh is particularly noteworthy because it took Google so long to actually launch it despite telling webmasters it’s trying to do these things more quickly. And that’s important because websites that are impacted have to wait for a refresh for a chance to recover any lost visibility in the search engines.

    The previous Panda refresh came as long ago as October. At that point, Google indicated Panda would pretty much continue indefinitely. It would seem that this wasn’t quite the case, even if that is still Google’s ultimate goal.

    While you may have no sympathy for sites that are negatively impacted by Panda if they’re producing the type of content that the update was designed to target (thin, low-quality content), there are cases when the update also negatively impacts higher quality stuff. In those cases, the long wait for a chance to recover is a little more disturbing.

    A few years ago, for example, IT discussion community DaniWeb was hit by the algorithm despite being a forum with a solid user base and being the kind of site that provides helpful answers for real problems that people have.

    Not even a popular site like Metafilter is immune to Google’s wrath.

    A more recent and maybe even a better example would be Search Engine Roundtable. This is one of the go-to industry blogs for SEO with content exclusively from a long-term veteran of the industry, often with direct quotes from Google itself. It’s not the type of site you would expect Panda to go after, yet last October, that’s exactly what happened. Under the new refresh, the site is indeed recovering even if gradually.

    Ironically, it is the author of this very site, Barry Schwartz, that broke (and confirmed) the news that Google finally launched its refresh.

    Schwartz is seeing continued improvement since the latest refresh, he’s still waiting to get back to where he was before the Panda hit in October.

    “There is plenty of room to grow but hopefully as the new Panda scores hit all my pages, recovery will come back in line with what it was before Panda 4.1 hit this site,” he writes.

    Despite Schwartz’s reports, some questioned the authenticity of the news of a Panda refresh actually occurring, but Illyes confirmed it directly on Twitter:

    It’s also a global roll-out, by the way:

    With this one being such a slow rollout, it’s hard (and unnecessary) to say who the real winners and losers are at this point. It’s going to be a while before we really have an idea. In fact, Searchmetrics, which regularly publishes lists of apparent winners and losers for major Google updates, says there’s no pattern yet.

    “It is not yet possible to detect a clear pattern regarding the winners and losers in the rankings, nor is it possible to correlate these results with specific aspects of the Panda update,” said Searchmetrics founder Marcus Tober. “In this regard, we expect to see changes in the SERPs over the coming weeks. We will continue to observe the data and keep you updated on this page about the effects of the Panda update.”

    In the meantime, it might not be a bad idea to review Google’s 23 questions you should ask yourself about the quality of the content on your site.

    Google has said in the past that Panda should help small businesses, but it’s unclear to what extent this has actually happened. We tend to hear more about the businesses that get hurt by it (which are sometimes forced to reduce their staff).

    Have you been waiting for this latest Panda refresh to come? Have you been impacted by it one way or another so far? Let us know in the comments.

    Image via Thinkstock

  • Google Gives An Update On How It Handles New gTLDs

    Google Gives An Update On How It Handles New gTLDs

    Google is letting webmasters know about how it handles new top level domains. The company says that as many new generic TLDs become available, it wants to provide some insight into how they’re handled in Google search as it has seen and heard a lot of misconceptions about the topic.

    The most important thing to note is that Google will generally treat the new gTLDs just like any other gTLDs like .com, .org, etc. Keywords in the TLD do not give it any advantage or disadvantage in search, it says.

    IDN TLDs such as .みんな can be used just like any other TLDs, and Google treats the Punycode version of a hostname as being equivalent to the unencoded version. This means you won’t have to redirect or canonicalize them separately. Google does say to use UTF-8 for the path & query-string in the URL, when using non-ASCII characters.

    Branded TLDs will not be given any more or less weight. Google says they’ll be treated the same as other gTLDs.

    “They will require the same geotargeting settings and configuration, and they won’t have more weight or influence in the way we crawl, index, or rank URLs,” notes Google Webmaster Trends analyst John Mueller.

    Google will treat those that look region-specific (such as .london) the same as any other gTLDs.

    “This is consistent with our handling of regional TLDs like .eu and .asia,” says Mueller. “There may be exceptions at some point down the line, as we see how they’re used in practice. See our help center for more information on multi-regional and multilingual sites, and set geotargeting in Search Console where relevant.”

    Google will still use ccTLDs to help it geotarget websites. It assumes that if the domain utilizes a country’s ccTLD, it’s probably relevant to that country.

    Google has a section in its help center to help webmasters and SEOs move their site from their current domain to a new TLD. If this is something you plan on undertaking, you’ll probably want to take a good look at that.

    Image via Google

  • Google Shows More Local Business Results Than Bing Or Yahoo

    Google Shows More Local Business Results Than Bing Or Yahoo

    According to a new report from local search marketing company BrightLocal, Google is more generous to local businesses than other search engines. Google, it says, displays more local pack results than Yahoo or bing, and is most confident inn the quality of its local data compared to Yahoo or Bing.

    Still, local businesses are being squeezed out of organic positions by large sites, it finds. Search terms with a “Geo-modifier” return “much more” local results.

    BrightLocal carried out a series of searches on Google, Bing and Yahoo for a range of keywords covering 7 different types of local businesses in various locations. They analyzed local pack results, large website organic results, and local business website organic results. They also analyzed geo-modified terms versus non-geo terms (such as “plumber chicago” vs. “plumber”) and generic terms versus longtail terms versus product/service terms (“plumber” vs. “emergency 24hr plumber” vs “radiator repair”).

    Here’s the chart showing how Google is showing more local pack results for generic terms:

    local-pack

    The edge goes to Yahoo for long tail terms:

    longtail

    For product/services, Google once again takes the cake:

    Product-Search-Terms

    As does it with generic keywords + location, long tail + location, and product + location:

    Keyword-Location

    longtail-Location

    product-Location

    You can find the full report here for more analysis on each element of this as well as a look at leisure-based business keywords and how the search engines show results for these.

    Images via BrightLocal

  • Is This Google Ranking Signal Getting Stronger?

    Is This Google Ranking Signal Getting Stronger?

    Update: Search Engine Roundtable says there was no update on the date Enge points to.

    While we certainly don’t know for sure, there are signs that Google could be turning up the dial on just how impactful mobile-friendliness is as a ranking signal for websites.

    Do you think it should be a greater signal for ranking? Let us know in the comments.

    As you’ll recall, Google launched an update in April, which it had announced months earlier, that was aimed at giving sites that are mobile-friendly (and pass Google’s mobile-friendly test) a boost in search rankings. It was still meant to only be one of many signals Google uses, but a signal nonetheless. Prior to its launch, the update became synonymous with “Mobilegeddon” as webmasters and SEOs braced for a big shake-up in search results.

    Early reports after the launch however suggested that the impact may not have been so great after all. Even Google’s own Gary Illyes suggested that the number of sites may have been lower since so many sites became mobile friendly in anticipation of the update.

    One study we recently looked at came from Stone Temple Consulting, who has been doing some great research into how Google works (see our coverage of their studies on Google’s indexing of tweets).

    According to that, nearly half (46%) of non-mobile-friendly URLs that held top 10 spots on April 17 lost ranking, while fewer than 20% gained. Other findings included:

    – For URLs that dropped in ranking, the drop for non-friendly URLs was more pronounced – an average of 2 spots – than for mobile-friendly URLs – average of .25 spots.

    – Another significant effect was that URLs being favored for mobile-friendly sites are often different from the ones that ranked earlier.

    – Overall, the study found a 1.3% increase in mobile-friendly URLs in search results. While this does not approach the impact of Panda or Penguin algorithm updates, this is the first such change by Google, and we expect more changes and an increased impact over time favoring mobile-friendly sites.

    Stone Temple’s Eric Enge concluded in the report, “In summary, I’d suggest that the impact of this release was indeed significantly bigger than originally met the eye. The trade press did not see it as large because of the slow roll out, and the intervening Search Quality Update. In addition, this is likely just the start of what Google plans to do with this algorithm. It is typical for Google to test some things and see how they work. Once they have tuned it, and gain confidence on how the algo works on the entire web as a data set, they can turn up the volume and make the impact significantly higher. It’s my expectation that they will do that. In the long run, don’t be surprised if the impact of this algorithm becomes even greater, and that people will stop debating whether or not it was greater than Panda or Penguin.”

    Enge shared some additional analysis related to the update on Google+ today. He believes he has found a sign that Google may be giving the mobile friendly signal more weight now. Here’s the post:


    It wouldn’t be much of a surprise to see Google giving the signal more weight. They did make a pretty big deal about it ahead of the launch, and gave webmasters and SEOs a great deal of notice in advance. Google has also been talking about how mobile search volumes are overtaking desktop in a number of countries. As that gap continues to widen in favor of mobile, it only makes sense that Google utilize this signal more.

    In related news, Adobe released its Digital Advertising Report for Q2. While it’s only a small section of the broader report, it does look at the impact of Google’s mobile-friendly update. According to that, organic traffic was up to 10% lower among sites with low mobile engagement.

    “While there wasn’t a precipitous drop among non-friendly sites, the effect is pronounced over the 10 weeks after the event,” said Tamara Gaffney, principal at Adobe Digital Index. “Such continued loss of traffic suggests that immediate emphasis would have been placed on paid search as a quick way to recover traffic. But that strategy is not necessarily sustainable.”

    “Brands that neglected to address their mobile Web strategies are seeing mobile advertising via Google’s network delivering less value at a greater cost, with a growing gap between mobile click-through rates (CTRs) and cost-per-clicks (CPCs),” Adobe says. “ADI reports mobile CPCs are up 16%, while CTRs are falling, down 9%.”

    “Increases in CPC stretch marketing budgets due to what is known as click inflation–advertisers have to spend more just to stay even,” added Adobe Digital Index analyst Joe Martin.

    You can read more on the report’s findings, including why Adobe says Google is “losing ground as a marketing vehicle” here. The full report is here.

    Should Google crank up the dial on the mobile-friendly signal? Have you noticed anything on your end to suggest that they’ve done so? Let us know in the comments.