WebProNews

Tag: John Mueller

  • Google: No More Core Updates This Year

    Google: No More Core Updates This Year

    Site admins and SEOs can breathe a sigh of relief, with Google confirming there will be no new core update this year.

    Core updates are changes to the algorithm Google uses for its search rankings. Depending on the amount of change, a core update can wreak havoc on a website’s standings.

    Going into the holiday season, many SEOs and site admins were likely worried about the possibility of another core update before the end of the year, one that could negatively impact their during the season business.

    Google Search Advocate John Mueller has put everyone’s mind at ease in a tweet:

    FWIW, just to be clear, there’s no core update lined up for the rest of the year. That doesn’t mean there won’t be visible changes; the linkspam & HCS updates are still rolling out ( https://developers.google.com/search/updates/ranking ), and, Search continues to reflect changes across the web.

    John Mueller (@JohnMu), December 16, 2022

  • Google Search Algorithm Update on October 16th Looks Like a Big One

    Google Search Algorithm Update on October 16th Looks Like a Big One

    SEO expert Barry Schwartz released a video this morning stating that he sees “lots of signals” that the recent October 16 Google algorithm update was a “big one.” Barry also talked about a Webmaster Hangout this morning with Google’s John Mueller and a popular site owner who lost 60 percent of his traffic overnight. John and his team actually looked into this one in detail but offered no advice except to keep improving the site.

    Barry Schwartz on the Google October 16 algorithm update:

    Although Google won’t confirm it yet, I asked them, there seems to be a Google search algorithm update on October 16th and it looks like a big one. We have lots of signals from both the SEO community as well as the different tracking tools that there was some sort of big algorithm update. It probably was related to tweaking what they have done over the last couple of months.

    John Mueller said that all Google updates impact every single website in the Google index. Obviously, if your site didn’t see an impact, it was still impacted by the algorithm but just had zero impact change. It’s not like specific sites were impacted, it’s just the overall algorithm.

    This morning, John Mueller did a Hangout which he does weekly (roughly) and a guy came in who runs a popular website that lost 60 percent of his organic traffic. He came to John before and John asked for detailed examples which he gave him.

    The interesting thing about this is that John actually explained that he went to the Google engineer team, went over the examples with them, and came back with the answer that the algorithm is working as expected, things change over time, and to keep making your website better.

    Of course, understanding and seeing somebody whose traffic dropped and who also has payroll because he has a big staff is a big impact on a website and a business owner. Listening to that conversation, to John and the site owner who lost half of his business or more because of the Google algorithm is somewhat heartbreaking.

    Hopefully, he will be able to figure out what to do. John’s not giving any specific advice outside of just make your site better, read the quality rating guidelines and look for things there.

  • Google Will Announce The Long-Anticipated Penguin Update

    Google Will Announce The Long-Anticipated Penguin Update

    Well, here we are halfway through April, and still no sign of that long-anticipated Penguin update. Google has been hinting at rolling this thing out for many months. At one point it was supposed to happen before the end of last year, but it kept getting pushed back. Google just hasn’t been ready to launch it yet.

    Early last month, Google’s Gary Illyes, who had been hinting at date possibilities for quite some time, said he would no longer give any timeframes as he kept being wrong.

    We still don’t have a timeframe, but at least we’ll know when the time finally does come. Google’s John Mueller said in a Google+ hangout (via Search Engine Roundtable):

    I am pretty sure when we start rolling out [Penguin] we will have a message to kind of post but at the moment I don’t have anything specific to kind of announce.

    Waiting on Google to push the new Penguin has grown increasingly frustrating for businesses impacted by the update in the past, who have lost search visibility and traffic and have no way to recover until the next one comes. Once the next one does come, it will supposedly be continuous, meaning that sites will no longer have to wait so long to recover in the future if they make the necessary adjustments.

    Read this interview Illyes did with Stone Temple a while back for more comments on the the pending update.

  • Google Announces JSON-LD Support For Reviews and Products Structured Data Markup

    Google Announces JSON-LD Support For Reviews and Products Structured Data Markup

    Google announced JSON-LD support for Reviews and Products structured data markup.

    Webmaster trends analyst John Mueller wrote in a Google+ post (via Search Engine Roundtable) that along with the launch of support for JSON-LD for Reviews and Products structured data markup, they’ve also “cleaned up” some of their application logic.

    “For example, requirements for explicit reviewed item and correct property name values are now enforced. Check your markup in the Structured Data Testing Tool and Search Console Structured Data Dashboard to see if your site is impacted by these changes.”

    Mueller notes that more info about how to format Reviews and Products markup to comply with Google’s validation rules is available here.

    “We hope these changes make it easier to use structured data on your side!” he says.

    Google released the structured data tool to help webmasters author and publish markup on their sites. It provides validation for all Google features powered by structured data, support for markup in JSON-LD (including dynamic HTML pages).

  • What Google Says About SEO in 2016

    What Google Says About SEO in 2016

    SEO is an ever-changing industry as search engines (Google in particular) evolve to some extent every single day. Google makes algorithm changes on a daily basis, and every now and then it makes major changes that cause massive shake-ups in search results as well as SEO strategies.

    What do you expect to change the most about optimizing for Google in 2016? Share your thoughts in the comments.

    Mobile has been a major focal point of Google for much longer, but in 2015 it was as big a focus as ever. Early in the year, Google announced two significant ranking factors – app indexing and mobile-friendliness – both aimed at improving the mobile experience for users and getting them the content they want/need in the best way possible.

    This will (unsurprisingly) continue to be a major focus on Google’s heading into 2016.

    In a recent webmaster hangout on Google+, Google webmaster trends analyst John Mueller spoke a little about what to expect for SEO in the coming year (via Barry Schwartz).

    The relevant portion of the video begins at about 26 minutes in, but you’re probably only going to get more by watching the entire video.

    Mueller answers a question about general SEO tips for 2016 (as transcribed by Schwartz):

    Oh man… I don’t have any magical SEO tips for next year. I can’t tell you about that high ranking meta tag that we’ve been working on [sarcasm].

    But in general, I think, next year you’ll probably hear a lot about from us about AMP, mobile friendly, we’ve been doing over the years. It is still a very big topic and we still see a lot of sites not doing that properly. Those are probably the bigger changes, but other things will definitely happen as well. More information about JavaScript in sites so that we can really figure out how to handle these better in search and make a better recommendation on what you should do or shouldn’t do.

    But past that, of course, high quality content is something I’d focus on. I see lots and lots of SEO blogs talk about user experience, which I think is a great thing to focus on as well. Because that essentially kind of focuses on what we are trying to look at as well. We want to rank content that is useful for them and if your content is really useful for them, then we want to rank it.

    We’ve covered mobile-friendliness a great deal throughout the year, so if this is something you’re still struggling with as Mueller implies, I’d encourage you to read back through the content found here.

    AMP of course refers to Accelerated Mobile Pages, which is a new open source project and basically Google’s answer to Facebook’s Instant Articles, which is being supported by a number of other internet players including Yahoo, Twitter, LinkedIn, Pinterest, WordPress.com, ChartBeat, Parse.ly, and Adobe Analytics.

    You can read more about this here, but Google recently said it will begin sending search traffic to AMP pages beginning in late February. So that’s one major change you can expect in 2016 (and early 2016 at that).

    Another big SEO change coming in early 2016 is Google’s next Penguin update which is supposed to update in real time moving forward.

    Regarding the javascript stuff Mueller mentioned, Google recently changed some recommendations related to that, which you can read more about here.

    What would you like to see Google change or do for webmasters and SEOs in 2016? Share your thoughts in the comments.

  • Here’s Why Google’s Panda Update Is So Slow

    Here’s Why Google’s Panda Update Is So Slow

    Google’s latest Panda refresh is running really slowly, which pretty much adds insult to injury for sites negatively impacted by the previous one, which rolled out all the way back in October. When that one rolled out, Google implied that things would start moving with it more smoothly.

    After that, it took a surprisingly long time for Google to finally push out this latest refresh. It began rolling out about two weeks ago, but the company also said that it would take months to complete, though it is a global roll-out.

    We now have some insight into just why Google is being so slow with this one. Google Webmaster Trends analyst John Mueller participated in one of his regular webmaster hangouts and explained the technical difficulties associated with the Panda refresh.

    Search Engine Land points to the relevant section of the hour-long video with this transcript:

    This is [Panda rollout] actually, pretty much a similar update to before. For technical reasons we are rolling it out a bit slower. It is not that we are trying to confuse people with this. It is really just for technical reasons. So, it is not that we are crawling slowly. We are crawling and indexing normal, and we are using that content as well to recognize higher quality and lower quality sites. But we are rolling out this information in a little bit more slower way. Mostly for technical reasons. It is not like we are making this process slower by design, it is really an internal issue on our side.

    Ok, well he didn’t really “explain” the technical difficulties so much as explain that there ARE technical issues at the root of why the refresh is so slow.

    I don’t know that any of this will be of much comfort to sites waiting for a chance to regain lost search visibility, but at least it’s something.

    Image via Wikimedia Commons

  • Google Gives An Update On How It Handles New gTLDs

    Google Gives An Update On How It Handles New gTLDs

    Google is letting webmasters know about how it handles new top level domains. The company says that as many new generic TLDs become available, it wants to provide some insight into how they’re handled in Google search as it has seen and heard a lot of misconceptions about the topic.

    The most important thing to note is that Google will generally treat the new gTLDs just like any other gTLDs like .com, .org, etc. Keywords in the TLD do not give it any advantage or disadvantage in search, it says.

    IDN TLDs such as .みんな can be used just like any other TLDs, and Google treats the Punycode version of a hostname as being equivalent to the unencoded version. This means you won’t have to redirect or canonicalize them separately. Google does say to use UTF-8 for the path & query-string in the URL, when using non-ASCII characters.

    Branded TLDs will not be given any more or less weight. Google says they’ll be treated the same as other gTLDs.

    “They will require the same geotargeting settings and configuration, and they won’t have more weight or influence in the way we crawl, index, or rank URLs,” notes Google Webmaster Trends analyst John Mueller.

    Google will treat those that look region-specific (such as .london) the same as any other gTLDs.

    “This is consistent with our handling of regional TLDs like .eu and .asia,” says Mueller. “There may be exceptions at some point down the line, as we see how they’re used in practice. See our help center for more information on multi-regional and multilingual sites, and set geotargeting in Search Console where relevant.”

    Google will still use ccTLDs to help it geotarget websites. It assumes that if the domain utilizes a country’s ccTLD, it’s probably relevant to that country.

    Google has a section in its help center to help webmasters and SEOs move their site from their current domain to a new TLD. If this is something you plan on undertaking, you’ll probably want to take a good look at that.

    Image via Google

  • Google’s Not Checking Your Facts Just Yet

    Google’s Not Checking Your Facts Just Yet

    A recently released Google research paper has been drawing some attention throughout the search industry. It proposes a signal for ranking search results based upon “the correctness of factual information provides by the source,” rather than links.

    Do you think this would be a good direction for the algorithm to go in? Let us know in the comments.

    As we reported before, just having this paper out does not mean that Google has implemented such a ranking strategy, nor does it necessarily mean that it will. Still, some misleading reports have circulated implying that Google is going forward with it.

    Just to confirm that this is not currently part of the Google algorithm, Google webmaster trends analyst John Mueller said as much in a Google+ hangout (via Search Engine Roundtable).

    A little over 49 minutes in, Mueller responds to a question about facts potentially being included as a ranking factor, and how Google would handle inaccurate information that can’t be fact checked. Mueller didn’t really have an answer for how Google would deal with that, but did say this:

    This was just a research paper that some of our researchers did and not something that we are using in our rankings. We have researchers that do fantastic research that publish tons of papers all the time, and just because they are researching something and trying to see which options are happening there, or because maybe they are patenting something or creating new algorithms, it doesn’t mean that is something we are using in search. At the moment, this is a research paper. I think it’s interesting seeing the feedback around that paper and the feedback from the online community, from the people who are creating web pages, from the SEOs who are promoting these pages, and also from normal web users who are looking at this. At the moment, this is definitely just a research paper and not something that we’re actually using.

    So there you have it. Now, all of that said…

    The paper is still more interesting than your run-of-the-mill Google research paper, for a few reasons. For one, we’re talking about a signal that could be looked at as more valuable than links, which have long been the backbone of Google’s ranking strategy. If implemented, it would represent a fundamental change in how Google ranks web pages.

    Secondly, the way the paper is written essentially calls out links as an outdated way of ranking content. If this is indeed the case, why would Google want to continue placing so much emphasis on that signal, when it has one that it feels is better representative of authoritative content?

    The opening paragraph of the paper pretty much discredits links as a valuable signal. It says:

    Quality assessment for web sources is of tremendous importance in web search. It has been traditionally evaluated using exogenous signals such as hyperlinks and browsing history. However, such signals mostly capture how popular a webpage is. For example, the gossip websites listed in [16] mostly have high PageRank scores, but would not generally be considered reliable. Conversely, some less popular websites nevertheless have very accurate information.

    Fourteen out of fifteen of those sites it refers to, it says, carry a PageRank among the top 15% of websites due to popularity, but for all of them, the Knowledge-Based Trust (KBT), which is the score for trustworthiness of information, is in the bottom 50% of websites.

    “In other words, they are considered less trustworthy than half of the websites,” Google says in the paper.

    So again, why would Google want to continue ranking content that isn’t trustworthy just because it has a lot of links? And we’re just talking about popular websites here. That’s not even taking into consideration black hat SEO practices, which Google has to constantly play whack-a-mole with.

    Thirdly, Google already uses a lot of “knowledge”-based features. You’re no doubt familiar with Knowledge Graph, and more recently Knowledge Vault. The search engine is constantly trying to deliver information directly in search results. This stuff is clearly of great importance to Google. To me, this just adds to the likelihood that Google will eventually use the signal discussed in the research paper, at least to some extent.

    What will really be interesting is whether or not Google will inform webmasters if it does implement such a signal. Will it announce it like it did its recent mobile-related signals? Time will tell.

    Either way, it can’t hurt websites to strive to include as accurate of information as possible, and do some fact checking when appropriate. Who knows? Maybe one day it will mean the difference in whether or not your page is on the first page of search results. The best part is that there is no down side to this. Accuracy lends to credibility, which good for you no matter what.

    Oh, by the way, Mueller has also been advising webmasters against link building.

    Do you think knowledge-based trust would be a better ranking signal than PageRank? Share your thoughts in the comments.

    Image via Google

  • Google Says You Should Avoid Link Building

    Google Says You Should Avoid Link Building

    You know how you’ve been building links to your website for years, trying to get Google look upon it more favorably? Well, according to Google, you shouldn’t bother doing that.

    Do you think there’s still value to link building? Let us know in the comments.

    Nearly an hour into a Google Webmaster Central Office Hours hangout on Friday, Google’s John Mueller was asked whether or not link building, in any way, is good for webmasters.

    Mueller’s response (via Search Engine Roundtable) was, “That is a good question. In general, I’d try to avoid that. So that you are really sure that your content kind of stands on its own and make it possible for other people of course to link to your content. Make it easy, maybe, put a little widget on your page, if you like this, this is how you can link to it. Make sure that the URLs on your web site are easy to copy and paste. All of those things make it a little bit easier. We do use links as part of our algorithms but we use lots and lots of other factors as well. So only focusing on links is probably going to cause more problems for your web site that actually helps.” Emphasis added.

    This part starts at about 55:40 into the following video:

    We reached out to online marketer Ken McGaffin, who is widely known for his quality link building services, to get his thoughts on what Mueller said. We’ll just give you McGaffin’s words verbatim:

    That all depends on the type of link building you’re doing. Let’s say I’ve just conducted a great piece of research for a client and my prime objective is to get them media coverage. The research and the accompanying press release was so good that it got coverage in the NYTimes, BBC and many others – good job done!

    But my secondary objective is to get links – so AS WELL as conducting the research, and writing the press release, I make sure that the journalist has something to link to, something that his readers will appreciate. That could be:

    – an in-depth blog post giving much more detail than the Journalist could give space to
    – a presentation or infographic of the results
    – a copy of the original research so that readers can check it out.

    In this case, I’m doing my client a service in getting PR coverage. But I’m also doing my best to ensure that editorial links and others links will follow. I can only see Google looking positively on my efforts – because of the value it offers. But if all I did was the ‘link building’ part then I’d be doing my client a disservice – and missing some major opportunities. This means that any online marketing/PR initiative is multi-layered – and one of those layers must be link building.

    Well said.

    There’s no question that link building strategies have had to adapt to the changing search engine climate over the years. We recently had a great conversation with Eric Ward (another prominent name in link building) about that.

    “Many people feel the very act of pursuing links has become evil, which is sad because it’s not even close to true. In 1994 nobody gave any thought to the idea that a link to a website could be a bad thing,” he told us. “The entire concept of a poisoned link profile is simultaneously comic and tragic. Links are not ‘things’. Links are not imbued with the quality of Good or Evil. Links are the visible manifestation of a human’s action and opinion, and in some cases, intent.”

    “I guess if I had to boil down the biggest change of all from a strategy standpoint it would be in trying to help people realize that it is incredibly easy compared to the old days to get URLs to migrate or propagate across the web,” he said. “What I mean by that is today everyone is a Link builder, they just don’t see themselves that way, and many linking strategists overlook this.”

    If you really want to dig into how linking and link building has changed over the years, I suggest the rest of our conversation.

    How has your linking strategy changed over the years? Please discuss in the comments.

  • Get Ready For Google’s New Mobile Ranking Signal

    Get Ready For Google’s New Mobile Ranking Signal

    Throughout the course of last year, Google made a bunch of moves showing that it was focusing on improving the mobile search experience for its users by way of getting websites (otherwise known as search results) to make themselves more mobile-friendly.

    In November, Google added a “mobile-friendly” label to mobile search results for sites that deserve such a title. It also said it was experimenting with using the same criteria that would earn a site the label for a ranking signal to give mobile-friendly sites even more love in the search results.

    Are you concerned about Google’s potential mobile-friendly ranking signal? Is this a positive step for Google? Let us know what you think in the comments.

    Now, webmasters are getting warnings from Google when their sites aren’t mobile-friendly, which may suggest that Google is about to implement that ranking signal. According to Barry Schwartz at Search Engine Roundtable, who says several of his clients received the warning over the weekend, these are being sent out at mass scale by way of email and Webmaster Tools.

    Schwartz shows a screenshot of one of the warnings, which says, “Fix mobile usability issues found on http://www…..” and then:

    Google systems have tested 3,670 pages from your site and found that 100% of them have critical mobile usability errors. The errors on these 3,670 pages severely affect how mobile users are able to experience your website. These pages will not be seen as mobile-friendly by Google Search, and will therefore be displayed and ranked appropriately for smartphone users.

    The message goes on to tell the webmaster to find problematic pages, learn about mobile-friendly design, and fix the mobile usability issues on the site. There is a link to view a report on the non-mobile-friendly pages, as well as one point to Google’s mobile-friendly guidelines. It also has links to a guide to making a CMS mobile-friendly, a page on Google’s Developer site about building mobile-friendly sites, and the Webmaster Central Forum, where webmasters are encouraged to ask more questions.

    Here’s another of the messages Martin Oxby shared on Twitter:

    Schwartz says Google had previously only notified sites that were “supposedly mobile friendly” when they had usability issues, but now they’re targeting sites that just aren’t at all mobile-friendly.

    The mobile-friendly labels should be fully rolled out on a global basis by now. In mid-November, Google said it was rolling it out over the next few weeks. When the company made the announcement, it also laid out some criteria for earning the label as detected by Googlebot.

    For one, a site should avoid software that isn’t common on mobile devices. It specifically mentioned Flash as an example. This actually follows Google’s previous shaming of Flash sites in mobile search results. Last summer, Google started showing messages for results that may not work in mobile results, such as “Uses Flash. May not work on your device.”

    Google says sites should use text that is readable without zooming, and should size content to the screen so that users don’t have to scroll horizontally or zoom. Links should also be placed far enough apart so that the correct one can be tapped easily.

    Google has a Mobile-Friendly Test tool here, which webmasters should find particularly helpful. You can simply enter a URL, and Google will analyze it and report if it has a mobile-friendly design.

    If a URL passes the test, it will tell you that the page is mobile-friendly, and give you some additional resources, including information about how Googlebot sees the page.

    If the URL fails the test, you’ll get reasons why the page isn’t mobile-friendly, as well as info about how Googlebot sees it, and resources to help you fix issues.

    After Google gave the news about using mobile-freindly as a ranking signal in November, the company said it would continue to use desktop signals for ranking mobile results. Google’s John Mueller said this in a Webmaster Central mobile office hours hangout:

    We need to focus on the desktop page for the search results for the most part. That’s also the one that you use with the rel canonical. As we pick up more information from mobile-friendly pages or from mobile pages in general, then I would expect that to flow into the rankings as well. So that’s something to keep in mind there.

    I’d still make sure that your mobile friendly pages are as fast as possible, that they work really well on mobile devices, that you’re going past just essentially the required minimum that we had with the mobile friendly tool, and really providing a great experience on mobile. Because lots of people are using mobile to kind of make their decisions, to read content, and if your site is kind of minimally usable on mobile, but really a bad user experience, really, really slow, then that’s something that users will notice as well and they’ll jump off and do something else or go to a different site.

    You can listen to him talk about that subject about 18 minutes and 50 seconds into the following video.

    Google has been asking random mobile users to rate their search results based on a five-star rating system ranging from poor to excellent.


    Last fall, Google Webmaster Tools added mobile usability tracking. This includes graphs that look at mobile issues over time, so you can see any progress you’ve made.

    Muller had this to say when announcing that: “A mobile-friendly site is one that you can easily read & use on a smartphone, by only having to scroll up or down. Swiping left/right to search for content, zooming to read text and use UI elements, or not being able to see the content at all make a site harder to use for users on mobile phones. To help, the Mobile Usability reports show the following issues: Flash content, missing viewport (a critical meta-tag for mobile pages), tiny fonts, fixed-width viewports, content not sized to viewport, and clickable links/buttons too close to each other.”

    “We strongly recommend you take a look at these issues in Webmaster Tools, and think about how they might be resolved; sometimes it’s just a matter of tweaking your site’s template!” he added.

    Google continues to look for ways to improve Webmaster Tools as the nature of search results continues to shift. Last week, it launched a new structured data tool to help webmasters author and publish markup on their sites. The company says it will better reflect Google’s interpretation of your site.

    Google is also asking webmasters for some ideas for new tools and features. The company wants to know what people want from it in 2015, and has a Google Moderator page where you can add your own suggestions or vote on others.

    What would you like to see Google add to Webmaster Tools? Let us know in the comments.

  • Google Penguin Update Will Probably Just Continue Indefinitely

    As you may know, Google’s latest Penguin update, which launched in October, is still rolling out. Google had said that it would be a slow rollout from the beginning, but here we are two months later, and it’s still going. They really weren’t kidding.

    Some may take issue with the fact that Google is pushing such a major update during the holiday shopping season. If you get hit by a Google update this time of year, you risk losing major sales if you’re selling things online that people might want to purchase as gifts.

    Either way, it’s continuing, and you probably shouldn’t count on Google to let you know when it’s done. Google’s John Mueller, whom you might as well consider the new Matt Cutts at this point, for all intents and purposes, conducted a Webmaster Hangout, as he often does. In it, he seemed to indicate that the update will continue, that Google might not publicly say when it stops, and that the goal is for it to continue updating. In other words, much like Panda before it, Penguin will probably just keep going on regularly.

    “We’re hoping that these things will keep updating,” he said.

    The good news with that scenario is that if you get hit by the update it should be a lot easier to recover. Before, you had to make the necessary changes to your site, and then sit patiently and wait for Google to launch another update. The last time, it took them a year to do so. The bad news, I guess, is that there is also a greater chance that you’ll be affected by the update.

    Of course the primary goal of Penguin is to get rid of spam, so if you’re not doing spammy stuff, you should be safe from that part of the algorithm. That does assume that Google’s algorithm is actually doing its job correctly.

    Via Search Engine Roundtable

    Image via Wikimedia Commons

  • Google Still Uses Desktop Page Speed For Mobile Ranking

    As you may know, Google has been focusing on how it can better rank mobile search results. The big news last week was that it added a “Mobile-Friendly” label to results, and would give such sites a ranking boost.

    Google said this week that it will still use desktop signals for ranking mobile results. John Mueller (who we’ll pretty much consider the new Matt Cutts at this point) talked about this in a Webmaster Central hangout (via Search Engine Roundtable):

    The subject comes up at about 18 minutes and 50 seconds into the video, when someone asked him if it is correct that Google uses the page speed of a site’s desktop version as a ranking signal for the mobile version. He said that this is correct at the moment (he thinks…he tends not to completely commit to a lot of these answers).

    Mueller went on to say:

    So we need to focus on the desktop page for the search results for the most part. That’s also the one that you use with the rel canonical. As we pick up more information from mobile friendly pages or from mobile pages in general, then I would expect that to flow into the rankings as well. So that’s something to keep in mind there.

    I’d still make sure that your mobile friendly pages are as fast as possible, that they work really well on mobile devices, that you’re going past just essentially the required minimum that we had with the mobile friendly tool, and really providing a great experience on mobile. Because lots of people are using mobile to kind of make their decisions, to read content, and if your site is kind of minimally usable on mobile, but really a bad user experience, really, really slow, then that’s something that users will notice as well and they’ll jump off and do something else or go to a different site.

    It’s pretty much common sense that you want your site to be as optimal as possible, but it is interesting that Google is still using desktop signals, especially considering that there are likely more signals to glean from mobile devices.

    Image via YouTube

  • Google: We Don’t Plan On Updating PageRank Again

    The writing has been on the wall for quite some time, but if you’re still clinging to any shred of hope that Google will be updating Toolbar PageRank again, it’s probably time to give it up.

    Do you think Toolbar PageRank should go away, or do you still have a purpose for relying on it? Let us know in the comments.

    It was pretty clear last month that PageRank is dead, when Google’s John Mueller (who has pretty much taken over for Matt Cutts when it comes to communicating these types of things to Webmasters), said, “PageRank is something that we haven’t updated for I think over a year now, and we’re probably not going to be updating it going forward, at least in the Toolbar PageRank…”

    The subject came up again in a Webmaster Help thread. This time he said (via Search Engine Roundtable):

    I wouldn’t use PageRank or links as a metric. We’ve last updated PageRank more than a year ago (as far as I recall) and have no plans to do further updates. Think about what you want users to do on your site, and consider an appropriate metric for that.

    He also linked to this post from Google’s Webmaster Central Blog from 2011 titled “Beyond PageRank: Graduating to actionable Metrics”.

    In that post, Google’ Susan Moskwa said to focus on things like conversion rates, bounce rates, and clickthrough rates, rather than PageRank.

    The last PageRank update was actually just under a year ago on December 6th. Even that update was just the result of some other changes. It wasn’t Google just updating PageRank to make it more useful.

    So if you don’t count that, you’re looking at close to two years since a legitimate update. It’s been much longer since most in the industry have considered it a legitimate metric anyway.

    As you may recall, Cutts talked about PageRank in a video over a year ago, discussing how it no longer has a lot of value, and why your site’s PageRank isn’t changing.

    “Over time, the Toolbar PageRank is getting less usage just because recent versions of Internet Explorer don’t really let you install toolbars as easily, and Chrome doesn’t have the toolbar so over time, the PageRank indicator will probably start to go away a little bit,” he said.

    The phasing out of PageRank is hardly a surprise.

    In another video before that one, he said, “Maybe it will go away on its own or eventually we’ll reach the point where we say, ‘Okay, maintaining this is not worth the amount of work.’”

    It seems that Google has reached that point.

    Would you like to see Google continue to update Toolbar PageRank, or is this a good riddance situation as far as you’re concerned? Let us know in the comments.

    Image via Google

  • Google Penguin Update Still Lurking, Keep An Eye Out

    Google Penguin Update Still Lurking, Keep An Eye Out

    It’s been a while since Google launched the most recent version of its famous (or infamous, depending on how you want to look at it) Penguin update. If you haven’t seen any changes in your rankings, don’t assume you’re in the clear just yet. After all this time it’s still rolling out. Keep holding your breath.

    Have you seen any changes since Penguin began rolling out? Positive or negative? Let us know in the comments.

    With Matt Cutts not due back at work any time in the foreseeable future, we’re going to have to continue to rely on Googlers like John Mueller and Pierre Far to give us updates on what’s going on with the ever-changing algorithm.

    About eighteen days ago, Google launched a long-anticipated Penguin update (it took over a year to finally launch).

    Far said a few days after the initial roll-out announcement that this Penguin would be a “slow worldwide rollout” and that it would settle down “over the next few weeks”.

    Confirmation came on Monday that the update is still in fact rolling out. Mueller participated in a Webmaster Central Office Hours hangout (via Alex Graves at David Naylor), and said as much.

    In the video, one person says they saw sites disappear to page 20 and 30 after the update launched, and the sites apparently went back to the first page. He asked if there was a reason the algorithm reversed the “spammy sites”.

    Mueller said he didn’t know about any specific sites, but said as far as he knows, the data is still rolling out.

    “You might just be seeing fluctuations from that,” he said, then reiterated that it’s still rolling out “as far he knows.”

    Mueller also said that Google probably wouldn’t take into account any new disavow files for “this round,” but that it’s never too late to use them. The reason why it won’t take them into account this time is because they have to re-crawl all the links. It’s not the case that these files are processed instantly.

    “We essentially have to re-crawl all those links, and then that data is taken into account the next time the algorithms use that data, so it’s never too late. It’s something where if you see problems, I’d definitely submit that file, and make sure that you have it in there, but it’s probably not going to take effect for this round,” he said.

    Who knows how long it will take for the next round to come? It took over a year last time, but Google has implied that Penguin will be refreshed more regularly going forward. We’ll see.

    By the way, remember when Google basically used to say that most people shouldn’t use the disavow tool?

    Googler Gary Illyes said ahead of the Penguin roll-out that this particular update should make webmasters’ lives easier, and that people would find it to be a “delight”. Considering it’s still rolling out, I guess the jury’s out on that one.

    How’s the Penguin treated you? Have you noticed any significant improvement in search results in general? Let us know in the comments.

    Image via Wikimedia Commons

  • Google Is Not Going To Be Updating Toolbar PageRank Anymore

    It looks like Google Toolbar PageRank may officially be a thing of the past. This will no doubt please some, while upsetting others, but for better or worse, don’t expect it to be updated anymore.

    Do you think Toolbar PageRank should die, or do you still find a use for it? Let us know in the comments.

    Over the last couple of years, Google has already been updating PageRank less frequently. In fact, it’s not even been updated this year at all. The last update came in December.

    Even before that, Google had given indication that it wouldn’t update it before the end of last year, if at all, though it ultimately did. By that point, many had assumed Toolbar PageRank was going away because it had been so long since the previous update after years of regularity. Before the December update, it hadn’t been updated since the prior February. Historically, they had updated it every three or four months.

    Google’s Matt Cutts tweeted a year ago that he would be surprised if there was another PR update before 2014. Well, there was, but that was the last one. It’s now been ten months.

    Google’s John Mueller actually addressed the lack of an update in a Google+ Hangout (via Search Engine Roundtable).

    PageRank is something that we haven’t updated for I think over a year now, and we’re probably not going to be updating it going forward, at least in the Toolbar PageRank…

    He said that at 20 minutes and 30 seconds into this video.

    Of course Mueller is incorrect in that it’s been over a year, but he seems to be under the impression that Toolbar PageRank is dead. He wasn’t exactly making an announcement, but discussing it in relation to somebody’s question about a particular site’s rankings, so it’s probably not out of the realm of possibility that an another update could sneak through, but it sounds like it’s not going to happen.

    A year ago, Cutts discussed PageRank in this video:

    “Over time, the Toolbar PageRank is getting less usage just because recent versions of Internet Explorer don’t really let you install toolbars as easily, and Chrome doesn’t have the toolbar so over time, the PageRank indicator will probably start to go away a little bit,” he said.

    In another video earlier in the year, he said, “Maybe it will go away on its own or eventually we’ll reach the point where we say, ‘Okay, maintaining this is not worth the amount of work.’”

    So the writing has been on the wall for quite some time. Still, people have continued to monitor PageRank, and look forward to seeing that data refreshed.

    The last update was actually kind of a side effect of sorts. As Cutts noted at the time, the team was fixing a different backend service, and did a PR update along the way. He said it wasn’t an accident, but that it was just easier for them to push the new PR data rather than keeping the old data. Maybe that will happen again.

    Do you want to see Google continue to update Toolbar PageRank? Let us know in the comments.

  • New Google Penguin Update Coming, Will Refresh Faster

    Google’s Penguin update is notorious for taking an extremely long time to get refreshed, leaving sites negatively impacted by it out of luck until Google finally pushes a refresh through. You can make all the changes you want in an effort to recover, but if Google doesn’t refresh it, it’s not going to make much difference.

    Google has now offered a couple of pieces of news. The third major Penguin update will be here before the end of the year, and it will start receiving quicker refreshes, which means sites (in theory) should be able to recover more quickly than they’ve been able to in the past.

    Is Google on the right track with the Penguin update? How do you think they’ve handled it thus far? Share your thoughts in the comments.

    Google is working on making the refresh process faster, which should please many webmasters and SEOs if and when this actually occurs.

    Google’s John Mueller talked about this in a Webmaster Hangout on Monday (via Search Engine Roundtable). He also noted that they’re still working on the long-anticipated update (the last refresh was nearly a year ago).

    “We are working on a Penguin update, so I think saying that there’s no refresh coming would be false,” Mueller said. “I don’t have any specific timeline as to when this happens…it’s not happening today, but I know the team is working on this and generally trying to find a solution that refreshes a little bit faster…but it’s not happening today, and we generally try not to give out too much of a timeline ahead of time because sometimes things can still change.”

    Asked if Penguin refreshes will come on a regular basis like those for the Panda update, Mueller said, “We’ll see what we can do there, so that’s something where we’re trying to kind of speed things up because we see that this is a bit of a problem when webmasters want to fix their problems, they actually go and fix these issues, but our algorithms don’t reflect that in a reasonable time. So that’s something where it makes sense to try to improve the speed of our algorithms…Some of you have seen this first hand, others have worked with other webmasters who have had this problem, and I think this is kind of something good to be working on.”

    Asked about the size of the impact of the next update, Mueller said, “That’s always hard to say, and I imagine the impact also depends on your website and whether or not it’s affected…if it’s your website, the impact is always big, right? We’re trying to find the right balance there to make sure we’re doing the right things, but sometimes it doesn’t go as quickly as we’d all like.”

    Mueller hinted in another Hangout nearly a month ago that the next Penguin wasn’t too far off. He participated in yet another Hangout on Friday morning (via Barry Schwartz), and said that we can expect the next Penguin update (which would be 3.0, for all intents and purposes), by the end of the year.

    That’s at least somewhat of a timeline. We’ve only got three months left. Granted, even this timeline isn’t certain.

    Asked if Penguin 3.0 will launch in 2014, Mueller said, “My guess is yes, but as always, there are always things that can happen in between. I’m pretty confident that we’ll have something in the reasonable future but not today, so we’ll definitely let you know when things are happening.”

    They’ll definitely let us know? That doesn’t seem like Google’s style these days.

    This week, Mueller also confirmed that a Penguin refresh is indeed required for an affected site to recover. Most people probably already knew that or at the very least expected as much, but it’s always nice to have official word from Google.

    That’s also all the more reason for webmasters to anticipate the next update with open arms in some cases.

    Note: This story has been updated to include additional information.

    Do you expect Google to actually roll the update out before the end of the year? Will you welcome a more rapidly-refreshing Penguin? Let us know in the comments.

    Image via YouTube

  • Were Your Google Authorship Efforts All For Nothing?

    Google introduced authorship support over three years ago, leading webmasters and anyone concerned with SEO to jump through a new set of hoops to make sure their faces were visible in Google search results, and hopefully even get better rankings and overall visibility in the long run. Now, Google has decided to pull the plug on the whole thing.

    Do you feel that authorship was a waste of time? Are you glad to see it go? Is Google making the wrong move? Share your thoughts in the comments.

    To be fair, Google called its authorship efforts experimental in the first place, but for quite a while, it looked like it would play more and more of a role in how Google treated search results, and more specifically, the people providing the content that populates them. Of course Google seems to be relying much less on people (at least directly) for search result delivery these days, favoring on-page “answers” over links to other sites.

    Google never came right out and said it would use authorship as a ranking signal to my recollection, but it did go out of its way to really encourage people to take advantage, recording multiple videos on various ways to implement authorship markup on your website. As time went on, they added more ways to implement it, sending a signal that doing so would be in your best interest.

    They also added features, such as display of comments, circle counts, etc. They added authorship click and impression data to Webmaster Tools. They dropped the author search operator in Google News in favor of authorship. They added authorship to Google+ Sign-In less than a year ago. It seemed that Google was only valuing authorship more as time went on.

    A year ago, Google’s Maile Ohye said, “Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic.” Emphasis added.

    Also last summer, Google’s Matt Cutts said, “I’m pretty excited about the ideas behind rel=’author’. Basically, if you can move from an anonymous web to a web where you have some notion of identity and maybe even reputation of individual authors, then webspam, you kind of get a lot of benefits for free. It’s harder for the spammers to hide over here in some anonymous corner.”

    “Now, I continue to support anonymous speech and anonymity, but at the same time, if Danny Sullivan writes something on a forum or something like that I’d like to know about that, even if the forum itself doesn’t have that much PageRank or something along those lines,” he added. “It’s definitely the case that it was a lot of fun to see the initial launch of rel=’author’. I think we probably will take another look at what else do we need to do to turn the crank and iterate and improve how we handle rel=’author’. Are there other ways that we can use that signal?”

    Before that, he had indicated that authorship could become more of a signal in the future, dubbing it a “long term trend.”

    At some point, something changed. Google started making reductions to how it used authorship rather than adding to it. Last fall, Cutts announced that Google would be reducing the amount of authorship results it showed by about 15%, saying that the move would improve quality.

    In June, Google announced it was doing away with authors’ profile photos and circle counts in authorship results, indicating that doing so would lend to a “better mobile experience and a more consistent design across devices.”

    But even then, results would still show a byline and contain a link to the author’s Google+ profile.

    Last week came the death blow. Google’s John Mueller announced that the company had made “the difficult decision” to stop showing authorship in search results, saying that the information wasn’t as useful to users as it had hoped, and that it could “even distract from those results”. Emphasis added.

    You know, because knowing more about a result – like who wrote it – is less useful.

    According to Mueller, removing authorship “generally” doesn’t seem to reduce traffic to sites, though you have to wonder if that’s the case for more well-known authors who stand to be affected by this the most. Mueller wrote:

    Going forward, we’re strongly committed to continuing and expanding our support of structured markup (such as schema.org). This markup helps all search engines better understand the content and context of pages on the web, and we’ll continue to use it to show rich snippets in search results.

    It’s also worth mentioning that Search users will still see Google+ posts from friends and pages when they’re relevant to the query — both in the main results, and on the right-hand side. Today’s authorship change doesn’t impact these social features.

    As Search Engine Land’s Danny Sullivan explains, just because authorship is now dead, that doesn’t mean “author rank” is.

    Cutts said earlier this year that Google uses author rank in “some ways,” including in the In-Depth Articles section. Google’s Amit Singhal has also suggested that the signal could come into play more in the future in terms of regular organic search results.

    Cutts said this late last year: “We are trying to figure out who are the authorities in the individual little topic areas and then how do we make sure those sites show up, for medical, or shopping or travel or any one of thousands of other topics. That is to be done algorithmically not by humans … So page rank is sort of this global importance. The New York times is important so if they link to you then you must also be important. But you can start to drill down in individual topic areas and say okay if Jeff Jarvis (Prof of journalism) links to me he is an expert in journalism and so therefore I might be a little bit more relevant in the journalistic field. We’re trying to measure those kinds of topics. Because you know you really want to listen to the experts in each area if you can.”

    Sullivan also points to an excerpt from Google Executive Chairman Eric Schmidt’s 2013 book The New Digital Age, which says: “Within search results, information tied to verified online profiles will be ranked higher than content without such verification, which will result in most users naturally clicking on the top (verified) results. The true cost of remaining anonymous, then, might be irrelevance.”

    The point to all of this is that even though so-called “authorship” is dead, it still matters to Google who you are, and that could have a much bigger impact on your visibility in the search engine than authorship itself ever did.

    But still, what a big waste of time, right? And how did Google go from thinking authorship information was so useful a year ago to finding it useless now?

    What do you think? Should Google have killed authorship? Do you believe the reasoning the company gave? Let us know in the comments.

    Image via Google+

  • Google Just Killed Authorship Entirely

    Google Just Killed Authorship Entirely

    Google announced that it is no longer using authorship markup or displaying author information in search results, saying that it just wasn’t as useful as expected.

    Actually, it was Google’s John Mueller who announced the change on his personal Google+ page rather than on any official Google blog, which seems odd for something like this that Google pushed on users a great deal a couple years ago. Mueller writes:

    I’ve been involved since we first started testing authorship markup and displaying it in search results. We’ve gotten lots of useful feedback from all kinds of webmasters and users, and we’ve tweaked, updated, and honed recognition and displaying of authorship information. Unfortunately, we’ve also observed that this information isn’t as useful to our users as we’d hoped, and can even distract from those results. With this in mind, we’ve made the difficult decision to stop showing authorship in search results.

    (If you’re curious — in our tests, removing authorship generally does not seem to reduce traffic to sites. Nor does it increase clicks on ads. We make these kinds of changes to improve our users’ experience.)

    He goes on to note that Google will continue to expand support of structured markup like schema.org, and use it to show rich snippets in search results. He also says the changes won’t affect users seeing Google+ posts from friends and pages in search results or publisher markup.

    Asked in the comments if Google will still be using authorship data behind the scenes, and whether or not people should remove the code from their pages, Mueller said, “No, we’re no longer using it for authorship, we treat it like any other markup on your pages. Leaving it is fine, it won’t cause problems (and perhaps your users appreciate being able to find out more about you through your profile too).”

    Asked if there is no longer any value to showing Google (via interlinking with the Google+ profile) what pieces of work have been published online, Mueller responded, “Well, links are links, but we’re not using them for authorship anymore.”

    Some obviously feel like they’ve jumped through various hoops Google has thrown at them, only for it all to have been a waste of time. It’s still not exactly clear why taking it away makes search results more useful.

    Here’s Mueller’s full post:


    Image via Google+

  • New Google Penguin Update Is Getting Closer

    It’s looking like we’ll probably be seeing Google launch a new version of the Penguin update before too long, though Google still won’t give an exact timeframe. Either way, they’re working on it, and it’s coming.

    Google webmaster trends analyst John Mueller, who regularly participates in Google hangout conversations with webmasters, hinted that an update is probably not too far off.

    Here’s the video. He starts discussing it at 21 minutes and 40 seconds in:

    “At the moment, we don’t have anything to announce,” he says. “I believe Panda is one that is a lot more regular now, so that’s probably happening fairly regularly. Penguin is one that I know the engineers are working on, so it’s been quite a while now, so I imagine it’s not going to be that far away, but it’s also not happening this morning…”

    Barry Schwartz at Search Engine Roundtable, who pointed out Mueller’s comments, suggests Penguin 3 could be happening as early as today, saying, “All the tracking tools are going a bit nuts the whole week,” and pointing to some webmaster forum chatter and speculation.

    Of course, such chatter and speculation runs rampant pretty much all the time, so I wouldn’t put a whole lot of stock into that particular aspect, but it does look like if it’s not coming immediately, it’s in the cards soon.

    It has, after all, been over a year since Google pushed Penguin 2.0. Even 2.1 (or whatever you want to call it), which was bigger than the average refresh, was in October. There almost has to be one soon at this point.

    Image via YouTube

  • Google’s ‘Completely Clear’ Stance On Disavowing ‘Irrelevant’ Links

    We knew that when Google launched the Disavow Links tool, people were going to use it more than they should, even though Google made it clear that most people shouldn’t use it at all.

    A person doing some SEO work posted a question in the Google Webmaster Central product forum (via Search Engine Roundtable) that many others have probably wondered: Should I use the disavow tool for irrelevant links?

    In other words, should you tell Google to ignore links from sites that aren’t related to yours? The answer is no.

    Google’s own John Mueller jumped in to say this: “Just to be completely clear on this: you do not need to disavow links that are from sites on other topics. This tool is really only meant for situations where there are problematic, unnatural, PageRank-passing links that you can’t have removed.”

    Google updates and manual action penalties have caused a lot of webmasters to re-evaluate their link profiles. Many have scrambled to get various links to their sties taken down, often going overboard (or even way overboard).

    For the record, Google still views backlinks as “a really, really big win in terms of quality for search results.”

    In other “how Google views links” news, Matt Cutts just put out an 8-minute video about how Google determines whether your links are paid or not.

    Image via Google.com

  • Google Says It Will Follow Five Redirects At The Same Time When Crawling

    About a year ago, Google put out a Webmaster Help video discussing PageRank as it relates to 301 redirects. Specifically, someone asked, “Roughly what percentage of PageRank is lost through a 301 redirect?”

    Google’s Matt Cutts responded, noting that it can change over time, but that it had been “roughly the same” for quite a while.

    “The amount of PageRank that dissipates through a 301 is currently identical to the amount of PageRank that dissipates through a link,” he explained. “So they are utterly the same in terms of the amount of PageRank that dissipates going through a 301 versus through a link. So that doesn’t mean use a 301. It doesn’t mean use a link. It means use whatever is best for your purposes because you don’t get to hoard or conserve any more PageRank if you use a 301, and likewise it doesn’t hurt you if you use a 301.”

    In a new Webmaster Central office hours video (via Search Engine Roundtable), Google’s John Mueller dropped another helpful tidbit related to redirects in that GoogleBot will follow up to five at the same time.

    “We generally prefer to have fewer redirects in a chain if possible. I think GoogleBot follows up to five redirects at the same time when it’s trying to crawl a page, so up to give would do within the same cycle. If you have more than five in a chain, then we would have to kind of think about that the next time we crawled that page, and follow the rest of the redirects…We generally recommend trying to reduce it to one redirect wherever possible. Sometimes there are technical reasons why that’s not possible, so something with two redirects is fine.”

    As Barry Schwartz at SER notes, this may be the first time Google has given a specific number. In the comments of his post, Michael Martinez says it used to be 2.

    Image via YouTube