WebProNews

Tag: webmaster tools

  • Google Asks Webmasters About Combining Sites In Search Console (Webmaster Tools)

    Google Asks Webmasters About Combining Sites In Search Console (Webmaster Tools)

    Google wants to know if webmasters would find it useful to combine different sites in Search Console (formerly Webmaster Tools). The company has a survey out inquiring about the subject.

    “For example, to view a combined Search Analytics report that includes different URL versions of your site (http and https) and different subdomains (mobile and international subdomains),” Google explains in the survey.

    First, it asks if Search Console allowed you to group together sites to view a combined report, would you use it? Answers respondents can give include:

    – No
    – Yes, I’d use it to combine different URL versions of my site (i.e. https/http and www/non-www)
    – Yes, I’d use it to track my subdomains (i.e. desktop/mobile sites and international subdomains)
    – Yes, I’d use it to track my entire brand (i.e. all subdomains and apps)
    – Other (with a form for explanation)

    Respondents are asked to state their role, choosing from SEO, Webmaster, Developer, Marketer/advertiser, website owner, or other. They can also provide Google with the websites they manage. Google says it can use this information to understand different use cases for sites of different sizes.

    The next section of the survey asks respondents to explain why they wouldn’t need a feature that groups multiple sites together. Possible responses include:

    – I only have one verified site
    – I only monitor my primary/canonical site
    – Other people manage the remaining sites for my company
    – Other (with a form for explanation)

    That’s pretty much the entirety of the survey.

    It would seem like the best solution for Google would be to give people the option to combine sites if it works for them. It seems likely that this will become a feature based on the company’s interest in feeling out the community. That is unless there is just little to zero interest in it. Still, I would guess that most webmasters (and other roles noted) would at least like to have the option even if it’s not something they immediately need.

    You can take the survey yourself here (via Search Engine Land).

    Image via Google

  • Google Webmaster Tools Changed To Google Search Console

    Google Webmaster Tools Changed To Google Search Console

    Recognizing that a lot of different types of people use Webmaster Tools beyond just traditional webmasters, Google has decided to rebrand its popular product to reflect that. From now on, Google Webmaster Tools will be known as Google Search Console. Here’s the logo:

    “For nearly ten years, Google Webmaster Tools has provided users with constantly evolving tools and metrics to help make fantastic websites that our systems love showing in Google Search,” wrote product manager Michael Fink in a blog post. “In the past year, we sought to learn more about you, the loyal users of Google Webmaster Tools: we wanted to understand your role and goals in order to make our product more useful to you.”

    “It turns out that the traditional idea of the ‘webmaster’ reflects only some of you,” he added. “We have all kinds of Webmaster Tools fans: hobbyists, small business owners, SEO experts, marketers, programmers, designers, app developers, and, of course, webmasters as well. What you all share is a desire to make your work available online, and to make it findable through Google Search. So, to make sure that our product includes everyone who cares about Search, we’ve decided to rebrand Google Webmaster Tools as Google Search Console.”

    Google did not announce any new features beyond the rebranding.

    Google webmaster trends analyst John Muller had this to say on Google+:

    I remember … back when Google Webmaster Tools first launched as a way of submitting sitemap files. It’s had an awesome run, the teams have brought it a long way over the years. It turns out that the traditional idea of the “webmaster” reflects only some of you. We have all kinds of Webmaster Tools fans: hobbyists, small business owners, SEO experts, marketers, programmers, designers, app developers, and, of course, webmasters as well. So, to make sure that our product includes everyone who cares about Search, we’ve decided to rebrand Google Webmaster Tools as Google Search Console .

    The rebranding does seem much more user-friendly than the term Webmaster Tools, which some with limited web experience may have found a little intimidating. In an era where businesses must have and maintain a web presence, the offering is more important than ever, and the rebranding could just lead to more businesses utilizing the Webmaster Tools features.

    Image via Google

  • Google Deprecates Old Webmaster Tools API

    Back in September, Google launched an update to its Webmaster Tools API to make it more consistent with other Google APIs. Those using other APIs from the company would find the new one easier to implement, the company said.

    Now, Google has announced that with the pending shutdown of ClientLogin, the old version will be shut down on April 20.

    “If you’re still using the old API, getting started with the new one is fairly easy,” says Google webmaster trends analyst John Mueller. “The new API covers everything from the old version except for messages and keywords. We have examples in Python, Java, as well as OACurl (for command-line fans & quick testing). Additionally, there’s the Site Verification API to add sites programmatically to your account. The Python search query data download will continue to be available for the moment, and replaced by an API in the upcoming quarters.”

    When it introduced the new API, Google said it made it easier to authenticate apps or web services, and provided access to some of the main Webmaster Tools features. These are some specific things you can do with it:

    • list, add, or remove sites from your account (you can currently have up to 500 sites in your account)
    • list, add, or remove sitemaps for your websites
    • get warning, error, and indexed counts for individual sitemaps
    • get a time-series of all kinds of crawl errors for your site
    • list crawl error samples for specific types of errors
    • mark individual crawl errors as “fixed” (this doesn’t change how they’re processed, but can help simplify the UI for you)

    You can find the links for the Python, Java, and OACurl examples here.

    Mueller says that comments and questions about the API should be posted in this blog post or this forum.

    Image via Google

  • How To Use Google’s New Blocked Resources Report

    Google just introduced a new Webmaster Tools feature called the Blocked Resource Report, aimed at helping webmasters find and resolve issues where Google can’t use images, CSS, or JavaScript that has been blocked. Blocked resource prevent pages from rendering properly, and Google wants to make sure you’re only blocking what you really want/need to be.

    The report provides the names of the hosts from which your site is using blocked resources. If you click on a row, it gives you the list of blocked resources and the pages that embed them. This should help you figure out the issues and take care of them so Google can better crawl and index your content.

    Some resources will be hosted on your site, while others will be hosted on others. Clicking on a host will also give you a count of pages on your site affected by each blocked resource. Clicking on any blocked resource will give you a list of pages that load that resource. If you click on any page in the table hosting a blocked resource, you’ll get instructions for unblocking that particular resource.

    In a help center article, Google runs down five steps for evaluating and redcuing your list of blocked resources:

    1. Open the Blocked Resources Report to find a list of hosts of blocked resources on your site. Start with the hosts that you own, since you can directly update the robots.txt files, if needed.

    2. Click a host on the report to see a list of blocked resources from that host. Go through the list and start with those that might affect the layout in a meaningful way. Less important resources, such as tracking pixels or counters, aren’t worth bothering with.

    3. For each resource that affects layout, click to see a list of your pages that uses it. Click on any page in the list and follow the pop-up instructions for viewing the difference and updating the blocking robots.txt file. Fetch and render after each change to verify that the resource is now appearing.

    4. Continue updating resources for a host until you’ve enabled Googlebot access to all the important blocked resources.

    5. Move on to hosts that you don’t own, and if the resources have a strong visual impact, either contact the webmaster of those sites to ask them to consider unblocking the resource to Googlebot, or consider removing your page’s dependency on that resource.

    There’s also an update to Fetch and Render, which shows how the blocked resources matter. When you request a URL to be fetched and rendered, it shows screenshots rendered both as Googlebot and as a typical user, so you get a better grasp on the problems.

    “Webmaster Tools attempts to show you only the hosts that you might have influence over, so at the moment, we won’t show hosts that are used by many different sites (such as popular analytics services),” says Google webmaster trends analyst John Mueller. “Because it can be time-consuming (usually not for technical reasons!) to update all robots.txt files, we recommend starting with the resources that make the most important visual difference when blocked.”

    In January, Google called on webmasters to offer suggestions for new features for Webmaster Tools. It set up a Google Moderator page where people could leave and vote on suggestions. Among the most popular suggestions were:

    “I would like to see in WMT data from 12 months, not 3 as it is now :)”

    “An automated action viewer, so webmasters can see if they were impacted by an algorithm such as Panda or Penguin>”

    “Bounce back measuring tool. Did the user go back to Google for a similar search or did they find what they needed?”

    Google has since given webmasters a new structured data tool.

    Image via Google

  • Google Gives Webmasters New Structured Data Tool

    Google announced that it is giving webmasters a new structured data tool to help them author and publish markup on their websites. The new tool, it says, will better reflect Google’s interpretation of your content.

    structured data tool

    google structured data tool

    The tool provides validation for all Google features powered by structured data, support for markup in the JSON-LD syntax (including dynamic HTML pages), clean display of the structured data items on your page, and syntax highlighting of markup problems in your source code.

    They also updated the documentation and policy guidelines for Google features that are powered by structured data.

    “We’ve clarified our documentation for the vocabulary supported in structured data based on webmasters’ feedback,” Google says in a blog post. “The new documentation explains the markup you need to add to enable different search features for your content, along with code examples in the supported syntaxes. We’ll be retiring the old documentation soon.”

    They also clarified policies on using structured data, and are encouraging people who see abuse to report spam.

    “We’ve extended our support for schema.org vocabulary in JSON-LD syntax to new use cases: company logos and contacts, social profile links, events in the Knowledge Graph, the sitelinks search box, and event rich snippets,” Google adds. “We’re working on expanding support to additional markup-powered features in the future.”

    Earlier this week, Google announced that you can now use markup to add social profile links to the Knowledge Graph as seen below:

    social profiles in google knowledge graph

    You can include Facebook, Twitter, Google+, Instagram, YouTube, LinkedIn, and/or Myspace. More on all of that here.

    If you’re wanting more from Google in Webmaster Tools, the company is also calling on webmasters to give them suggestions for new ideas. You can submit suggestions or vote on those submitted by others.

    Images via Google

  • Google Wants Some Ideas For Webmaster Tools Again. Got Any?

    Google is once again calling on webmasters for ideas on how to improve Google Search and Webmaster Tools. The company put out a brief Google+ update early this morning:

    What would you like to see from Google Websearch & Webmaster Tools in 2015?

    We’d love to hear your feedback & collect your suggestions on the kinds of things you’d wish to see being added, improved, removed, or changed by the Webmaster Tools and websearch teams at Google.

    Google is offering a Google Moderator page where you can submit suggestions and vote one those that others have contributed. Of course there’s no guarantee that Google will act upon any of these suggestions, even if they’re voted the highest, but it’s interesting to see what people want, and at least this gives Google a good way to get a feel for that.

    “What can we do to make your life – as a webmaster, producer of great content, or SEO – easier this year?” Google asks on the page.

    As of the time of this writing, there are over a hundred suggestions with with over 1,100 votes. The top one right now is: “Show more than 1,000 entries for any error report. 5,000 or 10,000 would be helpful.”

    Other popular suggestions include:

    “An automated action viewer, so webmasters can see if they were impacted by an algorithm such as Panda or Penguin.”

    “I would like to see in WMT data from 12 months, not 3 as it is now :)”

    Along side the manual penalty notices, a message to say whether or not an algorithmic penalty is applied to a website, and if so, what type of penalty and what action might help resolve it.”

    “Year on year comparison by month for impressions and clicks query data.”

    “Moe to account structure so http, https, subdomains (even subfolders) of one site can all be accessed and managed easily.”

    “I would like to see a much more detailed link tool within WMT. It would be nice to see all links, especially the latest ones without having to click around and download a .csv. Show us who is linking to which pages using what anchor text natively.”

    “I would like to see all back links data with nofollow and dofollow declaration separately. If possible also provide bad links and quality links checker.”

    There are plenty more where those came from.

    This is your chance to let your voice be heard, so if you’ve been wanting something from Google in this area, you better let them know. There’s a good possibility you’ll have additional like-minded webmasters backing you up with votes, and who knows? Maybe it will make a difference.

    Before you tell Google what you want, we’d love to hear it too in the comments.

    Image via Google

  • Google: WMT Automated Action Viewer ‘Not On The Horizon’

    Google: WMT Automated Action Viewer ‘Not On The Horizon’

    Google Webmaster Trends Analyst John Mueller hosted a Webmaster Central Office Hours hangout on Tuesday, and talked about he prospect of Google adding a feature to Webmaster Tools, which would let them know about automated actions that were taken against their sties, and how webmasters could go about fixing any issues.

    This was buried nearly forty minutes (38 minutes and 36 seconds to be precise) into the video, which itself is over an hour long.

    Barry Schwartz at Search Engine Roundtable pulled this tidbit including a transcript out of the video. Here are some excerpts:

    I think it might make sense at some point to find something that does something similar to what these algorithms are doing and bubble that up to the webmaster and say, hey, our Webmaster Tools quality check has recognized that these and these and these types of pages are generally lower quality. Maybe that’s something you want to look at. But I don’t think it would make sense to take the search ranking algorithms and kind of bubble that information up directly in Webmaster Tools just because it has a very different goal. But I do bring this up with the Webmaster Tools and the engineering team every now and then to make sure that we don’t lose track of that because I think sometimes, some of the information from these algorithms could be really useful to webmasters.

    These are definitely things that we’re always looking into and thinking about how much we could put more of this into Webmaster Tools, how much we have the technical details covered in Webmaster Tools, and say, well, technically, Webmaster Tools covers everything you need. Now we need to focus on kind of the softer factors. They’re always long discussions. I mean, if there’s something we can bubble up in Webmaster Tools there, I think I’d definitely be for that. But we really need to make sure that we’re bubbling up something that’s really actionable for you guys, not something where you see, oh, well, our algorithms think your site is kind of mediocre. And that doesn’t really help you.

    But all of this is probably fairly far off. It’s definitely not something where you’ll see this showing up in Webmaster Tools the next week or so. But we do try to bring these things into the discussions with the Webmaster Tools team and with the engineering teams to try to see how much of the information that we create about a website can be shown to the webmaster to help guide them in the right way.

    Mueller also commented on Schwart’z post about all of this adding, “Just to clarify, this is not ‘on the horizon.’ While we talk about ideas like these internally & externally, that doesn’t mean that they’re coming anytime soon, or even at all. I love seeing big steps being taken with our products, and you have to start with an idea, but ultimately not all ideas can be realized :).”

    In other words, keep dreaming.

    Image via YouTube

  • Google Webmaster Tools Gets Updated Robots.txt Testing Tool

    Google has released an updated robots.txt testing tool in Webmaster Tools. The tool can be found in the Crawl section.

    The aim of the new version of the tool is to make it easier to make and maintain a “correct” robots.txt file, and make it easier to find the directives within a large file that are or were blocking individual URLs.

    “Here you’ll see the current robots.txt file, and can test new URLs to see whether they’re disallowed for crawling,” says Google’s Asaph Amon, describing the tool. “To guide your way through complicated directives, it will highlight the specific one that led to the final decision. You can make changes in the file and test those too, you’ll just need to upload the new version of the file to your server afterwards to make the changes take effect. Our developers site has more about robots.txt directives and how the files are processed.”

    “Additionally, you’ll be able to review older versions of your robots.txt file, and see when access issues block us from crawling,” Amon explains. “For example, if Googlebot sees a 500 server error for the robots.txt file, we’ll generally pause further crawling of the website.”

    Google recommends double-checking the robots.txt files for your existing sites for errors or warnings. It also suggests using the tool with the recently updated Fetch as Google tool to render important pages, or using it to find the directive that’s blocking URLs that are reported as such.

    Google says it often sees files that block CSS, JavaScript, or mobile content, which is problematic. You can use the tool to help you fix that if it’s a problem with your site.

    Google also added a new rel=alternate-hreflang feature to Webmaster Tools. More on that here.

    Image via Google

  • Google Gives Webmasters New Page Rendering Tool

    Last week, Google named some JavaScript issues that can negatively impact a site’s search results, and said it would soon be releasing a tool to help webmasters better understand how it renders their site. The tool has now been announced.

    It comes in the form of an addition to the Fetch as Google tool, which lets you see how Googlebot renders a page. Submit a URL with “Fetch and render” in the Fetch as Google feature under Crawl in Webmaster Tools.

    “In order to render the page, Googlebot will try to find all the external files involved, and fetch them as well,” writes Shimi Salant from Google’s Webmaster Tools team. “Those files frequently include images, CSS and JavaScript files, as well as other files that might be indirectly embedded through the CSS or JavaScript. These are then used to render a preview image that shows Googlebot’s view of the page.”

    “Googlebot follows the robots.txt directives for all files that it fetches,” Salant explains. “If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.”

    Google recommends making sure Gooblebot can access any embedded resource that contributes to your site’s visible content or layout in any meaningful way to make it easier to use the new tool. You can leave out social media buttons, some fonts and/or analytics scripts, as they don’t “meaningfully contribute”. Google says these can be left disallowed from crawling.

    Image via Google

  • Google Names JavaScript Issues That Can Negatively Impact Your Search Results, Readies New Webmaster Tool

    Ever wonder about how Google handling the JavaScript on your site? It’s a common question, as Google’s Matt Cutts has discussed it several times in Webmaster Help videos.

    Google took to its Webmaster Central blog on Friday to talk about it even more, and offer a bit of perspective about just how far it’s come when it comes to handling JavaScript since the early days when it basically didn’t handle it at all.

    Beyond patting itself on the back though, Google offers some useful information – specifically things that may lead to a negative impact on search results for your site.

    “If resources like JavaScript or CSS in separate files are blocked (say, with robots.txt) so that Googlebot can’t retrieve them, our indexing systems won’t be able to see your site like an average user,” the post, co-written by a trio of Googlers, says. “We recommend allowing Googlebot to retrieve JavaScript and CSS so that your content can be indexed better. This is especially important for mobile websites, where external resources like CSS and JavaScript help our algorithms understand that the pages are optimized for mobile.”

    “If your web server is unable to handle the volume of crawl requests for resources, it may have a negative impact on our capability to render your pages. If you’d like to ensure that your pages can be rendered by Google, make sure your servers are able to handle crawl requests for resources,” the continues. “It’s always a good idea to have your site degrade gracefully. This will help users enjoy your content even if their browser doesn’t have compatible JavaScript implementations. It will also help visitors with JavaScript disabled or off, as well as search engines that can’t execute JavaScript yet.”

    Google also notes that some JavaScript is too complex or arcane for it to execute, which means they won’t be able to render the page fully or accurately. That’s something to keep in mind for sure.

    Also, some JavaScript removes content from the page, which prevents Google from indexing it.

    Google says it’s working on a tool for helping webmasters better understand how Google renders their site, which will be available in Webmaster Tools within days.

    Image via Google

  • Google Launches Official Google+ Page For Webmasters

    There is now an official Google+ page for Google Webmasters. Matt Cutts tweeted a link to it, and the page made its first post last night:


    So far, that’s all that it has to offer, but we can probably expect similar (if not the same) posts as what we see from the Google Webmasters Twitter account:


    That means links to Webmaster Central blog posts, links to Google Webmaster Help videos and various other updates that webmasters should know about.

    If you’re a Google+ junkie, you now have another way to keep up with all this stuff.

    The Google+ page only has 3,700 followers so far. That’s compared to the 111,000 on Twitter.

    Image via Google+

  • Google Launches WordPress Plugin For AdSense, Webmaster Tools Management

    Google announced the launch of its new Google Publisher Plugin for WordPress (in beta). This lets people place AdSense ads on their sites, and verify their sites in Webmaster Tools right from WordPress.

    The plugin links your site to your AdSense account, and lets you place ads without having to manually modify code. If you already have an AdSense account, the plugin will detect it and show your publisher info (make sure it’s correct). If you already have AdSense ads on your site and just want to manage them through the plugin, you can do so, but you’ll have to remove the existing ads first, then place new ones.

    For Webmaster Tools verification, it’s just a matter of a single click. Verification simply happens when you set up the plugin. Then, to open Webmaster Tools, open the plugin, and click the “manage site” button under “Webmaster Tools.”

    Product manager Michael Smith said in a blog post, “If you own your own domain and power it with WordPress, this new plugin will give you access to a few Google services — and all within WordPress. Please keep in mind that because this is a beta release, we’re still fine-tuning the plugin to make sure it works well on the many WordPress sites out there.”

    The Plugin can be found in the WordPress plugin directory.

    Image via WordPress.org

  • Google Webmaster Tools ‘Search Queries’ Feature Gets Some New Tweaks

    Google has announced a couple of changes to the Search Queries feature in Webmaster Tools, improving stats for mobile sites and getting rid of rounding.

    For webmasters who manage mobile sites on separate URLs from the desktop versions (like m.example.com), Google will now show queries where the m. pages appeared in results for mobile browsers and queries where Google applied Skip Redirect.

    Skip Redirect

    “This means that, while search results displayed the desktop URL, the user was automatically directed to the corresponding m. version of the URL (thus saving the user from latency of a server-side redirect),” explains developer programs tech lead Maile Ohye. “Prior to this Search Queries improvement, Webmaster Tools reported Skip Redirect impressions with the desktop URL. Now we’ve consolidated information when Skip Redirect is triggered, so that impressions, clicks, and CTR are calculated solely with the verified m. site, making your mobile statistics more understandable.”

    The change enabling users to see search queries data without being rounded will become visible in Webmaster Tools over the next few days.

    “We hope this makes it easier for you to see the finer details of how users are finding your website, and when they’re clicking through,” says Google webmaster trends analyst John Mueller.

    We wonder if these tweaks are related to Google’s recent call for ideas from users for Webmaster Tools improvements.

    Image: Google

  • Google Improves URL Removal Tool

    Google Improves URL Removal Tool

    Google has launched an improved version of its URL removal tool in Webmaster Tools, aimed at making it easier to request updates based on changes to other people’s sites.

    Google suggests that you could use the tool if a page has been removed completely or if it has changed, and you need the snippet and cached page removed.

    “If the page itself was removed completely, you can request that it’s removed from Google’s search results,” says Google Webmaster Trends analyst John Mueller. “For this, it’s important that the page returns the proper HTTP result code (403, 404, or 410), has a noindex robots meta tag, or is blocked by the robots.txt (blocking via robots.txt may not prevent indexing of the URL permanently). You can check the HTTP result code with a HTTP header checker. While we attempt to recognize ‘soft-404’ errors, having the website use a clear response code is always preferred.”

    For submitting a page for removal, just enter the URL and confirm the request.

    “If the page wasn’t removed, you can also use this tool to let us know that a text on a page (such as a name) has been removed or changed,” says Mueller. “It’ll remove the snippet & cached page in Google’s search results until our systems have been able to reprocess the page completely (it won’t affect title or ranking). In addition to the page’s URL, you’ll need at least one word that used to be on the page but is now removed.”

    Webmasters are instructed to enter the URL, confirm that the page has been updated or removed and that the cache and snippet are outdated, and enter a word that no longer appears on the live page, but still appears in the cache or snippet.

    Image: Google

  • Google Webmaster Tools Now Shows Structured Data Errors

    Google announced today that it has launched a new error reporting feature for the Structured Data Dashboard in Webmaster Tools. The company began testing this earlier this year, and has used feedback from webmasters to fine-tune the feature.

    Users can now see items with errors in the dashboard. Items represent top-level structured data elements tagged in the HTML code. Nested items aren’t counted. Google groups them by data type and orders them by number of errors.

    “We’ve added a separate scale for the errors on the right side of the graph in the dashboard, so you can compare items and errors over time,” notes Google webmaster trends analyst Mariya Moeva. “This can be useful to spot connections between changes you may have made on your site and markup errors that are appearing (or disappearing!).”

    Google says it has also updated its data pipeline, so reporting will be more comprehensive.

    When you click on a specific content type, Google will show you the markup error it found for that type. You can see all at once or filter by error type. Google suggests checking to see if the markup meets the implementation guidelines, which can be found here.

    You can click on the URLs in the table to see details about what markup Google has detected when it crawled the page last and what’s missing. There’s also a “test live data” button so you can the markup with Google’s Structured Data Testing Tool.

    After you fix issues, the changes will reflected in the dashboard.

    Image: Google

  • Google Adds Smartphone Crawl Errors To Webmaster Tools (This Is Important Considering Recent Ranking Changes)

    Google announced that it has expanded the Crawl Errors feature in Webmaster Tools to help webmasters identify pages on their sites that show smartphone crawl errors.

    This is going to be of particular importance because Google recently made several ranking changes for sites not configured for smartphone users.

    “Some smartphone-optimized websites are misconfigured in that they don’t show searchers the information they were seeking,” says Google Webmaster Trends analyst Pierre Far. “For example, smartphone users are shown an error page or get redirected to an irrelevant page, but desktop users are shown the content they wanted. Some of these problems, detected by Googlebot as crawl errors, significantly hurt your website’s user experience and are the basis of some of our recently-announced ranking changes for smartphone search results.”

    The feature will include server errors, “not found” errors and soft 404s, faulty redirects and blocked URLs.

    Mobile crawl errors

    “Fixing any issues shown in Webmaster Tools can make your site better for users and help our algorithms better index your content,” says Far.

    Google’s Matt Cutts recently asked what users would like to see Google add to Webmaster Tools in the coming year. Some will no doubt be able to cross one thing off their list now.

    Image: Google

  • Google Wants Some Ideas For Webmaster Tools. Got Any?

    In a recent article, we asked if Google is being transparent enough. While the question was asked broadly, much our discussion had to do specifically with webmasters. Is Google providing them with enough information?

    I mean after all, a single algorithm tweak can completely kill a business, or cause one to have to lay off staff. Webmasters want to know as much about how Google works, and how it views their site as possible.

    What do you think Webmaster Tools needs more than anything else? Let us know in the comments.

    We’re not asking that question just for conversation’s sake, though that should be interesting too. Google actually wants to know. Or at least one pretty important and influential Googler does.

    Matt Cutts, head of Google’s webspam team, has taken to his personal blog to ask people what they would like to see Google Webmaster Tools offer in 2014.

    So here’s your chance to have your voice heard.

    “At this point, our webmaster console will alert you to manual webspam actions that will directly affect your site,” he writes. “We’ve recently rolled out better visibility on website security issues, including radically improved resources for hacked site help. We’ve also improved the backlinks that we show to publishers and site owners. Along the way, we’ve also created a website that explains how search works, and Google has done dozens of ‘office hours’ hangouts for websites. And we’re just about to hit 15 million views on ~500 different webmaster videos.”

    I like to think we’ve played some small role in that.

    Cutts lists fourteen items himself as things he could “imagine people wanting,” but notes that he’s just brainstorming, and that there’s no guarantee any of these will actually be worked on.

    Among his ideas are: making authorship easier, improving spam/bug/error/issue reporting, an option to download pages from your site that Google has crawled (in case of emergency), checklists for new businesses, reports with advice for improving mobile/page speed, the ability to let Google know about “fat pings” of content before publishing it to the web, so Google knows where it first appeared, better duplicate content/scraper reporting tools, showing pages that don’t validate, showing pages that link to your 404 pages, show pages on your site that lead to 404s and broken links, better bulk URL removal, refreshing data faster, improving the robots.txt checker, and ways for site owners to tell Google about their site.

    Even if we don’t see all of these things come to Webmaster Tools in the near future, it’s interesting to see the things Cutts is openly thinking about.

    The post’s comments from Webmasters are already in the hundreds, so Google will certainly have plenty of ideas to work with. Googlers like Cutts have been known to peruse the WPN comments from time to time as well, so I wouldn’t worry about your response going unnoticed here either.

    What do you think Webmaster Tools needs more than anything? Let us know in the comments. Better yet, let us know what you think it might actually get.

    Image: Google

  • Is Google Being Transparent Enough?

    Is Google Being Transparent Enough?

    Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough?

    When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so.

    Do you think Google is transparent enough? Does it give webmasters enough information? Share your thoughts in the comments.

    Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing.

    For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort.

    As you know, Google, on occasion, launches updates to its search algorithm, which can have devastating effects on sites who relied on the search engine for traffic. Sometimes (and probably more often than not), the sites that get hit deserve to get hit. They’re just trying to game the system and rank where they really shouldn’t be ranking. Sometimes, people who aren’t trying to be deceptive, and are just trying to make their business work are affected too.

    Google openly talks about these updates. Panda and Penguin are regular topics of discussion for Googlers like Matt Cutts and John Mueller. Google tries to send a clear message about the type of content it wants, but still leaves plenty of sites guessing about why they actually got hit by an update.

    Not all of Google’s algorithmic changes are huge updates like Panda and Penguin. Google makes smaller tweaks on a daily basis, and these changes are bound to have an effect on the ranking of content here and there. Otherwise, what’s the point?

    While Google would never give away its secret recipe for ranking, there was a time (not that long ago) when Google decided that it would be a good idea to give people a look at some changes it has been making. Then, they apparently decided otherwise.

    In December of 2011, Google announced what it described as a “monthly series on algorithm changes” on its Inside Search blog. Google started posting monthly lists of what it referred to as “search quality highlights”. These provided perhaps the most transparency into how Google changes its algorithm that Google has ever provided. It didn’t exactly give you a clear instruction manual for ranking above your competition, but it showed the kinds of changes Google was making – some big and some small.

    Above all else, it gave you a general sense of the kinds of areas Google was looking at during a particular time period. For example, there was a period of time when many of the specific changes Google was making were directly related to how it handles synonyms.

    Google described the lists as an attempt to “push the envelope when it comes to transparency.” Google started off delivering the lists one a month as promised. Eventually, they started coming out much more slowly. For a while, they came out every other month, with multiple lists at a time. Then, they just stopped coming.

    To my knowledge, Google hasn’t bothered to explain why (a lack of transparency on its own), though I’ve reached out for comment on the matter multiple times.

    It’s been over a year since Google released one of these “transparency” lists. The last one was on October 4th of last year. It’s probably safe to say at this point that this is no longer happening. Either that or we’re going to have one giant year-long list at the end of 2013.

    For now, we’re just going to have to live with this reduction in transparency.

    Don’t get me wrong, Google has given webmasters some pretty helpful tools during that time. Since that last list of algorithm changes, Google has launched the Disavow Links tool, the Data Highlighter tool, the manual action viewer, and the Security Issues feature and altered the way it selects sample links.

    Barry Schwartz from Search Engine Roundtable says he’d like to see an “automated action viewer” to complement the manual action viewer. As would many others, no doubt.

    “Don’t get me wrong,” he writes. “Google’s transparency over the years has grown tremendously. But this one thing would be gold for most small webmasters who are lost and being told by “SEO experts” or companies things that may not be true. I see so many webmasters chasing their tails – it pains me.”

    Cutts continues to regularly put out videos responding to user-submitted questions (webmasters find these to be varying degrees of helpful).

    But Google is not doing anything remotely like search quality highlights lists, which provided specific identifying numbers, project nicknames and descriptions of what they did like the following example:

    #82862. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources

    While I haven’t really seen this talked about much, Google has been accused of breaking other promises lately. We talked about the broken promise of Google not having banner ads in its search results recently. Danny Sullivan blogged earlier this week about “Google’s broken promises,” mentioning that as well as Google’s decision to launch the paid inclusion Google Shopping model last year, something the company once deemed to be “evil”.

    “For two years in a row now, Google has gone back on major promises it made about search,” he wrote. “The about-faces are easy fodder for anyone who wants to poke fun at Google for not keeping to its word. However, the bigger picture is that as Google has entered its fifteenth year, it faces new challenges on how to deliver search products that are radically different from when it started.”

    “In the past, Google might have explained such shifts in an attempt to maintain user trust,” he added. “Now, Google either assumes it has so much user trust that explanations aren’t necessary. Or, the lack of accountability might be due to its ‘fuzzy management’ structure where no one seems in charge of the search engine.”

    He later says Google was “foolish” to have made promises it couldn’t keep.

    User trust in Google has suffered for a variety reasons, not limited to those mentioned, in recent months.

    Last year, Google cause quite a dust-up with its big privacy policy revamp, which more efficiently enables it to use user data from one product to the next. Last week, another change in policy went into effect, enabling it to use users profiles and pictures wherever it wants, including in ads. The ad part can be opted out of, but the rest can’t. Quite a few people have taken issue with the policy.

    Then there’s the YouTube commenting system. They changed that to a Google+-based platform, which has caused its own share of issues, and sparked major backlash from users.

    The changes were pitched as a way to improve conversations around videos and surface comments that are more relevant to the user, but most people pretty much just see it as a way to force Google+ onto the YouTube community. Some don’t think Google is being very transparent about its intentions there. It’s a point that’s hard to argue against when you see stuff like this.

    Do you think Google is losing trust from its users? Do you think the company is being transparent enough? Is all of this stuff just being overblown? What would you like to see Google do differently? Share your thoughts in the comments.

    Image: Matt Cutts (YouTube)

  • Google Adds ‘Security Issues’ Feature To Webmaster Tools

    Google has announced the launch of a new Webmaster Tools feature called Security Issues, which shows verified site owners info about the security issues on their sites (in a single place), and lets them find problems faster with detailed code snippets and request reviews for all issues in one new process.

    Google’s Meenali Rungta and Hadas Fester say in a joint blog post on the Webmaster Central blog, “We know that as a site owner, discovering your site is hacked with spam or malware is stressful, and trying to clean it up under a time constraint can be very challenging. We’ve been working to make recovery even easier and streamline the cleaning process — we notify webmasters when the software they’re running on their site is out of date, and we’ve set up a dedicated help portal for hacked sites with detailed articles explaining each step of the process to recovery, including videos.”

    “Now, when we’ve detected your site may have been hacked with spam or with malware, we’ll show you everything in the same place for easy reference,” they say. “Information that was previously available in the Malware section of Webmaster Tools, as well as new information about spam inserted by hackers, is now available in Security Issues. On the Security Issues main page, you’ll see the type of hacking, sample URLs if available, and the date when we last detected the issue.”

    Security Issues in Google Webmaster Tools

    Google says that it will try to show webmasters HTML and JavaScript code snippets from the hacked URLs when ever possible, and recommend actions to help clean up the problems.

    Feedback about the feature has been overwhelmingly positive so far.

  • Google Webmaster Tools Alters How It Selects Sample Links

    Google announced on Thursday that it has made a change to how it decides what links to show webmasters when they push the “Download more sample links” button. The feature typically shows about 100,000 backlinks.

    “Until now, we’ve selected those links primarily by lexicographical order,” explains Yinnon Haviv, a software engineer with Google’s Webmaster Tools team. “That meant that for some sites, you didn’t get as complete of a picture of the site’s backlinks because the link data skewed toward the beginning of the alphabet.”

    “Based on feedback from the webmaster community, we’re improving how we select these backlinks to give sites a fuller picture of their backlink profile,” Haviv adds. “The most significant improvement you’ll see is that most of the links are now sampled uniformly from the full spectrum of backlinks rather than alphabetically. You’re also more likely to get example links from different top-level domains (TLDs) as well as from different domain names. The new links you see will still be sorted alphabetically.”

    Soon, Google says, when webmasters download their data, they’ll see a more diverse cross-section of links. The goal is for webmasters to more easily be able to separate the bad links from the good.

    More link profile clarity has to be a good thing, because people are freaking out about links these days, and Google itself is even mistakenly telling webmasters that legitimate links are bad in some cases. But like Google’s Matt Cutts said in a tweet, “I think that’s 1 of the benefits of more transparency is that it helps us improve on our side too.”

    Image: Google

  • Are You Getting More Out Of Paid Search Than From SEO?

    As you’ve probably found out, getting your content seen in Google’s organic listings is not as easy as it used to be. It’s no wonder that businesses are getting more out of paid listings than they are organic search traffic.

    is this the case for your business or do you get more out of organic SEO? Let us know in the comments.

    Google has launched a new Paid & Organic report in AdWords aimed at helping businesses get more out of their paid and organic search campaigns by offering new comparison options.

    “Previously, most search reports showed paid and organic performance separately, without any insights on user behavior when they overlap,” says AdWords product manager Dan Friedman. “The new paid & organic report is the first to let you see and compare your performance for a query when you have either an ad, an organic listing, or both appearing on the search results page.”

    Google suggests using the report to discover potential keywords to add to your AdWords accounts by looking for queries where you only appear in organic search with no associated ads, as well as for optimizing your presence on high value queries and measuring changes to bids, budgets, or keywords and their impact across paid, organic and combined traffic.

    Paid & Organic Report

    Image: Google

    Digital marketing firm IMPAQT was part of the beta testing, and says, “The paid & organic report has been incredibly useful in understanding the interaction between paid and organic search, and the overall synergy when they are working together. For one of our client’s key branded queries, we saw an 18% increase in CTR when paid and organic work together, as opposed to only having the organic listing.”

    It’s worth noting that Google itself shared this quote.

    To take advantage of the Paid & Organic report, you have to link your AdWords account to Webmaster Tools, and you have to be a verified owner or be granted access by one.

    MarketLive has put out a report finding that its merchants saw “significant changes” in the mix of paid/organic traffic. Paid search visits made up about a third of total search engine visits (up from 26% the previous year), while revenue from paid search grew to 44% of total search engine visit revenue (up from 40% in 2012). Interestingly, search visit growth altogether slowed in the first six months of the year, but paid was up 30% while organic was down 3%.

    Paid/Organic Search Traffic

    Image: Marketlive

    Here’s a side-by-side comparison of conversions, order size, new visits, bounce rate and pages per visit. As you can see, paid performs better across the board, except for new visits, which makes sense if you consider brand familiarity.

    Marketlive: Paid vs. Organic

    Image: Marketlive

    The report delves into performance across verticals, device comparisons and more, if you want to check it out (registration required).

    This is only one study, of course, but the signs are pointing to businesses getting more out of paid search than out of organic search. While Google’s new report feature could help both, it certainly seems geared toward using what you learn from your organic performance to put toward your paid campaigns. And again, Google certainly isn’t making things any easier for those trying to be found in organic results.

    For one thing, Google results simply have a lot more types of results than they used to, and on many pages, that means less traditional organic results. For another thing, people are afraid to link out, and to have links pointing toward them, which surely can’t be a great thing for traditional SEO, considering that Google’s algorithm (while including over 200 signals) has historically placed a great deal of its confidence in legitimate linking.

    Between webmaster paranoia, Google’s somewhat mixed messaging and ongoing “advice,” and its ever-changing algorithms, many businesses are finding out the hard way that relying too heavily on organic search is just detrimental. Paid search is less risky. It’s also how Google makes the bulk of its money.

    The AdWords department lost some trust points this week, however, when an account manager’s accidental voice mail recording gained some attention. Basically, he expressed his distaste that the client had upgraded to Google’s Enhanced Campaigns without consulting him, that he would now have to pitch call extensions and site links. He also noted that he didn’t care about bridge pages or parked domains.

    As Ginny Marvin at Search Engine Land writes, the implications of that are that AdWords account reps are paid to upsell new products/services that may or may not be in clients’ best interests, an account rep was willing to ignore a breach of Google’s own policies, and that AdWords account managers are “sales people first and foremost.”

    Google indicated that this person was not an actual Google employee, but a contractor, and that they had already removed them from the AdWords team, but as Marvin points out, it’s unclear whether this is potentially a bigger issue or if this one person’s attitude is just a rare case. Either way, it hasn’t been great for advertiser perception.

    But what are you gonna do?

    Obviously, when it comes to paid and organic search, the idea is to get them to work together. It’s not necessarily an “either or” situation, but there is always a question of how to balance your resources.

    Do you get better performance from paid search or organic SEO? Let us know in the comments.