WebProNews

Tag: webmaster tools

  • Google Gives You A New Way To Verify Your Domain

    Google Gives You A New Way To Verify Your Domain

    Google has launched a new way to verify that you are the owner of your site or domain for Webmaster Tools. You can now do this using DNS CNAME records.

    Google says the new option is for users who aren’t able to create DNS TXT records for their domains, which in the past has been a way to get verified with Google.

    To take advantage of DNS CNAME records in Webmaster Tools, add the domain to your account, select the domain name provider option, select your domain name provider, and then you’ll either get instructions to set a CNAME record or Add a CNAME record. Then click “verify”.

    Domain Name Provider

    Verify CNAME

    “When you click Verify, Google will check for the CNAME record and if everything works you will be added as a verified owner of the domain,” says Google software engineer Pooja Wagh. “Using this method automatically verifies you as the owner of all websites on this domain. For example, when you verify your ownership of example.com, you are automatically verified as an owner of www.example.com as well as subdomains such as blog.example.com.”

    “Sometimes DNS records take a while to make their way across the Internet,” adds Wagh. “If we don’t find the record immediately, we’ll check for it periodically and when we find the record we’ll make you a verified owner. To maintain your verification status don’t remove the record, even after verification succeeds.”

    The company notes that you can still use other verification methods like the HTML file, the meta tag or the Google Analytics tag.

  • Structured Data Dashboard Comes To Google Webmaster Tools

    Google has added a new feature to Webmaster Tools called the Structured Data Dashboard. The purpose, Google says, is to provide webmasters with more visibility into the structured data that Google knows about for their site.

    It comes with three views, which will let you see things like:

    Structured Data

    Structured Data

    Structured Data

    The first view is Site-level view, which aggregates data by root item type and vocabulary schema. “Root item type means an item that is not an attribute of another on the same page,” Google’s Webmaster Tools team explains. “For example, the site below has about 2 million Schema.Org annotations for Books.”

    The second view is Itemtype-level, and provides per-page details for each item. “Google parses and stores a fixed number of pages for each site and item type,” the team says. “They are stored in decreasing order by the time in which they were crawled. We also keep all their structured data markup. For certain item types we also provide specialized preview columns as seen in this example below (e.g. ‘Name’ is specific to schema.org Product).”

    The third view is Page-level view, which shows all attributes for each item type on a given page.

    As the company notes, you can use the dashboard to verify that Google is picking up any new markup, or discover any issues Google may be having with old markup.

  • Google Gives You A New Way To See How Many Pages You Have In Its Index

    Google has released a new feature in Webmaster Tools called Index Status. The feature shows you how many pages you have included in Google’s index at any given time.

    The feature appears in the Health menu in Webmaster Tools. It shows how many pages are currently indexed, but also shows you a graph dating back a year:

    Index Status

    “If you see a steadily increasing number of indexed pages, congratulations! This should be enough to confirm that new content on your site is being discovered, crawled and indexed by Google,” Google says in a blog post. “However, some of you may find issues that require looking a little bit deeper. That’s why we added an Advanced tab to the feature.”

    “The advanced section will show not only totals of indexed pages, but also the cumulative number of pages crawled, the number of pages that we know about which are not crawled because they are blocked by robots.txt, and also the number of pages that were not selected for inclusion in our results,” Google adds.

    When you click on the Advanced tab, you will see something like this:

    Index Status Advanced

    Google says the data you get from the tool can be used to identify and debug numerous indexing-related issues your site may have.

    Of course, the company also touts the feature as a way to bring “more transparency” to the table, much like its recent, confusing messages about links.

  • Google Gives Webmasters Just What They Need: More Confusion

    Last week, Google began sending out messages to webmasters, warning them of bad links, much like the ones that many webmasters got prior to the infamous Penguin update. Google said, however, that these messages were different. Whereas the company’s advice in the past was to pay attention to these warnings, Google’s said this time, that they’re not necessarily something you need to worry about it.

    Google’s head of webspam, Matt Cutts, wrote on Google+,”If you received a message yesterday about unnatural links to your site, don’t panic. In the past, these messages were sent when we took action on a site as a whole. Yesterday, we took another step towards more transparency and began sending messages when we distrust some individual links to a site. While it’s possible for this to indicate potential spammy activity by the site, it can also have innocent reasons. For example, we may take this kind of targeted action to distrust hacked links pointing to an innocent site. The innocent site will get the message as we move towards more transparency, but it’s not necessarily something that you automatically need to worry about.”

    “If we’ve taken more severe action on your site, you’ll likely notice a drop in search traffic, which you can see in the ‘Search queries’ feature Webmaster Tools for example,” Cutts added. “As always, if you believe you have been affected by a manual spam action and your site no longer violates the Webmaster Guidelines, go ahead and file a reconsideration request. It’ll take some time for us to process the request, but you will receive a followup message confirming when we’ve processed it.”

    Obviously, this all caused a great deal of confusion, and panic among webmasters and the SEO community. Barry Schwartz, who spends a lot of time monitoring forum discussions, wrote, “It caused a major scare amongst SEOs, webmasters and those who owned web sites, never bought a link in their life, didn’t even know what link buying was and got this severe notification that read, ‘our opinion of your entire site is affected.’

    Even SEOmoz was getting these warnings. The company’s lead SEO, Ruth Burr, wrote,”We’ve got the best kind of links: the kind that build themselves. Imagine the sinking feeling I got in the pit of my stomach, then, when a Google Webmaster Tools check on Thursday revealed that we’d incurred an unnatural link warning.”

    Cutts eventually updated his post to indicate that Google has changed the wording of the messages it is sending, in direct response to webmaster feedback.

     
     

    Google has also removed the yellow caution sign that accompany the messages from the webmaster console. According to Cutts, this illustrates that action by the site owner isn’t necessarily required.

  • Google Makes Webmaster Tools Data Exporting Easier

    Google announced that it has included a new option for exporting data to a Google Spreadsheet in Google Docs. Users can now choose between the CSV and Google Docs download formats.

    “Choosing ‘CSV’ initiates a download of the data in CSV format which has long been available in Webmaster Tools and can be imported into other spreadsheet tools like Excel,” says Google Webmaster Trends analyst Jonathan Simon. “If you select the new ‘Google Docs’ option then your data will be saved into a Google Spreadsheet and the newly created spreadsheet will be opened in a new browser tab.”

    Download to spreadsheet

    “We hope the ability to easily download your data to a Google Spreadsheet helps you to get crunching on your site’s Webmaster Tools data even faster than you could before,” says Simon. “Using only a web browser you can instantly dive right into slicing and dicing your data to create customized charts for detecting significant changes and tracking longer term trends impacting your site.”

    This is only one of various user experience improvements Google has made to Webmaster Tools in recent memory. Last month, Google updated the navigation of the site, and launched a new dashboard.

  • Bing Unleashes a Ton of New Webmaster Tools for SEO, Links Stats

    As Vice President of Bing Program Management Derrick Connell mentioned at the Search Engine Land roundtable talk earlier today, Bing has launched a bevy of new webmaster tools to better inform the understanding of your site’s data and statistics. The addition of several tools and features, which the Bing Team is calling their Phoenix update, offers up everything from SEO analysis tools to a tools for link analysis.

    The new arsenal of analytical tools comes on the heels of Friday’s announcement that the new Bing design has become the standard Bing for users in the United States. Some of these tools are brand new while others are merely getting an update or moving out of beta.

    The Bing Team says that you should consider this announcement of the tools as an webmaster aperitif because they’ll be providing a more detailed explanation of each tool in the upcoming few weeks. Since Bing has promised to elaborate on each of these tools in the near future, I’m not going to try to out-Bing them so I’ll just include a brief description of each of the updates below. We’ll bring you further information about each new feature or tool as Bing makes the information available.

    Probably the most immediate change you’ll see is that Bing has redesigned the Webmaster Tools dashboard. As seen below in the example taken from the Bing blog post, the new look complements the cleaner look to Bing’s search results page.

    Bing Webmaster Tools Phoenix Update

    And now, on to the catwalk.

    New Tools:

  • Link Explorer (beta) – Go spelunking through the internet to discover links associated with any domain.
  • SEO Reports (beta) – Generate SEO analysis reports directly from Bing. The report uses roughly 15 SEO best practices to generate the analysis and runs once every two weeks for all of the domains you have verified with Webmaster Tools account.
  • SEO Analyzer (beta) – Similar to the SEO Reports tool, Analyzer will use the same best practices criteria to scan an URL in order to tell you whether or not you’re in compliance with each best practice.
  • Fetch as Bingbot (beta) – Every curious how the Bing’s web-crawler, Bingbot, sees your site? Now you can find out with this new tool, which will allow a webmaster to send Bingbot crawling across a specific page and display it as the bot sees it.
  • Canonical Alerts – A new tool to help keep webmasters from erroneously using the rel=canonical tags so your website doesn’t get mistaken as one single page.
  • The following tools have existed for a bit but received updates with Phoenix.

  • URL Removal Tool – Simple enough: a tool to allow webmasters to easily block a page from appearing in Bing’s search results.
  • Keyword Research Tool (beta) – Previously, users were only able to enter one single keyword or phrase per keyword request, but now now webmasters will be able to add multiple entries within the same request.
  • URL Normalization – Updated the interface to clarify how it works.
  • Whew. That should be enough to keep you Bing Webmaster Tools users busy for a while, at least until Bing begins to share more information about the hows and whys of each of these tools. To start playing with the tools, users will need a Bing Webmaster Tools account, so happy webmastering and enjoy.

  • Google Domain Verification Gets Simplified With Registrar Partnerships

    Google announced that it is working with GoDaddy and Demand Media’s eNom on automated domain verification for Webmaster Tools and Google Apps.

    For sites with domains whose records are managed by GoDaddy or eNom, users will see a new verification method that looks something like this:

    eNom verification

    “Selecting this method launches a pop-up window that asks you to log in to the provider using your existing account with them,” explains product manager Anthony Chavez. “The first time you log in, you’ll be asked to authorize the provider to access the Google site verification service on your behalf. Next you’ll be asked to confirm that you wish to verify the domain. And that’s it! After a few seconds, your domain should be automatically verified and a confirmation message displayed.”

    Based on the comments on Google’s announcement, it seems that people are pretty happy with the feature. So far, there’s nothing but praise from users, other than a suggestion (or hope, rather) that more domain registrars are added.

    Google does mention that Bluehost customers will be added in the near future, for this same kind of functionality. The company also says it looks forward to working with more partners, so we should be seeing this initiative greatly expanded upon.

  • Google Informs Us That Instant Pages Works Great

    How comfortable are you with Google’s prerendering capabilities? Do you like the idea of your site loading faster in Chrome thanks to this technology? The idea behind Instant Pages is to immediate load the top search result for a particular query, so, when it’s clicked, it immediately loads up in your browser. Naturally, when Google discusses this technology, it refers to its Chrome browser, the prerendering works in the following browsers:

    Chrome v5 or higher, Firefox v3/4, Safari v5 for Mac and Internet Explorer v8/9.

    As for Instant Pages, a quick look at some of the cons that immediately popped up shows they are awfully weak, if not outright laughable:

    CON

    There are those, however, that believe Google Instant can waste much time as it saves. “You could end up getting distracted by the suggestions and read an article that you weren’t even looking for,” Heather McClain, 16, a waitress, told the BBC. “It will probably end up costing you more time than it saves you.”

    Or it could be Heather is like most tween Internet users who get distracted by just about anything.

    CON

    Another check in the con column against Google Instant is that it may hurt SEO marketers. Searchers will be less likely to click through to a second page of search results, critics say, which will give marketers fewer keywords to work with.

    Since when was it Google’s job to ensure SEO remains a viable industry?

    All of that aside, over at the Google Webmaster Central Blog, there’s a post discussing the benefits of Instant Pages. The amount of time save particularly stood out:

    We’ve been closely watching performance and listening to webmaster feedback. Since Instant Pages rolled out we’ve saved more than a thousand years of ours users’ time. We’re very happy with the results so far, and we’ll be gradually increasing how often we trigger the feature.

    For those of you worried about pageviews and are worried about how prerendering will affect these numbers, Google addresses this too:

    …only results the user visits will be counted. If your site keeps track of pageviews on its own, you might be interested in the Page Visibility API, which allows you to detect when prerendering is occurring and factor those out of your statistics. If you use an ads or analytics package, check with them to see if their solution is already prerender-aware; if it is, in many cases you won’t need to make any changes at all.

    For those of you who are lamenting the fact that your site isn’t near the top of the results for keywords you’re targeting, Instant Pages isn’t going to help or hurt you. With that in mind, it’s isn’t going to hurt if your site is at the top of the results, either. In fact, the opposite seems a lot more likely, especially if your site ranks highly for a competitive keyword.

  • Google Webmaster Tools Gets Better Navigation, New Dashboard

    Google is continually updating its Webmaster Tools to bring the best features to users. This week brings some pretty major changes in the form of an updated navigation, new dashboard, and a compact view for the home page site-list.

    The features that you know and love in Webmaster Tools have been regrouped which facilitated a change in the navigation structure. Some of the features have been renamed as well. The example provided is HTML Suggestions is now called HTML Improvements. All the features can now be found in one of four new groups:

    Configuration: Things you configure and generally don’t change very often.
    Health: Where you look to make sure things are OK.
    Traffic: Where you go to understand how your site is doing in Google search, who’s linking to you; where you can explore the data about your site.
    Optimization: Where you can find ideas to enhance your site, which enables us to better understand and represent your site in Search and other services.

    The dashboard has received a complete redesign. With the new design, you’ll find recent, important and prioritized messages regarding your site sitting at the very top now. Equally important, there is now a brief summary of your site’s current status just below that. Three of the new feature navigations have widgets with Crawl Errors, Search Queries and Sitemaps representing Health, Traffic and Optimization. In what may be the best change, however, is that more messages and charts are now on the front page. With this, you can see how your site is doing without having to dive into the tools if you don’t have the time to do a thorough check.

    The final change is the addition of a “compact” layout. This allows you to see your sitelist without having to view the site-preview thumbnails. You have a choice between the two, but I personally like the compact view more. It also has the added benefit of loading faster since it doesn’t have to stream in large images.

    Google Webmaster Tools Gets Better Navigation, New Dashboard

    These new updates should make it easier to see site statistics at a glance. You can still dive in and get all the detailed information you’ve expected from Google Webmaster Tools, but now a lot more of it is on the front page. If that’s not convenience, I don’t know what is.

  • Google Details How They Crawl Images, Offers Tips

    It’s pretty clear now how Google crawls Web sites to get the results you search for. Heck, we can’t go a week without Google updating us on how they’re changing the algorithm that determines search results. Unfortunately, there hasn’t been much on how Google crawls for images – until now.

    In a lengthy post on the Webmaster Central Blog, Google details how they crawl for images and what they’re looking for when it comes to choosing the most relevant images. Considering how often I and everybody else uses Google Images, the information should be extremely relevant for Webmasters wanting their images showing up in search.

    To confirm what you may have already knew, Google crawls images from pretty much every source on the Web from bloggers to stock photo sites. They also crawl images in pretty much every standard format including BMP, GIF, JPEG, PNG, WebP and SVG.

    As an example, Google shows how the image algorithm knows the difference between a search for coffee and tea. The feat is accomplished by looking “at the textual content on the page the image was found on to learn more about the image.” They also “look at the the page’s title and its body; we might also learn more from the image’s filename, anchor text that points to it, and its “alt text”.”

    After all this, you may be wondering how you can get your images to better show up in the results. It’s quite easy really. For your image to show up in results, make sure that Google can crawl both the HTML that the image is embedded in as well as the image itself. Also, make sure that your image is in one of the support formats from above.

    The next tips aren’t required, but they are recommended for Web sites hoping to get its images crawled. First, make sure the image filename is related to the image’s content. Don’t post an image of a giraffe and call the image “africa.jpg” for instance. If your image has an alt attribute, make sure it describes the image in a “human-friendly” way. That means using full words and maybe even sentences for your alt attributes. Finally, it really helps is the HTML page’s “textual contents as well as the text near the image are related to the image.”

    One of the more recent updates to the Google Images algorithm was the addition of words like “sex” to safesearch. This was because Google updated their algorithm to handle requests like this without returning obscene content. Google talks about this in more detail now saying they encourage Webmasters who publish adult content on their pages mark such content with a metatag that tells Google about said adult content. If you don’t use these metatags, Google can still do a pretty good job of filtering adult content using computer vision or contextual clues.

    Finally, the company answered some burning questions in regards to image search. Some of the details include why Googlebot crawls images instead of Googlebot-Image or if there is a file size limit for images that can be crawled (spoiler: there’s not). Google also suggests that Webmasters implement an Image Sitemap to tell them more about new images and what they’re about.

    If you took anything away from this, it should be that image crawling is just as involved as regular Web crawling. It takes constant vigilance and work to get your image at the top of the heap in terms of relevance. These are just some beginner tips to getting your images crawled more often. If you need more in depth instruction on image crawling, Google has a handy Image publishing guidelines page set up for such an occasion.

  • Google To 20,000 Sites: You May Have Been Hacked

    Google has been sending out a lot of messages to webmasters lately. A lot have been getting them based on questionable links pointing to their sites, in relation to Google’s cracking down on paid blog/link networks.

    Now, over 20,000 sites have received messages from Google for a very different reason: hacking (or the possibility of hacking). Matt Cutts tweeted the following today:

    Is your site doing weird redirects? We just sent a “your site might be hacked” msg to 20K sites, e.g. http://t.co/r9jOkiOm 5 hours ago via Tweet Button ·  Reply ·  Retweet ·  Favorite · powered by @socialditto

    Barry Schwartz at Search Engine Land claims to have seen some related activity. “I’ve personally seen a spike in the number of sites redirecting from their web site to a non-authorized site recently,” he writes. “The webmaster is typically unaware of this redirect because the redirects only occur when someone clicks from Google’s search results to the web site. Typically the site owner doesn’t go to Google to find his web site, the site owner goes directly to the site.”

    It’s unclear if Google’s messages are related, but TheNextWeb recently reported on some hacking that was going on, on some sites, where the hacker was sneaking in and inserting backlinks to his/her own spammy content, and even messing with canonical link elements, tricking Google’s algorithm into thinking the hacker was the originator of content, even though he/she was simply scraping. They were even able to hijack +1’s in search results.

    Google has a help center article in Webmaster Tools about what to do if your site has been hacked. That includes taking your site offline and cleaning it of malicious software, and requesting a malware review from Google.

    “You can find out if your site has been identified as a site that may host or distribute malicious software (one type of ‘badware’) by checking the Webmaster Tools home page (Note: you need to verify site ownership to see this information.),” says Google.

    Google sends out notices to affected sites at the following email addresses: abuse@, admin@, administrator@, contact@, info@, postmaster@, support@ and webmaster@.

    Google bases its identifictions of “badware” on guidelines from StopBadware.org, the company says, though it also uses its own criteria and tools to identify sites that host/distribute badware.

    “In some cases, third parties can add malicious code to legitimate sites, which would cause us to show the warning message,” Google says in the help center. “If you feel your site has been mistakenly identified, or if you make changes to your site so that it no longer hosts or distributes malicious software and you secure your site so that it is no longer vulnerable to the insertion of badware, you can request that your site be reviewed.”

    Google has instructions for cleaning your site here. This involves quarantining the site, assessing the damage, cleaning it up and asking for Google to review it.

  • Bing Webmaster Tools Explained

    Microsoft is pretty serious about Bing and competing with Google. They are now the second most used search engine so they better step up their game to keep growing their audience. Of course, with more users comes more people who don’t know how to use Bing specific tools. Microsoft has put together a short list that you may want to check out.

    Posting on the Bing Webmaster blog, the team has put together four tips for Webmasters to get the most out of the tools Bing provides. They also took the opportunity to make an April Fools Joke, but I’ll get to that later.

    The Bing team wants to bring your attention to their keyword research tool. It allows a user to, what else, search for keywords, but the method it uses is a bit unique. They claim that instead of pulling its results from paid advertising, the keyword research tool pulls results from organic search queries. You can search for just a phrase or an exact word and it will pull up results from up to six months ago. You can also target specific regions or languages in your quest to find better keywords.

    Everything is getting an API these days and Bing is no different. The Bing Webmaster API is available under settings on the left side of the Bing Webmaster page. Like other APIs, the Bing Webmaster API allows you to plug it in to a dashboard for constant data streams. It’s especially useful if you’re having to watch multiple search Web sites for upcoming trends and keywords.

    There is now an option to have emails sent to you whenever Bing detects something like malware. It’s an alert system that would keep you notified throughout the day in case anything came up. To ensure that you’re not getting spammed, you can control the amount of emails they send you. This option can be found under settings in the preferences section.

    Under the traffic section for your domain, they have added a new column called the “Avg. Impression Position.” This is a constantly fluctuating number that tells you how Bing views your Web site against specific queries. When the number increases, it means that your Web site is being pushed high up the search results. The vice versa is true when the number goes lower. Keywords are one of the most important drivers to Web sites so having a high Impression Position is key to driving traffic to your site via keywords.

    Finally, I mentioned that they made an April Fools joke. It’s not a very good one, but at least they tried. Bingbot, the software that runs all the previously mentioned programs, is said to be writing a memoir. Microsoft says he starred in Real Steel, was a stunt double for KITT from Knight Rider and almost got the role of C3PO. To make this joke better, they should make a Bing branded Knight Rider for their next ad. Just an idea, Microsoft. You can pay me when the ad campaign is a success.

  • Google Webmaster Tools Gets New Admin Feature

    Google announced the launch of a new Webmaster Tools feature, which lets verified site owners grant limited access to their site’s data and settings to other people.

    You can do this from the home page, by clicking “Manage Site” and going to the “Add or remove users” option, which has replaced the “Add or remove owners” option. This will take you to a new User admin page. From here, you can add or delete up to 100 users. Users can be identified as “full” or “restricted” depending on the rights you want to assign them.

    Full means they can view all data and take most actions. Restricted means they only have access to view most data, but can only take some actions, such as using Fetch as Googlebot and configuring message forwarding.

    Here’s who can do what:

    Full vs. Restricted on Webmaster Tools

    “You’ve had the ability to grant full verified access to others for a couple of years,” says Google Webmaster Trends analyst Jonathan Simon on the Webmaster Central blog. “Since then we’ve heard lots of requests from site owners for the ability to grant limited permission for others to view a site’s data in Webmaster Tools without being able to modify all the settings. Now you can do exactly that with our new User administration feature.”

    “Users added via the User administration page are tied to a specific site,” he explains. “If you become unverified for that site any users that you’ve added will lose their access to that site in Webmaster Tools. Adding or removing verified site owners is still done on the owner verification page which is linked from the User administration page.”

    Hopefully the new feature will make site management easier for webmasters with a lot of employees and colleagues, and save a lot of hassle when changes are needed, or need to be retracted.

  • Google Webmaster Tools Sitemaps Feature Gets Some Updates

    Google announced that it is including some new information in the Webmaster Tools sitemaps feature.

    This includes details based on content-type, like stats from Web, Videos, Images and News featured more prominently.

    “This lets you see how many items of each type were submitted (if any), and for some content types, we also show how many items have been indexed,” explains Webmaster Tools engineer Kamila Primke. “With these enhancements, the new Sitemaps page replaces the Video Sitemaps Labs feature, which will be retired.”

    There is also now the ability to test a sitemap. “Unlike an actual submission, testing does not submit your Sitemap to Google as it only checks it for errors,” says Primke. “Testing requires a live fetch by Googlebot and usually takes a few seconds to complete. Note that the initial testing is not exhaustive and may not detect all issues; for example, errors that can only be identified once the URLs are downloaded are not be caught by the test.”

    Google also has a new way of displaying errors, which the company says better exposes what types of issues a sitemap contains. Rather than repeating the same kind of error numerous times for one sitemap, Google will group errors and warnings, giving a few examples.

    For sitemap index files, Google aggregates errors and warnings from the child sitemaps that the sitemap index encloses, so users won’t have to click through each child one at a time.

    The functionality of the delete button has changed as well. It will now remove the sitemap from Webmaster Tools from both your account and the accounts of the other owners of a site.

  • Google Adds New Duplicate Content Messages to Webmaster Tools

    Google announced today that it is launching new Webmaster Tools messages to notify webmasters when its algorithms select an external URL instead of one from their website, in duplicate content scenarios.

    Google defines “cross-domain URL selection” as when the representative URL (the URL representing the content in a duplicate content scenario that Google’s algorithm decides to use) is selected from a group with different sites.

    In other words, the selection Google goes with when two or more sites are showing the same content.

    “When your website is involved in a cross-domain selection, and you believe the selection is incorrect (i.e. not your intention), there are several strategies to improve the situation,” says Google Webmaster Trends Analyst Pierre Far.

    Google highlights three main reasons for unexpected cross-domain URL selections: duplicate content including multi-regional sites, configuration mistakes and malicious site attacks. Far points to various resources for each scenario in this blog post.

    “In rare situations, our algorithms may select a URL from an external site that is hosting your content without your permission,” says Far. “If you believe that another site is duplicating your content in violation of copyright law, you may contact the site’s host to request removal. In addition, you can request that Google remove the infringing page from our search results by filing a request under the Digital Millennium Copyright Act.”

    At least the WMT messages should help alert you when it’s happening.

  • Google Encrypted Search Means No Info For Individual Queries

    Google announced that it is going to begin encrypting search queries with SSL (Secure Sockets Layer) as the default experience at Google.com when you search logged into your Google account. http://www.google.com will become https://www.google.com.

    “This is especially important when you’re using an unsecured Internet connection, such as a WiFi hotspot in an Internet cafe,” says Google product manager Evelyn Kao.

    There’s a chance that your Google experience will be slower with SSL because the computer your’e using has to establish a secure connection with Google. This is interesting, considering that Google has put so much effort into speeding things up.

    It’s worth noting that you can just go to https://www.google.com when you’re signed out, and still use encrypted search.

    Naturally, webmasters and SEOs are contemplating the effects this will have on search engine optimization and analytics.

    Sites visited from Google’s organic listings will be able to tell that the traffic is coming from Google, but they won’t be able to receive info about each individual query. They will, however, receive an aggregated list of the top 1,000 search queries that drove traffic to the site for each of the past 30 days in Webmaster Tools.

    “This information helps webmasters keep more accurate statistics about their user traffic,” says Kao. “If you choose to click on an ad appearing on our search results page, your browser will continue to send the relevant query over the network to enable advertisers to measure the effectiveness of their campaigns and to improve the ads and offers they present to you.”

    “When a signed in user visits your site from an organic Google search, all web analytics services, including Google Analytics, will continue to recognize the visit as Google ‘organic’ search, but will no longer report the query terms that the user searched on to reach your site,” says Amy Chang on the Google Analytics blog. “Keep in mind that the change will affect only a minority of your traffic. You will continue to see aggregate query data with no change, including visits from users who aren’t signed in and visits from Google ‘cpc’.”

    “We are still measuring all SEO traffic. You will still be able to see your conversion rates, segmentations, and more,” she adds. “To help you better identify the signed in user organic search visits, we created the token ‘not provided)’ within Organic Search Traffic Keyword reporting. You will continue to see referrals without any change; only the queries for signed in user visits will be affected. Note that ‘cpc’ paid search data is not affected.”

    Google is making the encrypted search available on all of its search properties except for Maps.

  • Google Analytics Gets Webmaster Tools Data, New Search Reports

    Google is letting Google Analytics users get Webmaster Tools data in their GA accounts, so they can surface Google search data in GA.

    Several months ago, Google launched a pilot program, but now, the new set of reports is available to everyone. “The Webmaster Tools section contains three reports based on the Webmaster Tools data that we hope will give you a better sense of how your site performs in search results,” says Google Analytics Associate Product Manager Kate Cushing. “We’ve created a new section for these reports called Search Engine Optimization that will live under the Traffic Sources section.”

    That includes reports for queries (impressions, clicks, position, and CTR info for the top 1,000 daily queries, Landing Pages (impressions, clicks, position and CTR info for the top 1,000 daily landing pages) and Geographical Summary (impressions, clicks and CTR by country).

    Google says it has made various improvements to the reports based on feedback from the pilot program.

    Perhaps these reports will help webmasters who have been affected by Google’s Panda update figure out some things.

    Users must link their WMT and GA accounts obviously. To do so, go to the WMT homepage, click “manage site” next to the site you want and click “Google Analytics” property. Select the property you want to associate with the site and just save it.

  • Google Webmater Tools – Changes To Link Categorization

    Google announced that it is changing the way it categorizes link data in Webmaster Tools.

    “As you know, Webmaster Tools lists links pointing to your site in two separate categories: links coming from other sites, and links from within your site,” says Google Webmaster Trends analyst Susan Moskwa. “Today’s update won’t change your total number of links, but will hopefully present your backlinks in a way that more closely aligns with your idea of which links are actually from your site vs. from other sites.”

    For one, subdomains are now counted as internal links, which makes a great deal of sense. Here’s a chart showing how links have changed:

    Link categorization

    “If you own a site that’s on a subdomain (such as googlewebmastercentral.blogspot.com) or in a subfolder (www.google.com/support/webmasters/) and don’t own the root domain, you’ll still only see links from URLs starting with that subdomain or subfolder in your internal links, and all others will be categorized as external links,” says Moskwa. “We’ve made a few backend changes so that these numbers should be even more accurate for you.”

    She does note that if you own a root domain, your number of external links may appear to go down.

  • Help Google Crawl Your Site More Effectively, But Use Caution

    Google has introduced some changes to Webmaster Tools – in particular, handling of URLs with parameters.

    “URL Parameters helps you control which URLs on your site should be crawled by Googlebot, depending on the parameters that appear in these URLs,” explains Kamila Primke, Software Engineer with the Google Webmaster Tools Team. “This functionality provides a simple way to prevent crawling duplicate content on your site. Now, your site can be crawled more effectively, reducing your bandwidth usage and likely allowing more unique content from your site to be indexed. If you suspect that Googlebot’s crawl coverage of the content on your site could be improved, using this feature can be a good idea. But with great power comes great responsibility! You should only use this feature if you’re sure about the behavior of URL parameters on your site. Otherwise you might mistakenly prevent some URLs from being crawled, making their content no longer accessible to Googlebot.”

    Do you use URL parameters in Webmaster Tools? What do you think of the changes? Comment here.

    Google Webmaster Tools - URL Paramter page

    Google is now letting users describe the behavior of parameters. For example, you can let Google know if a parameter changes the actual content of the page.

    “If the parameter doesn’t affect the page’s content then your work is done; Googlebot will choose URLs with a representative value of this parameter and will crawl the URLs with this value,” says Primke. “Since the parameter doesn’t change the content, any value chosen is equally good. However, if the parameter does change the content of a page, you can now assign one of four possible ways for Google to crawl URLs with this parameter.”

    Those would be: let Googlebot decide, every URL, only crawl URLS with value or no URLs.

    Users can tell Google if a parameter sorts, paginates, determines content, or other things that it might do. For each parameter, Google will also “try” to show you a sample of example URLs from your site that it has already crawled that contain a given parameter.

    To bring up the use of caution again, Primke warns about the responsibilities that come with using the No URLs option. “This option is the most restrictive and, for any given URL, takes precedence over settings of other parameters in that URL. This means that if the URL contains a parameter that is set to the ‘No URLs’ option, this URL will never be crawled, even if other parameters in the URL are set to ‘Every URL.’ You should be careful when using this option. The second most restrictive setting is ‘Only URLs with value=x.’”

    She runs through some examples in this blog post, and there is more related information in Google’s Webmaster Help forum.

    Webmasters & SEOs: here’s *tons* of great info on our improved tool to handle url parameters better: http://t.co/TtBs8tp 2 minutes ago via Tweet Button · powered by @socialditto

    Be Careful About Selling the Same Stuff From Multiple Domains

    As long as we’re discussing webmaster issues for Google, I’ll also point to the latest Webmaster Help video from Matt Cutts, who discusses selling products on multiple domains. The user question he sought to answer was:

    “I manage 3 websites that sell the same products across 3 domains. Each site has a different selling approach, price structure, target audience, etc. Does Google see this as spumy or black hat?”

    Cutts says, “On one hand, if the domains are radically different lay-out, different selling approach, different structure – like, essentially completely different, and especially the fact that you said it’s only 3 domains, that might not be so bad. Clearly if it were 300 domains or 3,000 domains – you can quickly get to a fairly large number of domains that can be crowding up the search results and creating a bad user experience…by the time you get to a relatively medium-sized number of sites.”

    “The thing that was interesting about the question is that you said it’s the same products, as in identical. So it’s a little weird if you’re selling identical products across 3 domains. If you were selling like men’s sweaters on one, and women’s sweaters on another, and shoes on a third….I’ve said before, there’s no problem with having different domains for each product, and a small number of domains (2, 3, or 4) for very normally separable reasons can make perfect sense, but it is a little strange to sell the same products, so if they’re really identical, that starts to look a little bit strange – especially if you start to get more than 3 domains.”

    “Definitely, I have found that if you have one domain, you’ve got the time to build it up – to build the reputation for that domain…in my experience, when someone has 50 or 100 domains, they tend not to put as much work – as much love into each individual domain, and whether they intend to or not, that tends to show after a while. People have the temptation to auto-generate content or they just try to syndicate a bunch of feeds, and then you land on one domain vs. another domain, and it really looks incredibly cookie cutter – comparing the two domains, and that’s when users start to complain.

    Do you think Google takes the right approach to sites selling products from multiple domains? Share your thoughts here.

  • Google Emailing Notices To Webmasters Regarding Unnatural Links

    Many websites have fallen victims to Google’s emphasis on quality. And over the past few months, one of the targets has been unnatural links on your websites used for enhanced indexing and ranking. Google’s webmaster tools guidelines clearly demonstrates its non tolerance towards unnatural websites. Since December 2010, Google has sent out numerous emails consisting notices to websites of the unnatural links that are directed on their sites. 

    The email being sent out by Google looks like:

    Google Webmaster Tools notice of detected unnatural links to (your website)

    Dear site owner or webmaster of (your website), We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team.


    Many webmasters seem to be completely taken off guard by this email. And to that end, many have discussed their concerns on the Google webmaster forum. One of the many such queries discussed on the forum is by studione on the 16th July 2011
    is http://playarena.nl/

    “ Hi
    I noticed this message send from Google two weeks ago about detecting unnatural links on my site www.playarena.nl/. My site is an online flash game site, with many games to be played.
    I read the guidelines: links added as part of a link exchange scheme may be considered excessive. I’m using Linkex for finding reciprocal link partners and by now have only two reciprocal links. Is using Linkex banned by google, as I read that exxesive use of reciprocal is not excepted. But in my case its accepted.

    Could somebody help me out on this.

    With Kind regards,

    Playarena”

    How to avoid this:

    • Make sure other websites link to your website, however in a natural way to increase your page ranking
    • Don’t participate in link schemes which include, manipulating, buying or selling PageRanks. According to Google’s Matt Cutts “remove the paid links that pass PageRank”. In addition to that, avoid linking to web spammers and engaging in too much reciprocal linking
    • Consider evaluating your link exchange scheme and discard off the links that would be unaccepted under the Google’s guidelines.
    • Also consider evaluating the content on your website and other websites which are along similar lines to yours
    • Furthermore, avoid using lists of keywords in an attempt to “cloak” pages, or upload “crawler only” pages

    Once you have made the above recommended changes:

    • Sign in using your Google account to Google webmaster tools
    • Add and verify the website you want Google to reconsider
    • Request Google to reconsider your website

    Once requested to be reconsidered, we suggest you hang in there tight. Google is usually fair to both the the webmasters and the users, you won’t have much problems there. 

    Check out Page Traffic Buzz for more articles by Navneet Kaushal.

  • Bing Webmaster Tools Refreshed

    Bing Webmaster Tools Refreshed

    Bing has launched some enhancements to Bing Webmaster Tools in an update called “Honey Badger”.

    “Today’s redesign offers webmasters a simplified experience that allows them to quickly analyze and identify trends – while also bringing new and unique features to the industry,” a representative for Bing tells WebProNews. “Our goal is to help webmaster make faster, more informed decisions and drive new insights about their website by presenting them with rich visuals and more organized, relevant content.”

    Enhancements include:

    • Crawl delay management: Lets webmasters configure the bingbot crawl rate for a specific domain.
    • Index Explorer: Allows webmasters to the ability to access data in the Bing index regarding a specified domain.
    • User and Role Management: Provides site owners with the ability to grant admin, read/write or read-only access to other users for their site.

    Crawl deal is configurable by hour. Users can ask Bing to crawl slower during peak business hours or have it crawl faster during off-peak hours. There is drag-and-drop functionality that lets users create a crawl graph by clicking and dragging the mouse pointer across the graph. Individual columns can also be clicked for fine-tuning.

    Bing Crawl Settings

    Index Explorer, Bing says, is a “complete rewrite” of the Index Tracker backend, focusing on freshness, performance, extensibility, reduced machine footprint, and stability and failure detection. New sites will have this data as they sign up.

    Bing Index Explorer

    The company also launched the ability for webmasters to manage deep-links and added over 40 new educational documents and videos to the Toolbox site. The content covers things like: using Webmaster Tools, data explanation, link building, removing/blocking pages from Bing’s index, SEO guidance, managing URL parameters, rich snippets (schema.org), canonicalization, nofollow, managing redirects, 404 page management, etc.

    Bing says you can “count on more monthly content being added” to Webmaster Tools in the near future.