WebProNews

Tag: SEO

  • Webmasters Think Google Has Eased Up On EMD Sites

    Back in 2012, Google launched the EMD update, which the company described as a small update, affecting 0.6% of English-US queries to a noticeable degree. It was designed to demote low quality exact match domain sites.

    We don’t really hear that much about the update anymore, but there were quite a few complaints from webmasters hit by it when it launched.

    There is some new discussion in the webmaster community with people thinking Google has eased up on this repeating update.

    Barry Schwartz at Search Engine Roundtable points to some chatter in the Webmaster Word forum, and more people chimed in in the comments of his post saying they’ve been seeing more EMD sites ranking for various industry-specific searches.

    Gareth Miller suggests that they didn’t really punish low quality EMD sites, calling it “another case of Google doing something and then exaggerating its impact to discourage spammers from pursuing the tactic”

    I don’t know. I wouldn’t say Google exaggerated it, considering they said it was small to begin with, though I’m sure he has a valid point about Google discouraging spammers.

    Apparently, if these sites are appearing as frequently as some say, it didn’t discourage them enough.

    Of course there are plenty of EMD sites that do offer quality content, and deserve to rank just as much as the next site. We talked about this with Todd Malicoat (aka: Stuntdubl) last year.

    Image: DenverLawyers.com

  • Google Improves URL Removal Tool

    Google Improves URL Removal Tool

    Google has launched an improved version of its URL removal tool in Webmaster Tools, aimed at making it easier to request updates based on changes to other people’s sites.

    Google suggests that you could use the tool if a page has been removed completely or if it has changed, and you need the snippet and cached page removed.

    “If the page itself was removed completely, you can request that it’s removed from Google’s search results,” says Google Webmaster Trends analyst John Mueller. “For this, it’s important that the page returns the proper HTTP result code (403, 404, or 410), has a noindex robots meta tag, or is blocked by the robots.txt (blocking via robots.txt may not prevent indexing of the URL permanently). You can check the HTTP result code with a HTTP header checker. While we attempt to recognize ‘soft-404’ errors, having the website use a clear response code is always preferred.”

    For submitting a page for removal, just enter the URL and confirm the request.

    “If the page wasn’t removed, you can also use this tool to let us know that a text on a page (such as a name) has been removed or changed,” says Mueller. “It’ll remove the snippet & cached page in Google’s search results until our systems have been able to reprocess the page completely (it won’t affect title or ranking). In addition to the page’s URL, you’ll need at least one word that used to be on the page but is now removed.”

    Webmasters are instructed to enter the URL, confirm that the page has been updated or removed and that the cache and snippet are outdated, and enter a word that no longer appears on the live page, but still appears in the cache or snippet.

    Image: Google

  • The Latest From Google On Guest Blogging

    The subject of guest blogging has been coming up more and more lately in Google’s messaging to webmasters. Long story short, just don’t abuse it.

    Matt Cutts talked about it in response to a submitted question in a recent Webmaster Help video:

    He said, “It’s clear from the way that people are talking about it that there are a lot of low-quality guest blogger sites, and there’s a lot of low-quality guest blogging going on. And anytime people are automating that or abusing that or really trying to make a bunch of link without really doing the sort of hard work that really earns links on the basis of merit or because they’re editorial, then it’s safe to assume that Google will take a closer look at that.”

    “I wouldn’t recommend that you make it your only way of gathering links,” Cutts added. “I wouldn’t recommend that you send out thousands of blast emails offering to guest blog. I wouldn’t recommend that you guest blog with the same article on two different blogs. I wouldn’t recommend that you take one article and spin it lots of times. There’s definitely a lot of abuse and growing spam that we see in the guest blogging space, so regardless of the spam technique that people are using from month to month, we’re always looking at things that are starting to be more and more abused, and we’re always willing to respond to that and take the appropriate action to make sure that users get the best set of search results.”

    But you already knew that, right?

  • Google Goes After Yet Another Link Network

    Earlier this month, Google revealed that it would be cracking down on more link networks, following a larger trend that has been taking place throughout the year.

    Google’s Matt Cutts hinted on Twitter that Google was taking action on the network Anglo Rank.

    He went on to note that they’d be “rolling up a few.”

    On Friday, Cutts tweeted similarly:

    This was apparently in reference to another network, BackLinks.com.

    No surprises really, but Google is making it quite clear that it’s going to continue to penalize these types of sites.

    Hat tip to Search Engine Land.

    Image: BackLinks.com

  • Google Webmaster Tools Now Shows Structured Data Errors

    Google announced today that it has launched a new error reporting feature for the Structured Data Dashboard in Webmaster Tools. The company began testing this earlier this year, and has used feedback from webmasters to fine-tune the feature.

    Users can now see items with errors in the dashboard. Items represent top-level structured data elements tagged in the HTML code. Nested items aren’t counted. Google groups them by data type and orders them by number of errors.

    “We’ve added a separate scale for the errors on the right side of the graph in the dashboard, so you can compare items and errors over time,” notes Google webmaster trends analyst Mariya Moeva. “This can be useful to spot connections between changes you may have made on your site and markup errors that are appearing (or disappearing!).”

    Google says it has also updated its data pipeline, so reporting will be more comprehensive.

    When you click on a specific content type, Google will show you the markup error it found for that type. You can see all at once or filter by error type. Google suggests checking to see if the markup meets the implementation guidelines, which can be found here.

    You can click on the URLs in the table to see details about what markup Google has detected when it crawled the page last and what’s missing. There’s also a “test live data” button so you can the markup with Google’s Structured Data Testing Tool.

    After you fix issues, the changes will reflected in the dashboard.

    Image: Google

  • Google Gives You A Checklist For Making Sure Your Mobile Site Is Up To Snuff

    Google has put out a checklist for improving mobile websites. The search engine recently made several ranking changes for sites not configured for smartphone users, so you may want to pay attention.

    The list is broken down into three main steps: stop frustrating your customers, facilitate task completion and convert customers to fans. Don’t worry.The points under each one are more specific.

    For step one, there are tips related to removing extra windows from all mobile user-agents, providing device-appropriate functionality, testing, etc.

    For step two, tips get into ptimizing crawling, indexing and the searcher experience and optimizing popular mobile persona workflows for your site.

    For step three, tips are about considering search integration points with mobile apps, investigating/attempting to track cross-device workflow and brainstorming new ways to provide value.

    You may also want to take forty minutes or so to watch these two videos from Google’s Maile Ohye discussing improving high-traffic, poor user-experience mobile pages, optimizing the top mobile tasks on your site and quick fixes for mobile website performance.

    Last week, Google added smartphone crawl errors to Webmaster Tools.

  • Google Gives Advice On Speedier Penalty Recovery

    Google has shared some advice in a new Webmaster Help video about recovering from Google penalties that you have incurred as the result of a time period of spammy links.

    Now, as we’ve seen, sometimes this happens to a company unintentionally. A business could have hired the wrong person/people to do their SEO work, and gotten their site banished from Google, without even realizing they were doing anything wrong. Remember when Google had to penalize its own Chrome landing page because a third-party firm bent the rules on its behalf?

    Google is cautiously suggesting “radical” actions from webmasters, and sending a bit of a mixed message.

    How far would you go to get back in Google’s good graces? How important is Google to your business’ survival? Share your thoughts in the comments.

    The company’s head of webspam, Matt Cutts, took on the following question:

    How did Interflora turn their ban in 11 days? Can you explain what kind of penalty they had, how did they fix it, as some of us have spent months try[ing] to clean things up after an unclear GWT notification.

    As you may recall, Interflora, a major UK flowers site, was hit with a Google penalty early this year. Google didn’t exactly call out the company publicly, but after reports of the penalty came out, the company mysteriously wrote a blog post warning people not to engage in the buying and selling of links.

    But you don’t have to buy and sell links to get hit with a Google penalty for webspam, and Cutts’ response goes beyond that. He declines to discuss a specific company because that’s not typically not Google’s style, but proceeds to try and answer the question in more general terms.

    “Google tends to looking at buying and selling links that pass PageRank as a violation of our guidelines, and if we see that happening multiple times – repeated times – then the actions that we take get more and more severe, so we’re more willing to take stronger action whenever we see repeat violations,” he says.

    That’s the first thing to keep in mind, if you’re trying to recover. Don’t try to recover by breaking the rules more, because that will just make Google’s vengeance all the greater when it inevitably catches you.

    Google continues to bring the hammer down on any black hat link network it can get its hands on, by the way. Just the other day, Cutts noted that Google has taken out a few of them, following a larger trend that has been going on throughout the year.

    The second thing to keep in mind is that Google wants to know your’e taking its guidelines seriously, and that you really do want to get better – you really do want to play by the rules.

    “If a company were to be caught buying links, it would be interesting if, for example, [if] you knew that it started in the middle of 2012, and ended in March 2013 or something like that,” Cutts continues in the video. “If a company were to go back and disavow every single link that they had gotten in 2012, that’s a pretty monumentally epic, large action. So that’s the sort of thing where a company is willing to say, ‘You know what? We might have had good links for a number of years, and then we just had really bad advice, and somebody did everything wrong for a few months – maybe up to a year, so just to be safe, let’s just disavow everything in that timeframe.’ That’s a pretty radical action, and that’s the sort of thing where if we heard back in a reconsideration request that someone had taken that kind of a strong action, then we could look, and say, ‘Ok, this is something that people are taking seriously.”

    Now, don’t go getting carried away. Google has been pretty clear since the Disavow Links tool launched that this isn’t something that most people want to do.

    Cutts reiterates, “So it’s not something that I would typically recommend for everybody – to disavow every link that you’ve gotten for a period of years – but certainly when people start over with completely new websites they bought – we have seen a few cases where people will disavow every single link because they truly want to get a fresh start. It’s a nice looking domain, but the previous owners had just burned it to a crisp in terms of the amount of webspam that they’ve done. So typically what we see from a reconsideration request is people starting out, and just trying to prune a few links. A good reconsideration request is often using the ‘domain:’ query, and taking out large amounts of domains which have bad links.”

    “I wouldn’t necessarily recommend going and removing everything from the last year or everything from the last year and a half,” he adds. “But that sort of large-scale action, if taken, can have an impact whenever we’re assessing a domain within a reconsideration request.”

    In other words, if your’e willing to go to such great lengths and eliminate such a big number of links, Google’s going to notice.

    I don’t know that it’s going to get you out of the penalty box in eleven days (as the Interflora question mentions), but it will at least show Google that you mean business, and, in theory at least, help you get out of it.

    Much of what Cutts has to say this time around echoes things he has mentioned in the past. Earlier this year, he suggested using the Disavow Links tool like a “machete”. He noted that Google sees a lot of people trying to go through their links with a fine-toothed comb, when they should really be taking broader swipes.

    “For example, often it would help to use the ‘domain:’ operator to disavow all bad backlinks from an entire domain rather than trying to use a scalpel to pick out the individual bad links,” he said. “That’s one reason why we sometimes see it take a while to clean up those old, not-very-good links.”

    On another occasion, he discussed some common mistakes he sees people making with the Disavow Links tool. The first time someone attempts a reconsideration request, people are taking the scalpel (or “fine-toothed comb”) approach, rather than the machete approach.

    “You need to go a little bit deeper in terms of getting rid of the really bad links,” he said. “So, for example, if you’ve got links from some very spammy forum or something like that, rather than trying to identify the individual pages, that might be the opportunity to do a ‘domain:’. So if you’ve got a lot of links that you think are bad from a particular site, just go ahead and do ‘domain:’ and the name of that domain. Don’t maybe try to pick out the individual links because you might be missing a lot more links.”

    And remember, you need to make sure you’re using the right syntax. You need to use the “domain:” query in the following format:

    domain:example.com

    Don’t add an “http” or a ‘www” or anything like that. Just the domain.

    So, just to recap: Radical, large-scale actions could be just what you need to take to make Google seriously reconsider your site, and could get things moving more quickly than trying single out links from domains. But Google wouldn’t necessarily recommend doing it.

    Oh, Google. You and your crystal clear, never-mixed messaging.

    As Max Minzer commented on YouTube (or is that Google+?), “everyone is going to do exactly that now…unfortunately.”

    Yes, this advice will no doubt lead many to unnecessarily obliterate many of the backlinks they’ve accumulated – including legitimate links – for fear of Google. Fear they won’t be able to make that recovery at all, let alone quickly. Hopefully the potential for overcompensation will be considered if Google decides to use Disavow Links as a ranking signal.

    Would you consider having Google disavow all links from a year’s time? Share your thoughts in the comments.

  • Google Toolbar PageRank Lives (For Now)

    Just when you thought you were out, they’ve pulled you back in.

    Google has updated its data for Toolbar PageRank, after giving indication that it likely wouldn’t happen before the end of the year, if at all. Many of us assumed that it was pretty much going away because it has been so long since it has been updated, after years of regularity.

    Google’s Matt Cutts tells us it came as the result of an update to a backend service that “handles dupes and equivalent names,” and that while he’d hesitate to says he’d be surprised if regular updates like before started happening, in general, he’d “expect PR updates to be less of a priority.”

    Are you glad to see a PageRank update? How did you do? Let us know in the comments.

    Reactions to the update are mixed. Some are happy to see the new(er) data, while others wish it would just go away once and for all. As those in the SEO industry have known for years, the data simply isn’t that useful as a day-to-day tool, mainly due to the time that passes between updates, yet others obsess about it.

    Here’s a real time look at what people are saying about the update on Twitter:


    This is the first time Google has updated PageRank since February. Historically, they’ve updated it every thee or four months. Cutts tweeted in October that he’d be surprised if there was another PR update before 2014.

    Shortly after that, he discussed the topic in a Webmaster Help video:

    “Over time, the Toolbar PageRank is getting less usage just because recent versions of Internet Explorer don’t really let you install toolbars as easily, and Chrome doesn’t have the toolbar so over time, the PageRank indicator will probably start to go away a little bit,” he said.

    In another video earlier in the year, he said, “Maybe it will go away on its own or eventually we’ll reach the point where we say, ‘Okay, maintaining this is not worth the amount of work.

    On Twitter, Cutts acknowledged the update, which perhaps did come as a surprise to him, as it came by the hands of a different team at Google.

    He also mentioned on Twitter that it wasn’t an accident, but “was just easier for them to push the new PR data rather than keep the old data.”

    Cutts tells us:

    Sounds like we’re probably not going to get the frequency of years past.

    Should Google continue to update Toolbar PR in the future? Let us know what you think.

  • Google Takes Action On More Link Networks

    Google Takes Action On More Link Networks

    Google has been cracking down on link networks, penalizing the networks and the sites that take advantage of them to artificially inflate their link profiles, all year.

    Google’s Matt Cutts hinted on Twitter that the search engine has taken action on yet another one – Anglo Rank:

    While engaging with the Search Engine Land crew on Twitter, he noted they’ve been “rolling up a few”.

    A giant ad for AngloRank can be seen at BlackHatWorld (h/t: Search Engine Land). It promises “high PR English links from the most exclusive and unique private networks on the web.”

    Back in May, Cutts announced that Google would continue to tackle link networks, and that in fact, they had just taken action on “several thousand linksellers”.

    More recently, Google took out the link network GhostRank 2.0.

    The moral of the story is: stay away from these networks, because Google will figure it out, and make you pay. But you already knew that.

    Image via BlackHatWorld

  • Matt Cutts Talks Content Stitching In New Video

    Google has a new Webmaster Help video out about content that takes text from other sources. Specifically, Matt Cutts responds to this question:

    Hi Matt, can a site still do well in Google if I copy only a small portion of content from different websites and create my own article by combining it all, considering I will mention the source of that content (by giving their URLs in the article)?

    “Yahoo especially used to really hate this particular technique,” says Cutts. “They called it ‘stitching’. If it was like two or three sentences from one article, and two or three sentences from another article, and two or three sentences from another article, they really considered that spam. If all you’re doing is just taking quotes from everybody else, that’s probably not a lot of added value. So I would really ask yourself: are you doing this automatically? Why are you doing this? Why? People don’t just like to watch a clip show on TV. They like to see original content.”

    I don’t know. SportsCenter is pretty popular, and I don’t think it’s entirely for all the glowing commentary. It’s also interesting that he’s talking about this from Yahoo’s perspective.

    “They don’t just want to see an excerpt and one line, and then an excerpt and one line, and that sort of thing,” Cutts continues. “Now it is possible to pull together a lot of different sources, and generate something really nice, but you’re usually synthesizing. For example, Wikipedia will have stuff that’s notable about a particular topic, and they’ll have their sources noted, and they cite all of their sources there, and they synthesize a little bit, you know. It’s not like they’re just copying the text, but they’re sort of summarizing or presenting as neutral of a case as they can. That’s something that a lot of people really enjoy, and if that’s the sort of thing that you’re talking about, that would probably be fine, but if you’re just wholesale copying sections from individual articles, that’s probably going to be a higher risk area, and I might encourage you to avoid that if you can.”

    If you’re creating good content that serves a valid purpose for your users, my guess is that you’ll be fine, but you know Google hates anything automated when it comes to content.

  • Google Adds Smartphone Crawl Errors To Webmaster Tools (This Is Important Considering Recent Ranking Changes)

    Google announced that it has expanded the Crawl Errors feature in Webmaster Tools to help webmasters identify pages on their sites that show smartphone crawl errors.

    This is going to be of particular importance because Google recently made several ranking changes for sites not configured for smartphone users.

    “Some smartphone-optimized websites are misconfigured in that they don’t show searchers the information they were seeking,” says Google Webmaster Trends analyst Pierre Far. “For example, smartphone users are shown an error page or get redirected to an irrelevant page, but desktop users are shown the content they wanted. Some of these problems, detected by Googlebot as crawl errors, significantly hurt your website’s user experience and are the basis of some of our recently-announced ranking changes for smartphone search results.”

    The feature will include server errors, “not found” errors and soft 404s, faulty redirects and blocked URLs.

    Mobile crawl errors

    “Fixing any issues shown in Webmaster Tools can make your site better for users and help our algorithms better index your content,” says Far.

    Google’s Matt Cutts recently asked what users would like to see Google add to Webmaster Tools in the coming year. Some will no doubt be able to cross one thing off their list now.

    Image: Google

  • Google’s Cutts Talks Link Limits

    Google has put out a new Webmaster Help video about how many links you should have on a page.

    In summary, Google used to advise people not to have more than a hundred links on a page, but now you can get by with more than that. According to Cutts there’s no real limit, though there might be. Just don’t have too many. But don’t worry about it too much.

    Confused? That’s because there’s no real answer here. The basic gist is just: be reasonable. Of course that’s subjective, but here’s what Cutts says:

    He says, “It used to be the case that Googlebot and our indexing system would truncate at 100 or 101K, and anything beyond that wouldn’t get indexed, and what we did, was we said, ‘Okay, if the page is 101K, 100K, then, you know, it’s reasonable to expect roughly one link per kilobyte, and therefore, something like 100 links on a page.’ So that was in our technical guidelines, and we said, you know, ‘This is what we recommend,’ and a lot of people assumed that if they had 102 links or something like that then we would view it as spam, and take action, but that was just kind of a rough guideline.”

    “Nonetheless, the web changes,” he continues. “It evolves. In particular, webpages have gotten a lot bigger. There’s more rich media, and so it’s not all that uncommon to have aggregators or various things that might have a lot more links, so we removed that guideline, and we basically just now say, ‘Keep it to a reasonable number,’ which I think is pretty good guidance. There may be a limit on the file size that we have now, but it’s much larger, and at the same time, the number of links that we can process on a page is much larger.”

    “A couple factors to bear in mind,” he notes. “When you have PageRank, the amount of PageRank that flows through the outlinks is divided by the number of total outlinks, so if you have, you know, 100 links, you’ll divide your PageRank by 100. If you have 1,000 links, you’ll divide your PageRank by 1,000. So if you have a huge amount of links, the amount of PageRank that’s flowing out on each individual link can become very, very small. So the other thing is it can start to annoy users, or it can start to look spammy if you have tons and tons and tons of links, so we are willing to take action on the webspam side if we see so many links that it looks really, really spammy.”

    If you’re concerned about having too many links on a page, Cutts suggests getting a “regular user,” and testing it out with them to see if they think it has too many links.

    So, in the end, just ask a friend, “Hey man, do you think this page has too many links?”

    Problem solved.

  • Google Wants Some Ideas For Webmaster Tools. Got Any?

    In a recent article, we asked if Google is being transparent enough. While the question was asked broadly, much our discussion had to do specifically with webmasters. Is Google providing them with enough information?

    I mean after all, a single algorithm tweak can completely kill a business, or cause one to have to lay off staff. Webmasters want to know as much about how Google works, and how it views their site as possible.

    What do you think Webmaster Tools needs more than anything else? Let us know in the comments.

    We’re not asking that question just for conversation’s sake, though that should be interesting too. Google actually wants to know. Or at least one pretty important and influential Googler does.

    Matt Cutts, head of Google’s webspam team, has taken to his personal blog to ask people what they would like to see Google Webmaster Tools offer in 2014.

    So here’s your chance to have your voice heard.

    “At this point, our webmaster console will alert you to manual webspam actions that will directly affect your site,” he writes. “We’ve recently rolled out better visibility on website security issues, including radically improved resources for hacked site help. We’ve also improved the backlinks that we show to publishers and site owners. Along the way, we’ve also created a website that explains how search works, and Google has done dozens of ‘office hours’ hangouts for websites. And we’re just about to hit 15 million views on ~500 different webmaster videos.”

    I like to think we’ve played some small role in that.

    Cutts lists fourteen items himself as things he could “imagine people wanting,” but notes that he’s just brainstorming, and that there’s no guarantee any of these will actually be worked on.

    Among his ideas are: making authorship easier, improving spam/bug/error/issue reporting, an option to download pages from your site that Google has crawled (in case of emergency), checklists for new businesses, reports with advice for improving mobile/page speed, the ability to let Google know about “fat pings” of content before publishing it to the web, so Google knows where it first appeared, better duplicate content/scraper reporting tools, showing pages that don’t validate, showing pages that link to your 404 pages, show pages on your site that lead to 404s and broken links, better bulk URL removal, refreshing data faster, improving the robots.txt checker, and ways for site owners to tell Google about their site.

    Even if we don’t see all of these things come to Webmaster Tools in the near future, it’s interesting to see the things Cutts is openly thinking about.

    The post’s comments from Webmasters are already in the hundreds, so Google will certainly have plenty of ideas to work with. Googlers like Cutts have been known to peruse the WPN comments from time to time as well, so I wouldn’t worry about your response going unnoticed here either.

    What do you think Webmaster Tools needs more than anything? Let us know in the comments. Better yet, let us know what you think it might actually get.

    Image: Google

  • Cutts Talks Disavow Links Tool And Negative SEO

    Google has put out a new Webmaster Help video discussing the Disavow Links tool, and whether or not it’s a good idea to use it even when you don’t have a manual action against your site.

    Google’s Matt Cutts takes on the following question:

    Should webmasters use the disavow tool, even if it is believed that no penalty has been applied? For example, if we believe ‘negative SEO’ has been attempted, or spammy sites we have contacted have not removed links.

    As Cutts notes, the main purpose of the tool is for when you’ve done some “bad SEO” yourself, or someone has on your behalf.

    “At the same time, if you’re at all worried about someone trying to do negative SEO or it looks like there’s some weird bot that’s building up a bunch of links to your site, and you have no idea where it came from, that’s a perfect time to use Disavow as well.”

    “I wouldn’t worrying about going ahead and disavowing links even if you don’t have a message in your webmaster console. So if you have done the work to keep an active look on your backlinks, and you see something strange going on, you don’t have to wait around. Feel free to just preemptively say, ‘This is a weird domain. I have nothing to do with it. I don’t know what this particular bot is doing in terms of making links.’ Just feel free to go ahead and do disavows, even on a domain level.”

    As Cutts has said in the past, feel free to use the tool “like a machete“.

  • Is Google Being Transparent Enough?

    Is Google Being Transparent Enough?

    Many would say that Google has become more transparent over the years. It gives users, businesses and webmasters access to a lot more information about its intentions and business practices than it did long ago, but is it going far enough?

    When it comes to its search algorithm and changes to how it ranks content, Google has arguably scaled back a bit on the transparency over the past year or so.

    Do you think Google is transparent enough? Does it give webmasters enough information? Share your thoughts in the comments.

    Google, as a company, certainly pushes the notion that it is transparent. Just last week, Google updated its Transparency Report for the eighth time, showing government requests for user information (which have doubled over three years, by the way). That’s one thing.

    For the average online business that relies on Internet visibility for customers, however, these updates are of little comfort.

    As you know, Google, on occasion, launches updates to its search algorithm, which can have devastating effects on sites who relied on the search engine for traffic. Sometimes (and probably more often than not), the sites that get hit deserve to get hit. They’re just trying to game the system and rank where they really shouldn’t be ranking. Sometimes, people who aren’t trying to be deceptive, and are just trying to make their business work are affected too.

    Google openly talks about these updates. Panda and Penguin are regular topics of discussion for Googlers like Matt Cutts and John Mueller. Google tries to send a clear message about the type of content it wants, but still leaves plenty of sites guessing about why they actually got hit by an update.

    Not all of Google’s algorithmic changes are huge updates like Panda and Penguin. Google makes smaller tweaks on a daily basis, and these changes are bound to have an effect on the ranking of content here and there. Otherwise, what’s the point?

    While Google would never give away its secret recipe for ranking, there was a time (not that long ago) when Google decided that it would be a good idea to give people a look at some changes it has been making. Then, they apparently decided otherwise.

    In December of 2011, Google announced what it described as a “monthly series on algorithm changes” on its Inside Search blog. Google started posting monthly lists of what it referred to as “search quality highlights”. These provided perhaps the most transparency into how Google changes its algorithm that Google has ever provided. It didn’t exactly give you a clear instruction manual for ranking above your competition, but it showed the kinds of changes Google was making – some big and some small.

    Above all else, it gave you a general sense of the kinds of areas Google was looking at during a particular time period. For example, there was a period of time when many of the specific changes Google was making were directly related to how it handles synonyms.

    Google described the lists as an attempt to “push the envelope when it comes to transparency.” Google started off delivering the lists one a month as promised. Eventually, they started coming out much more slowly. For a while, they came out every other month, with multiple lists at a time. Then, they just stopped coming.

    To my knowledge, Google hasn’t bothered to explain why (a lack of transparency on its own), though I’ve reached out for comment on the matter multiple times.

    It’s been over a year since Google released one of these “transparency” lists. The last one was on October 4th of last year. It’s probably safe to say at this point that this is no longer happening. Either that or we’re going to have one giant year-long list at the end of 2013.

    For now, we’re just going to have to live with this reduction in transparency.

    Don’t get me wrong, Google has given webmasters some pretty helpful tools during that time. Since that last list of algorithm changes, Google has launched the Disavow Links tool, the Data Highlighter tool, the manual action viewer, and the Security Issues feature and altered the way it selects sample links.

    Barry Schwartz from Search Engine Roundtable says he’d like to see an “automated action viewer” to complement the manual action viewer. As would many others, no doubt.

    “Don’t get me wrong,” he writes. “Google’s transparency over the years has grown tremendously. But this one thing would be gold for most small webmasters who are lost and being told by “SEO experts” or companies things that may not be true. I see so many webmasters chasing their tails – it pains me.”

    Cutts continues to regularly put out videos responding to user-submitted questions (webmasters find these to be varying degrees of helpful).

    But Google is not doing anything remotely like search quality highlights lists, which provided specific identifying numbers, project nicknames and descriptions of what they did like the following example:

    #82862. [project “Page Quality”] This launch helped you find more high-quality content from trusted sources

    While I haven’t really seen this talked about much, Google has been accused of breaking other promises lately. We talked about the broken promise of Google not having banner ads in its search results recently. Danny Sullivan blogged earlier this week about “Google’s broken promises,” mentioning that as well as Google’s decision to launch the paid inclusion Google Shopping model last year, something the company once deemed to be “evil”.

    “For two years in a row now, Google has gone back on major promises it made about search,” he wrote. “The about-faces are easy fodder for anyone who wants to poke fun at Google for not keeping to its word. However, the bigger picture is that as Google has entered its fifteenth year, it faces new challenges on how to deliver search products that are radically different from when it started.”

    “In the past, Google might have explained such shifts in an attempt to maintain user trust,” he added. “Now, Google either assumes it has so much user trust that explanations aren’t necessary. Or, the lack of accountability might be due to its ‘fuzzy management’ structure where no one seems in charge of the search engine.”

    He later says Google was “foolish” to have made promises it couldn’t keep.

    User trust in Google has suffered for a variety reasons, not limited to those mentioned, in recent months.

    Last year, Google cause quite a dust-up with its big privacy policy revamp, which more efficiently enables it to use user data from one product to the next. Last week, another change in policy went into effect, enabling it to use users profiles and pictures wherever it wants, including in ads. The ad part can be opted out of, but the rest can’t. Quite a few people have taken issue with the policy.

    Then there’s the YouTube commenting system. They changed that to a Google+-based platform, which has caused its own share of issues, and sparked major backlash from users.

    The changes were pitched as a way to improve conversations around videos and surface comments that are more relevant to the user, but most people pretty much just see it as a way to force Google+ onto the YouTube community. Some don’t think Google is being very transparent about its intentions there. It’s a point that’s hard to argue against when you see stuff like this.

    Do you think Google is losing trust from its users? Do you think the company is being transparent enough? Is all of this stuff just being overblown? What would you like to see Google do differently? Share your thoughts in the comments.

    Image: Matt Cutts (YouTube)

  • Matt Cutts Discusses Duplicate Meta Descriptions

    Google has released a new Webmaster Help video featuring Matt Cutts talking about duplicate and unique meta descriptions.

    Cutts answers this submitted question:

    Is it necessary for each single page within my website to have a unique metatag description?

    “The way I would think of it is, you can either have a unique metatag description, or you can choose to have no metatag description, but I wouldn’t have duplicate metatag description[s],” Cutts says. “In fact, if you register and verify your site in our free Google Webmaster Tools console, we will tell you if we see duplicate metatag descriptions, so that is something that I would avoid.”

    “In general, it’s probably not worth your time to come up with a unique meta description for every single page on your site,” he adds. “Like when I blog, I don’t bother to do that. Don’t tell anybody. Ooh. I told everybody. But if there are some pages that really matter, like your homepage or pages that have really important return on investment – you know, your most featured products or something like that – or maybe you’ve looked at the search results and there’s a few pages on your site that just have really bad automatically generated snippets. We try to do our best, but we wouldn’t claim that we have perfect snippets all the time.”

    No, believe it or not Google is not perfect (as Executive Chairman Eric Schmidt also reminded us).

    Cutts concludes, “You know, in those kinds of situations, then it might make sense to go in, and make sure you have a unique handcrafted, lovingly-made metatag description, but in general, rather than have one metatag description repeated over and over and over again for every page on your site, I would either go ahead and make sure that there is a unique one for the pages that really matter or just leave it off, and Google will generate the snippet for you. But I wouldn’t have the duplicate ones if you can help it.”

    Some will probably take Matt’s advice, and start spending a lot less time bothering with meta descriptions. Just remember that part about looking at the search results and making sure that Google isn’t displaying something too weird, particularly if it’s an important page.

  • Matt Cutts Talks Blog Comments And Link Spam

    If you run a blog, you no doubt come across spammy comments with links in them frequently. You may know that this can hurt your page in Google, but sometimes people leave comments with links that are actually relevant to the conversation. Perhaps they want to illustrate a point, or discussed the topic at length in their own blog post that they want to share. Perhaps it’s a relevant YouTube video.

    Are you allowing these types of comments in? Are you putting a nofollow on all comment links? Should they really be nofollowed if they are in fact relevant?

    Google’s Matt Cutts talks about comments with links in a new Webmaster Help video, but from the perspective of the person leaving the comments. A user submitted the following question:

    Google’s Webmaster Guidelines discourage forum signature links but what about links from comments? Is link building by commenting against Google Webmaster Guidelines? What if it’s a topically relevant site and the comment is meaningful?

    “I leave topically relevant comments on topically relevant sites all the time,” says Cutts. “So if somebody posts, you know, an SEO conspiracy theory, and I’m like, ‘No, that’s not right,’ I’ll show up, and I’ll leave, you know, a comment that says, ‘Here’s a pointer that shows that that’s not correct,’ or ‘Here’s the official word,’ or something like that. And I’ll just leave a comment with my name, and I’ll often even point to my blog rather than to Google’s webmaster blog or something like that because I’m just representing myself. So lots of people do that all the time, and that’s completely fine.”

    “The sorts of things that I would start to worry about is, it’s better, often, to leave your name, so someone knows who they’re dealing with rather than you know, ‘cheap study tutorials’. You know, or ‘fake drivers license,’ or whatever the name of your business is,” he continues. “Often that will get a chillier reception than if you show up with your name.”

    “The other thing that I would say is if your primary link-building strategy is to leave comments all over the web to the degree that you’ve got a huge fraction of your link portfolio in comments, and no real people linking to you then at some point, that can be considered a link scheme,” Cutts adds. “At a very high level, we reserve the right to take action on any sort of deceptive or manipulative link schemes that we consider to be distorting our rankings. But if your’e just doing regular organic comments, and you’re not doing it as a, you know, ‘I have to leave this many comments a day every single day because that’s what I’m doing to build links to my site,’ you should be completely fine. It’s not the sort of thing that I would worry about at all.”

    I doubt that this video will do much to change people’s commenting habits, and prevent excessive comment spam, but at least it’s out there.

    Bloggers are going to have to continue being aggressive with comment moderation and/or use nofollows on comment links if they don’t want spammy links making their pages look bad. Of course, if the spammy comments are there, the page will still look bad to users, and Google doesn’t want that either, regardless of whether or not links are passing PageRank.

    At the same time, if you’re leaving a comment with a link, and aren’t trying to influence Google’s rankings, you shouldn’t really care if your link is nofollowed, right?

  • Google Offers Webmasters New SEO Advice Video

    Google has put out a new video of SEO advice from Developer Programs Tech Lead, Maile Ohye. She discusses how to build an organic search strategy for your company.

    “What’s a good way to integrate your company’s various online components, such as the website, blog, or YouTube channel? Perhaps we can help!” she says in a blog post about the video. “In under fifteen minutes, I outline a strategic approach to SEO for a mock company, Webmaster Central, where I pretend to be the SEO managing the Webmaster Central Blog.”

    Specifically, she discusses: understanding searcher persona workflow, determining company and site goals, auditing your site to best reach your audience, execution, and making improvements.

    You can find the slides she references here.

  • Twitter Closes First Day at NYSE at $44.90 a Share

    Twitter Inc., the company known for the micro-blogging/social networking free service Twitter (i.e. a 140 character per-status update (“tweet”) feed in which users can follow, send, and read) began trading on the New York Stock Exchange (NYSE) at 9:30 AM on Wednesday. Shares rose as high as $50.09.

    This means that Twitter Inc. (TWTR) is now a public company, and upon launch, shares were priced at $26 a pop, initially valuing the company at $18.34 billion.

    To put that $18.34 billion in perspective:

    • Yahoo (YHOO) – $33.45 billion
    • Kellogg Company (K) – $22.5 billion
    • Twitter, Inc. (TWTR) – $18.34 billion
    • Macy’s, Inc. (M) – $17.29 billion
    • Bed Bath & Beyond Inc. (BBBY) – $15.91 billion

    After its first day of trading yesterday, Twitter Inc.’s shares closed at $44.90, a 73 per cent increase from the initial IPO, giving the company’s worth at the end of the day to be $31 billion. Jack Dorsey, Evan Williams, and Biz Stone, Twitter’s creators, became instant billionaires after considerable returns. The one year estimated price per share is valued at $39.98, which means $39.98 (at this rate) may be the price-per-share as the year ends.

    (image)

    According to Dealogic, a markets service, Twitter has the second largest IPO by an American company, and trails shortly behind Facebook (FB) which shares, as of this time, cost $47.56.

    As the next few days follow, Twitter will see an extraordinary amount of activity at the stock exchange, but will gradually fall down just a bit. If you’re hell-bent on buying shares, it may be wise to wait it out a little bit.

    “In a few days after the IPO, you’re going to start seeing the stock price settling down a bit,” says Global X Funds CEO Bruno del Ama.

    Thankfully as November 7th closed, Twitter’s NYSE debut didn’t result in any technical glitches like when Facebook went open to the public (and the Securities and Exchange Commission fined Nasdaq $10 million because of it) in May 2012. In late October, NYSE performed a successful test run that showed it could handle the volume of buyers when Twitter’s IPO launched.

    So how does Twitter make money?

    According to Will Oremus at Future Tense, Twitter makes it money “primarily by selling ads, which gain a lot of their value from the advertiser’s ability to target specific groups of users. Twitter’s disadvantage relative to Facebook is scale: It has on the order of 200 million users, while Facebook has some 1.15 billion. But its advantage lies in timeliness and topicality. People check Facebook casually, when time allows. Twitter users tend to use Twitter quite actively, and in conjunction with specific events, like TV shows, rallies, concerts, and breaking news. So advertisers can craft ads tailored not only to a Twitter user’s general tastes and demographic profile, but to what that user is doing at the very moment they see the ad.”

    Below is a pie chart that shows what the 200 million registered users in 2009 were posting on Twitter:

    (image)

    As of May 2013, Twitter has 554,750,000 registered users.

     

     

  • Matt Cutts Talks Responsive Design Impact On SEO

    Google has put out a new Wembaster Help video. In this one, Matt Cutts discusses responsive design and its impact (or lack thereof) on SEO. He takes on the question:

    Does a site leveraging responsive design “lose” any SEO benefit compared to a more traditional m. site?

    Cutts says, “Whenever you have a site that can work well for regular browsers on the desktop as well as mobile phones, there’s a couple completely valid ways to do it. One is called responsive design, and responsive design just means that the page works totally fine whether you access that URL with a desktop browser or whether you access that URL with a mobile browser. Things will rescale, you know, the page size will be taken into account, and everything works fine. Another way to do it is, depending on the user agent that’s coming, you could do a redirect so that a mobile phone – a mobile smartphone, for example – might get redirected to a mobile dot version of your page, and that’s totally fine as well.”

    He notes that Google has guidelines and best practices here.

    This includes things like having a rel=”canonical” from the mobile version to the desktop version, and stuff like that.

    He continues, “In general, I wouldn’t worry about a site that uses responsive design losing SEO benefit(s) because by definition, you’ve got the same URL, so in theory, if you do a mobile version of your site, if you don’t handle that well and you don’t do the rel=’canonical’ and all those sorts of things, you might, in theory, divide the PageRank between those two pages, but if you’ve got responsive design, everything is handled from one URL, and so the PageRank doesn’t get divided. Everything works fine, so you don’t need to worry about the SEO drawbacks at all.”

    And that’s about the size of it.

  • Cutts: Use Schema Video Markup For Pages With Embedded YouTube Videos

    Cutts: Use Schema Video Markup For Pages With Embedded YouTube Videos

    There is a lot of webmaster interest these days in the impact schema markup has on content in search results.

    Today’s Webmaster Help video from Google addresses video markup. Matt Cutts takes on the following submitted question:

    Rich snippets are automatically added to SERPs for video results from YouTube. Is it recommended to add schema video markup onsite in order to get your page w/embedded video to rank in SERPs in addition to the YouTube result or is this redundant?

    Cutts says he checked with a webmaster trends analyst, and they said, “Yes, please get them to add the markup.”

    He says, “In general, you know, the more markup there is (schema, video or whatever), the easier it is for search engines to be able to interpret what really matters on a page. The one thing that I would also add is, try to make sure that you let us crawl your JavaScript and your CSS so that we can figure out the page and ideally crawl the video file itself, so that we can get all the context involved. That way if we can actually see what’s going on on the video play page, we’ll have a little bit better of an idea of what’s going on with your site. So yes, I would definitely use the schema video markup.”

    There you have it. The answers Cutts gives in these videos aren’t always so straight forward, but this pretty much gives you a direct answer, and one which can no doubt be applied to other types of content beyond video. Use as much markup as you can, so Google (and other search engines) can understand your site better.