WebProNews

Tag: SEO

  • Here’s What’s Working For B2B Businesses In Search And Social

    Here’s What’s Working For B2B Businesses In Search And Social

    It’s an ongoing struggle for businesses to figure out how to get the most out of their marketing budgets. While email has proven time and time again to be an incredibly effective channel, there’s still a lot of question as to how to maximize ROI in channels like search and social. Here, you’ll find a look at what seems to be working for a great deal of B2B businesses.

    Do you get a bigger bang from your buck from SEO, PPC or social media marketing? Let us know in the comments.

    Webmarketing123 has put out some interesting survey results for its State of Digital Marketing 2012 report. It looks at B2B vs. B2C marketing efforts in terms of SEO, PPC and social media. We’ll focus on the B2B side of things here. Among other things, it looks specifically at the satisfaction levels of in-house/agency search marketing and social media efforts, as well as how much money businesses are putting into different social networks, and how much they’re getting back.

    The survey included companies like Sony, Olympus, Phillips, IBM, Hitatchi, Cisco, Agilent, Microsoft, Citrix, Medtronic, Merck, Novo Nordisk, Blue Shield, ADP, Pitney Bowes, Monster.com, Angie’s List, GE, John Deere, Aramark, Thomson Reuters, Federal Express, Bose, and Nestlé. In all, over 500 marketing professionals from the U.S. responded to the survey.

    The main B2B takeaways appear to be that lead generation is the top objective among brands, and SEO is found to be twice as effective as either PPC or social media marketing. This is quite interesting, considering that it is getting harder and harder to get on the first page of Google results. Of course, there are some major brands that took this survey, and they likely don’t struggle with rankings as much as some smaller businesses.

    B2B Objectives

    According to the survey, about 50% more B2Bs now consider social media as having the most impact on lead generation, compared to last year, though SEO is still significantly ahead.

    Biggest Impact on lead generation

    Based on the surveys findings, most B2B businesses engage in SEO, and do so in-house, rather than hiring agencies. Almost all of them either intend to increase their SEO budgets in 2013 or at least maintain their current budgets.

    SEO Budget plans

    According to the survey, the most common measures of SEO performance are the volume of traffic, organic traffic, and the number of keywords appearing on page 1, “which give no insight into financial impact.” Fewer marketers, Webmaketing123 says, are employing “more sophisticated measures,” like number of qualified leads or sales attributable to organic search.

    SEO Measurement

    Here’s a look at how businesses are doing their PPC, and what their budget plans look like:

    PPC Budgets

    Here’s a similar look at how businesses are doing social media, and what their budget plans look like:

    Social Budgets

    Interestingly, while in-house dominates the efforts of businesses across SEO, PPC and social media, the satisfaction levels are significantly higher when outside agencies are hired, according to the survey:

    Satisfaction

    According to the survey, B2C businesses are getting more engagement than B2B businesses from their social media efforts, but the gap is narrowing. B2B businesses are getting better at social media.

    “B2C marketers are ahead with 70% moderately to highly engaged (40% highly engaged), but B2B is catching up, with 63% at those levels of engagement (27% highly engaged), overall, only 1 in 10 have no social media program,” the firm says.

    It probably helps that the social networks are putting out more business-oriented products. Facebook, since the IPO, has certainly had businesses on the brain (even at the cost of user-friendliness, perhaps), launching more and more ad products and targeting capabilities for posts. Twitter, just this past week, announced new ad targeting options of its own, as well as the Certified Product Program for businesses. Google also announced some new business-specific features for Google+ this past week.

    As these networks continue to cater more to businesses, businesses are likely to find them more valuable, and perhaps find more room in their budgets to take advantage. However, it is the social network that has always been business-oriented, that currently seems to be providing the biggest bang for B2B business’ bucks.

    Last, but not least, Webmarketing123 provides a look at the break down of dollars spent among popular social networks, and money made from them. It looks like businesses are getting the most out of their dollars spent with LinkedIn, though for B2C, Facebook blows LinkedIn out of the water. This makes sense, however, if you consider the professional nature of LinkedIn.

    Social Media Revenue

    Businesses may soon be able to get even more out of LinkedIn, as the company just expanded LinkedIn ads into 17 new languages. The company is also improving its developer platform, which could lead to some more business opportunities.

    From which social networks are you getting the most ROI? Are you getting more from SEO or PPC? In-house or agency? Let us know what’s working for your business.

  • Fear Of Google Ironically Has People Considering Making Natural Links Unnatural

    We recently published an article called “Links Are The Web’s Building Blocks, And Fear Of Google Has Them Crumbling“. This was about the panic Google has caused among webmasters with its messages about links. It’s a panic that has led to many webmasters requesting to have links removed from sites that they would otherwise find valuable, if not for fear that Google will not like them and hurt their rankings.

    Is all of this fear over Google an overreaction, or is it justified? Let us know what you think.

    I noticed a post in WebmasterWorld that expresses this point to perfectly. The title of the post says it all: “New link to my site worries me — but it’s a good link!” Senior member crobb305 writes:

    Got an unsolicited citation from a media source but they used anchor text that I have been penalized on. FUD! Should I ask them to change it? By doing that I make a natural link unnatural, and Googlebot will detect that change (obvious tinkering). Nevertheless, I do have an OOP and received the infamous link warnings about 5 months ago.

    I hate it that we have to live with this type of fear.

    This person has been a member of WebmasterWorld since 2002, so they’ve clearly been in this world for quite a while. Yet here they are concerned that a completely natural link might draw negative attention from Google. The person is even wondering if they should go out of their way to make the link unnatural to please Google. How’s that for irony? Sadly, it’s highly likely that plenty of other webmasters are thinking similar thoughts.

    As shared in the article mentioned at the beginning, there is plenty of overreaction from webmasters out there, and I would say that Google would rather see the link occur in its natural form, but this is the kind of fear people are dealing with to please Google and maintain some form of visibility in search results (which is getting harder and harder for other reasons entirely). Should people have to be this worried about links (the building blocks of the web)?

    It probably doesn’t help that Google has reportedly indicated that forthcoming algorithm updates will be more “jarring.”

    Another forum senior member later responded, “But seriously, some of the sites of mine that went down the Google drain were clean, ‘link building’ was not done, just attracted some real nice ones and yet the project died due to ‘penalties’. I went out of answers to this somewhere in the middle of 2011 and focus on cool stuff, HTML5, content (I think some tools can be considered good content) and ultimately ranking solid on Bing. Google does whatever Google wants to do.”

    Likewise, Chris Lang from Gadget MVP tells me on Google+, “I never have worried about Google. I just do what seems natural. Never been slapped once…. At least not by Google.”

    WebmasterWorld moderator goodroi tells the user, “One link from a quality, relevant website is not the problem. The hundreds of links with identical anchor text coming from blog spam, directory submission schemes and other short cuts are the problem.’

    “I tend to focus more of my efforts on improving backlink profiles by adding quality links instead of focusing on deleting bad links,” goodroi adds. “Even if you delete every single bad link (and somehow are lucky not to accidentally delete a good link) you still need to build legitimate links. So if you start working on legitimate links you may end up getting enough good links that it naturally defuses the bad link issues.”

    Unfortunately, many are seemingly still eager to kill significantly more links than they may really need to. On the flipside, even some publishers are growing leery of including guest content on their sites. This fear, apparently is coming from the Penguin update.

    Barry Schwartz at Search Engine Roundtable points to a post from Cre8asite Forums, where user EGOL writes:

    Since Penguin, I am getting a flood of article offers. Most of this content is crap. Some of it is “average” quality (which I don’t publish). Some can be excellent, unique, highly desirable. So now I am deciding if I want to accept some of this content, knowing that I could be publishing links to sites that could have past, present or future manipulation.

    I have a potential article that I really like and that would be very popular with my visitors. The author’s site ranks #1 in a difficult niche and they don’t have enough content on their site to hold that position from editorial links (IMO).

    I have not seen any articles or discussion about the cautions that a publisher should be following in these days of post-penguin linking.

    So, not only are people afraid to have links out there that they would find valuable, if not for fear of Google, but some are also afraid to publish quality content, for fear that it might somehow be connected to something Google will not like. Ironically, quality content is what Google wants from sites above all else.

    Are webmasters worrying about Google too much, or are these simply rational concerns, with Google being such a dominant force on the web? Tell us what you think.

  • Google, Takedown Requests & The Unknown

    Google, Takedown Requests & The Unknown

    As you may know, Google is now using the amount of takedown requests it receives for a site as a ranking signal. Google publicly announced this change earlier this month, and it’s been something of a controversial topic within the webmaster community.

    Interestingly, according to Google’s Transparency Report, the number of takedown requests has actually decreased since the announcement, but the overall trend shows a very significant rise in requests over the past year. There are a lot of concerns about the vagueness of how Google uses this data, and potential abuse of the signal by competitors. More on all of that here.

    SEOBook‘s Aaron Wall shared some additional thoughts about the whole thing with us. He thinks the change may be good for Google, but is less certain how good it is for the rest of the web.

    The unknowns

    “I don’t think the feature hurts Google at all,” he tells WebProNews. “In fact I think it creates a further competitive advantage for them.”

    “It is the rest of the web that the feature is not so good for,” he adds. “The limitations are not known publicly, the level of pain caused is not known publicly, the recovery process is not known publicly, how and where and why they may change limits or penalties associated with it going forward is also unknown, etc.”

    “It is a way to point at basically anything with any sort of remix of culture and say, ‘well it is spam because this over here,’” Wall says.

    Google, as a company, is changing

    “Now to be fair, I think historically Google has been far fairer than most in their position would be,” he adds. “However as time passes, they become larger, they get more employees & they need to keep growing revenues they become more of a typical company (and that means past exceptionalism might be less exceptional in years to come).”

    “As an example of something they wouldn’t have done 10 years ago, today Google’s homepage has a large graphic ad on it for their tablet,” he points out.

    We actually talked about this another piece. It certainly is a pretty interesting turnaround from where the company once was. You have to wonder how much ore of this kind of thing Google will do. User response hasn’t been incredibly positive. Here are some examples of some fo the comments we received about the ad:

    “Definitely the most bold ad to be shown on the homepage ever. I don’t mind a text link, but an animated image? It is 179Kb as well, that is not a tiny file.”

    “How do I get rid of this annoyance?!!!! I don’t want another annoying push to buy things I couldn’t care less about.”

    “This is a sad day!!”

    “Really don’t like it – very annoyed with google.”

    I’m sure there are plenty who are not really bothered by the ad. Frankly, it doesn’t really bother me at all. I just find it noteworthy that Google would do this now after its long history of homepage simplicity.

    What About User-Generated Content Sites?

    Wall shares a quote from a post Google made on its AdSense blog this week:

    “It’s against our policies to show ads on the same page as links to other sites that are hosting copyrighted materials without authorization. Keep in mind that these sites come in various forms such as forums, blogs or community websites.”

    “Notice that YouTube goes unmentioned in the above tip,” Wall says. “Yet if you wanted to list sites that have been on the receiving end of a billion Dollar lawsuit for copyright infringement YouTube would be right up top. That was sort of the point I was trying to make…that new sites that behave like some of Google’s vertical properties do would have a strong risk of being labeled as spam before they could reach a critical mass.”

    Google has acknowledged that YouTube (and Blogger) aren’t counted among takedown requests in its transparency report, as they have different paths for reporting, but Wall makes an interesting point about other user-generated content sites trying to get off the ground.

    The signal is only one of over 200, but for a site that operates in a similar fashion to YouTube, it could be a strong signal, depending on how users use it, even if the site is diligent about responding to requests of its own.

    We’ve reached out to Google for comment on this, and will update if we receive one.

    Update: Talking with Google, the company, while acknowledging that no algorithm is perfect, indicates that it does not feel like the signal will have much of an impact on user-generated content sites, because of the way the signal has been designed. It’s likely that the types of sites seen at the top of the list of the Transparecy Report will be affected most. The company also explains that the bar is pretty high for abuse, given the legal ramifications of submitting false reports, which it says is punishable by penalty of perjury. The signal has also apparently been designed to prevent abuse.

    Google also reminds us that it is only one of over 200 signals, and that it i not using the signal to remove sites from listings.

  • Webmasters Suspect A Significant Google Algorithm Update Just Happened

    There are some rumblings that Google may have made a significant update to its algrorithm this week, sometime in the vicinity of August 28. Search Engine Roundtable points to a WebmasterWorld forum thread where some webmasters are commenting on substantial changes in their traffic – some positive, some negative.

    Here are a couple sample comments from the forum:

    “20% google traffic improvement as of yesterday… Only about 500% left to go to get back to normal… But like the direction!”

    “I noticed a massive drop in long tail yesterday. Our site only went live in June, and there have been no indications of Panda/Penguin.”

    One reader commented on the SER post, “I’m seeing a very obvious increase in traffic for several large clients, two of which their traffic has increased by about 25% or so – Obviously something is going on, but at least (for once) it doesn’t seem to have affected me negatively!”

    Dr. Peter Meyers, who recently shared that interesting data bout Google SERPs with less than ten organic results, at SEOmoz, also commented on the article, indicating that he’s seen “no sign of anything major,” based on MozCast numbers.

    There is question as to whether or not Google has tweaked the Penguin update. Another person commenting on that post noted that the lack of more comments doesn’t seem “jarring and jolting”. This is in reference to what Google’s Matt Cutts recently (reportedly) said at SES about what to expect from upcoming updates.

    Google updates its algorithms over 500 times a year. Some have far greater impact than others. This could be one of those not so major ones.

    We’ve reached out to Google for comment, and will update if we receive one.

  • Matt Cutts Has To Explain Paid Links To Newspaper

    Google’s Matt Cutts has a new blog post up about paid links. He says he was contacted by an unnamed newspaper who saw its Pagerank drop from a 7 to a 3, and wanted to know why. The reason, as Cutts explains, was because the site was selling links that passed PageRank, which is, of course, a violation of Google’s quality guidelines.

    Cutts shares an email he sent to the newspaper (leaving out the identifying info). Here’s a chunk of what he had to tell them:

    In particular, earlier this year on [website] we saw links labeled as sponsored that passed PageRank, such as a link like [example link]. That’s a clear violation of Google’s quality guidelines, and it’s the reason that [website]‘s PageRank as well as our trust in the website has declined.

    In fact, we received a outside spam report about your site. The spam report passed on an email from a link seller offering to sell links on multiple pages on [website] based on their PageRank. Some pages mentioned in that email continue to have unusual links to this day. For example [example url] has a section labeled “PARTNER LINKS” which links to [linkbuyer].

    So my advice would be to investigate how paid links that pass PageRank ended up on [website]: who put them there, are any still up, and to investigate whether someone at the [newspaper] received money to post paid links that pass PageRank without disclosing that payment, e.g. using ambiguous labeling such as “Partner links.” That’s definitely where I would dig.

    Cutts goes on to suggest that after the site completes an investigation, and gets rid of any paid links that pass PageRank, it submit a reconsideration request.

    In the comments section of the post, Cutts notes that a drop in PageRank toolbar is an indication of Google’s decreased trust in a site. In this case, because of link selling.

    Of course since the Penguin update (and even before it), people have been getting messages from Google about bad links, and it’s caused a lot of panic. This panic seems to be reflect in the comments of Cutts’ post, with some webmasters wondering if they should simply place nofollow on all of their links to avoid Google penalties.

    Well, if everyone put nofollow on all of their links, it would pretty much render PageRank meaningless, wouldn’t it?

    One reader suggests that PageRank shouldn’t even be made visible to the public, as high PageRank blogs draw more spam.

    The subject of paid links also came up in this Webmaster Hangout Google hosted yesterday.

  • Watch This Google Webmaster Hangout From Monday

    Ever wish you had a chance to sit in on one of Google’s Webmaster Central Office Hours hangouts, but just can’t find the time, or they just to don’t correspond well with your schedule? Luckily, Google sometimes makes them available for later viewing, and you can skip around and find the parts most relevant to you.

    Here’s a hangout Google’s John Mueller hosted on Monday. It’s over an hour long, but there’s also a transcript available on the actual YouTube video page, if you’d rather simply peruse that.

  • Removal Requests Actually Down, Following Google Algorithm Change

    On August 10, Google announced that it would be updating its algorithm the following week to include a new ranking signal for the number of “valid copyright removal notices” it receives for a given site.

    Do you think Google’s addition of this signal is a good thing for search results? Let us know in the comments.

    “Sites with high numbers of removal notices may appear lower in our results,” said Google SVP, Engineering, Amit Singhal, at the time. “This ranking change should help users find legitimate, quality sources of content more easily—whether it’s a song previewed on NPR’s music website, a TV show on Hulu or new music streamed from Spotify.”

    One might have expected the removal request floodgates to have been opened upon this news, but that does not appear to be the case. In fact, interestingly, it has been kind of the opposite, according to Google’s Transparency Report.

    Barry Schwartz at Search Engine Roundtable points out that from August 13 to August 20, the number of URLs requested to be removed from Google search per week, actually decreased, going from 1,496,220 to 1,427369. It’s only a slight decrease, but the fact that it decreased at all, following this news, is noteworthy.

    URLs requested to be removed

    When Google first announced the algorithm change, it immediately sparked a great deal of criticism from bloggers and webmasters and concern from consumer groups. “In particular, we worry about the false positives problem,” the EFF said at the time. “For example, we’ve seen the government wrongly target sites that actually have a right to post the allegedly infringing material in question or otherwise legally display content. In short, without details on how Google’s process works, we have no reason to believe they won’t make similar, over-inclusive mistakes, dropping lawful, relevant speech lower in its search results without recourse for the speakers.”

    Public Knowledge has spoken out about the change as well. Senior staff attorney John Bergmayer previously said in a statement, “Sites may not know about, or have the ability to easily challenge, notices sent to Google. And Google has set up a system that may be abused by bad faith actors who want to suppress their rivals and competitors. Sites that host a lot of content, or are very popular, may receive a disproportionate number of notices (which are mere accusations of infringement) without being disproportionately infringing. And user-generated content sites could be harmed by this change, even though the DMCA was structured to protect them.”

    “Google needs to make sure this change does not harm Internet users or the Internet ecosystem,” he added.

    Interestingly enough, Public Knowledge actually receives contributions from Google, as indicated in new court document Google provided in the Oracle case. “Google has contributed to Public Knowledge for years before the complaint in the case at bar was filed,” wrote Google attorney Robert Van Nest.

    Regarding inaccurate and intentionally abusive copyright removal requests, Google says, “From time to time, we may receive inaccurate or unjustified copyright removal requests for search results that clearly do not link to infringing content. An independent, third-party analysis of how frequently improper and abusive removal requests are submitted was conducted in 2006.”

    That was six years ago, and does little to set webmasters’ minds at ease. On an FAQ page, Google lists a number of examples of requests that were submitted that were “clearly invalid,” and notes that it did not comply with any of them.

    In case you’re wondering how many of the requests Google does comply with, the company says on the same page, “We removed 97% of search results specified in requests that we received between July and December 2011.”

    “We remove search results that link to infringing content in Search when it is brought to our attention, and we do it quickly,” Google adds. “As of May 2012, our average processing time across all removal requests submitted via our web form for Search is approximately 10 hours. However, many different factors can influence the processing time for a particular removal request, including the method of delivery, language, and completeness of the information submitted.”

    As far as webmasters being informed of the issue by Google, the company says, “When feasible and legal to do so, we try our best to notify users to give them an opportunity to submit a counter-notice in response to copyright removal requests. For Search, it is extremely difficult to provide meaningful notice to webmasters whose pages have been identified in copyright removal requests, because we do not necessarily know their identities or have an effective means of contacting them. If users have registered with our Webmaster Tools as web site owners, we will notify them there. We also share a copy of qualifying copyright removal requests with the public site Chilling Effects, where a webmaster may inspect it as well.”

    For the past month, Google says 5,680,830 URLs have been requested to be removed from 31,677 domains by 1,833 and 1,372 reporting organizations. The top copyright owners in the past month have been Froytal Services, RIAA member companies, Microsoft, NBCUniversal and BPI. The top specified domains have been filestube.com, torrenthound.com, isohunt.com, downloads.nl and filesonicsearch.com.

    You can see all copyright removal requests here. You can see a big list of 133,502 specified domains here. A list of 9,660 reporting organizations is available here. The list of over ten thousand copyright owners is here.

    All data reflects copyright removal notices received for search since 2011, with some omissions, which include requests for products other than Google Search (like YouTube and Blogger), and requests submitted by means other than Google’s web form (such as fax or written letters).

    It’s important to note that while Google is now using the number of removal requests a site receives as a ranking factor, it is still only one of over two hundred factors. But the negative SEO ramifications of the signal still have people worried. Negative SEO was a growing concern before this signal was even announced, particularly as it’s related to bad links and the Penguin update. Now there is concern that competitors can submit notices, and influence Google. Whether this can be done successfully or not really remains to be seen. Google seems to be giving the impression that it cannot, as Google only complies with “valid” requests, but when was the last time Google executed an algorithm update flawlessly?

    Google even recently reworded its help page for the question “Can competitors harm ranking?”. It used to say, “There’s almost nothing a competitor can do to harm your ranking or have your site removed from our index.” It was changed to say, “Google works hard to prevent other webmasters from being able to harm your ranking or have your site removed from our index.”

    But, as the image above shows, it doesn’t appear that Google’s announcement has led to too a substantial increase in attempted abuse so far. That doesn’t mean it’s not possible to abuse it, and that people aren’t trying to abuse it. People were probably already trying to abuse it. While the number may be down since the announcement, the greater trend is clearly that of substantial growth in the number of requests. It will be surprising if the trend does not ultimately continue upward. We’re still waiting on the latest numbers to come out.

    Are you worried about URL removal requests as a ranking signal? Share your thoughts in the comments.

  • Now It’s Even Harder To Get First-Page Google Rankings

    It seems like the chances for sites to get their content into organic Google search results is continuing to decrease. In a recent article, we looked at some of the recent changes Google has made to its algorithm, including things to make it better at natural language, give it a decreased dependence on keywords, and giving users more direct answers, and therefore not having to direct them to other sites as much.

    Have Google’s results pages gotten better or worse? Let us know what you think in the comments.

    This, alone, makes a lot of webmasters uneasy, and highlights the need for sites to diversify their sources of web traffic. Google only wants to get better and better at this. Google wants to deliver the best user experience possible, and users want to go on about their business as quickly as possible. This is easier to do if Google can provide the answer itself. Lost traffic, however, could be an unfortunate side effect for content providers.

    Wait, didn’t there used to be more search results on this page?

    Now, there’s a separate, but related topic being discussed by the webmaster community. Google appears to be showing less organic results for SERPs that contain a result with its sitelinks feature. You know, the ones that look like this:

    Sitelinks

    Specifically, for many SERPs that display these kinds of results, Google is now showing only a total of 7 organic search results (that’s regular results, not including any universal search results that might appear):

    Seven Organic Results

    There has been discussion about this in the WebmasterWorld forums over the past couple weeks. “Google wants to get people to their answer quickly, and if the query has a history of being too ambiguous, they certainly have the ability to measure that and throw a tag to change from the normal SERP. Just as there was QDF (for query deserves freshness) they might have something like “QDD” or query deserves disambiguation,” said forum admin Tedster.

    Danny Sullivan at Search Engine Land shared a statement from Google about the matter, saying, “We’re continuing to work out the best ways to show multiple results from a single site when it’s clear users are interested in that site. Separately, we’re also experimenting with varying the number of results per page, as we do periodically. Overall our goal is to provide the most relevant results for a given query as quickly as possible, whether it’s a wide variety of sources or navigation deep into a particular source. There’s always room for improvement, so we’re going to keep working on getting the mix right.”

    So this may be an experiment, but a lot of people are getting SERPs with fewer organic results, from fewer sites. It doesn’t bode well for organic SEO. It does seem to make sitellinks more important than ever.

    Dr. Peter J. Meyers, President of User Effect, has put out some research at SEOmoz, finding that Google is showing way more SERPs with less than ten results than ever before, and for the most part, these results have 7 results a piece. Here are a couple of graphs he shared:

    SERPs with less than ten results

    null

    “SERPs with 7 results were an anomaly prior to 8/13, with the system tracking a maximum of one (0. 1%) on any given day. On 8/13, that number jumped to 10.7% and then, the following day, to 18.3%,” he writes. “Almost one-fifth of SERPs tracked by our data now have 7 results.”

    You can read his article for more about the methodology, and his additional findings.

    There has been some talk about this phenomenon being related to brand queries, but as Sullivan points out, there are plenty of examples of non-branded queries where this is happening, where the results contain one with sitelinks. It just so happens that a lot of brands do have sitelinks.

    Taking Advantage Of Sitelinks

    So, how do you get Google to display sitelinks for your site? Well, unfortunately, it’s not that simple. It’s pretty much up to Google.

    “We only show sitelinks for results when we think they’ll be useful to the user,” says Google in its help center. “If the structure of your site doesn’t allow our algorithms to find good sitelinks, or we don’t think that the sitelinks for your site are relevant for the user’s query, we won’t show them.”

    “At the moment, sitelinks are automated,” Google adds. “We’re always working to improve our sitelinks algorithms, and we may incorporate webmaster input in the future. There are best practices you can follow, however, to improve the quality of your sitelinks. For example, for your site’s internal links, make sure you use anchor text and alt text that’s informative, compact, and avoids repetition.”

    If Google is showing sitelinks for your site, but you don’t like certain links it’s showing, you can demote those links, telling Google not to consider it for a sitelink candidate. You can do this in Webmaster Tools. Go to the “For this search result box” in “Sitelinks” under “Site Configuration”. You can demote up to 100 URLs, but Google says it may take a while to be reflected in the search results.

    But that’s about as much control as you have over it right now. At least Google is hinting that it may give webmasters more control over sitelinks in the future. If sitelinks are having such an impact on SERPs these days, perhaps sooner rather later would be a good idea.

    But back to the point at hand…

    Search has been moving further and further away from the classic “ten blue links” format for years, but now Google is clearly giving you fewer opportunities to just rank on the first page in organic links than it used to, at least for a growing number of queries (and who’s to say that number won’t continue to grow?).

    This probably means that you’ll need to put more focus on getting into Google’s other types of results more than ever, depending on what types of results Google is showing for the queries for which you want to be found. That could mean optimizing for image search, Places, YouTube, Google News, or of course paying for AdWords ads and/or Google Shopping results.

    Interestingly enough, as Google wants to get users answers more quickly (and directly in many cases), the company still faces pressure from publishers who actually don’t want Google benefiting from their content without paying them. It seems pretty backwards, when you consider all of the sites who just want to show up in the results at all.

    Should Google be showing less organic search results on its pages? Tell us what you think.

  • Google Panda Update: Data Refresh Launched Monday

    Google tweeted today that it launched a data refresh for the Panda update on Monday. According to the company, about 1% of queries were noticeably affected.

    This is the first known Panda-related tweak we’ve seen from Google in August. In June, Google incorporated new data into the algorithm, and launched two data refreshes. In July, Panda was launched on google.co.jp and google.co.kr Additionally, Panda data was updated again.

    Keep in mind that this latest adjustment was just a data refresh. These tend to have a much smaller impact on results than actual updates. For more on the difference between an update and a data refresh, refer to what Google’s Matt Cutts has said on the matter.

    More Panda Update coverage here.

  • Google Webmaster Tools Gets Alerts For Search Queries Data

    Google announced today that it is adding alerts for Search Queries data to Webmaster Tools to complement its recently rolled out Crawl Errors alerts.

    You can get the alerts forwarded to your inbox if you sign up for email forwarding in Webmaster Tools.

    Search Queries Alerts

    “We know many of you check Webmaster Tools daily (thank you!), but not everybody has the time to monitor the health of their site 24/7,” says Webmaster Tools tech lead Javier Tordable in a blog post. “It can be time consuming to analyze all the data and identify the most important issues. To make it a little bit easier we’ve been incorporating alerts into Webmaster Tools. We process the data for your site and try to detect the events that could be most interesting for you.”

    “The Search Queries feature in Webmaster Tools shows, among other things, impressions and clicks for your top pages over time,” says Tordable. “For most sites, these numbers follow regular patterns, so when sudden spikes or drops occur, it can make sense to look into what caused them. Some changes are due to differing demand for your content, other times they may be due to technical issues that need to be resolved, such as broken redirects. For example, a steady stream of clicks which suddenly drops to zero is probably worth investigating.”

    He also notes that Google is still working on the sensitivity threshold for the messages.

  • Google Penguin Update: You Ain’t Seen Nothing Yet

    Last year, Google launched the Panda update, and wreaked havoc across the web on sites doing little to contribute to the quality of content appearing throughout Google’s search engine. This year, it’s been the Penguin update doing the wreaking (with Panda continuing to do its job at the same time). There has been plenty of panic among webmasters caused by the Penguin update, primarily in the inbound links department, and from the sound of it, that’s really just getting started.

    Is Google’s Penguin update making the web better? Is it making Google better? Let us know what you think in the comments.

    Now, some would say that an update like Penguin is good for Google and for the web at large. It’s hard to argue that an algorithm update designed to get rid of spam is truly a bad thing. At the same time, many webmasters feel they are being unjustly punished by Google, and are essentially bringing a rocket launcher to a knife fight in the battle to get back into Google’s good graces. By doing so, they’re trying to exterminate links, which they may even find valuable, if not for fear of Google.

    Based on recent comments from Google’s Matt Cutts, I would not expect this mentality to change anytime soon.

    Cutts appeared at the Search Engine Strategies conference in San Francisco this week, talking about a variety of search-related topics, and of course touting Google’s Knowledge Graph.

    Inevitably, the subject of the Penguin update came up. According to a paraphrased account of Cutts’ talk, he said webmasters could expect updates to be “jarring” for a while.

    Matt Cutts commented on a Search Engine Roundtable blog post about it, saying:

    Hey Barry, I wasn’t saying that people needed to overly stress out about the next Penguin update, but I’m happy to give more details. I was giving context on the fact that lots of people were asking me when the next Penguin update would happen, as if they expected Penguin updates to happen on a monthly basis and as if Penguin would only involve data refreshes.

    If you remember, in the early days of Panda, it took several months for us to iterate on the algorithm, and the Panda impact tended to be somewhat larger (e.g. the April 2011 update incorporated new signals like sites that users block). Later on, the Panda updates had less impact over time as we stabilized the signals/algorithm and Panda moved closer to near-monthly updates. Likewise, we’re still in the early stages of Penguin where the engineers are incorporating new signals and iterating to improve the algorithm. Because of that, expect that the next few Penguin updates will take longer, incorporate additional signals, and as a result will have more noticeable impact. It’s not the case that people should just expect data refreshes for Penguin quite yet. Emphasis added.

    Still in the early stages. Will have a more noticeable impact. In other words, Google is just getting started with Penguin, and you ain’t seen nothing yet.

    Reader Josh Bachynski, responding to Cutts’ comment, said, “Matt, can you please tell us exactly what to fix now then so we are not caught off guard? Don’t give us the secret sauce, just be transparent and say ‘watch your linking text’ or ‘check your HTML for inadvertent alt attributes with keywords in them’ or ‘delete all your old links on ‘put-it-there-yourself’ pages (or nofollow them)’ or whatever this new penguin eats 🙂 That would be awesome transparency that does not give anything new away, just focuses our efforts.”

    Cutts responded to him on Twitter, saying:

    So, I don’t expect the mad rush by webmasters to have links removed anytime soon. I don’t expect to see less instances where people are charging to remove links. Yep, this is what the web has come to.

    Of course, webmasters are still waiting on that tool that allows them to tell Google what links to ignore. That is supposedly still coming, and hopefully the next time the Penguin terrorizes its targets, the tool will be available. It would not only make things easier on the webmasters who are trying to clean up their link profiles, but for all the sites that have to deal with webmasters freaking out because they’re afraid of links.

    Are you ready for more Penguin? How do you expect it to change Google results? Share your thoughts in the comments.

    Image: Batman Returns (Warner Bros.)

  • Matt Cutts Clarifies What He Said About Twitter (On Twitter)

    Matt Cutts appeared at Search Engine Strategies this week. In addition to talking up the Knowledge Graph and scaring people about the Penguin update, he talked briefly about Google’s relationship with Twitter.

    First, we linked to a liveblogged account of Cutts’ session from State Of Search, which paraphrased him as saying:

    Danny [Sullivan] asks ‘Can’t you see how many times a page is tweeted? I can see it, I could call you’.

    Cutts: we can do it relatively well, but if we could crawl Twitter in the full way we can, their infastructure wouldn’t be able to handle it.

    In a later article on what SEOmoz CEO Rand Fishkin had to say about Twitter’s impact on SEO, we also referenced Brafton’s version, which paraphrased Cutts as saying:

    People were upset when Realtime results went away! But that platform is a private service. If Twitter wants to suspend someone’s service they can. Google was able to crawl Twitter until its deal ended, and Google was no longer able to crawl those pages. As such, Google is cautious about using that as a signal – Twitter can shut it off at any time.

    We’re always going to be looking for ways to identify who is valuable in the real world. We want to return quality results that have real world reputability and quality factors are key – Google indexes 20 billion pages per day.

    The Brafton piece also indicated that Cutts said that Google can’t crawl Facebook pages or Twitter accounts. It was later updated, but this led to Fishkin asking Cutts about that on Twitter, which led to some more from Cutts on the matter.

     
     
     

  • Matt Cutts: Google Updates Will Be Jarring For A While

    Google Distinguished Engineer Matt Cutts made a surprise appearance at the Search Engine Strategies conference in San Francisco this morning, dropping a fair amount of knowledge on the SEO community.

    Bas van den Beld at State Of Search has a liveblogged recap of the session. He paraphrases Cutts:

    When people asked Cutts about the next Penguin Update he thought: You don’t want the next Penguin update, the engineers have been working hard…We are constantly improving. The updates are going the be jarring and julting for a while. Emphasis added.

    Cutts also talked briefly about Google’s relationship with Twitter data. More paraphrasing from the liveblog:

    Danny [Sullivan] asks ‘Can’t you see how many times a page is tweeted? I can see it, I could call you’.

    Cutts: we can do it relatively well, but if we could crawl Twitter in the full way we can, their infastructure wouldn’t be able to handle it.

    Cutts talked about a variety of search-related topics, including Knowledge Graph, SEO, the Penguin update, Webmaster Tools, etc. He also noted that the Google Search Quality team is now the Knowledge team.

    Here’s some paraphrasing from attendees on Twitter:

  • Google Reveals Some Recent Changes To How It Ranks Results

    Google released a giant list of 86 “search quality” changes it made in June and July, beyond the various changes it had already blogged about. We’re breaking down the list by various categories to take a closer look at the kinds of things Google has been up to. So far, we’ve looked at:

    Now, let’s look at the changes Google has revealed, which are directly related to how Google ranks search results.

    First, Google says the change referred to as “ng2,” better orders top results using a new and improved ranking function for combining several key ranking features.

    Another list entry (Ref-16) involves changes to an “official pages” algorithm to “improve internationalization”. This is part of Google’s “Other Ranking Components” project.

    Another change listed under the project codename “Other Ranking Components,” includes one that helps you find more high quality content from trusted sources (#82367). Also under that label are several, which Google says will help make its system better for clustering web results “better and simpler.” These include: #82541, NoPathsForClustering, and bergen. “We’ve made our algorithm for clustering web results from the same site or same path (same URL up until the last slash) more consistent,” says Google of NoPathsForClustering.

    Change #81933, Google says, improves use of query synonyms in ranking. “Now we’re less likely to show documents where the synonym has a different meaning than the original search term,” Google says.

    Another change (Manzana2) improves the clustering and ranking of links in the expanded sitelinks feature. Another (“Improvements to Images Universal ranking”) improves Google’s ability to show universal image search results on infrequently searched-for queries, and another improves the efficiency of Google’s Book Search ranking algorithms. It makes them more consistent with Web Search, Google says.

    Have you noticed if any of these have had a direct impact on how Google ranks your pages?

  • 23 Recent Changes Google Made To Give You Better Quick Answers

    On Friday, Google released a list of 86 changes it made to its search engine over June and July. Some of them were directly related to quality, some related to ranking, some related to mobile, some related to how it understands natural language, and others show a decreased dependency on keywords from Google’s algorithms.

    Many of the changes are related to the direct answer-style results Google provides for some types of queries. Each change on the list is accompanied by a related project, and the project codenamed “Answers,” was the subject of 23 of the entries.

    Here’s what they did:

    1. Google added a live result showing schedule and scores of the EURO 2012 games (European championship of national soccer teams).

    2. Google improved its dictionary search feature, adding support for more natural language searches.

    3. Google made changes to its calculator feature to improve recognition of queries containing “and,” such as [4 times 3 and a h half]. Really, this is about natural language too.

    4. Google made an unspecified improvement for showing the sunrise and sunset times feature.

    5. Google added live results for NASCAR, MotoGP and IndyCar, in addition to the Forumula1 results, which were already available.

    6. Google improved natural language detection for the time feature, so it can better understand queries like, “What time is it in India?”

    7. Google made changes to the movie showtimes feature on mobile, improving recognition of natural language queries, as well as overall coverage.

    8. Google made other improvements to the calculator feature on mobile to improve how it handles queries that contain both words and numbers (like: 4 times 3 divided by 2).

    9. Google enabled natural language detection for currency conversion to better understand queries like “What is $500 in Euros?”

    10. Google enabled natural language detection for the flight status feature to better understand queries about flight arrival times and status.

    11. Google improved the triggering for the “when is” feature, and the understanding of queries like like “When is Mother’s Day?”

    12. Google made changes to the display of the weather feature.

    13. Google improved natural language detection for the unit conversion feature, so it an better understand queries like “What is 5 miles in kilometers?”

    14. Google changed the display of the finance feature for voice search queries on mobile.

    15. Google improved natural language processing for the dictionary search feature.

    16. Google changed the display of the weather search feature, so you can ask things like [weather in california] or [is it hot in italy].

    17. Google improved natural language detection for the sunrise/sunset feature.

    18. Google improved its detection of queries about weather.

    19. Google changed the display of the local time search feature.

    20. Google improved natural language detection to better understand queries about baseball and return the latest baseball info about MLB, like schedules and the latest scores.

    21. Google changed the display of local business info in certain, but unspecified mobile use cases. Google says it highlights info relevant to the search (like phone numbers, hours, etc.)

    22. Google improved its understanding of calculator-seeking queries.

    Those were all in June.There was only one entry related to “Answers” in the July list:

    23. Google made it so you could search, and find “rich, detailed” info about the latest schedule, medal counts, events and record-breaking moments for the Olympics.

    Of course, these lists leave out things Google otherwise blogged about, such as Flight Search in Canada and the interactive weather visualization for tablets.

    Also, just last week, Google made additional improvements to quick answers and the Knowledge Graph.

  • Remember, Google’s Newest Ranking Signal Is Only 1 Of Over 200

    Google announced on Friday that starting this week, it would begin using the number of “valid” copyright removal notices it gets for a site as a ranking signal. This immediately rubbed a lot of people the wrong way.

    In fact, various groups were quick to speak out about Google’s new policy. The EFF, for example, called the policy “opaque,” and expressed its concerns about how Google will make its determinations, and about the road to recourse (or lack thereof) that sites will have.

    “Sites may not know about, or have the ability to easily challenge, notices sent to Google,” said Public Knowledge Senior Staff Attorney, John Bergmayer. “And Google has set up a system that may be abused by bad faith actors who want to suppress their rivals and competitors. Sites that host a lot of content, or are very popular, may receive a disproportionate number of notices (which are mere accusations of infringement) without being disproportionately infringing. And user-generated content sites could be harmed by this change, even though the DMCA was structured to protect them.”

    Other have questioned how Google will deal with these notices with regards to its own properties – namely, YouTube. YouTube, of course, gets plenty of takedown requests, but they go through a different system (which Danny Sullivan has broken down into great detail). In fact, the takedown request form Google pointed to in its announcement of the feature, specifically mentions YouTube:

    “If you have a specific legal issue concerning YouTube, please visit this link for further information. Please do not use this tool to report issues that relate to YouTube.”

    Sullivan says Google told him, however, that “notices filed against YouTube through the separate YouTube copyright infringement reporting system will be combined with those filed against YouTube through the Google Search reporting system,” and that Google will treat YouTube like any other site. However, he reports, Google does not expect YouTube to be negatively affected by this, nor does it expect other popular user-generated content sites. Google, he says, told him that it will take into account other factors, besides the number of notices it receives.

    Well, this makes sense, because Google was pretty clear in its announcement that it was simply adding this as a signal – as in one of over 200.

    “We aim to provide a great experience for our users and have developed over 200 signals to ensure our search algorithms deliver the best possible results,” Google’s Amit Singhal said. “Starting next week, we will begin taking into account a new signal in our rankings: the number of valid copyright removal notices we receive for any given site.”

    YouTube and other popular sites likely have enough other signals working in their favor to counter this one signal. It doesn’t sound like Google’s newest signal is necessarily going to be its weightiest.

  • Google Moves Further Away From Keyword Dependence

    It seems that June, for Google, was all about improving how the search engine deals with natural language. On Friday, Google released a giant list of changes it made over the course of June and July. There were 86 entries on the list. Ten of them were specifically about natural language search improvements, and nine out of those ten were changes made in June.

    In addition to those ten changes, there were also five list entries dealing specifically with synonyms, which one might say are also related to natural language. All five of those were also made in June. The listings are as follows:

  • #81933. [project codename “Synonyms”] This launch improves use of query synonyms in ranking. Now we’re less likely to show documents where the synonym has a different meaning than the original search term.
  • gallium-2. [project codename “Synonyms”] This change improves synonyms inside concepts.
  • zinc-4. [project codename “Synonyms”] This change improves efficiency by not computing synonyms in certain cases.
  • #82460. [project codename “Snippets”] With this change we’re using synonyms to better generate accurate titles for web results.
  • #81977. [project codename “Synonyms”] This change updates our synonyms systems to make it less likely we’ll return adult content when users aren’t looking for it.
  • The synonym-related changes indicate that Google is getting better at understanding what we mean, and what we’re looking for.

    The better Google gets at understanding the way users search in terms of the language they use, the more Google is getting away from its dependence on keywords for delivering relevant results, which appears to be one of Google’s main goals as a search engine.

    In fact, that’s exactly what the Knowledge Graph is all about. “Things, not strings,” as Google likes to put it.

    By the way, Google announced last week that it was expanding the Knowledge Graph globally (in English), and adding more to the Knowledge Graph interface on the SERPs.

  • 10 Natural Language Search Improvements Google Has Recently Made

    Late on Friday, Google finally unveiled its “search quality highlights” lists for both June and July in one fell swoop. In all, there were 86 changes to wade through, and we’re still wading. You can see the whole list here.

    We looked at the quality-specific changes here, and the mobile-specific changes here. Now, let’s look at all the natural language search stuff Google had on the lists. There were ten different items.

    1. One change (#82293), Google says, improves its dictionary search feature by adding support for more natural language searches. Unfortunately, it doesn’t get any more specific than that.

    2. Another (timeob), Google says, improves natural language detection for the time feature, so that it will better understand questions like, ‘What time is it in India?’

    3. Change #82496 involves the movie showtimes feature on mobile. Google says it improves recognition of natural language queries, as well as overall coverage.

    4. One change (#82537) is the addition of natural language detection for currency conversion. This will enable Google to better understand queries like, ‘What is $500 in euros?,’ the company says.

    5. A similar feature (#82519) has been added for the flight status feature so Google can better understand questions about flight arrival times and status.

    6. With change #81776, Google says it has improved natural language detection for its unit conversion feature, sot that it will better understand questions like, “What is 5 miles in kilometers?”

    7. Google says it has improved natural language processing for the dictionary search feature (#82887).

    8. Google has made changes to the natural language detection for its sunrise/sunset feature (#82935).

    9. Google says it has improved natural language detection for its MLB results, to better understand queries about baseball (schedules, scores, etc.). (#82536)

    10. Finally, Google made another natural language-related change to the movie showtimes feature, claiming it has improved the natural language processing. (#82948)

    All but number 10 were changes Google made in June. It appears that there was far more focus on natural language queries throughout that month than more recently.

    Do you think Google has improved significantly in natural language search?

  • Here’s What Google Has Been Doing For Quality (And Panda) For The Past Two Months

    Google finally released its big lists of algorithm changes for the months of June and July, after an unexplained delay. In all, there were 86 changes on the lists to dig through, and we’re still analyzing them.

    In this post, we’ll look at all of the entries directly related to quality. There are a bunch. Technically, all 86 changes could be described as being related to quality, but these are the ones where Google specifically mentions “quality” in the listings.

    In June, Google had listings called Bamse and Bamse-17L, both under the project codename “Page Quality”. Both launches, Google says are to help you find more high-quality content from trusted sources.

    Also under the project Codename “Page Quality,” in June, Google launched: GreenLandII, #82353, #82666, and Hamel. #82666 is also to help you find more high-quality content from trusted sources.

    “We’ve incorporated new data into the Panda algorithm to better detect high-quality sites and pages,” says Google of GreenLandII.

    #82353, Google says, refreshes data for the “Panda high-quality sites algorithm.”

    Hamel updates a model Google uses to help you find high-quality pages with unique content, Google says.

    #82367, under the project codename “Other Ranking Components,” Google says, also helps you find more high-quality content from trusted sources.

    A change called, “PandaMay,” under the project codename “Search Quality,” is a data refresh for Panda.

    Finally, a change referred to as ItsyBitsy, under the “Images” codename is designed to improve the quality of image results. Google says it filters tiny, unhelpful images at the bottom of the image results pages.

    That about covers it for June in the “quality” department.

    In July, there were fewer changes altogether, and fewer specifically mentioning “quality”. There was the Panda JK launch (under the codename “Page Quality”). This was simply the launch of Panda in Japan and Korea.

    There was also JnBamboo (“Page Quality”), which was a data update for Panda.

    I think the main takeaways from all of this are:

    • Google is still very much focused on making improvements to Panda, which is all about surfacing high quality content.
    • Being a “trustworthy” source of content is more important than ever. You must establish yourself as an authority with a good track record of trustworthiness.
    • Unique content is important. Find ways to make your content stand out from the crowd.

    Image: gigglecam (YouTube)

  • Google Algorithm Changes For June, July Finally Released

    I’ve written a couple of times about how Google seemed to be falling behind on letting us know about its monthly algorithm changes. Finally, Google has made it right and posted the lists for both June and July in one giant list of 86 changes.

    We’ll certainly be digging into these more soon, but for now, here’s the entire list(s) in its entirety:

    June:

    • uefa-euro1. [project codename “Answers”] Addition of a live result showing schedule and scores of the EURO 2012 games (European championship of national soccer teams).
    • #82293. [project codename “Answers”] Improved dictionary search feature by adding support for more natural language searches.
    • Better HTML5 resource caching for mobile. [project codename “Mobile”] We’ve improved caching of different components of the search results page, dramatically reducing latency in a number of cases.
    • ng2. [project codename “Other Ranking Components”] Better ordering of top results using a new and improved ranking function for combining several key ranking features.
    • Ref-16. [project codename “Other Ranking Components”] Changes to an “official pages” algorithm to improve internationalization.
    • Bamse. [project codename “Page Quality”] This launch helps you find more high-quality content from trusted sources.
    • Bamse-17L. [project codename “Page Quality”] This launch helps you find more high-quality content from trusted sources.
    • GreenLandII. [project codename “Page Quality”] We’ve incorporated new data into the Panda algorithm to better detect high-quality sites and pages.
    • #82353. [project codename “Page Quality”] This change refreshes data for the Panda high-quality sites algorithm.
    • SuperQ2. [project codename “Image”] We’ve updated a signal for Google Images to help return more on-topic image search results.
    • #82743. [project codename “Answers”] Changes to the calculator feature to improve recognition of queries containing “and,” such as [4 times 3 and a half].
    • komodo. [project codename “Query Understanding”] Data refresh for system used to better understand and search for long-tail queries.
    • #82580. [project codename “Answers”] This is an improvement for showing the sunrise and sunset times search feature.
    • PitCode. [project codename “Answers”] This launch adds live results for Nascar, MotoGP, and IndyCar. This is in addition to Formula1 results, which were already available.
    • timeob. [project codename “Answers”] We’ve improved natural language detection for the time feature to better understand questions like, “What time is it in India?”
    • #81933. [project codename “Synonyms”] This launch improves use of query synonyms in ranking. Now we’re less likely to show documents where the synonym has a different meaning than the original search term.
    • #82496. [project codename “Answers”] Changes made to the movie showtimes feature on mobile to improve recognition of natural language queries and overall coverage.
    • #82367. [project codename “Other Ranking Components”] This launch helps you find more high-quality content from trusted sources.
    • #82699. [project codename “Other Search Features”] We’ve made it easier to quickly compare places. Now you can hover over a local result and see information about that place on the right-hand side.
    • CapAndGown. [project codename “Image”] On many webpages, the most important images are closely related to the overall subject matter of the page. This project helps you find these salient images more often.
    • #82769. [project codename “Answers”] Improvements to the calculator feature on mobile to improve handling of queries that contain both words and numbers such as [4 times 3 divided by 2].
    • Vuvuzela. [project codename “SafeSearch”] We’ve updated SafeSearch to unify the handling of adult video content in videos mode and in the main search results. Explicit video thumbnails are now filtered more consistently.
    • #82537. [project codename “Answers”] We’ve enabled natural language detection for the currency conversion feature to better understand questions like, “What is $500 in euros?”
    • #82519. [project codename “Answers”] We’ve enabled natural language detection for the flight status feature to better understand questions about flight arrival times and status.
    • #82879. [project codename “Answers”] We’ve improved the triggering for the “when is” feature and understanding of queries like, “When is Mother’s Day?”
    • wobnl0330. [project codename “Answers”] Improvements to display of the weather search feature.
    • Lime. [project codename “Freshness”] This change improves the interaction between various search components to improve search results for searches looking for fresh content.
    • gas station. [project codename “Snippets”] This change removes the boilerplate text in sitelinks titles, keeping only the information useful to the user.
    • #81776. [project codename “Answers”] We’ve improved natural language detection for the unit conversion feature to better understand questions like, “What is 5 miles in kilometers?”
    • #81439. [project codename “Answers”] Improved display of the finance feature for voice search queries on mobile.
    • #82666. [project codename “Page Quality”] This launch helps you find more high-quality content from trusted sources.
    • #82541. [project codename “Other Ranking Components”] This is one of multiple projects that we’re working on to make our system for clustering web results better and simpler.
    • gaupe. [project codename “Universal Search”] Improves display of the flights search feature. Now, this result shows for queries with destinations outside the US, such as [flights from Austin to London].
    • #82887. [project codename “Answers”] We’ve improved natural language processing for the dictionary search feature.
    • gallium-2. [project codename “Synonyms”] This change improves synonyms inside concepts.
    • zinc-4. [project codename “Synonyms”] This change improves efficiency by not computing synonyms in certain cases.
    • Manzana2. [project codename “Snippets”] This launch improves clustering and ranking of links in the expanded sitelinks feature.
    • #82921. [project codename “Alternative Search Methods”] We’ve improved finance results to better understand finance-seeking queries spoken on mobile.
    • #82936. [project codename “Answers”] Improved display of the weather search feature, so you can ask [weather in california] or [is it hot in italy].
    • #82935. [project codename “Answers”] We’ve improved natural language detection for the sunrise/sunset feature.
    • #82460. [project codename “Snippets”] With this change we’re using synonyms to better generate accurate titles for web results.
    • #82953. [project codename “Answers”] This change improves detection of queries about weather.
    • PandaMay. [project codename “Search Quality”] We launched a data refresh for our Panda high-quality sites algorithm.
    • ItsyBitsy. [project codename “Images”] To improve the quality of image results, we filter tiny, unhelpful images at the bottom of our image results pages.
    • localtimeob. [project codename “Answers”] We’ve improved display of the local time search feature.
    • #82536. [project codename “Answers”] We’ve improved natural language detection to better understand queries about baseball and return the latest baseball information about MLB, such as schedules and the latest scores.
    • Improvements to Images Universal ranking. [project codename “Universal Search”] We significantly improved our ability to show Images Universal on infrequently searched-for queries.
    • absum3. [project codename “Snippets”] This launch helps us select better titles to display in the search results. This is a change to our algorithm that will specifically improve the titles for pages that are in non-Latin based languages.
    • #83051. [project codename “Answers”] We’ve improved display of local business information in certain mobile use cases. In particular, we’ll highlight information relevant to the search, including phone numbers, addresses, hours and more.
    • calc2-random. [project codename “Answers”] This change improves our understanding of calculator-seeking queries.
    • #82961. [project codename “Alternative Search Methods”] When you search for directions to or from a location on your mobile device without specifying the start point, we’ll return results starting from your current position.
    • #82984. [project codename “Universal Search”] This was previously available for users searching on google.com in English, and now it’s available for all users searching in English on any domain.
    • #82150. [project codename “Spelling”] Refresh of our algorithms for spelling systems in eight languages.
    • NoPathsForClustering. [project codename “Other Ranking Components”] We’ve made our algorithm for clustering web results from the same site or same path (same URL up until the last slash) more consistent. This is one of multiple projects that we’re working on to make our clustering system better and simpler.
    • Hamel. [project codename “Page Quality”] This change updates a model we use to help you find high-quality pages with unique content.
    • #81977. [project codename “Synonyms”] This change updates our synonyms systems to make it less likely we’ll return adult content when users aren’t looking for it.
    • Homeland. [project codename “Autocomplete”] This is an improvement to autocomplete that will help users to get predicted queries that are more relevant to their local country.

    July:

    • #82948. [project codename “Other Search Features”] We’ve improved our natural language processing to improve display of our movie showtimes feature.
    • yoyo. [project codename “Snippets”] This change leads to more useful text in sitelinks.
    • popcorn. [project codename “Snippets”] We’ve made a minor update to our algorithm that detects if a page is an “article.” This change facilitates better snippets.
    • Golden Eagle. [project codename “Autocomplete”] When Google Instant is turned off, we’ll sometimes show a direct link to a site in the autocomplete predictions. With this change we refreshed the data for those predictions.
    • #82301. [project codename “Indexing”] This change improves an aspect of our serving systems to save capacity and improve latency.
    • #82392. [project codename “Indexing”] This launch improves the efficiency of the Book Search ranking algorithms, making them more consistent with Web Search.
    • Challenger. [project codename “Snippets”] This is another change that will help get rid of generic boilerplate text in Web results’ titles, particularly for sitelinks.
    • #83166. [project codename “Universal Search”] This change is a major update to Google Maps data for the following regions: CZ, GR, HR, IE, IT, VA, SM, MO,PT, SG, LS. This new data will appear in maps universal results.
    • #82515. [project codename “Translation and Internationalization”] This change improves the detection of queries that would benefit from translated results.
    • bergen. [project codename “Other Ranking Components”] This is one of multiple projects that we’re working on to make our system for clustering web results better and simpler.
    • Panda JK. [project codename “Page Quality”] We launched Panda on google.co.jp and google.co.kr to promote more high-quality sites for users in Japan and Korea.
    • rrfix4. [project codename “Freshness”] This is a bug fix to a freshness algorithm. This change turns off a freshness algorithm component in certain cases when it should not be affecting the results.
    • eventhuh4. [project codename “Knowledge Graph”] We’ll show a list of upcoming events in the Knowledge Graph for city-related searches such as [san francisco] and [events in san francisco].
    • #83483. [project codename “Universal Search”] This change helps surface navigation directions directly in search results for more queries.
    • Zivango. [project codename “Refinements”] This change leads to more diverse search refinements.
    • #80568. [project codename “Snippets”] This change improves our algorithm for generatingsite hierarchies for display in search result snippets.
    • Labradoodle. [project codename “SafeSearch”] We’ve updated SafeSearch algorithms to better detect adult content.
    • JnBamboo. [project codename “Page Quality”] We’ve updated data for our Panda high-quality sites algorithm.
    • #83242. [project codename “Universal Search”] This change improves news universal display by using entities from the Knowledge Graph.
    • #75921. [project codename “Autocomplete”] For some time we’ve shown personalized predictions in Autocomplete for users who’ve enabled Web History on google.com in English. With this change, we’re internationalizing the feature.
    • #83301. [project codename “Answers”] Similar to the live results we provide for sports like baseball or European football, you can now search on Google and find rich, detailed information about the latest schedule, medal counts, events, and record-breaking moments for the world’s largest sporting spectacle.
    • #83432. [project codename “Autocomplete”] This change helps users find more fresh trending queries in Japanese as part of autocomplete.


    Any of these catching your eye? Let us know what you think about any of them.

    Of course, Google also announced a big new ranking change that it will implement next week, which is already generating some controversy. Google will start looking at the number of takedown notices sites receive. More on that here.

  • New Google Algorithm Change Immediately Raises Concerns

    As previously reported, Google announced that it will implement a new ranking signal into its search algorithm next week. The search engine will start taking the number of “valid” copyright removal notices it receives for a site, into account when ranking content.

    Are you concerned about this new addition? Let us know in the comments.

    Almost as soon as the Blogosphere was able to react to the news, the Electronic Frontier Foundation (EFF) put out its own post about it. Julie Samuels and Mitch Stoltz with the EFF write, “Earlier this summer, we applauded Google for releasing detailed stats about content removal requests from copyright holders. Now that we know how they are going to use that data, we are less enthusiastic.”

    The two go on to express concerns with how “opaque” Google is being about the process, despite Google’s claim that it will “continue to be transparent about copyright removals.”

    The EFF’s concerns are the vagueness of what Google considers to be a high number of removal notices, how Google plans to make its determinations, and how “there will be no process of recourse for sites who have been demoted.”

    Google does say that it will “continue to provide ‘counter-notice‘ tools so that those who believe their content has been wrongly removed can get it reinstated.”

    “In particular, we worry about the false positives problem,” says the EFF. “For example, we’ve seen the government wrongly target sites that actually have a right to post the allegedly infringing material in question or otherwise legally display content. In short, without details on how Google’s process works, we have no reason to believe they won’t make similar, over-inclusive mistakes, dropping lawful, relevant speech lower in its search results without recourse for the speakers.”

    “Takedown requests are nothing more than accusations of copyright infringement,” the EFF addds. “No court or other umpire confirms that the accusations are valid (although copyright owners can be liable for bad-faith accusations). Demoting search results – effectively telling the searcher that these are not the websites you’re looking for – based on accusations alone gives copyright owners one more bit of control over what we see, hear, and read.”

    The EFF concludes by saying that Google’s “opaque policies” threaten lawful sites and undermine confidence in search results.

    The EFF is not the only group to quickly speak out about the announcement. Public Knowledge, a consumer rights group, also put out a much longer response.

    We also received the following statement from Public Knowledge Senior Staff Attorney, John Bergmayer:

    “It may make good business sense for Google to take extraordinary steps, far beyond what the law requires, to help the media companies it partners with. That said, its plan to penalize sites that receive DMCA notices raises many questions.

    “Sites may not know about, or have the ability to easily challenge, notices sent to Google. And Google has set up a system that may be abused by bad faith actors who want to suppress their rivals and competitors. Sites that host a lot of content, or are very popular, may receive a disproportionate number of notices (which are mere accusations of infringement) without being disproportionately infringing. And user-generated content sites could be harmed by this change, even though the DMCA was structured to protect them.

    “Google needs to make sure this change does not harm Internet users or the Internet ecosystem.”

    It’s going to be quite interesting to see how Google’s new policy/signal holds up to abuse, and whether or not we see fair use significantly jeopardized.

    Tell us what you think about the change in the comments.