WebProNews

Tag: Whitelist

  • Google Makes Whitelist Admission

    Google Makes Whitelist Admission

    Google’s often made a big deal about using automated solutions to handle problems.  The search giant hasn’t tried to address every single Google bomb individually, for example; it’s preferred to make algorithm changes that can tackle lots of weaknesses at once.  Only now an admission related to whitelists is getting some attention.

    At SMX West, Danny Sullivan put a question to Google’s Matt Cutts: “So Google might decide there’s some particular signal within the overall ranking algorithm that works for say 99% of the sites as Google hopes, but maybe that also hits a few outlying sites in a way they wouldn’t expect – in a way they feel harms the search results – then Google might except those sites?”

    Cutts indicated that Google might indeed (no exact quote’s available).

    That admission’s earning Google some negative attention, given the company’s traditional claims.  Cade Metz pointed out that, if Google has been giving some sites special attention, European antitrust regulators might now want to revisit old decisions.

    Lots of small business owners and site administrators may have new – and not so polite – questions and/or accusations for Google, as well.

    Google has tried to address the matter, though.  The company said in a statement:

    “Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with computer algorithms.  In our experience, algorithms generate much better results than humans ranking websites page by page.  And given the hundreds of millions of queries we get every day, it wouldn’t be feasible to handle them manually anyway.

    That said, we do sometimes take manual action to deal with problems like malware and copyright infringement.  Like other search engines (including Microsoft’s Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality.  We don’t keep a master list protecting certain sites from all changes to our algorithms.

    The most common manual exceptions we make are for sites that get caught by SafeSearch-a tool that gives people a way to filter adult content from their results.  For example, “essex.edu” was incorrectly flagged by our SafeSearch algorithms because it contains the word “sex.”  On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.

    Of course, we would much prefer not to make any manual changes and not to maintain any exception lists.  But search is still in its infancy, and our algorithms can’t answer all questions.”

  • Facebook, MySpace, YouTube Named Top Blacklisted Sites Of 2010

    Social media sites like Facebook, MySpace, and YouTube continue to polarize corporations and people in charge of networks, judging by a new report from OpenDNS.  The sites showed up on both "top blacklisted" and "top whitelisted" lists covering the entire year of 2010.

    If you’d like a definition of those terms, OpenDNS said in its report, "Blacklists are typically used when there is no desire to block an entire category in principle, but there is a focus on preventing traffic to specific websites based on a combination of their popularity and content.  This top ten list suggests a concern with the use of bandwidth by streaming sites and with privacy concerns from advertising networks."

    Then the company added, "Whitelists are typically used when there is a desire to block entire categories, but access to selected websites is granted on an exception basis.  These sites represent the most trusted sites in their category."

    All sorts of interpretations seem possible as a result.  An unpleasant one for Mark Zuckerberg: perhaps companies like Facebook less than Playboy.  Or, to head in the other direction, maybe the average office worker prefers Facebook to Playboy, thereby necessitating the "blacklist" response.

    Either way, the sites appearing on these lists can at least argue that they’ve succeeded in attracting everyone’s attention.  And home usage is likely to be high as a result, since blacklists and whitelists can’t restrict everyone’s browsing all the time.