WebProNews

Tag: SEO

  • This Google Panda “Victim” Just Posted Record Revenue And Profitability

    The Google Panda update, originally unleashed in early 2011, continues to take its toll on the Internet, for better or for worse (most would probably say better). While there are frequently rumors about new updates or refreshes to Panda, the last one that we’ve had official confirmation on was only last month (Update: Speak of the devil. Google just confirmed one is rolling out). Panda will continue to patrol Google’s search results for the foreseeable future, so webmasters who want to attract visibility in them should pay attention to the kinds of things Panda likes (or doesn’t like).

    Have you been hit by the Panda update at any time since it was first launched? Were you able to recover? Share your story.

    Demand Media paid attention when its major content property eHow fell victim to the update last year. Now, the company has released its quarterly earnings report to record revenue and popularity. If that’s not a Panda recovery story, I don’t know what is. It appears to be safe to say that Demand Media has conquered Panda, and is flourishing.

    Revenue was up 20% year-over-year at $98.1 million, with $3.2 million profit (compared to a $4.1 million loss for the same quarter last year).

    CEO Richard Rosenblatt said, “Demand Media’s audience surpassed 125 million monthly unique visitors during the third quarter, as we delivered record revenue and profitability. For the first time in over a year, we increased our content investments for two consecutive quarters as we expanded the distribution of our content platform. We remain focused on our long-term growth initiatives, which include continuing to increase our investment in core content as well as in opportunities across mobile, video, international, and new generic Top Level Domains.”

    Now, to be clear, Demand Media has revenue sources that have little to do with Panda. For one, the company runs a major registrar business. However, the content side of things, and even eHow itself, continue to improve in performance. Make no mistake. Demand Media has come back from the Panda update.

    The company’s owned and operated page views increased 33% year-over-year, driven primarily by strong traffic growth on eHow.com and LiveStrong.com, the company says.

    eHow has historically been the poster child of the Panda update, you might say. Some believe Demand Media was one of the major drivers in Google even creating the update. If you can remember back before the update was first unleashed, there was a lot of discussion in the media and Blogosphere about content farms. eHow was often cited (if not the most often cited) as falling into this category. In fact, many were shocked when Google finally pushed the update, and eHow appeared to escape unscathed.

    That did not last, however. As Google continued to push out more updates for Panda, Demand Media eventually felt the effects, and by then it was a public company, and had to answer to investors. It deleted tons of articles. At first the number it gave was 300,000. In May, Demand Media revealed that it had deleted as many as 600,000 articles. It’s unclear whether they’ve deleted more since then. They didn’t just delete articles they found to be of low quality though. They also sent numerous articles through a more rigorous editing process, and added a feedback tool to all content so users could indicate any problems they come across. They also got rid of a lot of non-professional writers, and added more “expert” and celebrity curators. Essentially, eHow got a big boost in the quality control department.

    Since the clean-up initiative, the company has hardly looked back. eHow has increased its audience steadily. Now, eHow is ranked as the #13 site in the U.S. according to comScore. That’s up even from the previous quarter, when it was ranked #16. ehow had over 100 million unique monthly visitors worldwide for the 11th consecutive quarter, according to Rosenblatt, who cited internal numbers.

    Demand Media’s properties are seeing a billion worldwide monthly uniques, which is a record for the company. Demand has been so pleased with the progress it has made in the content area, the company promoted Michael Blend, who had been leading its content and media services, to President and COO earlier this year.

    Rosenblatt discussed the progress during the company’s earnings conference call, attributing the success largely to articles, videos and mobile apps with quality content and engaged communities. “All in all we really raised our game,” he said, noting that they have expanded the diversity of articles and added assignment curators.

    He also noted that almost half of the company’s articles are being published to its network of content partners.

    One important thing to note about all of this, with regards to the Panda update and search referrals, is that this whole quality control initiative has greatly helped the company to gain traffic from social media (especially Facebook). I think it’s safe to say that a decreased dependence on Google is really the cornerstone for a true Panda recovery. That way, if you do get hit by Panda at a later time, it doesn’t kill your traffic entirely. Of course, if you’re producing the kind of content that people want to share on social networks, it’s highly unlikely that you’re doing things that Panda wouldn’t like.

    If you still haven’t taken the time to assess the quality of your site’s content. You may want to do so. The next Panda refresh is likely just around the corner.

    What do you think of Demand Media’s efforts in bouncing back from Panda? Let us know in the comments.

    Here’s Demand Media’s earnings release in its entirety: 

    SANTA MONICA, Calif.–(BUSINESS WIRE)–Nov. 5, 2012– Demand Media, Inc. (NYSE: DMD), a leading digital media company, today reported financial results for the quarter ended September 30, 2012.

    “Demand Media’s audience surpassed 125 million monthly unique visitors during the third quarter, as we delivered record revenue and profitability,” said Richard Rosenblatt, Chairman and CEO ofDemand Media. “For the first time in over a year, we increased our content investments for two consecutive quarters as we expanded the distribution of our content platform. We remain focused on our long-term growth initiatives, which include continuing to increase our investment in core content as well as in opportunities across mobile, video, international, and new generic Top Level Domains.”

    Financial Summary
    In millions, except per share amounts
    Three months ended September 30,
    2011 2012 Change
    Total Revenue $ 81.5 $ 98.1 20 %
    Content & Media Revenue ex-TAC(1) $ 47.4 $ 58.8 24 %
    Registrar Revenue 30.7 34.0 11 %
    Total Revenue ex-TAC(1) $ 78.1 $ 92.8 19 %
    Income (loss) from Operations $ (3.3 ) $ 4.5 NA
    Adjusted EBITDA(1) $ 21.7 $ 27.6 28 %
    Net income (loss) $ (4.1 ) $ 3.2 NA
    Adjusted net income(1) $ 5.0 $ 9.8 97 %
    EPS $ (0.05 ) $ 0.04 NA
    Adjusted EPS(1) $ 0.06 $ 0.11 83 %
    Cash Flow from Operations $ 22.1 $ 24.6 12 %
    Free Cash Flow(1) $ 6.0 $ 16.6 177 %
    (1) These non-GAAP financial measures are described below and reconciled to their comparable GAAP measures in the accompanying tables. Effective Q1 2012, the Company began reporting Adjusted EBITDA instead of Adjusted OIBDA. Reconciliations for both measures are available on the investor relations section of the Company’s website.

    Q3 2012 Financial Summary:

    • Content & Media Revenue ex-TAC grew 24% year-over-year, due primarily to strong page view growth on the Company’s owned & operated properties, as well as 50% growth in network RPMs, reflecting higher revenue from our growing network of content partners. Sequentially, Content & Media Revenue ex-TAC increased 6% compared to the second quarter of 2012, driven primarily by network RPM growth.
    • Registrar revenue grew 11% year-over-year and increased 2% compared to the second quarter of 2012. Revenue growth was driven by an increase in number of domains on our platform, due primarily to growth from new partners.
    • Free Cash Flow was $16.6 million compared to $6.0 million a year ago, reflecting growth in cash flow from operations and a year-over-year reduction in intangible asset content spend, primarily on eHow. Sequentially, investment in intangible assets increased 36% compared to the second quarter of 2012.

    “We continued our 2012 financial momentum in Q3 with record adjusted EBITDA and strong free cash flow growth, while increasing our investment in content sequentially,” said CFO Mel Tang. “We are raising our 2012 financial guidance and remain focused on driving Demand Media’s long-term growth through continued disciplined investments.”

    Q3 2012 Business Highlights(1):

    • On a consolidated basis, Demand Media ranked as a top 20 US web property throughout the first nine months of 2012, ranking as #13 in September 2012, up from #17 in January 2012. Demand Media’s web properties reached over 125 million unique users worldwide in September 2012.
    • On a standalone basis, eHow.com ranked as the #13 website in the US in September 2012.
    • LIVE ranked as the #3 Health property in the US in September 2012.
    • Cracked.com maintained its ranking as the most visited humor site in the US throughout the first half of 2012, with more time spent on the site than any other humor website. The Cracked Network, which includes IndieClick, ranked as the #1 Humor property in the US in September 2012.

    (1) Source: comScore.

    Operating Metrics:

    Three months ended
    September 30,
    2011 2012 %
    Change
    Content & Media Metrics:
    Owned and operated
    Page views(1) (in millions) 2,527 3,363 33 %
    RPM(2) $ 15.16 $ 13.49 (11 )%
    Network of customer websites
    Page views(1) (in millions) 5,046 4,965 (2 )%
    RPM(2) $ 2.47 $ 3.78 53 %
    RPM ex-TAC(3) $ 1.80 $ 2.70 50 %
    Registrar Metrics:
    End of Period # of Domains(4) (in millions) 12.2 13.7 12 %
    Average Revenue per Domain(5) $ 10.20 $ 9.99 (2 )%

    ____________________

    (1) Page views represent the total number of web pages viewed across (a) our owned and operated websites and/or (b) our network of customer websites, to the extent that the viewed web pages of our customers host the Company’s content, social media and/or monetization services.
    (2) RPM is defined as Content & Media revenue per one thousand page views.
    (3) RPM ex-TAC is defined as Content & Media Revenue ex-TAC per one thousand page views.
    (4) Domain is defined as an individual domain name paid for by a third-party customer where the domain name is managed through our Registrar service offering.
    (5) Average revenue per domain is calculated by dividing Registrar revenue for a period by the average number of domains registered in that period. Average revenue per domain for partial year periods is annualized.
    Beginning July 1, 2011, the number of net new domains has been adjusted to include only new registered domains added to our platform for which we have recognized revenue. Excluding the impact of this change, average revenue per domain during the three months ended September 30, 2012 would have increased 1% compared to the corresponding prior-year periods.

    Q3 2012 Operating Metrics:

    • Owned & Operated page views increased 33% year-over-year, driven primarily by strong traffic growth on eHow.com and LIVESTRONG.COM. Owned & Operated RPMs decreased 11% year-over-year, due primarily to page view growth from lower RPM properties and traffic sources, including growth in mobile traffic.
    • Network page views decreased 2% year-over-year to 5.0 billion, due primarily to lower traffic from our social media partners. Network RPM ex-TAC increased 50% year-over-year, reflecting higher revenue from our growing network of content partners, primarily YouTube.
    • End of period domains increased 12% year-over-year to 13.7 million, driven primarily by the addition of higher volume customers and continued growth from existing resellers, with average revenue per domain decreasing by 2%, due to a mix shift to higher volume resellers.

    Business Outlook

    The following forward-looking information includes certain projections made by management as of the date of this press release. The Company does not intend to revise or update this information, except as required by law, and may not provide this type of information in the future. Due to a variety of factors, actual results may differ significantly from those projected. The factors that may affect results include, without limitation, the factors referenced later in this announcement under the caption “Cautionary Information Regarding Forward-Looking Statements.” These and other factors are discussed in more detail in the Company’s filings with the Securities and Exchange Commission.

    Excluding up to $3 million of 2012 expenses that the Company expects to incur related to the formation of its generic Top Level Domain (“gTLD”) initiative, the Company’s guidance for the fourth quarter and fiscal year ending December 31, 2012 is as follows:

    Fourth Quarter 2012

    • Revenue in the range of $101.5 – $103.5 million
    • Revenue ex-TAC in the range of $95.5 – $97.5 million
    • Adjusted EBITDA in the range of $27.5 – $28.5 million
    • Adjusted EPS in the range of $0.10 – $0.11 per share
    • Weighted average diluted shares of 89.5 – 90.5 million

    Full Year 2012

    • Revenue in the range of $378.9 – $380.9 million
    • Revenue ex-TAC in the range of $359.8 – $361.8 million
    • Adjusted EBITDA in the range of $101.6 – $102.6 million
    • Adjusted EPS in the range of $0.37 – $0.38 per share
    • Weighted average diluted shares of 86.5 – 87.5 million

    Conference Call and Webcast Information

    Demand Media will host a corresponding conference call and live webcast at 5:00 p.m. Eastern timetoday. To access the conference call, dial 877.565.1268 (for domestic participants) or 937.999.3108 (for international participants). The conference ID is 48753341. In order to participate on the live call, it is recommended that analysts should dial-in at least 10-minutes prior to the commencement of the call. A live webcast also will be available on the Investor Relations section of the Company’s corporate website at http://ir.demandmedia.com and via replay beginning approximately two hours after the completion of the call.

    About Non-GAAP Financial Measures

    To supplement our consolidated financial statements, which are prepared and presented in accordance with accounting principles generally accepted in the United States of America (“GAAP”), we use certain non-GAAP financial measures described below. The presentation of this additional financial information is not intended to be considered in isolation or as a substitute for, or superior to, the financial information prepared and presented in accordance with GAAP. For more information on these non-GAAP financial measures, please see the tables captioned “Reconciliation of Non-GAAP Measures to Unaudited Consolidated Statements of Operations” included in this release.

    Effective Q1 2012, the Company began reporting Adjusted EBITDA instead of Adjusted OIBDA. While the dollar value of each measure is the same, a comparison of the historical reconciliation of both measures is provided in our supplemental financial schedules posted on the investor relations section of our corporate website at http://ir.demandmedia.com. The non-GAAP financial measures presented in this release are the primary measures used by the Company’s management and board of directors to understand and evaluate its financial performance and operating trends, including period to period comparisons, to prepare and approve its annual budget and to develop short and long term operational plans. Additionally, Adjusted EBITDA is the primary measure used by the compensation committee of the Company’s board of directors to establish the funding targets for and fund its annual bonus pool for the Company’s employees and executives. We believe our presented non-GAAP financial measures are useful to investors both because (1) they allow for greater transparency with respect to key metrics used by management in its financial and operational decision-making and (2) management frequently uses them in its discussions with investors, commercial bankers, securities analysts and other users of its financial statements.

    Revenue ex-TAC is defined by the Company as GAAP revenue less traffic acquisition costs (“TAC”). TAC comprises the portion of Content & Media GAAP revenue shared with the Company’s network customers. Management believes that Revenue ex-TAC is a meaningful measure of operating performance because it is frequently used for internal managerial purposes and helps facilitate a more complete period-to-period understanding of factors and trends affecting the Company’s underlying revenue performance of its Content & Media service offering.

    Adjusted earnings before interest, taxes, depreciation and amortization (“Adjusted EBITDA”) is defined by the Company as net income (loss) before income tax expense, other income (expense), interest expense (income), depreciation, amortization, stock-based compensation, as well as the financial impact of acquisition and realignment costs, the formation expenses directly related to its gTLD initiative, and any gains or losses on certain asset sales or dispositions. Acquisition and realignment costs include such items, when applicable, as (1) non-cash GAAP purchase accounting adjustments for certain deferred revenue and costs, (2) legal, accounting and other professional fees directly attributable to acquisition activity, and (3) employee severance payments attributable to acquisition or corporate realignment activities. Management does not consider these expenses to be indicative of the Company’s ongoing operating results or future outlook.

    Management believes that these non-GAAP financial measures reflect the Company’s business in a manner that allows for meaningful period to period comparisons and analysis of trends. In particular, the exclusion of certain expenses in calculating Adjusted EBITDA can provide a useful measure for period to period comparisons of the Company’s underlying recurring revenue and operating costs, which is focused more closely on the current costs necessary to utilize previously acquired long-lived assets. In addition, management believes that it can be useful to exclude certain non-cash charges because the amount of such expenses is the result of long-term investment decisions in previous periods rather than day-to-day operating decisions. For example, due to the long-lived nature of a majority of its media content, the revenue generated by the Company’s media content assets in a given period bears little relationship to the amount of its investment in media content in that same period. Accordingly, management believes that content acquisition costs represent a discretionary long-term capital investment decision undertaken at a point in time. This investment decision is clearly distinguishable from other ongoing business activities, and its discretionary nature and long-term impact differentiate it from specific period transactions, decisions regarding day-to-day operations, and activities that would have an immediate impact on operating or financial performance if materially changed, deferred or terminated.

    Adjusted Earnings Per Share is defined by the Company as Adjusted Net Income divided by the weighted average number of shares outstanding. Adjusted Net Income is defined by the Company as net income (loss) before the effect of stock-based compensation, amortization of intangible assets acquired via business combinations, accelerated amortization of intangible assets removed from service, acquisition and realignment costs, the formation expenses directly related to its gTLD initiative, and any gains or losses on certain asset sales or dispositions, and is calculated using the application of a normalized effective tax rate. Acquisition and realignment costs include such items, when applicable, as (1) non-cash GAAP purchase accounting adjustments for certain deferred revenue and costs, (2) legal, accounting and other professional fees directly attributable to acquisition activity, and (3) employee severance payments attributable to acquisition or corporate realignment activities. Management does not consider these expenses to be indicative of the Company’s ongoing operating results or future outlook.

    Management believes that Adjusted Net Income and Adjusted Earnings Per Share provide investors with additional useful information to measure the Company’s underlying financial performance, particularly from period to period, because these measures are exclusive of certain non-cash expenses not directly related to the operation of its ongoing business (such as amortization of intangible assets acquired via business combinations, as well as certain other non-cash expenses such as purchase accounting adjustments and stock-based compensation) and include a normalized effective tax rate based on the Company’s statutory tax rate.

    Discretionary Free Cash Flow is defined by the Company as net cash provided by operating activities excluding cash outflows from acquisition and realignment activities, and the formation expenses directly related to its gTLD initiative, less capital expenditures to acquire property and equipment. Free Cash Flow is defined by the Company as Discretionary Free Cash Flow less investments in intangible assets and is not impacted by gTLD application payments, which were$18.1 million in Q2 2012. Management believes that Discretionary Free Cash Flow and Free Cash Flow provide investors with additional useful information to measure operating liquidity because they reflect the Company’s underlying cash flows from recurring operating activities after investing in capital assets and intangible assets. These measures are used by management, and may also be useful for investors, to assess the Company’s ability to generate cash flow for a variety of strategic opportunities, including reinvestment in the business, pursuing new business opportunities, potential acquisitions, payment of dividends and share repurchases.

    The use of these non-GAAP financial measures has certain limitations because they do not reflect all items of income and expense, or cash flows that affect the Company’s operations. An additional limitation of these non-GAAP financial measures is that they do not have standardized meanings, and therefore other companies may use the same or similarly named measures but exclude different items or use different computations. Management compensates for these limitations by reconciling these non-GAAP financial measures to their most comparable GAAP financial measures within its financial press releases. Non-GAAP financial measures should be considered in addition to, not as a substitute for, financial measures prepared in accordance with GAAP. Further, these non-GAAP financial measures may differ from the non-GAAP financial information used by other companies, including peer companies, and therefore comparability may be limited. We encourage investors and others to review our financial information in its entirety and not rely on a single financial measure. The accompanying tables have more details on the GAAP financial measures and the related reconciliations.

    About Demand Media

    Demand Media, Inc. (NYSE: DMD) is a leading digital media company that informs and entertains one of the internet’s largest audiences, helps advertisers find innovative ways to engage with their customers and enables publishers to expand their online presence. Headquartered in Santa Monica, CA, Demand Media has offices in North America, South America and Europe. For more information about Demand Media, please visit www.demandmedia.com

    Cautionary Information Regarding Forward-Looking Statements

    This press release contains forward-looking statements within the meaning of the “safe harbor” provisions of the Private Securities Litigation Reform Act of 1995, as amended. These forward-looking statements involve risks and uncertainties regarding the Company’s future financial performance, and are based on current expectations, estimates and projections about our industry, financial condition, operating performance and results of operations, including certain assumptions related thereto. Statements containing words such as “guidance,” “may,” “believe,” “anticipate,” “expect,” “intend,” “plan,” “project,” “projections,” “business outlook,” and “estimate” or similar expressions constitute forward-looking statements. Actual results may differ materially from the results predicted, and reported results should not be considered an indication of future performance. Potential risks and uncertainties include, among others: changes in the methodologies of internet search engines, including ongoing algorithmic changes made by Google to its search results as well as possible future changes, and the impact such changes may have on page view growth and driving search related traffic to our owned and operated websites and the websites of our network customers; changes in our content creation and distribution platform, including the possible repurposing of content to alternate distribution channels, reduced investments in intangible assets or the sale or removal of content; our ability to successfully launch, produce and monetize new content formats; the inherent challenges of estimating the overall impact on page views and search driven traffic to our owned and operated websites based on the data available to us as internet search engines continue to make adjustments to their search algorithms; our ability to compete with new or existing competitors; our ability to maintain or increase our advertising revenue; our ability to continue to drive and grow traffic to our owned and operated websites and the websites of our network customers; our ability to effectively monetize our portfolio of content; our dependence on material agreements with a specific business partner for a significant portion of our revenue; future internal rates of return on content investment and our decision to invest in different types of content in the future, including premium video and other formats of text content; our ability to attract and retain freelance creative professionals; changes in our level of investment in media content intangibles; the effects of changes or shifts in internet marketing expenditures, including from text to video content as well as from desktop to mobile content; the effects of shifting consumption of media content from desktop to mobile; the effects of seasonality on traffic to our owned and operated websites and the websites of our network customers; our ability to continue to add partners to our registrar platform on competitive terms; our ability to successfully pursue and implement our gTLD initiative; changes in stock-based compensation; changes in amortization or depreciation expense due to a variety of factors; potential write downs, reserves against or impairment of assets including receivables, goodwill, intangibles (including media content) or other assets; changes in tax laws, our business or other factors that would impact anticipated tax benefits or expenses; our ability to successfully identify, consummate and integrate acquisitions; our ability to retain key customers and key personnel; risks associated with litigation; the impact of governmental regulation; and the effects of discontinuing or discontinued business operations. From time to time, we may consider acquisitions or divestitures that, if consummated, could be material. Any forward-looking statements regarding financial metrics are based upon the assumption that no such acquisition or divestiture is consummated during the relevant periods. If an acquisition or divestiture were consummated, actual results could differ materially from any forward-looking statements. More information about potential risk factors that could affect our operating and financial results are contained in our annual report on Form 10-K for the fiscal year ending December 31, 2011 filed with the Securities and Exchange Commission(http://www.sec.gov) on February 24, 2012, and as such risk factors may be updated in our quarterly reports on Form 10-Q filed with the Securities and Exchange Commission, including, without limitation, information under the captions “Risk Factors” and “Management’s Discussion and Analysis of Financial Condition and Results of Operations.”

    Furthermore, as discussed above, the Company does not intend to revise or update the information set forth in this press release, except as required by law, and may not provide this type of information in the future.

    Demand Media, Inc. and Subsidiaries

    Unaudited Condensed Consolidated Statements of Operations

    (In thousands, except per share amounts)

    Three months ended September 30, Nine months ended September 30,
    2011 2012 2011 2012
    Revenue $ 81,473 $ 98,147 $ 240,451 $ 277,436
    Operating expenses
    Service costs (exclusive of amortization of intangible assets shown separately below) (1) (2) 40,109 46,524 115,632 132,153
    Sales and marketing (1) (2) 9,200 11,625 28,069 33,678
    Product development (1) (2) 9,791 10,278 28,684 30,989
    General and administrative (1) (2) 14,837 15,705 45,648 46,854
    Amortization of intangible assets 10,828 9,501 30,781 31,216
    Total operating expenses 84,765 93,633 248,814 274,890
    Income (loss) from operations (3,292 ) 4,514 (8,363 ) 2,546
    Other income (expense)
    Interest income 5 9 52 34
    Interest expense (385 ) (155 ) (710 ) (465 )
    Other income (expense), net (79 ) (13 ) (338 ) (77 )
    Total other expense (459 ) (159 ) (996 ) (508 )
    Income (loss) before income taxes (3,751 ) 4,355 (9,359 ) 2,038
    Income tax (expense) benefit (394 ) (1,180 ) (2,739 ) (611 )
    Net income (loss) $ (4,145 ) $ 3,175 $ (12,098 ) $ 1,427
    (1) Stock-based compensation expense included in the line items above:
    Service costs $ 757 $ 672 $ 1,341 $ 2,141
    Sales and marketing 1,405 1,400 3,441 4,521
    Product development 1,403 1,396 3,649 5,169
    General and administrative 4,190 4,578 13,671 12,155
    Total stock-based compensation expense $ 7,755 $ 8,046 $ 22,102 $ 23,986
    (2) Depreciation included in the line items above:
    Service costs $ 4,112 $ 3,587 $ 12,305 $ 10,789
    Sales and marketing 109 105 296 345
    Product development 399 234 1,158 787
    General and administrative 683 906 2,133 2,703
    Total depreciation $ 5,303 $ 4,832 $ 15,892 $ 14,624
    Income (loss) per common share:
    Net income (loss) $ (4,145 ) $ 3,175 $ (12,098 ) $ 1,427
    Cumulative preferred stock dividends (3) (2,477 )
    Net income (loss) attributable to common stockholders $ (4,145 ) $ 3,175 $ (14,575 ) $ 1,427
    Net income (loss) per share – basic $ (0.05 ) $ 0.04 $ (0.19 ) $ 0.02
    Net income (loss) per share – diluted $ (0.05 ) $ 0.04 $ (0.19 ) $ 0.02
    Weighted average number of shares – basic 83,934 85,182 77,001 84,020
    Weighted average number of shares – diluted 83,934 88,751 77,001 86,895
    (3) As a result of the Company’s initial public offering which was completed on January 31, 2011, all shares of the Company’s preferred stock were converted to common stock.
    Demand Media, Inc. and Subsidiaries

    Unaudited Condensed Consolidated Balance Sheets

    (In thousands)

    December 31,
    2011
    September 30,
    2012
    Current assets
    Cash and cash equivalents $ 86,035 $ 112,916
    Accounts receivable, net 32,665 41,118
    Prepaid expenses and other current assets 8,656 8,501
    Deferred registration costs 50,636 57,437
    Total current assets 177,992 219,972
    Property and equipment, net 32,626 33,740
    Intangible assets, net 111,304 88,577
    Goodwill 256,060 256,037
    Deferred registration costs 9,555 11,108
    Other long-term assets 2,566 21,607
    Total assets $ 590,103 $ 631,041
    Liabilities, Convertible Preferred Stock and Stockholders’ Equity
    Current liabilities
    Accounts payable $ 10,046 $ 11,340
    Accrued expenses and other current liabilities 33,932 33,623
    Deferred tax liabilities 18,288 19,586
    Deferred revenue 71,109 78,805
    Total current liabilities 133,375 143,354
    Deferred revenue 14,802 15,966
    Other liabilities 1,660 2,361
    Total liabilities 149,837 161,681
    Stockholders’ equity
    Common stock and additional paid-in capital 528,042 559,689
    Treasury stock (17,064 ) (21,020 )
    Accumulated other comprehensive income 59 35
    Accumulated deficit (70,771 ) (69,344 )
    Total stockholders’ equity 440,266 469,360
    Total liabilities, convertible preferred stock and stockholders’ equity $ 590,103 $ 631,041
    Demand Media, Inc. and Subsidiaries

    Unaudited Condensed Consolidated Statements of Cash Flows

    (In thousands)

    Three months ended September 30, Nine months ended September 30,
    2011 2012 2011 2012
    Cash flows from operating activities:
    Net income (loss) $ (4,145 ) $ 3,175 $ (12,098 ) $ 1,427
    Adjustments to reconcile net income (loss) to net cash provided by operating activities:
    Depreciation and amortization 16,131 14,332 46,673 45,839
    Stock-based compensation 7,727 8,046 21,989 23,986
    Other 294 967 2,363 584
    Net change in operating assets and liabilities, net of effect of acquisitions 2,050 (1,925 ) (802 ) (6,890 )
    Net cash provided by operating activities 22,057 24,595 58,125 64,946
    Cash flows from investing activities:
    Purchases of property and equipment (3,194 ) (4,982 ) (14,024 ) (12,425 )
    Purchases of intangibles (13,927 ) (3,468 ) (43,989 ) (8,590 )
    Payments for gTLD applications (18,202 )
    Cash paid for acquisitions (27,133 ) (1,011 ) (30,972 ) (1,280 )
    Other (855 )
    Net cash used in investing activities (44,254 ) (9,461 ) (88,985 ) (41,352 )
    Cash flows from financing activities:
    Proceeds from issuance of common stock, net 78,625
    Repurchases of common stock (3,728 ) (3,728 ) (3,956 )
    Proceeds from exercises of stock options and contributions to ESPP 2,832 5,160 4,357 11,016
    Other (1,332 ) (1,568 ) (1,547 ) (3,755 )
    Net cash provided by (used in) financing activities (2,228 ) 3,592 77,707 3,305
    Effect of foreign currency on cash and cash equivalents (23 ) 3 (31 ) (18 )
    Change in cash and cash equivalents (24,448 ) 18,729 46,816 26,881
    Cash and cash equivalents, beginning of period 103,602 94,187 32,338 86,035
    Cash and cash equivalents, end of period $ 79,154 $ 112,916 $ 79,154 $ 112,916
    Demand Media, Inc. and Subsidiaries

    Reconciliations of Non-GAAP Measures to Unaudited Consolidated Statements of Operations

    (In thousands, except per share amounts)

    Three months ended September 30, Nine months ended September 30,
    2011 2012 2011 2012
    Revenue ex-TAC:
    Content & Media revenue $ 50,744 $ 64,136 $ 152,418 $ 177,766
    Less: traffic acquisition costs (TAC) (3,381 ) (5,350 ) (9,384 ) (13,109 )
    Content & Media Revenue ex-TAC 47,363 58,786 143,034 164,657
    Registrar revenue 30,729 34,011 88,033 99,670
    Total Revenue ex-TAC $ 78,092 $ 92,797 $ 231,067 $ 264,327
    Adjusted EBITDA(1):
    Net income (loss) $ (4,145 ) $ 3,175 $ (12,098 ) $ 1,427
    Income tax expense/(benefit) 394 1,180 2,739 611
    Interest and other expense, net 459 159 996 508
    Depreciation and amortization(2) 16,131 14,333 46,673 45,840
    Stock-based compensation 7,755 8,046 22,102 23,986
    Acquisition and realignment costs(3) 1,058 20 1,828 132
    gTLD expense(4) 707 1,589
    Adjusted EBITDA $ 21,652 $ 27,620 $ 62,240 $ 74,093
    Discretionary and Total Free Cash Flow:
    Net cash provided by operating activities $ 22,057 $ 24,595 $ 58,125 $ 64,946
    Purchases of property and equipment (3,194 ) (4,982 ) (14,024 ) (12,425 )
    Acquisition and realignment cash flows 1,068 1,068
    gTLD expense cash flows(4) 488 1,224
    Discretionary Free Cash Flow 19,931 20,101 45,169 53,745
    Purchases of intangible assets (13,927 ) (3,468 ) (43,989 ) (8,590 )
    Free Cash Flow(4)(5) $ 6,004 $ 16,633 $ 1,180 $ 45,155
    Adjusted Net Income:
    GAAP net income (loss) $ (4,145 ) $ 3,175 $ (12,098 ) $ 1,427
    (a) Stock-based compensation 7,755 8,046 22,102 23,986
    (b) Amortization of intangible assets – M&A 2,969 2,666 9,799 8,332
    (c) Content intangible assets removed from service(2) 1,818
    (d) Acquisition and realignment costs(3) 1,058 20 1,828 133
    (e) gTLD expense(4) 707 1,589
    (f) Income tax effect of items (a) – (e) & application of 38% statutory tax rate to pre-tax income (2,658 ) (4,822 ) (6,521 ) (13,789 )
    Adjusted Net Income $ 4,979 $ 9,792 $ 15,110 $ 23,496
    Non-GAAP Adjusted Net Income per share – diluted $ 0.06 $ 0.11 $ 0.17 $ 0.27
    Shares used to calculate non-GAAP Adjusted Net Income per share – diluted(6) 87,973 88,754 89,098 87,003
    (1) Effective Q1 2012, the Company began reporting Adjusted EBITDA instead of Adjusted OIBDA. While the dollar value of each measure does not differ, a comparison of the historical reconciliation of both measures is provided in our supplemental financial schedules available on the investor relations section of our corporate website.
    (2) In conjunction with its previously announced plans to improve its content creation and distribution platform, the Company elected to remove certain content assets from service, resulting in $1.8 million of accelerated amortization expense in the first quarter of 2012.
    (3) Acquisition and realignment costs include such items, when applicable, as (1) non-cash GAAP purchase accounting adjustments for certain deferred revenue and costs, (2) legal, accounting and other professional fees directly attributable to acquisition activity, and (3) employee severance payments attributable to acquisition or corporate realignment activities. Management does not consider these costs to be indicative of the Company’s core operating results.
    (4) Comprises formation expenses directly related to the Company’s gTLD initiative that is not expected to generate associated revenue in 2012.
    (5) In April 2012, the Company invested $18.1 million in gTLD applications, which did not impact its recurring Free Cash Flow metric.
    (6) Shares used to calculate non-GAAP Adjusted Net Income per share – diluted include the weighted average common stock for the periods presented and all dilutive common stock equivalents at each period. Amounts have been adjusted in 2011 to reflect the revised capital structure following the Company’s initial public offering which was completed on January 31, 2011, whereby the Company issued 5,175 shares of common stock and converted certain warrants and all of its previously outstanding convertible preferred stock into 62,155 shares of common stock as if those transactions were consummated on January 1, 2011.
    Demand Media, Inc. and Subsidiaries

    Unaudited GAAP Revenue, by Revenue Source

    (In thousands)

    Three months ended September 30, Nine months ended September 30,
    2011 2012 2011 2012
    Content & Media:
    Owned and operated websites $ 38,298 $ 45,377 $ 117,917 $ 129,715
    Network of customer websites 12,446 18,759 34,501 48,051
    Total revenue – Content & Media 50,744 64,136 152,418 177,766
    Registrar 30,729 34,011 88,033 99,670
    Total revenue $ 81,473 $ 98,147 $ 240,451 $ 277,436
    Three months ended September 30, Nine months ended September 30,
    2011 2012 2011 2012
    Content & Media:
    Owned and operated websites 47 % 46 % 49 % 47 %
    Network of customer websites 15 % 19 % 14 % 17 %
    Total revenue – Content & Media 62 % 65 % 63 % 64 %
    Registrar 38 % 35 % 37 % 36 %
    Total revenue 100 % 100 % 100 % 100 %

     

    Source: Demand Media, Inc.

  • Google Panda Update Data Refresh Is Rolling Out

    Update: What do you know? They just formally announced it (and it’s just a data refresh, not an actual update):

    Google has without question been more transparent about algorithm updates over the past year or so. However, the search giant has shown a great deal of inconsistency around this transparency. Take the Panda update, for example. Sometimes, the company will announce that there has been a new update or data refresh. At the end of September / beginning of October, we saw Google announce a handful of updates, and eventually announce a Panda update about a week after it happened.

    According to Danny Sullivan at Search Engine Land, Google confirmed that a Panda update happened yesterday, though this was not announced. Sullivan reports:

    “Google said that worldwide, the update will impact about 0.4% of queries that a regular user might notice. For those searching in the United States in English, the percentage is higher. 1.1%, Google says.”

    Also yesterday, Demand Media (which some might say is the poster child for Panda update affected sites) released its quarterly earnings report, posting record profit and revenue.

    With regards to Google algorithm changes and transparency, we’re still waiting to see the company’s list of “search quality highlights” for the month of October, though they might wait and do October and November together.

    Image: Panda Express

  • Should Google Use Link Disavow As A Ranking Signal?

    Last month, as you may know, Google introduced its Link Disavow tool, after dropping a hint that it would do so months prior. What we didn’t know until this past week, however, is that there is a possibility that Google will use the data it gets from the tool as a ranking signal.

    Should data from the link disavow tool be used to rank sites in Google? Let us know what you think.

    First off, to be clear, Google is not currently getting any ranking signals from the tool. In the future, however, that may change. Danny Sullivan shared a Q&A with Matt Cutts in which he did not rule out the possibility. Sullivan asked him if “someone decides to disavow link from good sites a perhaps an attempt to send signals to Google these are bad,” is Google mining this data to better understand what bad sites are?

    “Right now, we’re using this data in the normal straightforward way, e.g. for reconsideration requests,” Cutts responded. “We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.”

    They haven’t decided. It could go either way, but if people are submitting enough links to the same sites, wouldn’t Google want to look at that as some sign that it is not a reputable site?

    Yes, Google does have over 200 signals, and has other ways of deciding what is high or poor quality, but does that mean there is not room for data from the link disavow tool to play some role within the algorithm, even if it’s not the heaviest signal it looks at?

    “We may do spot checks, but we’re not planning anything more broadly with this data right now,” said Cutts. “If a webmaster wants to shoot themselves in the foot and disavow high-quality links, that’s sort of like an IQ test and indicates that we wouldn’t want to give that webmaster’s disavowed links much weight anyway. It’s certainly not a scalable way to hurt another site, since you’d have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and don’t even get to the ‘build a good site’ stage.” Emphasis is ours.

    No, it doesn’t seem like a very plausible strategy for competitors to hurt one another. However, that does not necessarily mean that some sites couldn’t potentially be affected if the data were to become a signal.

    Since the Penguin update was launched, and Google has been sending out messages about links more aggressively, we’ve seen people scramble to get tons of links to their sites removed. Google is not telling you all the links that you should be getting removed. It’s giving you examples. As a result, we’ve seen many webmasters taking an aggressive approach of their own trying to get more links removed than they probably needed to. We’ve seen the letters webmasters have written to other sites asking to have links removed for fear that they could somehow be hurting them in Google, even if they would consider it to be a valuable link otherwise. If it’s a good link (and not one specifically meant for gaming Google), then it stands to reason it’s not something that Google should be frowning upon. Yet, these kinds of links are being requested to be removed.

    So, why would paranoid and/or desperate webmasters not go overboard on the Link Disavow tool?

    Sure, Google has warned repeatedly that the tool should not be used in most cases, and that it should only be used after trying to get all the links removed manually (they won’t even acknowledge your submission if they can see that you haven’t tried). But what is the likelihood that there won’t be numerous people jumping the gun and using it when they really shouldn’t be?

    How many of the webmasters out there that have been hurt by updates like Penguin are tired of jumping through hoop after hoop, and will see the tool as a shortcut?

    SEO analyst Jennifer Slegg writes, “People who have been affected with bad links will very likely take a very heavy-handed approach to the links they disavow in their panic of seeing their traffic drop off a cliff. There is no doubt that some of those good links that are actually helping the site will end up in the list along with poor quality ones because the webmaster is either unclear about whether a link is a bad influence, or just think the starting fresh approach is the best one to go with.”

    “So good websites could also have their sites potentially flagged as a possible bad source of links because of clueless webmasters, even though those clueless webmasters are actually making more work for themselves by disavowing links that are actually helping them,” she adds.

    And that’s exactly the point. If data from Link Disavow were to become a ranking signal, this is where things could get tricky.

    “What happens if someone disavows a link from your website for whatever reason?” asks 352 Media Group Social Media Marketing Director Erin Everhart. “Will your website get flagged as spam? Google has enough leverage over us anyway. Do you want them to have even more?”

    That’s a pretty good question too. Does Google have too much power over webmasters? Tell us what you think.

  • Matt Cutts Talks Subdomains Vs. Subdirectories

    Google’s Matt Cutts posted a new Webmaster Help video about subdomains vs. subdirectories. It’s a topic Google has talked about various times in the past, but as Cutts notes in the video, it’s been a while, so perhaps it’s worth revisiting.

    The user-submitted question he’s responding to is:

    I’m interested in finding out how Google currently views subdomains — whether there’s any difference between a site structured as a group of subdomains and one structured as a group of subdirectories.

    “They’re roughly equivalent,” says Cutts. “I would basically go with whichever one is easier for you in terms of configuration, your CMSs, [and] all that sort of stuff.”

    You can watch the video for a more complete answer.

  • It’s Apparently Not Out Of The Question That Google’s Link Disavow Tool Could Be Used For Ranking Signals

    Earlier this month, Google launched the Link Disavow tool, which webmasters can use to tell Google to ignore certain links they believe to be bad. While Google will only do so at its own discretion, some may be wondering if Google will be using the data it gets from the tool for other purposes (like maybe as a ranking signal).

    If enough sites submit links from a specific site, for example, would Google use that data to determine that the site in question is really a bad site, and therefore use the data as a ranking signal? It seems like a logical question, and Google’s Matt Cutts didn’t exactly rule out the possibility, though he says this is not the case now.

    Danny Sullivan at Search Engine Land posted a Q&A with Cutts, in which he asked if “someone decides to disavow links from good sites a perhaps an attempt to send signals to Google these are bad,” is Google mining this data to better understand what bad sites are?

    “Right now, we’re using this data in the normal straightforward way, e.g. for reconsideration requests,’ Cutts responded. “We haven’t decided whether we’ll look at this data more broadly. Even if we did, we have plenty of other ways of determining bad sites, and we have plenty of other ways of assessing that sites are actually good.”

    Google does, of course, have over 200 signals, but that doesn’t mean there isn’t room for the data to play some role in the algorithm, even if it’s not the weightiest signal.

    “We may do spot checks, but we’re not planning anything more broadly with this data right now,” he adds. “If a webmaster wants to shoot themselves in the foot and disavow high-quality links, that’s sort of like an IQ test and indicates that we wouldn’t want to give that webmaster’s disavowed links much weight anyway. It’s certainly not a scalable way to hurt another site, since you’d have to build a good site, then build up good links, then disavow those good links. Blackhats are normally lazy and don’t even get to the ‘build a good site’ stage.”

    It does sound like a pretty dumb strategy, and I doubt that many will go this route to try and hurt other sites, but that doesn’t mean that sites who get a lot of people including them in their Link Disavow files shouldn’t worry about it at all, does it?

    Look at the overreaction webmasters have partaken in with regards to link removal, thanks to the Penguin update. What makes anyone think that a similar overreaction won’t take place with the Link Disavow tool?

    Even if Google hasn’t decided whether it will use the data as a ranking signal later, one has to wonder if we’ll ever know if they do decide to implement it. I don’t see that one making Google’s monthly lists of changes.

  • Google Is Experimenting With Ways To Make Reconsideration Requests Better

    Google has been experimenting with how to make the reconsideration request process better for webmasters who have been dealt a manual action penalty by Google.

    Google’s head of webspam, Matt Cutts, put out a new Webmaster Help video discussing reconsideration requests and whether or not they’re actually read by humans. The video was a response to the following user-submitted question:

    Right now, when a webmaster sends a reconsideration request, how many chances does it have to be read by a real human? Do you plan to make it possible for webmasters to answer when they get a result back from Google?

    “Whenever you do a reconsideration request, if you don’t have any manual action by the webspam team, so there’s no way we could do anything, in essence, because it’s algorithmically determining where you’re ranking, those are automatically closed out,” says Cutts. “Those aren’t looked at by a human being, but 100% of all the other reconsideration requests are looked at by a real person.”

    “We don’t have the time to individually reply with a ton of detail, and so we do think about ways to be more scalable, and so I understand it might not be as satisfying to get, ‘Yeah, we think you’re okay,’ or ‘No, you still have issues,’ but that is a real human that is looking at that and generating the response that you read back,” he says.

    He goes on to say that if Google still thinks you have issues with your site, you should take the time to investigate and figure out some things you can do before submitting another request. If you just submit it again without doing anything, Google will likely consider you to be “hard headed” and find it “unproductive to continue that conversation.”

    “We’ve actually been trying a very experimental program where when we see someone who’s done a reconsideration request more than once, we’ll sample a small number of those and send those to other people to sort of say, ‘Okay, let’s do a deeper dig here.’ You know, maybe we need to send a little bit more info or investigate in a little bit more detail,” continues Cutts. “It’s just one of the ways we’ve been experimenting. We’ve actually been doing it for quite a while to try to figure out, ‘Okay, are there other ways that we can improve our process? Other ways that we can communicate more?’ So it’s the kind of thing where we don’t guarantee that if you appeal a couple times that you’ll get any sort of more detailed of an answer, but there are people reading all those reconsideration requests.”

  • Hall Of Famer Matt Cutts On Why Google Doesn’t Provide An SEO Quality Calculator

    Google’s Matt Cutts, who on Friday, was inducted into the University of Kentucky’s Hall of Fame here in Lexington, has posted a new Webmaster Help video answering a question about why Google doesn’t provide some kind of “SEO quality calculator”. The exact question was:

    Why doesn’t Google release an SEO quality check up calcualtor? This will help people to optimize their websites in the right direction. Is Google willing to do this or it just wants people to keep guessing what’s wrong with their websites?

    He talks about how much Google does to provide Webmasters with help via Webmaster Tools and the notifications it sends out.

    Then, he says, “If we were to just give an exact score…so back in the early days, InfoSeek would actually like let you submit a page, see immediately where it was ranking, and let you submit another page, and there are stories that have lasted since then about how people would just spend their entire day spamming InfoSeek, tweaking every single thing until they got exactly the right template that would work to rank number one. So we don’t want to encourage that kind of experimentation, and we know that if we give exact numbers and say, ‘Okay, this is how you’re ranking on this particular sort of algorithm or how you rank along this axis,’ people will try to spam that.”

    “But what we do want to provide is a lot of really useful tools for people that are doing it themselves – mom and pops – people who are really interested and just want to dig into it – agencies who want to have more information so that they can do productive optimization – all that sort of stuff,” he continues.

    “We don’t want to make it easy for the spammers, but we do want to make it as easy as possible for everybody else,” he adds. “There’s inherently a tension there, and we’re always trying to find the features that will help regular people while not just making it easy to spam Google.”

    Of course, it’s getting harder to get on the front page of results on Google anyway, because of all of the other elements they’re adding to the SERPs and the reduced number of organic results appearing on an increasing number of them.

  • In Case You Were Wondering, Quoting Isn’t Duplicate Content [Matt Cutts]

    Google’s Matt Cutts has put out his latest Webmaster Help video. This time he takes on a pretty classic topic – duplicate content. There’s not much here that any industry veterans will find to be of particular interest, but he is answering a user-submitted question, so clearly there are people out there unsure of Google’s take on quoting other sources. The question is as follows:

    Correct quotations in Google. How can you quote correctly from different sources without getting penalized for duplicated content? Is it possible to quote and refer to the source?

    “You’re a regular blogger, and you just want to quote an excerpt – you know, some author you like or some other blogger who has good insight – just put that in a blockquote, include a link to the original source, and you’re in pretty good shape,” says Cutts. “If that’s the sort of thing that you’re doing, I would never worry about getting dinged by duplicate content. We do have good ways of detecting that sort of thing without any sort of issue at all.”

    “If, however, your idea of quoting is including an entire article from some other site, or maybe even multiple articles, and you’re not doing any original content yourself, then that can affect the reputation of how we view your site,” he adds.

    Basically, as long as you are adding some kind of value and perspective to what you are quoting, you’re going to be as far as Google is concerned.

    “Those sorts of things are completely legitimate and absolutely fine,” Cutts says. “I wouldn’t worry about that.

    So, if you’re quoting (and linking) rather than scraping, you’re probably okay. You may not want to go overboard on how much text you’re actually quoting from a source, however. Otherwise, you’re liable to be run into trouble with the source itself.

  • Google Panda Update: Rumors Surface Of One Launching Last Week

    This is pretty commonplace now, but there are rumors going around that Google may have launched a Panda update last week.

    Barry Schwartz at Search Engine Roundtable links to WebmasterWorld, where these rumors are often born. Sometimes they turn out to be accurate, and sometimes they don’t. We won’t know until we get confirmation from Google.

    We’ve reached out to Google for comment, and will update the article if and when we receive one. Here’s the part where I give you the spiel about how Google launches changes to its algorithms every day. It makes over 500 changes a year. Many of these have the potential to affect the rankings of your website. Even if a change does not directly impact your site, it may impact other sites in the listings, causing your ranking to go up or down.

    Google did recently confirm a Panda update, but only after announcing a separate, unrelated update. Google often issues “weather reports” (otherwise known as tweets from Matt Cutts about algorithm changes), announcing various updates. However, there is really no consistency to this, and webmasters are still often left guessing what might have happened.

    There’s a chance that Google will inform us whether this was Panda or not, but there’s also a good chance they won’t. There’s not much consistency to that either.

    Update: We haven’t heard official word, but Schwartz is now saying that Google “implied” that there was no update.

    Image: Tekken 5 (via YouTube)

  • Will Google’s Link Disavow Tool Come Back To Haunt Webmasters?

    Back in June, during the height of the Penguin update freakout, Google’s Matt Cutts hinted that Google would launch a “link disavow” tool, so that webmasters can tell Google the backlinks they want Google to ignore. This means links from around the web that are potentially hurting a site’s rankings in Google could be ignored, and no longer count against the site in question. This is something that many webmasters and SEOs have wanted for a long time, and especially since the Penguin update launched earlier this year. On Tuesday, Google made these dreams come true by finally launching the tool, after months of anticipation.

    Is it what you hoped it would be? Do you intend to use it? Let us know in the comments.

    How It Works

    The tool tells users, “If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”

    It is worth noting, however, that just because you use the tool, and tell Google to ignore certain links, it is not a guarantee that Google will listen. It’s more of a helpful suggestion. Google made this clear in the Q&A section of the blog post announcing the tool.

    “This tool allows you to indicate to Google which links you would like to disavow, and Google will typically ignore those links,” Google Webmaster Trends Analyst Jonathan Simon says. “Much like with rel=’canonical’, this is a strong suggestion rather than a directive—Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.” He adds:

    If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.

    If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page.

    With the tool, you simply upload a .txt file containing the links you want Google to disavow. You add one URL per line. You can block specific URLs or whole domains. To block a domain, use this format: domain:example.com. You can add comments by including a # before them. Google ignores the comments. The file size limit is 2MB.

    If you haven’t watched it yet, watch Matt Cutts’ video explaining the tool. If it’s something you’re considering using, it’s definitely worth the ten minutes of your time:

    Cutts warns repeatedly that most people will not want to use this tool, and you should really only use it if you’ve already tried hard to get the questionable links removed, but haven’t been able to get it done. For more details and minutia about how this tool works, there is a whole help center article dedicated to it.

    Negative SEO

    Negative SEO, a practice in which competitors attack a site with spammy links and whatnot, has been debated for a long time, and many will see this tool as a way to eliminate the effects fo this. Google has specifically responded to this.

    “The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building,” says Simon. “If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool.”

    “In general, Google works hard to prevent other webmasters from being able to harm your ranking,” he adds. “However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.”

    Cutts also talked about the subject at PubCon, where the tool was announced. Search Engine Roundtable has a liveblogged account of what he said, which reads:

    All the negative SEO complaints he sees, or most of it, is really not negative SEO hurting you. It is a much better use of your time to make your site better vs hurting someone else. At the same time, we’ve seen cases of this as an issue. I.e. buying a new domain and needing to clean up that site. There are people who want to go through this process. Plus SEOs that take on new clients that went through bad SEOs.

    Warnings And Overreaction

    Again, you don’t want to use the tool in most cases. It’s pretty much a last resort tactic for links you’re positive are hurting you, and can’t get removed otherwise. Google has warned repeatedly about this, as over-use of the tool can lead to webmatsers shooting themselves in the foot. If you use it willy nilly, you may be hurting your site by getting rid of links that were actually helping you in the first place.

    It seems like common sense, but ever since the Penguin update, we’ve seen plenty of examples of webmasters frantically trying to get links removed that even they admit they would like to keep, if not for fear that Google might frown upon them (when in reality, it’s likely that they did not).

    Aaron Wall from SEOBook makes some other interesting points on the warnings front. He writes:

    The disavow tool is a loaded gun.

    If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.

    Could the use of the tool be seen as an admission of guilt? Matt gives examples of “bad” webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.

    Google Wants To Depend More On Social And Authorship

    If overreaction is an issue, and it seems fairly likely that it will be, despite Google’s warnings, this tool could really mess with how Google treats links, which have historically been the backbone of its algorithm.

    “Links are one of the most well-known signals we use to order search results,” says Simon. “By looking at the links between pages, we can get a sense of which pages are reputable and important, and thus more likely to be relevant to our users. This is the basis of PageRank, which is one of more than 200 signals we rely on to determine rankings. Since PageRank is so well-known, it’s also a target for spammers, and we fight linkspam constantly with algorithms and by taking manual action.”

    It will be interesting to see how Google treats the links webmasters tell it to ignore, which are not actually hurting them in the first place. I would not be surprised to see some in the industry test Google on this.

    Google does not like it when people manipulate the way it counts links, yet they’ve just given webmasters a tool to do so, even if it’s kind of the opposite of the black hat techniques Google has always tried to eliminate (link schemes, paid links, etc.). Now (and we’ve seen this even before the tool existed), you potentially have webmasters trying to get rid of links that actually do have value, even in Google’s eyes. I mean, seriously, what are the odds that this tool will be used 100% how Google intends it to be used, which is apparently in rare circumstances?

    Google seems to be grooming other signals to play a greater role in the algorithm. While they’re not there yet, based on various comments the company has made, social signals will almost certainly play an increasingly weighty role. CEO Larry Page was asked about this at a conference this week.

    He responded, “I think it’s really important to know, again, who you’re with, what the community is – it’s really important to share things. It’s really important to know the identity of people so you can share things and comment on things and improve the search ecosystem, you know, as you – as a real person…I think all those things are absolutely crucial.”

    “That’s why we’ve worked so hard on Google+, on making [it] an important part of search,” he continued. “Again, like Maps, we don’t see that as like something that’s like a separate dimension that’s never going to play into search. When you search for things, you want to know the kinds of things your friends have looked at, or recommended, or wrote about, or shared. I think that’s just kind of an obvious thing.”

    “So I think in general, if the Internet’s working well, the information that’s available is shared with lots of different people and different companies and turned into experiences that work well for everyone,” he said. “You know, Google’s gotten where it is by searching all the world’s information, not just a little bit of it, right? And in general, I think people have been motivated to get that information searchable, because then we deliver users to those people with information.”

    “So in general, I think that’s the right way to run the Internet as a healthy ecosystem,” Page concluded. “I think social data is obviously important and useful for that. We’d love to make use of that every way we can.”

    As Google says, links are a direct target for manipulation, and social could be harder to fake (though there are certainly attempts, and there will be plenty more).

    Another difficult signal to fake is authorship, which is why Google is really pushing for that now. In a recent Google+ Hangout, Matt Cutts said of authorship, “Sometimes you’ll have higher click through, and people will say, ‘Oh, that looks like a trusted resource.’ So there are ways that you can participate and sort of get ready for the longer term trend of getting to know not just that something was said, but who said it and how reputable they were.”

    “I think if you look further out in the future and look at something that we call social signals or authorship or whatever you want to call it, in ten years, I think knowing that a really reputable guy – if Dan has written an article, whether it’s a comment on a forum or on a blog – I would still want to see that. So that’s the long-term trend,” he said.

    “The idea is you want to have something that everybody can participate in and just make these sort of links, and then over time, as we start to learn more about who the high quality authors are, you could imagine that starting to affect rankings,” he pointed out.

    So here you have Google (Matt Cutts specifically) telling you that authorship is going to become more important, and that you probably shouldn’t even use the new link-related tool that the company just launched.

    Danny Sullivan asked Cutts, at PubCon, why Google doesn’t simply discount bad links to begin with, rather than “considering some of them as potentially negative votes.”

    “After all, while it’s nice to have this new tool, it would be even better not to need it at all,” he writes. Cutts did not really answer that question.

    Why do you think Google does not do as Danny suggests, and simply ignore the bad links to begin with? Do you think social and authorship signals will become more important than links? Share your thoughts about Google’s ranking strategy and the new tool in the comments.

    Lead Image: The Shining (Warner Bros.)

  • If You Experience A Manual Action From Google, You Should Hear About It

    Google’s Matt Cutts, as you may know, spoke at PubCon this week. It’s where he revealed Google’s new Link Disavow tool. That seems to have overshadowed just about everything else from the conference (even the news that PubCon founder Brett Tabke has sold WebmasterWorld), including other things Cutts talked about.

    It’s understandable, as webmasters have been waiting months for the tool to be released, but Danny Sullivan points out another piece of significance from Cutts’ speech. Google now claims to be sending out messages to webmasters for pretty much every manual action it takes on a site. Sullivan reports:

    “We’ve actually started to send messages for pretty much every manual action that we do that will directly impact the ranking of your site,” said Matt Cutts, the head of Google’s web spam team, when speaking at the Pubcon conference this week.

    “If there’s some manual action taken by the manual web spam team that means your web site is going to rank directly lower in the search results, we’re telling webmasters about pretty much about all of those situations,” he continued.

    Cutts said there might be a rare “corner case” that might not make it but that reporting is “practically 100%” and “the intent is to get to 100%, and as far as I know, we’re actually there.”

    It’s been quite obvious that Google has been sending out many more messages this year than they have historically, but this is good information for webmasters to know, especially since certain activities that are in violation of Google’s quality guidelines could really either be hit by a manual action or an algorithmic action, particularly since Penguin launched.

    I suppose this is all part of Google’s effort to be more transparent, which has also included semi-monthly lists of algorithm changes and more tweeting about major updates in recent weeks.

  • Here’s What Google Thinks About Article Marketing, Widgets, Footers And Themes

    Google’s Matt Cutts has released a new Webmaster Help video. In this one, he is answering a question submitted by himself:

    What is Google’s current thinking about getting links from article marketing, widgets, footers, themes, etc.?

    The current thinking, he says, is pretty much the same as the past thinking, but he wants to be a little more “explicit” about it. There’s not a lot of surprising info here, or anything most in the industry don’t know, but it does serve as a reminder that trying to get links in easy (or lazy) ways is usually not the best decision.

    “Whenever you get a link from just a WordPress footer or a random footer or, you know, when someone installs a widget, or they install some theme on their content management system, it’s often the case that they’re not editorially choosing to a link with that anchor text,” says Cutts. “And so you sometimes see a lot of links all with the exact some anchor text because, you know, that’s what the widget happened to have embedded in it, or something like that. And even if it’s not the exact same anchor text, it’s relatively inorganic in the sense that the person who made the widget decided what the anchor text should be rather than the person who is actually doing the link by including the widget.”

    “It’s the same sort of thing with article marketing,” he continues. “If you write a relatively low quality article, you know, just a few hundred words, then at the bottom is two or three links of, you know, specifically high keyword density anchor text, then the sort of guy who just wants some content and doesn’t really care about the quality might grab that article from an article bank or something, and he’s not really editorially choosing to give that anchor text. So, as opposed to something that’s really compelling, when he really likes something, and linking to it organically…that’s the sort of links that we really want to count more.”

    “It’s always been the case that these sorts of links that are almost like boiler plate – it’s like not really a person’s real choice to really endorse that particular link or that particular anchor text,” Cutts says. “Those are links that typically we would not want to count as much, so either our algorithms or we do have manual ways of saying, ‘OK, at a very granular level, you know, this is the sort of link that we don’t want to trust.’”

    Google did just release a new Link Disavow tool that will let webmasters tell Google links they want it to ignore. Cutts says, however, that you really shouldn’t use the tool in most cases, painting it kind of more as a last resort measure.

  • Jim Boykin (Internet Marketing Ninjas) Buys WebmasterWorld Forum

    WebmasterWorld and Pubcon founder Brett Tabke has sold WebmsaterWorld to Jim Boykin, who runs Internet Marketing Ninjas. The announcement was made today at PubCon, which Tabke will continue to run.

    “Internet Marketing Ninjas, led by company founder Jim Boykin, is the ideal match for WebmasterWorld,” said Tabke. “I couldn’t have asked for a better situation than a long-time member acquiring WebmasterWorld.”

    WebmasterWorld is just the latest in a series of forums Boykin has acquired. Other recent pick-ups include Cre8asiteforums, and the Developer Shed Network.

    “2012 has been a major year for IMN and I feel privileged to be a part of WebmasterWorld,” says Boykin. “I’ve been a member of WMW for over 10 years, and I’ve learned so much from so many there. I have so much respect for the community members and it gives me the uttermost pride to be able to work with the community to move WMW into the future. I am very humbled and ready to listen to, and work with, this amazing community.”

    WebmasterWorld Director of Operations, Neil Marshall, offered the following statement: “I’m delighted to be involved with Jim and his team at Internet Marketing Ninjas. This is an exciting new era for WebmasterWorld and I’m really looking forward to continuing to develop the site for the benefit of its members, and retaining the site’s world-renowned, quality discussion forum for webmasters and their businesses. Thank you to Brett Tabke, for giving webmasters an independent vehicle to meet and to discuss current hot topics, turning the site into a major brand for webmasters. I’m sure the whole team is ready to keep that going forward and to develop new ideas for today’s Internet marketing businesses.”

    WebmasterWorld is a place where there is a great deal of discussion from webmasters dealing with Google various algorithm updates. Such discussions often give clues to new major algorithm changes that Google makes before they are officially announced (although there is often a lot of talk that leads to false alarms).

    It will be interesting to see how the forum evolves under new ownership. From the sound of it, the community itself won’t be changing much.

    Tabke says he intends to invest his future efforts into the PubCon conferences.

  • Blekko Launches New Suite Of SEO Tools

    Blekko announced the launch of a new suite of premium SEO tools today. The suite features a re-designed user interface, and a number of new features.

    The suite includes page analysis, SEO report cards, inbound links organized by categories, Blekko crawler sections reports and domain reports, instant SEO data and direct SEO access from Blekko’s search engine, AdSense and IP hosting information, a full list of pages crawled, and PageSource and Cache.

    It also includes full link reports of the following types: inbound links by host, live inbound links, all inbound links from a host, all inbound links to a specific URL, outbound links fro a page URL, internal links, and domain comparison of inbound links.

    “The mission behind blekko has always be to offer full transparency on the web,” said Blekko CEO Rich Skrenta. “Blekko’s premium SEO tools gives developers unique and often privately kept data about web domain content. Unlike other search engine analytic tools which only show general data oninbound links, our premium SEO tools provide full data reports that can be used to compare SEO stats between sites and direct SEO access from blekko search result pages.”

    Blekko says it is able to provide its “up-to-the-minute” SEO tools because it is only one of four companies that can index the entire web. Blekko performs 120 million searches per month through 20 billion pages indexed. Over a hundred million of these are updated daily, Blekko says.

  • Matt Cutts Finally Announces Link Disavow Tool For Google Webmaster Tools

    After months of anticipation, Google’s Matt Cutts, at PubCon in Las Vegas today, finally announced a new tool in Webmaster Tools to disavow links.

    Cutts made comments at SMX Advanced back in July, indicating that a tool would be on the way, and it is now here.

    In text on the tool itself, Google says, “If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”

    Here is Cutts talking about it in a new Webmaster Help video:

    “You might have been doing blog spam, comment spam, forum spam, guestbook spam…maybe you paid somebody to write some low quality articles and syndicate those all over the place with some very keyword rich anchor text, and maybe Google sent you a message that says, ‘We’ve seen unnatural links to your site or we’ve taken targeted action on some of the unnatural links to your site,’ and so as a result, you want to clean up those backlinks,” Cutts says in the video.

    First and foremost, he says, they recommend getting those links actually removed from the web. Of course, that’s easier said than done.

    Google says in a help center article:

    PageRank is Google’s opinion of the importance of a page based on the incoming links from other sites. (PageRank is an important signal, but it’s one of more than 200 that we use to determine relevancy.) In general, a link from a site is regarded as a vote for the quality of your site.

    Google works very hard to make sure that actions on third-party sites do not negatively affect a website. In some circumstances, incoming links can affect Google’s opinion of a page or site. For example, you or a search engine optimizer (SEO) you’ve hired may have built bad links to your site via paid links or other link schemes that violate our quality guidelines. First and foremost, we recommend that you remove as many spammy or low-quality links from the web as possible.

    If you’ve done as much work as you can to remove spammy or low-quality links from the web, and are unable to make further progress on getting the links taken down, you can disavow the remaining links. In other words, you can ask Google not to take certain links into account when assessing your site.

    Update: Google has now put out an official blog post about the tool. In that, Webmaster Trends Analyst Jonathan Simon writes:

    If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.

    If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page. When you arrive, you’ll first select your site.

    According to a liveblogged account of Cutts’ speech, he says not to use the tool unless you’re sure you need to use it. He mentioned that Google, going forward, will be sending out more messages about examples of links Google is distrusting. He also says not to disavow links from your own site.

    Regarding those link messages, Cutts says in the video that these are only examples of links, and not a comprehensive list.

    The tool consists of a .txt file (disavow.txt), with one URL per line that tells Google to ignore the site. You can also use it to block a whole domain by using a format like: domain:www.example.com.

    Cutts apparently suggests that most sites not use the tool, and that it is still in the early stages. Given that link juice is a significant ranking signal for Google it’s easy to see why Google wouldn’t want the tool to be over-used.

    It can reportedly take weeks for Google to actually disavow links. In a Q/A session, according to the liveblog from Search Engine Roundtable, Cutts said you should wait 2-3 days before sending a reconsideration request after you submit a disavow file. When asked if it hurts your site when someone disavows links from it, he reportedly said that it typically does not, as they look at your site as a whole.

    Danny Sullivan blogs that “Google reserves the right not to use the submissions if it feels there’s a reason not to trust them.”

    Users will be able to download the files they submitted, and submit it again later with any changes. According to Sullivan’s account, Cutts said the tool is like using the “nofollow” attribute in that it allows sites to link to others without passing PageRank.

    That’s good to know.

    A lot of SEOs have been waiting for Google to launch something like this for a long time. Perhaps it will cut down on all of the trouble webmasters have been going through trying to get other sites to remove links. At the same time, we also have to wonder how much overreaction there will be from webmasters who end up telling Google to ignore too many links, and shooting themselves in the foot. This will be a different era, to say the least.

    Just be warned. Google’s official word of caution is: ” If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results. We recommend that you disavow backlinks only if you believe you have a considerable number of spammy, artificial, or low-quality links pointing to your site, and if you are confident that the links are causing issues for you. In most cases, Google can assess which links to trust without additional guidance, so most normal or typical sites will not need to use this tool.”

    The information Google uses from the tool will be incorporated into its index as it recrawls the web and reprocesses the pages it sees.

    Google currently supports one disavow file per site. That file is shared among site owners in Webmaster Tools. The file size limit is 2MB.

  • Google’s Link Disavow: Google Answers Domain Related Questions

    Google launched its Link Disavow tool today. If you haven’t read about it yet, you can do so here.

    There are a few things Google mentions about it at the end of a blog post, I think are worth highlighting, with regards to international domains, subdomains and www vs. non-www.

    Google ends its announcement with a Q&A section, and the last few are about these items. Here is what Google says:

    Q: Do I need to disavow links from example.com and example.co.uk if they’re the same company?
    A: Yes. If you want to disavow links from multiple domains, you’ll need to add an entry for each domain.

    Q: What about www.example.com vs. example.com (without the “www”)?
    A: Technically these are different URLs. The disavow links feature tries to be granular. If content that you want to disavow occurs on multiple URLs on a site, you should disavow each URL that has the link that you want to disavow. You can always disavow an entire domain, of course.

    Q: Can I disavow something.example.com to ignore only links from that subdomain?
    A: For the most part, yes. For most well-known freehosts (e.g. wordpress.com, blogspot.com, tumblr.com, and many others), disavowing “domain:something.example.com” will disavow links only from that subdomain. If a freehost is very new or rare, we may interpret this as a request to disavow all links from the entire domain. But if you list a subdomain, most of the time we will be able to ignore links only from that subdomain.

    To disavow an entire domain, you’ll want to use a format like: domain:www.example.com.

    Here’s what Google says about the Link Disavow tool and negative SEO.

  • Google’s Link Disavow Tool And Negative SEO

    In case you haven’t heard yet, Google finally released its long-awaited Link Disavow tool. You can get more details about it here.

    In a blog post about the tool, Google includes a Q&A section. One of the questions in that is: Can this tool be used if I’m worried about “negative SEO”? Here is Google’s official response to that:

    The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building. If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool.

    In general, Google works hard to prevent other webmasters from being able to harm your ranking. However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.

    Negative SEO also came up during the PubCon session in which Google’s Matt Cutts revealed the tool. Barry Schwartz at Search Engine Roundtable has a liveblog from that. Here is his account of what Cutts had to say about the subject:

    All the negative SEO complaints he sees, or most of it, is really not negative SEO hurting you. It is a much better use of your time to make your site better vs hurting someone else. At the same time, we’ve seen cases of this as an issue. I.e. buying a new domain and needing to clean up that site. There are people who want to go through this process. Plus SEOs that take on new clients that went through bad SEOs.

    Remember, Google updated the language of a help center article addressing negative SEO, seemingly indicating that it is possible.

  • Google Link Disavow Tool Does Not Guarantee Google Will Ignore A Link

    Google finally announced the launch of a Link Disavow tool for webmasters today, after months of anticipation. This is a tool that you can use to tell Google to ignore links if you feel they are hurting your search engine rankings.

    You can see plenty more details about it here (along with a video from Matt Cutts).

    One important thing to note about the tool, however, is that just because you do tell Google to ignore certain links, it is not a 100% guarantee that they will do so. It’s more of a suggestion, and Google will still decide whether or not it wants to follow your instructions.

    In a Q&A section in the official blog post announcing the tool, Google says:

    This tool allows you to indicate to Google which links you would like to disavow, and Google will typically ignore those links. Much like with rel=”canonical”, this is a strong suggestion rather than a directive—Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.

    Emphasis added.

    It probably won’t be an issue if you’re using the tool the way it was intended to be used, but it’s something to be aware of.

  • Google Page Layout Algorithm Update Should Be Easier To Recover From

    Remember the days when you could design your site any way you wanted to, and not have to worry about whether or not it would affect the chances of people actually finding your content in the first place? Those days are over. Design matters. More specifically, if you don’t have a substantial amount of content “above the fold” there’s a chance this will keep you from ranking in search results compared to your competitors who went out with a more Google-approved layout.

    On Tuesday, Google’s Matt Cutts announced that Google has launched an update to its Page Layout algorithm. The algorithm was initially announced back in January, and essentially aims to surface relevant pages that have a substantial amount of content “above the fold”.

    Has this algorithm update affected you at all since it was originally launched in January? Did this most recent launch have an impact? Let us know in the comments.

    Basically, Google doesn’t want to point users towards content that they have to scroll down to find. They want you to get to what you’re looking for as quickly as possible. If you and a competitor both have equally relevant content, but your competitor has it closer to the top of the page, and the user has to scroll down on yours to find it, chances are Google will give the edge to your competitor. That is if the Page Layout algorithm is doing its job. Of course, there are still over 200 signals that Google is taking into account, so it’s entirely possible that you’re doing enough other things better than your competitor that your page could still rank higher. But we don’t know how much weight this particular signal gets in the algorithm.

    Google talked about the general philosophy behind the page layout algorithm back in November of last year. “If you have ads obscuring your content, you might want to think about it,” Cutts is quoted as saying at Pubcon at the time. “Do they see content or something else that’s distracting or annoying?”

    Then in January, the actual update announcement came.

    “As we’ve mentioned previously, we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience,” Cutts wrote. “Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content ‘above-the-fold’ can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.”

    “We understand that placing ads above-the-fold is quite common for many websites; these ads often perform well and help publishers monetize online content,” he continued. “This algorithmic change does not affect sites who place ads above-the-fold to a normal degree, but affects sites that go much further to load the top of the page with ads to an excessive degree or that make it hard to find the actual original content on the page. This new algorithmic improvement tends to impact sites where there is only a small amount of visible content above-the-fold or relevant content is persistently pushed down by large blocks of ads.”

    Of course Google does not specify any numbers or limits, and ads can vary greatly in design. Some designs can pull off a substantial number of ads above the fold while not managing to distract too much from the content. This is subjective, and one has to wonder if Google’s algorithm can make the right call on something like design, which is really about perspective from the human eye.

    Initially, according to Cutts, the algorithm update from January affected less than 1% of searches globally. “That means that in less than one in 100 searches, a typical user might notice a reordering of results on the search page,” he said at the time. “If you believe that your website has been affected by the page layout algorithm change, consider how your web pages use the area above-the-fold and whether the content on the page is obscured or otherwise hard for users to discern quickly.”

    The most recent update to the page layout algorithm affected about 0.7% of English queries.

    An update like this was kind of hinted at, back in the early days of Panda. If you’ll recall that famous list of questions Google thinks you should ask yourself when assessing quality, one fo the bullet points was, “Does this article have an excessive amount of ads that distract from or interfere with the main content?”

    That didn’t say “above the fold,” but it’s along these lines.

    If you were hit by this update, it should be easier to recover from than some other Google updates (like Penguin, for example). That is, in theory. This particular algorithm takes into account your pages’ layout every time it crawls the page, so you don’t have to wait six months for Google to launch another refresh, for Google to see any changes you’ve made.

    Cutts explains in the original January post, “If you decide to update your page layout, the page layout algorithm will automatically reflect the changes as we re-crawl and process enough pages from your site to assess the changes. How long that takes will depend on several factors, including the number of pages on your site and how efficiently Googlebot can crawl the content. On a typical website, it can take several weeks for Googlebot to crawl and process enough pages to reflect layout changes on the site.”

    Even if it takes weeks, that’s still a great deal shorter than the length of time webmasters and SEOs have had to wait for the latest Penguin refresh.

    If you want to see how viewers can view your page layout on different browser sizes, Google has a tool for that at browsersize.googlelabs.com. Just enter the URL, and you can see what percentage of users will see different portions of your page. While it is still available, the tool at this URL is actually going away, because Google has simply added it to Google Analytics.

    To use it from Google Analytics, navigate to the Content section, and click In-Page Analytics. From there, click “Browser Size”. This will shade portions of the page that are “below the fold”. You can also click anywhere on the screen to see what percentage of visitors can see it, or you can control the threshold percentage by using the slider.

    Is Google doing a good job of delivering results in which there is substantial content above the fold? Let us know what you think.

  • Google EMD Update: What Is Its Real Impact?

    Google EMD Update: What Is Its Real Impact?

    As you may have noticed, Google has been announcing a lot of algorithm changes lately. This big round of “weather reports” kicked off a couple weeks ago, when Matt Cutts announced the EMD update. He described it as a “minor” weather report, indicating that that it was a “small” change designed to reduce low-quality exact-match domains in search results. He said it would affect 0.6% of English-US queries to a noticeable degree, noting that it was unrelated to Panda/Penguin.

    Do you consider any of Google’s recent updates to be minor? Let us know in the comments.

    The update may have been small as far as Cutts was concerned, but the flood of complaints from webmasters claiming to have been hit suggested otherwise. However, Cutts later revealed that a Panda update had also launched around the same time, and even since then, he has announced a Penguin data refresh and a new update to the Page Layout algorithm. There is plenty going on in Google land that webmasters are finding they need to pay attention to (not to mention those 65 changes Google announced last week that took place in August and September).

    So in light of all of this, how big was the EMD update really? Well, if your site was hit and you do not operate any exact-match domains, it’s probably safe to assume that you were not hit by that update. For the many who do operate EMDs, however, it’s not so simple. Remember, the update is not necessarily going after sites with EMDs. It’s going after low quality sites with EMDs. Much like Panda, it’s really about quality.

    We had a discussion with Todd Malicoat (aka: Stuntdubl), SEO Faculty at MarketMotive.com, who has a fair amount of experience with EMDs and even wrote The Exact Match Domain Playbook: A Guide and Best Practices For EMDs for SEOmoz after the update hit.

    “It’s important to remember that Google does at least a couple changes per day on average,” he says. “A lot of times, they will save up several updates, and release them simultaneously. Exact match domains have been on Matt and his team’s radar for well over 2 years. I think it’s a very difficult thing to ‘draw the line’ of which domains are okay and which aren’t. Google continues to find relevant sites based on page quality, offsite value, domain authority, and keyword relevance. The EMD update is just one in a lot of changes Google has done in the last few weeks, but it is obviously significant.”

    As Malicoat pointed out in his playbook, Cutts actually hinted at this update early last year in one of his Webmaster Help videos. “We have looked at the rankings and weights that we give to keyword domains and some people have complained that we’re giving a little too much weight for keywords in domains,” Cutts said at the time. “And so we have been thinking about adjusting that mix a little bit and sort of turning the knob down within the algorithm so that given two different domains, it wouldn’t necessarily help you as much to have a domain with a bunch of keywords in it.”

    “Anytime an ‘update’ is named it will be a filter or factor that plays a role in how the algo works,” continues Malicoat. “How wide of an impact is not quite as important in trying to determine what changed. Unfortunately, I think even the best of SEO folks are still struggling with exactly what happend in the ‘animal updates’. I try not to make too many assumptions about an update before there’s some time to really experience how it changes a handful of sites and the search results experience. I think as a consultant you can only react to best practices after you understand what they are.”

    “People have been using EMD’s and anchor text for the last few years as a best practice, and I believe it was,” he says. “Those practices have definitely changed, and I think those who move quickly are trying to figure out just HOW MUCH these things have changed. It’s very difficult to tell with a very limited release where only a small percentage of queries are originally effected. Sometimes even seemingly small changes have lasting effects. The bigger issue at play is how significant the changes to keyword anchor text will be.”

    Google has reportedly confirmed that it will launch refreshes for the EMD update periodically, much like it does for Panda and Penguin.

    All of these updates are designed to increase the quality of Google’s search results. Beyond the EMD update, Google has recently made other changes to how it handles domains in different cases. Before the EMD update, Google announced the Domain Diversity update, for example. In its recently announced list of 65 changes from the past two months, Google revealed another domain-related tweak related to freshness to help users find the latest content from a given site when two or more documents from the same domain are relevant for a given query. Is Google getting better at delivering relevant results thanks to such changes?

    “I don’t think anyone can argue that Google results are becoming LESS relevant in most verticals – Google’s results have always shown consistent improvements overall,” says Malicoat. “Relevance is rather subjective depending on who you ask though. Unfortunately, there’s always issues for someone. There’s only so many results, and organic search has become an important part of the marketing mix. It’s hard to support a business without Google sending at least some relevant users to your website.”

    “I don’t always agree with relevance changes, but I come at it from a much different perspective than most,” he notes. “It’s important to embrace the changes and be able to change your strategy with them if you’re going to be an SEO practitioner.”

    When asked if he believes Google’s results have improved in general, in light of recent updates, he says, “I really don’t think I’m the ‘average user’ to ask that sort of question unfortunately. I would come to the conclusion of what makes ‘relevant’ search results with a much different bias than most after being a search user for well over a decade. I’m also the co-owner of Marauder Sport Fishing which uses MiamiFishing.com as our domain, so my opinion is certainly biased.”

    “In my opinion, there are plenty of conflicting interests under the G umbrella these days,” he adds. “That means relevance alone can’t really ALWAYS be the main priority. The one thing they are not lacking is data. They have data and intelligence to make relevance decisions like no other company or entity on earth.”

    “Panda and penguin are both upgrades that raised the bar on the quality a website needs to demonstrate to receive organic search traffic,” he says. “That can be good or bad depending on perspective. It means more authoritative sites are ranking, and websites that don’t display all the quality signals necessary will not attract the traffic. The barrier to entry for new sites is higher, but the occurrence of spam is lower. There’s always some tradeoff in those two things I think.”

    And really, regardless of all of these updates and their various functions and names, they tend to have one main thing in common. They’re designed to improve search results’ quality. Panda is flat out about quality content. Penguin is about getting rid of spam (which makes for a low quality experience). The EMD update goes after EMDs with low quality content. Google’s main message is that you should just produce quality content, and you’ll be fine. Still, quality is subjective, and there are plenty of webmasters getting hit by algorithm updates who would argue that it’s not that simple – webmasters who really believe they do provide quality content.

    “Google is forcing sites to EARN traffic rather than just get it,” says Malicoat. “I think we’ve seen this before, and we’ll see it again. As an optimizer, I don’t look at many of the changes as good or bad – only a change that requires a change in strategy to keep relevant traffic flowing to a website.”

    Businesses and sites need to decide how important Google traffic is. For instance, do Google referrals outweigh the benefits of other potential benefits that could be received by not going the “please Google” route? Since the Penguin update, we’ve seen a lot of sites frantically asking for other sites to stop linking to them. In some cases, the sites asking for the removals admit that they would like to have these links out there, but are having them removed for fear of Google not liking them (even when there is no direct evidence that these links in particular are hurting their Google rankings). In other words, they have become so desperate to combat the negative Penguin experience that they’re overreacting and removing genuine, natural links.

    As Malicoat points out, there are benefits to having EMDs.

    “EMD’s definitely have lots of benefits – though you have to take my opinion with some bias – I own more than a few of them,” he tells us. “In the current Google climate, EMD’s are the symptom of a problem, and therefore an easy target. Link anchor text was a very large part of the Google algo, and is being slowly dialed down. EMD’s were where anchor text problems were MOST apparent. Most competitors were amazed how easily EMD’s ranked in the last few years, and complaints started.”

    “There’s still lots of benefits in EMD’s,” he reiterates. “They are great for: attracting keyword anchor text, attracting social mentions with targeted keywords, better for dominating a small niche, saying what you do in a geo vertical (DenverLawyers.com, DuluthDentists.net, etc.), targeting long tail variations in a small keyword set, and making brand mentions and keyword mentions the same.”

    Not to beat a dead horse, but the key seems to be making sure the quality of your site and its content are as good as they can be. You can have a domain like DenverLawyers.com. Just don’t treat it like a useless piece of crap, and perhaps Google will not either.

    If you want to review the things Google is thinking about when it comes to quality, I’d suggest running through these bullet points Google put out after the Panda update last year.

    Out of Google’s recently announced updates, which do you believe has had the greatest impact on webmasters? On search results? Which has had the greatest impact on you? Share your thoughts in the comments.

  • Google Changes How YouTube Ranks Search Results

    YouTube, often touted as the second largest search engine, just made a change to how it ranks content. The company announced in a blog post that it has just started adjusting the ranking of videos in YouTube search to “reward engaging videos that keep viewers watching.”

    In other words, the more of the video that viewers are actually watching, the more likely it is to rank higher in search results compared to its competitors.

    “This is a continuation of ongoing efforts to focus our video discovery features on watch time, and follows changes we made to Suggested Videos in March, and recent improvements to YouTube Analytics,” YouTube says.

    We reported on the Analytics changes here. In the “Views” report, YouTube will now show you more time watched data. “Estimated minutes watched” can be seen from this report now, and users can choose other data options from the “Compare metric” drop-down.

    “The experimental results of this change have proven positive — less clicking, more watching,” says YouTube of the rankings adjustment. “We expect the amount of time viewers spend watching videos from search and across the site to increase. As with previous optimizations to our discovery features, this should benefit your channel if your videos drive more viewing time across YouTube.”

    It does stand to reason that if the videos that are getting more viewing time from users are appearing near the top of search results more frequently, that more people will spend more time watching YouTube videos. Sounds like a win for advertisers.

    It makes you wonder how much weight Google gives to the time spent on site metric in its web search results.