WebProNews

Tag: Web Design

  • Facebook And Bolt Peters Join Forces

    I wouldn’t say that the current Facebook design is anti-user friendly, but it could use some improvements. Facebook’s latest hiring suggests that they’re taking user experience very seriously.

    You might not be aware of Bolt Peters, but they are a group that focuses on user experience and design for Web sites, devices, cars and other promotional materials. They’ve been very successful with multiple clients from all over the world like Sony, Volkswagen and the New York Times, but the company is closing on June 22. At that time, employees from the company will be moving to the Facebook design team.

    Nate Bolt, one of the co-founders, announced the closure on the company blog. The company obviously has fans and they might be a little worried about Bolt Peters’ other projects once they close up shop. You have no need to worry as each of their side projects will continue on under different leadership.

    First up is ethnio, a UX research recruiting firm. The company became independent on May 17 and the closure of Bolt Peters will not have an effect on the company. They announced a couple of changes coming to the service at that time. If you want to know more, check out their blog post.

    Bolt Peters used to als manage an event called User Research Friday. The event was billed as a “casual conference” for UX, research and design professionals could meet up and discuss the latest trends in the industry. Bolt says that the conference is now in good hands User Interface Engineering.

    They also hosted a mobile photography conference called 1197. That event will now be hosted by the New York Soho Gallery for Digital Art. So as you can see, the closure of Bolt Peters will not have any effect on the projects and conferences they curated over the years.

    It sounds like the team will be creating a far more interesting user experience at Facebook. We probably won’t see the fruits of their labor until later on down the road, but you can check out some examples of their previous work from videos to books.

    [h/t: All Things D]

    [Lead Image: Boltron via flickr]

  • Exclusive: Adobe on CS6, Creative Cloud, and New Focus

    Exclusive: Adobe on CS6, Creative Cloud, and New Focus

    On Monday, Adobe made some big announcements including the next version of its professional design software suite and a new cloud service for syncing, sharing, and storing files. Although there is always excitement surrounding news from the leading company in digital experiences, this week’s announcements were especially intriguing since they included 14 new products and 4 new Creative Suite additions.

    What do you think of Adobe CS6 and Creative Cloud? Let us know.

    Heidi Voltmer, Director of Product Marketing at Adobe Heidi Voltmer, Adobe’s Director of Product Marketing, spoke with WebProNews and told us that Adobe focused its efforts on 4 main areas with these products. As she explained, the company emphasized speed and performance, improving features in its tools, making sure that the content produced in CS6 is ready for devices, and enhancing the user interface.

    In CS6, Photoshop, Illustrator, and InDesign are all powered by Adobe Mercury Graphics Engine, which will dramatically improve the performance of the tools. Voltmer told us that Adobe wanted to make it “really easy for our customers to use our products and to focus on what they’re doing creatively in the tool… and not so much about where they have to find a particular item in a panel.”

    Photoshop CS6 is, of course, one of the big draws to the application bundle, and it is particularly noteworthy since this version is the first completely new release in 2 years. The increased performance it enables lets users receive near instant results in their editing.

    “Now, when you’re editing images or making changes, you’ll actually see them appear just really quickly on the screen instead of having to wait for it to redraw,” said Voltmer.

    Although the Creative Suite would have single-handedly been big news from Adobe, the company also announced Creative Cloud in which users can access its suite of desktop tools – normally $2,599 for a full license- for $49.99 per month. The cloud offering also adds online services for sharing and publishing content created through CS6. As a result, customers have much more flexibility using the software.

    This subscription-based service is also useful for customers that only need the software for a certain period of time. It also gives them access to all the updates Adobe makes to avoid having to buy the newest version every time it rolls out.

    “It’s not just like today, where you buy a single box and you don’t see anything new from Adobe for 12-24 months,” said Voltmer. “With the Creative Cloud, you actually get those updates on an ongoing basis.”

    Last year, when Adobe announced CS5.5, Scott Fegette, Senior Product Manager on the Creative Suite Web team, talked to us about the company’s first attempt at a new pricing model. The company wanted to give customers both long-term and short-term options.

    After listening to the customer feedback from last year’s pilot attempt at changing the pricing model, Voltmer told us that Adobe decided to take the model further this year.

    “We evolved the model to lower the price, first of all, and second of all, to add in additional value,” she said.

    In terms of video, CS6 includes major improvements to both Premiere and Flash. Incidentally, after a long battle with Apple over Flash’s significance on mobile devices, Adobe announced in November that it was re-positioning Flash for use, primarily, in premium video and hi-end gaming. While CS6 does include updates to Flash, Voltmer told us that Adobe really wanted to help bridge creators from Flash to HTML5.

    “We’re really trying to help our Flash customers to transition into animating and creating interactive activity with HTML,” she pointed out.

    With this greater emphasis on HTML5, Dreamweaver also received several improvements for incorporating HTML5 animations and more.

    With all these developments, Voltmer told us that Adobe ultimately wants to streamline complex workflows for creative professionals. The company recently combined its digital media group with its marketing group in an effort to support this goal.

    “By bringing those two pieces together, we offer a much more broad and integrated solution that not only sells to say, our customers in a creative department or an agency, but also people on the business side,” she said.

    According to Adobe, the new products will be available 30 days from its announcement on April 23, but the company is accepting pre-orders now.

  • Leverage Browser Strengths For A Faster Site

    We recently talked about reducing HTTP requests. Here’s a quick recap:

    • Slow web pages impede your website’s goals;
    • 90% of a typical web page’s slowness happens after the HTML has been downloaded;
    • Reducing the number of HTTP requests triggered by your page is usually the best first step to making your pages faster;
    • We reviewed some specific techniques for reducing the number of HTTP requests in a given page;
    • We noted that automation can ease or remove the maintenance burden for more invasive optimization techniques

    Next up on the list: taking advantage of the browser’s capabilities to make your web pages faster and more efficient.

    But are they even “pages” any more?

    Modern web pages have outgrown their humble origins and are not really recognizable as “pages” anymore. Except for the simplest and most old-fashioned brochure-ware sites, visiting a website means executing a complex application that is distributed — and executed — across the web.

    Viewed as such, these web applications are comprised of many parts: a client (the browser); one or more origin servers (where the site is hosted); CDN nodes (where static assets are cached); reverse proxy nodes (e.g. for next-gen whole site acceleration services); third-party assets (hosted on various servers); and the networks that connect them all. So it’s time to stop acting like the origin server has to do all the work and the browser can only present the page to the user. The server is just one part of the application, and it’s playing a shrinking role.

    Performance-minded website architects are showing an increasing tendency to shift the burden of work from the (overloaded) server to the (powerful, underutilized) client, and with good reason. In this article I’ll review some of the ways you can make your website faster by easing the burden on your server and giving the browser more responsibility.

    “Put Me In, Coach, I’m Ready To Play!”

    Modern web browsers run on hardware which is staggeringly powerful by historical standards, and which is simply massive overkill for the uses to which most users put them. It is very common for a user to interact with a site without even beginning to strain the RAM or CPU on his or her computer, while waiting far longer than necessary while an overloaded server (often a poorly configured virtual server on shared hardware in a cheap hosting center) struggles to allocate memory and keep up with the flow of requests without crashing under the load. Distributing more work to the client helps keep the server from getting swamped, can help save on bandwidth and hosting costs, makes the application faster and more responsive, and is generally a better architecture. It’s simply a more efficient allocation of available resources. (And even for less powerful clients, like some mobile devices, the high latency costs of HTTP round trips over mobile connections can still make it worthwhile to offload work from the server to the client.)

    But too many web developers continue to treat the browser – the client side of the client-server interaction – as just a simple “view” of the application. It’s better understood as residing at the heart of the application that is the modern web page. The server has its place, but the browser is increasingly where the action is. It’s got tons of under-utilized processing and memory resources, and its capabilities should be respected and used to their fullest.

    Ok, if you’re ready to leverage the client the first thing you’ll need to do is clean up
    your client-tier code. Seriously.

    Use web standards.

    Using web standards is essential for creating maintainable, accessible, future-proof websites. A great side effect is it’s also the best foundation for maximizing performance. Use of modern web standards encourages the separation of content (HTML), styling (CSS), and behavior (JavaScript). Of course, what constitutes “standards” is a surprisingly tricky question to answer. Debates rage around use of vendor prefixes; formal W3C recommendations lag behind the real world by years; religious wars are fought on the topic of abstract specifications vs de facto standards of what browser manufacturers actually implement… you get the point. But — pedantry aside — in general, strive to write front-end code that validates. And be aware of the places where you trigger warnings or errors.

    Recommended validators include http://validator.w3.org/ (for HTML), http://www.jshint.com/ (for JavaScript), and http://jigsaw.w3.org/css-validator/ (for CSS). Read and follow heroes like Jeff Zeldman and Paul Irish and you’ll be well on your way. Leveraging open source UI frameworks and/or boilerplate templates is a smart path to a solid foundation in standards-based front-end code, too. Using web standards doesn’t alone suffice to make your site fast (though it’ll help), but it will make optimization much more practical and achievable.

    Apply MVC in the page.

    The venerable “MVC” (Model/View/Controller) design pattern has long been the well-established best practice for web applications. Traditionally, “model” maps to the structured data you’d put in your database, “controller” is the application tier on the server that handles requests, applies business logic and generates responses, and “view” is everything the server sends back to the browser. But what some developers overlook is that this same MVC pattern can properly be applied in the front end of your website’s code too. Think of the HTML (the DOM, really) as the model, the CSS as the view, and the JavaScript as the controller. Adhering to this conceptual separation – keeping the HTML model (“what it is”) separate from the CSS view (“what it looks like”) and separate from unobtrusive JavaScript controller (“how it behaves”) – tends to make code more efficient and maintainable, and makes many optimization techniques much more practical to apply.

    Leverage Ajax techniques. Properly.

    Don’t refresh the whole page if you don’t have to! Use Ajax. By only requiring small parts of the page to change in response to user actions, you make your site or web application much more responsive and efficient. But be aware, there are different Ajax approaches.

    For example, fetching complete, styled HTML fragments via Ajax may be appropriate for implementing a sophisticated “single-page interface” (SPI) [https://en.wikipedia.org/wiki/Single page_application]. That’s a powerful approach, but don’t take it lightly – serious SEO and usability gotchas abound. If you’re not doing SPI, retrieving chunks of styled HTML from the server is probably not the right thing to do.

    For most common use cases, it’s better and faster to just pull pure data from the server. Client side templating libraries help solve the problem of turning that data into HTML that can be injected into the DOM and displayed. (Here’s a helpful template chooser.) But with or without client-side templates, fetching serialized data is usually the best Ajax approach for performance.

    Validate in the client.

    At the risk of insulting you smart readers, I have to mention the most obvious case for pushing work to the client, just because so many sites get it wrong: form validation. Picture a user, taking the time to fill out your signup or order form. They painstakingly complete the form and submit it. And then they wait. They look at a blinding white blank screen while the form is posted to the server… and processed…and a new page is generated… and sent back… and rendered… until finally… yes, they see — an error? What a waste of time! That’s an unhappy user and a likely candidate to bail out, abandon your site and go to a competitor.

    Whenever possible, validate the user’s form input from within the page, right where the input is happening. In some cases (such as checking for the availability of a username), doing an async request to the server is appropriate. But in many cases all of the validation rules can be implemented in JavaScript and included with the form in the page. This allows you to give the user instantaneous feedback as they complete the form, and it saves the server a lot of unnecessary work.

    Note for security reasons, web applications should always also validate on the server side. (Rule #1 of web app security is that user input cannot be trusted.) So, validate in the client for reasons of performance and UX, and validate on the server for security.

    Let the browser do the data viz.

    One last specific scenario I want to mention is the visual display of quantitative information. Generating charts and graphs — any sort of pretty-looking data visualization — used to be the sole province of the server. Those days are long gone.

    Now, it makes much more sense to push just the raw data from the server to the browser, in the initial page request. If the data set is too large to include in the initial view, it can be updated via Ajax, in response to user interaction. With modern client libraries (like Processing, D3, and Flot), you can create all kinds of stunning interactive data visualizations right there in the browser. Their capabilities go way, way beyond sorting table columns or rendering a pie chart.

    In this way, many user interactions avoid hitting the server at all. And when they do, it’s a small request and response, consuming the minimum amount of network bandwidth and requiring the least possible amount of work from the poor overworked server.

    To recap:

    • Web “pages” aren’t really pages any more, they’re distributed applications
    • Pushing work from the server to the client is a great way to make your site faster
    • Use best practices (web standards and MVC separation in HTML, CSS and JS)
    • Use the right Ajax approach for the job
    • Powerful client-side templating libraries and dataviz libraries abound

    That’s it for this second article. Next time I’ll dive into another area of web performance optimization. In the meantime I’m always interested in feedback and others’ thoughts on web performance.

  • How to Develop a Next-Generation Mobile Strategy

    According to an IDC forecast, by 2015 more U.S. Internet users will access the Internet through mobile devices than through PCs or other desktop devices. As smartphones begin to outsell simpler feature phones, and as tablet sales continue to explode, the number of mobile Internet users will grow by a compound annual growth rate of 16.6% by 2015.

    Supporting customers and staff on mobile is now a basic requirement for all organizations that currently have an online presence. Perfecting that strategy starts with getting the basics right and setting a strong mobile foundation to build upon.

    Be Comprehensive: Support on Mobile What you Support on the PC

    Since many Internet users are replacing their use of the PC with mobile, it is key for businesses to ensure that whatever users can do on their keyboard, they can do from the palm of their hand. Brands must make sure that every important feature of their website is also implemented on their mobile site and/or apps.

    However, the content and functionalities need to be optimized to properly fit the specific mobile device, rather than just offering a smaller version of the website. The user’s location, screen size, network speed and other key aspects unique to mobile need to be taken into account in the design and delivery of your mobile site or app. Productivity, speed, and simplicity are all essential to achieve maximum usability and customer satisfaction.

    The mobile site needs to support the site’s natural traffic coming from various sources, including SEO, email marketing and advertising. All of this inbound traffic should be redirected to a mobile comparable optimized experience. By getting this aspect right, brands will increase traffic, mobile ROI, and provide the groundwork to expand your mobile strategy.

    Staples mobileStaples.com has successfully achieved a consistent mobile experience on all sources of traffic by automatically directing customers to the mobile optimized view of its site regardless of entry point. If a user is searching for paper products via Google, for example, they will be led to an optimized mobile experience of that product page, despite coming from an outside source.

    Brands should also integrate mobile equivalent third party solutions that consumers trust, such as mobile payment provider PayPal and analytics platform Adobe. Integrating PayPal into your brand’s mobile strategy will make the checkout process that much easier, as users won’t be faced with the hassle of typing in their billing and shipping information for every purchase, or trying to remember their account log-in information for each site they access. Making the checkout process as simple as possible will result in more purchases and repeat visits.

    HTML5 Enhances the Mobile User Experience

    Leveraging HTML5 technology is a great way to improve the consumer’s experience with your brand in the mobile browser environment. By developing with next-generation HTML5 technologies, brands can deliver users a rich, app-like experience without having to develop downloadable apps for each specific mobile platform (iOS vs. Android and others). HTML5 enables brands to offer a consistent experience across all major mobile operating systems.

    Additionally, HTML5-based mobile sites enable users to take their mobile shopping experience to the next level with innovative features such as location-aware capabilities, high-resolution image galleries that enable you to zoom in to view products in detail, expandable navigation, collapsible menus, and advanced shopping carts that streamline and simplify the amount of steps users need to follow to complete a transaction.

    Travel giant Expedia.com is a brand dedicated to its innovative mobile strategy. The company’s HTML5-enabled mobile site, for example, takes “location-aware” to the next level by leveraging the device’s internal GPS to offer travelers the ability to search for nearby hotels with same day vacancies, as well as push notifications based on their location.

    Expedia mobile

    Scaling your Mobile Site to Reach an International Audience

    Rapid smartphone adoption is a global phenomenon, with more and more users around the world replacing their feature phones with smartphones. In fact, global smartphone sales grew 53.3% in 2011 and made up 34% of all mobile handsets sold in the year, according to Informa Telecoms research agency.

    Further, it is estimated that over a billion people would own a smartphone by 2013, reiterating the massive market potential. According to a recent PhoCusWright report, smartphone use now exceeds 50% in the U.S. and Europe. While Asia’s smartphone adoption has been a bit slower, the smartphone market there is expected to double its size by 2016, according to research firm Ovum.

    Because of this international smartphone surge, it is becoming more and more important for organizations to extend their mobile channel to match regional web channels in order to leverage worldwide smartphone adoption. It is vital to ensure that your company’s mobile site can adapt to regional differences in mobile user context and other key features.

    Engaging consumers on a worldwide scale has many benefits to both the company and the consumer. These include increased mobile channel revenues and maximized usage and repeat visits. Offering consumers access to online features and functionalities in their preferred language and currency will create a positive experience and encourage repeat visits.

    FedEx has implemented multiple language support to help reach its global user base. The shipping company’s site currently supports 242 regions and 25 languages worldwide, a number that signifies the importance FedEx places on scaling its mobile site for an international audience.

    Looking Forward

    In our new smartphone driven world, it is no longer enough for organizations to offer a simple optimized mobile site. Instead, brands must develop a strategy that leverages next-generation features and functionalities to make the customer experience just as comprehensive and easy to use as offered on the traditional website. By strategically implementing a well-executed mobile strategy, companies will see an increase in conversion, repeat visits, and overall positive brand awareness from users engaging with their brand via mobile devices.

  • Optimizing A Site For Mobile: Google Provides 70 Minutes Worth Of Tips

    Google has posted a pair of webinars from its “GoMo” campaign, which is an initiative to get people to create mobile-friendly sites. One of the webinars is for advertisers and one is for publishers. They both include tips and case studies on sites that have gone mobile.

    Frankly, you shouldn’t need Google to tell you that you need to be optimized for mobile these days, but there is still plenty to be learned.

    In the first one (the one for advertisers), Google discusses mobilizing your site, maximizing mobile ads, and tracking/measuring them. The second one discusses why you should go mobile, provides tips for building a mobile site, talks about best practices and explains how to get started.

    As a bonus, Google has also made the slides from the webinars available to download here (pdf).

  • Infographic Looks At Direct And Indirect Costs Of Testing Your Site

    Monetate has put out an interesting infographic looking at the total cost of website testing. The top challenges, according to Monetate are: deciding what to test, prioritizing testing initiatives, conducting the actual tests, obtaining enough traffic for statistical significance and acting on the test results.

    “To accomplish these and other challenges, marketers are turning to website testing tools that promise to deliver higher conversions and a positive return on investment,” the company says. “But how much do they really cost? The cost of paid website testing tools can be higher than just the ‘sticker price.’ And are ‘free’ tools really free?”

    “The near century-old financial estimate known as Total Cost of Ownership (TCO) can help determine direct and indirect costs of a purchase,” the infographic says. “Adopted by Gartner to help measure the true cost of software or hardware investments over time, companies should consider TCO when deciding which website testing tool to use.”

    Check out the image for a breakdown of direct and indirect costs.

    Total Cost of Website Testing

  • Want A Faster Website? Reduce Requests.

    If your website consistently loads in under two seconds, congratulations!  It is in the small minority of sites able to meet the new threshold for patience among Internet users. But if yours is like the vast majority of websites that take longer than two seconds to load — even for users with a decent browser and a fast connection — you might want to keep reading.

    The sad truth is, after three or four seconds, nearly half of your website’s visitors may be gone, having bounced away in frustration.  And for those users who do suffer through a slow page load, the slowness will negatively impact their satisfaction and their engagement with your site. Delays as small as a hundred milliseconds have been shown to decrease the amount of time users spend on a site, reduce conversion rates and reduce average order size. Slow pages also rank lower in searches, are less likely to get indexed, and are less likely to be recommended via word of mouth. In every conceivable way, when it comes to your goals for your website, slowness is the enemy.

    Where does the slowness come from? Perhaps surprisingly, over 90% of the time users spend waiting for a given page to load typically occurs after the main HTML page has been retrieved from the server! So for a huge number of slow sites, the single biggest culprit is bloat and inefficiency in the page. Specifically, the main offender is often too many HTTP requests triggered by the page. Each stylesheet, script and image found in a typical HTML page requires a separate round trip from the browser, to request and receive the resource from a web server.  The latency inherent in these HTTP requests delays the display of the page and users’ ability to interact with it. The overhead of these round trips can be massive, dwarfing the time it took to obtain the HTML document itself. So one effective rule of thumb for making your pages faster is to find ways to reduce the number of HTTP requests it requires.  This principle is the basis for many of the front-end optimization techniques recommended by web performance experts. Below are a few such techniques for reducing HTTP requests.

    A quick disclaimer about the following recommendations: web performance optimization is a complex and evolving discipline, and brief articles like this one require glossing over some subtleties, details and trade-offs. So, please consider these points as an introduction and a starting point, rather than a definitive or prescriptive solution.

    1. Stylesheets

    Combine them:

    Simply “concatenate,” or combine, multiple stylesheet files into one. If you have four .css files, for instance, instead of referencing them individually:

    <link rel=”stylesheet” type=”text/css” media=”all” href=”/css/one.css”>
    /** contains: css rules in one.css **/
    <link rel=”stylesheet” type=”text/css” media=”all” href=”/css/two.css”>
    /** contains: css rules in two.css **/
    <link rel=”stylesheet” type=”text/css” media=”all” href=”/css/three.css”>
    /** contains: css rules in three.css **/
    <link rel=”stylesheet” type=”text/css” media=”all” href=”/css/four.css”>
    /** contains: css rules in four.css **/

    combine their contents into a single .css file like this:

    <link rel=”stylesheet” type=”text/css” media=”all” href=”/css/concat-v2.css”>
    /** css rules in one.css **/
    /** css rules in two.css **/
    /** css rules in three.css **/
    /** css rules in four.css **/

    … so the content of each of these four files is included in the same order in the combined file. The same styling will apply to your page, but it cost a fraction of the HTTP requests to fetch it.

    Put CSS in the head:

    Also, be sure to put CSS in the HTML document’s <head> where it belongs, to speed up rendering. Inline <style> tags can delay rendering and force the browser to re-paint the whole page, which is costly from a performance perspective. Minification and compression are highly recommended, to reduce file size. Making your stylesheets cacheable will also greatly speed up subsequent page. (I’ll discuss caching and related topics in a future article in this series.)

    2. Scripts

    Combine them:

    Just like with stylesheets, concatenating JavaScript files is a good way to reduce the number of HTTP requests in your page. If you have multiple scripts, combining them into a single script that is minified, compressed and is recommended.

    Note there are tradeoffs with concatenation, for both CSS and JS. For example, you may want to discriminate between styles and scripts needed in every page of your site versus those unique to a given page or section of your site. You may also want to prioritize early loading for certain scripts, while others are fine to delay until everything else is done. And you may choose to reference certain popular libraries like jQuery on a commonly used public CDN URL in an attempt to leverage the browser’s cache from other sites the user has visited. So there are valid reasons not to combine every single .css or .js file in every page. But in general, given a choice between loading a bunch of smaller files separately and loading a single larger combined file, the latter is usually the better choice for performance.

    Put scripts at the bottom:

    Scripts should be loaded at the very bottom of your HTML document before the closing </body> tag. The old-school practice of putting them in the <head> introduces severe performance problems, in part because scripts are “blocking” resources. In other words, when the browser encounters a script tag, it generally stops doing much of anything else at all until it has fetched and executed the script. So putting scripts at the end of the document and/or fetching them asynchronously is important for performance. There are different ways to account for dependencies of the page on external scripts or interdependencies among scripts, and to preserve their order of execution in a cross-browser compatible way. For complex scripting and client-heavy web applications, using an open source script loader can be a good solution. But the simplest approach is usually just to put them at the bottom, in the desired execution order. [To learn more about blocking scenarios and script loading approaches, I highly recommend the excellent Steve Souders book “Even Faster Web Sites”.]

    3. Images

    Having too many images on a webpage is a very common performance problem that’s nearly as old as the <img> tag. Fortunately, there are optimization techniques which allow you to implement your chosen design with fewer HTTP requests. This can be achieved by combining image files with CSS sprites, by in-lining them with Data URIs, or by using pure CSS to eliminate the use of an image file altogether. Read on for details.

    Combining with Sprites:

    Combining images with “CSS spriting” has become a mainstream optimization technique. The idea is to combine a number of common images into a single, larger image file—and thus a single request.  This typically includes images like navigation elements, buttons, logos, icons or any other static images which are central to a design and which rarely change. When the big master image containing the smaller ones in it is fetched, CSS is then used to precisely position specific parts of the same image in the right places in the page while hiding the other parts from view. The result is a page that looks the same as if it had loaded each image separately, but which only required one HTTP request instead of dozens.

    Note this technique is fairly invasive, and can introduce a considerable maintenance burden. Since the images are no longer in separate files, editing even one of them can require generating a new master sprite image and sometimes editing the HTML and CSS using the sprite. Tools for automating CSS sprite generation do exist to help ease their maintenance costs [e.g. http://spriteme.org and http://compass-style.org/help/tutorials/spriting/]. However, it’s worth comparing alternative solutions for reducing HTTP requests associated with images.

    Inlining with Data URIs:

    It is possible to directly embed the contents of an image file in HTML or CSS, instead of referencing it as a separate file. For example, you can replace this:

    <img src=”tiny_image1.png” …>

    with this:

    <img src=”data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAUA AAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO 9TXL0Y4OHwAAAABJRU5ErkJggg==” …>

    This technique has some advantages over spriting. It involves zero extra HTTP requests, and it also makes the in-lined image immediately available for rendering as soon as the browser discovers it. It does increase the file size of the HTML page or the CSS file that includes it, so it’s definitely not appropriate for large images, but this tradeoff is often worth making for smaller files (e.g. < 4KB). For sites with a bunch of small images, this can be extremely effective for reducing extra requests. As for caching, if the DataURI reference is in a CSS file or HTML page that can be cached, the image is effectively cached as well.

    Data URIs have maintenance overhead too. Changing an image requires changing every HTML or CSS file that references it. Also, some older and less capable but still-popular browsers don’t support Data URIs, so some conditional processing is required. This is a primary reason why Data URIs are not as popular as CSS sprites in the web performance world. But with the advent of automated performance optimization services (like Yottaa Site Speed Optimizer), the feature detection and maintenance burden can be removed altogether.

    Replacing Images with Pure CSS:

    One smart approach to reducing HTTP requests is to forego image files, in favor of styling HTML elements with CSS. CSS can be used to great effect in implementing background colors, borders, buttons, hover effects, styled text, and even vector-scalable icons (via “icon fonts”). Using pure CSS is virtually always better for performance than using image files, and it can help make your site more maintainable too. Of course, cross-browser compatibility issues and “graceful degradation” of a complex design [or better, proper “progressive enhancement” of a simple basic design] are a challenge for every web developer and designer… but there are some fantastic open-source projects and communities that can really help. For example, HTML5Boilerplate and Twitter-Bootstrap provide great starting points for excellent, highly tested templates and user interfaces, and are essentially reference implementations for modern web development best practices. They are a great way to leverage the accumulated knowledge of experts and to learn about best practices for your markup.

    Splitting up the page:

    Finally, for cases where the number of images can’t easily be reduced with one of these techniques, it’s worth considering breaking up the page itself into smaller pieces. Pagination, with use of Ajax to fetch additional content when the user requests it, is a way to support a smaller initial page with fewer HTTP requests. That is, instead of a massive page triggering large numbers of image requests, have the initial page require just a few such images. Then use pagination to incorporate the additional content and images in the user experience. This isn’t a casual decision — if not implemented carefully, splitting up a page can cause problems for SEO, accessibility, bookmarking and user-friendliness. But by following best practices, these issues can be surmounted, with the desired result of faster pages delivered to your site’s visitors.

    To recap:

    * Slow web pages impede your website’s goals;
    * 90% of a typical web page’s slowness happens after the HTML has been downloaded;
    * Reducing the number of HTTP requests triggered by your page is usually the best first step to making your pages faster;
    * We reviewed some specific techniques for reducing the number of HTTP requests in a given page
    * We noted that automation can ease or remove the maintenance burden for more invasive optimization techniques

    Next time I’ll dive into another area of web performance optimization: doing more to leverage the browser’s capabilities, moving toward the client side of the client-server relationship. In the meantime I’m always interested in feedback and others’ thoughts on web performance. Please reply in the comments.

  • Thesis WordPress Theme Creator Talks Golden Ratio and Typography

    Have you ever thought about applying the Golden Ratio to your website? Let me back up and ask – do you even know the Golden Ratio is? According to Wolfram MathWorld, the Golden Ratio is:

    The golden ratio, also known as the divine proportion, golden mean, or golden section, is a number often encountered when taking the ratios of distances in simple geometric figures such as the pentagon, pentagram, decagon and dodecahedron. It is denoted phi, or sometimes tau.

    The Golden Ratio is often associated with the Greeks and arts and architecture. However, Chris Pearson, the Creator of Thesis and Founder of DIYthemes, has a pretty convincing argument for applying it to typography and Web design. He said that if people relate aesthetic beauty and efficiency to the Golden Ratio, then why not use it in typography?

    “There’s a whole lot of conjecture and assumption and arbitrary decisions that have been made throughout history to say this is the right way to set text,” he said. “I’m not satisfied with arbitrary choices and preferential selection.”

    As Pearson explained it to WebProNews, looking at the Golden Ratio in this way gives proper placement to every single pixel on a website. He built a calculator to simplify the process to allow others to take advantage of this concept too.

    “The idea is that you’re gonna produce the most aesthetically beautiful, aesthetically pleasing, and easy-to-read text that you can have,” Pearson pointed out.

    He went on to say that applying this idea to a website is extremely important since content is the primary mode of communication for publishers.

    “Since content is the only way that you’re communicating with people online… I don’t want to leave to arbitrary selection and choice,” he said.

    Pearson also added that applying the Golden Ratio to typography also helps publishers organize their content proportionately, which is critical since people view content on so many different devices.

    Are you open to trying Pearson’s idea for your website? Let us know.

  • Redesigning Your Site? Don’t Make it Harder for Google to Extract the Text

    Google posted one of the Matt Cutts Q&A videos today, where he talks about the effects of site redesigns with redirects on search rankings. Here’s the specific question as it was posed to Matt:

    I’m changing the platform of my blog. All old URLs will redirect to new ones. But, since the HTML code and layout of the pages are different, do you lose search engine rankings?

    “Well, search engine rankings can change when the page changes itself,” Cutts responds. “If you’re doing the 301s correctly – a permanent redirect from the old site to the new site, and if you’re doing it at a page level – so from the old page to the new page – you should be in relatively good shape, but it’s not just incoming links.”

    “It’s also the content of the page itself,” he continues. “So if you had a really good layout with a really clean design, where all the text was really easily indexed, and you move to something that was a lot more confusing, and maybe the text wasn’t as easy for us to extract, that could change your search rankings for the downside, or for the negative.”

    “In general, we’re relatively good about changing layouts and still being able to discern what that page is about, but here’s one test that you could do: as long as you haven’t done the transition yourself, if you can try making a few tests, where you can take the layout of the new page or the new site, and see if you can apply it in some very simple ways to the old site, then that’s a way to isolate those, because it’s just like any scientific experiment,” he says. “If you do two things at once, and your rankings go down, you can’t decouple what caused it. Whereas if you can change just the layout – even if it’s only on a few pages, to try out and see whether your rankings change with that, then you’ll know – was it more likely to be because of the redirects or because I was changing my HTML layout.”

    In terms of layouts, you may also do well to consider the role design plays in how Google determines quality content. Would people be comfortable giving your site their credit card info? Design can play a big role in this. Another question on Google’s list of “questions that one could use to assess the ‘quality’ of a page or an article,” is “Are the pages produced with great care and attention to detail vs. less attention to detail?” Then there’s the whole load time factor. Google does count page speed as a ranking signal.

  • Adobe Launches Creative Suite 5.5

    Today, Adobe announced a new version of its popular Creative Suite software. The company has always aimed to enable Web designers and developers to produce quality products, and the new Creative Suite 5.5 Web Premium is no exception.

    Do you use Adobe’s Creative Suite? Tell us what you like about it.

    According to Scott Fegette, Adobe’s Senior Product Manager on the Creative Suite Web team, CS5.5 “helps Web pros work more efficiently and quickly without having to go back to school and learn a whole new slew of skills or technology.”

    While CS5.5 includes updates to Adobe Flash Catalyst, Adobe Flash Builder, Adobe Device Central, and Adobe Acrobat X Pro, the biggest updates come through Adobe Dreamweaver and Adobe Flash Pro. Through CS5.5, both Dreamweaver and Flash Pro enhance the workflow and sharing experience for Web developers.

    “We really sort of tried to gear all of our work in CS5.5 just to make it much more easier for Web professionals to either transition their existing projects, sites, and applications to a multiscreen experience, or in some cases, just literally start from scratch with a green field, dedicated mobile project,” said Fegette.

    In terms of Adobe Flash Pro, CS5.5 provides a new feature called Scale Content with Stage that allows users to scale proportionately and quickly. In the past, it could take a long time for users to convert animations and movie clips. Through this feature, this same action could take a matter of seconds.

    Adobe also made significant advances in HTML5, specifically its authoring tools and its support of jQuery mobile framework integration for browser-based content. In addition, it allows PhoneGap integration for the development of native and mobile apps. In other words, these improvements eliminate extra steps for developers and make their experience more seamless.

    The company has also revamped its pricing and subscription models to give customers the latest versions of their software without being locked into 18-month cycles. Adobe is giving customers both long-term and short-term options with prices starting at $89 per month.

    “The hope is that, literally, as soon as people get it installed and take a look at it, it’s just immediately going to feel like a natural extension of the skills they already learned,” Fegette added.

  • Getting Your Site Ready for TV

    Getting Your Site Ready for TV

    If you’re not already, it’s time to start thinking about optimizing your site for televisions. With more connected devices coming out and gaining popularity (not to mention people simply hooking their computers up to their TVs), you’re going to want to have a site that is presentable on large TV screens, as well as small ones. 

    Now that Google’s own Google TV is here, the company is talking more about TV optimization itself. "Because Google TV has a fully functioning web browser built in, users can easily visit your site from their TV," says Google Developer Programs Tech Lead Maile Ohye. "Current sites should already work, but you may want to provide your users with an enhanced TV experience — what’s called the ’10-foot UI’ (user interface). They’ll be several feet away from the screen, not several inches away, and rather than a mouse on their desktop, they’ll have a remote with a keyboard and a pointing device."

    Ohye says that text should be large enough to be viewable from the sofa-to-TV distance, site navigation should be able to be performed through button arrows on the remote, selectable elements should provide a visual queue when selected (it should be obvious what sections are highlighted), etc. 

    There is an entire Google TV site optimization guide here and a checklist here. I would get familiar with these. Google actually has a gallery of sites that are optimized for TV, though it’s not very big, and very video-based. 

    Google says you can get a general idea of what your site looks like on TV by using a large monitor, making the window size 1920 X 1080, visiting your site in a browser at full screen, zooming the browser to 1.5X the normal size, moving back, and looking at it. 

    It’s also worth noting that people using Google TV are required to use a Google account, as Danny Sullivan points out in his review of the new Sony Google TV-ready Blu-Ray Player. Interestingly, it also asks users if they want to send usage stats to Google (this is aggregate data used for detecting bugs, according to the company, which also says it doesn’t collect any viewing history. According to Sullivan, Google still has a lot of work to do with search on Google TV.

    Is your site ready for TV? 

  • Google Launches New Preview Feature for Font Directory

    Google has launched a new feature for its font directory that lets users preview fonts and generates code to use them. The font directory contains fonts that are part of Google’s font API.

    "Now, whenever you visit the font family page of any of the fonts, you will see a link saying ‘Preview this font’ that will load your font selection into the font previewer," says Marc Tobias Kunisch with the Google Font API Team. "Here you can edit the text, change its size and line height, and add decorations and spacing among other things. You can even apply text shadow to your text."

    Google Font Previewer launched

    The preview will then generate code that you can stick in your style sheet.

    "If you want to see the font sample without any distractions from the font previewer controls, you can do that as well simply by clicking ‘Toggle controls’ in the upper right corner," notes Kunisch. "This will show you a nice clean example of what the font would look like in your design."

    The Font API (in beta) lets users add web fonts to any web page.

  • Google Ditches Local Listings for SEOs and Designers

    As 2009 came to a close, Google managed to get SEOs riled up for one last controversial topic. For some time, SEOs and web designers have been noticing that Google has not been showing local listings in search results for queries related to their businesses – even location-specific ones.

    Should SEOs and designers be worried about local listings?
    Comment here.

    As Matt McGee mentions in a Search Engine Land piece, even a query like "candy" without any geographical indicator will bring up a seven-pack of local results, but a query for "seo" or "web design" or even something as specific as "web design vancouver" will bring up no local listings whatsoever (although the organic results still heavily favor local businesses in location-specific queries).

    Web Design Vancouver

    Needless to say, some SEOs and designers are taking this as something of a slap in the face, justified or not. Search engine optimization and web design are both services after all, and just about every other type of service you can think of will yield local listings in a Google search.

    While this phenomenon was originally thought to be a bug, Barry Schwartz of RustyBrick fame points to a Google Maps Help thread where a Googler going by Joel H. tells a different story:

    Today, we’re intentionally showing less local results for web design / SEO queries. For example, [web design sacramento] doesn’t display local listings today. We believe this is an accurate representation of user intent. In some cases, we do show local listings, however (as NSNA/php-er noted) [web design in bellingham]. I’m sure some of you feel we should be displaying local results for queries like [Web Design Vancouver]. I understand that concern, but based on our understanding of our users, we feel this is the right decision for now.

    I’ll give the usual disclaimer that we’re constantly working on improving the user experience and results will vary over time. So, this could change in the future, but I wanted to be explicit about what we’re doing today.

    So if you use the word "in" in your query, you are more likely to get the local results. Some still have a hard time finding the logic in this move.

    Web Design Vancouver

    "I’m all for their interest in balancing for user intent – it’s their business, their product – but I’m missing the logic here," comments Bill Sebald. 

    "I find this disturbing," says Scott Clark. "If I have a physical location in a given area, offer a service to customers in that area that is close to their query, then onebox listings should appear as they do for other creative-class industries."

    Not all SEOs have such a problem with what Google is doing though. "I want to be found by people everywhere, not just in the small city I happen to live in at the moment," a content writer comments.

    "But you would think that if people typed in a city name or other location, they are actually looking for local results and the maps could be useful," they add. "Although if you have optimized your website for your location, you should get found anyway. And I do all my work online, people don’t need to visit me or even know where I am located so in that sense the maps aren’t always useful or necessary."

    People are saying that in some countries, they are still seeing local results for the type of query in question. It is possible that Google has just not rolled out the changes everywhere yet. The quoted content writer suggests that Google just doesn’t know the user-intent of all of its countries’ people as well as it does for the countries where the changes exist.

    What do you make of Google showing less local results for SEOs and web designers? Will it hurt local businesses? Share your thoughts.

    Related Articles:

    > Google Adds Place Pages to Google Earth

    > Google Comes to Brick and Mortar Store Windows

    > Critical Local Search Factors To Pay Attention To