Even Faster Web Sites: Performance Best Practices for Web Developers

Even Faster Web Sites: Performance Best Practices for Web Developers

Steve Souders

Language: English

Pages: 256

ISBN: 0596522304

Format: PDF / Kindle (mobi) / ePub


Performance is critical to the success of any web site, and yet today's web applications push browsers to their limits with increasing amounts of rich content and heavy use of Ajax. In this book, Steve Souders, web performance evangelist at Google and former Chief Performance Yahoo!, provides valuable techniques to help you optimize your site's performance.

Souders' previous book, the bestselling High Performance Web Sites, shocked the web development world by revealing that 80% of the time it takes for a web page to load is on the client side. In Even Faster Web Sites, Souders and eight expert contributors provide best practices and pragmatic advice for improving your site's performance in three critical categories:

  • JavaScript--Get advice for understanding Ajax performance, writing efficient JavaScript, creating responsive applications, loading scripts without blocking other components, and more.
  • Network--Learn to share resources across multiple domains, reduce image size without loss of quality, and use chunked encoding to render pages faster.
  • Browser--Discover alternatives to iframes, how to simplify CSS selectors, and other techniques.

Speed is essential for today's rich media web sites and Web 2.0 applications. With this book, you'll learn how to shave precious seconds off your sites' load times and make them respond even faster.

This book contains six guest chapters contributed by Dion Almaer, Doug Crockford, Ben Galbraith, Tony Gentilcore, Dylan Schiemann, Stoyan Stefanov, Nicole Sullivan, and Nicholas C. Zakas.

A/B Testing: The Most Powerful Way to Turn Clicks Into Customers

Shell Scripting: Expert Recipes for Linux, Bash and more

Project 2016 For Dummies

Augmented Reality: An Emerging Technologies Guide to AR

Design by Evolution: Advances in Evolutionary Design (Natural Computing Series)

Byte (March 1986)

 

 

 

 

 

 

 

 

 

 

 

 

Sparingly, explains the downsides of iframes and offers a few alternatives. Chapter 14, Simplifying CSS Selectors, presents the theories about how complex selectors can impact performance, and then does an objective analysis to pinpoint the situations that are of most concern. xvi | Preface The Appendix, Performance Tools, describes the tools that I recommend for analyzing web sites and discovering the most important performance improvements to work on. Conventions Used in This Book The

JavaScript is a single-threaded language, only one script can be run at a time per window or tab. This means that all user interaction is necessarily halted while JavaScript code is being executed. This is an important feature of browsers since JavaScript may change the underlying page structure during its execution, with the possibility of nullifying or altering the response to user interaction. If JavaScript code isn’t carefully crafted, it’s possible to freeze the web page for an extended

significantly to the cost. The analysis of applications is closely related to the analysis of algorithms. When looking at execution time, the place where programs spend most of their time is in loops. The return on optimization of code that is executed only once is negligible. The benefits of optimizing inner loops can be significant. For example, if the cost of a loop is linear with respect to the number of iterations, then we can say it is O(n), and we can graph its performance as shown in

Failure line. This is when the user refreshes or closes the browser because the application appears to have crashed, or the browser itself produces a dialog suggesting that the application has failed and that the user should take action. Figure 1-2. The Axes of Error There are three ways to avoid intersecting the Axes of Error: reduce the cost of each iteration, reduce the number of iterations, or redesign the application. When loops become nested, your options are reduced. If the cost of the

time. To reduce this overhead, HTTP/1.1 uses persistent connections and performs multiple requests and responses using a single connection. Persistent connections are typically held open longer and thus impose a greater burden on servers that have a finite number of connections available. Hence, the recommended number of connections per server is reduced to two for HTTP/1.1. By downgrading to HTTP/1.0, AOL and Wikipedia achieve a higher level of parallel downloads, but this benefit is gained at

Download sample

Download

About admin