<$BlogRSDUrl$>
 
Cloud, Digital, SaaS, Enterprise 2.0, Enterprise Software, CIO, Social Media, Mobility, Trends, Markets, Thoughts, Technologies, Outsourcing

Contact

Contact Me:
sadagopan@gmail.com

Linkedin Facebook Twitter Google Profile

Search


wwwThis Blog
Google Book Search

Resources

Labels

  • Creative Commons License
  • This page is powered by Blogger. Isn't yours?
Enter your email address below to subscribe to this Blog !


powered by Bloglet
online

Archives

Sunday, May 13, 2007

Always Available High Volume Sites & Faster Performance

Architecting high volume sites is a challenging task – Many engineering disciplines come together in creating a highly available site that users around the world can access routinely by typing a URL address. Recently, I called the architectures of high performance sites such as eBay, Amazon & Google as modern wonders of the world. Web site performances are becoming more and more important - once might see while downloading, users are asked to select their servers of choice from which to download – mostly geographical proximity /traffic bandwidth information helps users to choose from where to download. For an ordinary user, website availability and speed of loading the site is of paramount importance – whatever be the bandwidth availability – very often users think that fast loading sites are an important measure of the quality of the site. Jacob Rosenberg points out the range of variance that users may experience owing to location.
Keynote tests of a single-site web application -Digg.com delivered from the West Coast of the US to:
- San Jose, CA: 403ms (.4 seconds)
- New York, NY: 1,993 ms (2.0 seconds)
- Frankfurt, Germany: 3,658 ms (3.7 seconds)
- Bangalore, India: 6,224 ms (6.2 second)
Some believe that faster loading sites get better traffic. Yahoo’s chief performance Yahoo man, Steve Souders is writing a series of blogs on Yahoo! Developer Network describing best practices he’s developed at Yahoo! for improving performance. He advocates that 80-90% of the end-user response time is spent downloading all the components in the page: images, stylesheets, scripts, Flash, etc. Rather than starting with the difficult task of redesigning your application architecture, it's better to first disperse your static content. This not only achieves a bigger reduction in response times, but it's easier thanks to content delivery networks. A content delivery network (CDN) is a collection of web servers distributed across multiple locations to deliver content more efficiently to users. The server selected for delivering content to a specific user is typically based on a measure of network proximity. For example, the server with the fewest network hops or the server with the quickest response time is chosen. Steve’s recent presentation(made along with Tenni Theurer) on the topic of High Performance Web Sites emphasizes the optimizing website performance by focusing on front end issues. The content rich presentation is full of so much of data and covers several underlying principles. The golden rules of performance are:
1. Make fewer HTTP requests
2. Use a CDN
3. Add an Expires header
4. Gzip components
5. Put CSS at the top
6. Move JS to the bottom
7. Avoid CSS expressions
8. Make JS and CSS external
9. Reduce DNS lookups
10. Minify JS
11. Avoid redirects
12. Remove duplicate scripts
13. Turn off ETags
14. Make AJAX cacheable and small

In the end the advise boils down to:
• Focus on the front-end
• Harvest the low-hanging fruit
• User response times can be controlled

Highly recommended reading.

Labels: , , ,

|
ThinkExist.com Quotes
Sadagopan's Weblog on Emerging Technologies, Trends,Thoughts, Ideas & Cyberworld
"All views expressed are my personal views are not related in any way to my employer"