Cloud, Digital, SaaS, Enterprise 2.0, Enterprise Software, CIO, Social Media, Mobility, Trends, Markets, Thoughts, Technologies, Outsourcing


Contact Me:

Linkedin Facebook Twitter Google Profile


wwwThis Blog
Google Book Search



  • Creative Commons License
  • This page is powered by Blogger. Isn't yours?
Enter your email address below to subscribe to this Blog !

powered by Bloglet


Thursday, December 02, 2004

The magic that makes Google tick

The numbers alone are enough to make your eyes water .
Excerpts :

• Over four billion Web pages, each an average of 10KB, all fully indexed
• Up to 2,000 PCs in a cluster
• Over 30 clusters
• 104 interface languages including Klingon and Tagalog
• One petabyte of data in a cluster -- so much that hard disk error rates of 10-15 begin to be a real issue
• Sustained transfer rates of 2Gbps in a cluster
• An expectation that two machines will fail every day in each of the larger clusters
• No complete system failure since February 2000
It is one of the largest computing projects on the planet, arguably employing more computers than any other single, fully managed system (we're not counting distributed computing projects here), some 200 computer science PhDs, and 600 other computer scientists. And it is all hidden behind a deceptively simple, white, Web page that contains a single one-line text box and a button that says Google Search.

Recently, Google's vice-president of engineering, Urs Hölzle, gave an insight to would-be Google employees into just what it takes to run an operation on such a scale, with such reliability. Google's vision is broader than most people imagine, said Hölzle: "Most people say Google is a search engine but our mission is to organise information to make it accessible." Behind that, he said, comes a vast scale of computing power based on cheap, no-name hardware that is prone to failure. There are hardware malfunctions not just once, but time and time again, many times a day. Yes, that's right, Google is built on imperfect hardware. The magic is writing software that accepts that hardware will fail, and expeditiously deals with that reality, says Hölzle.
The problem
Google indexes over eight billion Web pages, using an average of 10KB per page, which comes to about 40TB. Google is asked to search this data over 1,000 times every second of every day, and typically comes back with sub-second response rates. If anything goes wrong, said Hölzle, "you can't just switch the system off and switch it back on again." The job is not helped by the nature of the Web. "In academia," said Hölzle, "the information retrieval field has been around for years, but that is for books in libraries. On the Web, content is not nicely written -- there are many different grades of quality." Some, he noted, may not even have text. "You may think we don't need to know about those but that’s not true -- it may be the home page of a very large company where the Webmaster decided to have everything graphical. The company name may not even appear on the page."
ThinkExist.com Quotes
Sadagopan's Weblog on Emerging Technologies, Trends,Thoughts, Ideas & Cyberworld
"All views expressed are my personal views are not related in any way to my employer"