Cloud, Digital, SaaS, Enterprise 2.0, Enterprise Software, CIO, Social Media, Mobility, Trends, Markets, Thoughts, Technologies, Outsourcing


Contact Me:

Linkedin Facebook Twitter Google Profile


wwwThis Blog
Google Book Search



  • Creative Commons License
  • This page is powered by Blogger. Isn't yours?
Enter your email address below to subscribe to this Blog !

powered by Bloglet


Sunday, October 24, 2004

Efforts Are Under Way To Create A Computer The Size Of The World via Economist

The Economist writes, the stated goal of grid computing is to create a worldwide network of computers interconnected so well and so fast that they act as one. This is mostly used to describe rather mundane improvements that allow companies to manage their workload more flexibly by tapping into idle time on their computers. Physicists' demand for computing power is being spurred by the flood of data that will pour out of the Large Hadron Collider (LHC), the next-generation particle smasher due to start operation in 2007 at CERN, the European particle physics laboratory near Geneva. This machine will produce some 15 petabytes (millions of billions of bytes) of data a year, or the equivalent of about 3m DVDs, which physicists will store and sift through for at least a couple of decades in search of those few rare collisions where exotic new particles are created. To put this in perspective, current estimates of the annual production of information on the planet are on the order of a few thousand petabytes, so the LHC will be producing nearly 1% of that total. Some 100,000 of today's fastest personal computers—with accompanying bits and bobs such as tape and disk storage and high-speed networking equipment—will be needed to analyse all this data.
The decision to build a distributed computing system to deal with this deluge of data predates the hype about grid technology and is purely pragmatic: it would be difficult to fund the necessary computational power and storage capacity if it were concentrated on one site. If, on the other hand, the computations are distributed among the hundreds of institutes worldwide that are involved in the LHC, each institute can tap into national or regional funding sources to raise cash, spreading the pain. The LCG project now involves some 80 computing centres in 25 countries contributing over 7,000 computers, and is reckoned to be the biggest—and most global—computing grid around. Not all problems are best solved using the distributed clusters that underpin grids. True supercomputers are irreplaceable for some scientific problems, such as weather forecasting, where many processors must communicate frequently with one another. At the other extreme, scavenging spare computer power from personal computers on the internet is proving an increasingly effective approach for problems that can be split into a large number of small, independent parts.

ThinkExist.com Quotes
Sadagopan's Weblog on Emerging Technologies, Trends,Thoughts, Ideas & Cyberworld
"All views expressed are my personal views are not related in any way to my employer"