<$BlogRSDUrl$>
 
Cloud, Digital, SaaS, Enterprise 2.0, Enterprise Software, CIO, Social Media, Mobility, Trends, Markets, Thoughts, Technologies, Outsourcing

Contact

Contact Me:
sadagopan@gmail.com

Linkedin Facebook Twitter Google Profile

Search


wwwThis Blog
Google Book Search

Resources

Labels

  • Creative Commons License
  • This page is powered by Blogger. Isn't yours?
Enter your email address below to subscribe to this Blog !


powered by Bloglet
online

Archives

Monday, November 08, 2004

Putting Computing Power on Tap - Part 1

Businessweek writes, Running corporate servers is still nowhere near as easy as turning on a faucet. Now lots of companies are working hard to change that. Excerpts:

Some of mankind's greatest accomplishments have been all about eliminating complexity. Take water. For thousands of years, humans have been perfecting ways to let people easily grab a drink, regardless of how many miles of aqueducts, how many reservoirs, or how much water treatment are needed to make it possible. And then there's electricity. Inventing the light bulb was nice. But the truly miraculous creation is the gigantic infrastructure of power plants and grids that allows consumers to merely flip a switch to turn on that light bulb.

But running computer networks are not easy. CTO’s aren't removed from the computing process. They still have to worry about the day-to-day intricacies of using technology. It's a thankless job burdened with considerable expenses. Companies around the world will spend $95 billion in 2004 just to maintain their server computers -- 80% more than they'll spend to actually buy servers over the entire year. The simple task of assigning servers from one job to another takes a lot of grunt work. Imagine this scenario: An online promotion takes off at a retailing Web site, and its tech manager wants to assign more computing power to respond to a traffic spike of shoppers rushing to the site. "Today, it's about as easy as moving a family into a new house," says Jay Kidd, chief technology officer of Brocade Communications Systems ). "It would probably take more than a day. And meanwhile, your customers are waiting." Computer execs know that's a problem. Even during the dot-com bubble, visionaries like Sun Microsystems CEO Scott McNealy and Netscape Communications co-founder Marc Andreessen were talking about the need to make computer networks more like utility networks such as the power grid. While the industry has started to make progress in spots, it hasn't gone very far toward the ultimate goal of making computing equipment as easy to use as a light switch -- what has come to be known as "utility computing."

Computer makers have created servers and management software designed to let them run more easily. The same goes for makers of data-storage gear, makers of software, and even the makers of computer networking equipment. "There's a holy war going on," says Jason Donahue, chief executive of software maker Meiosys, one of the many small startups trying to solve the utility computing problem. "All of these groups are developing [utility computing] technology, but there's nothing unifying it." Making computer networks run like a utility is a daunting task. In order for utility computing to work, servers, storage gear, networking equipment, and the software that manages it all have to work like four legs of a stool. If one leg isn't cooperating with the others, the whole thing doesn't work as well as it should. Analysts say a transition to true utility computing, where tech managers or even consumers can assign the right amount of computing power to a particular task whenever they want, could take another decade or more.
Here's another scenario: A company that makes water boilers needs to find the best price on copper piping to fulfill a big order, and it needs to get the job done fast. The company starts by reassigning servers to run e-commerce software that checks into availability and price with various suppliers. That's just the start.Those newly assigned servers have to tap into the data banks where pricing information is stored. This requires a fast connection to the computer networks of suppliers so queries don't end up in a slow-moving queue. "This all has to happen quickly, because if you can't get the parts from one supplier, you need to go somewhere else." "If you only have three days of inventory left and it takes a day to find out whose got more parts, you're running out of time." The good news is that the work necessary for true utility computing is already being done. Computer makers like IBM, HP and Sun have unveiled "virtualization" schemes that let companies manage hundreds or thousands of computers like one giant machine. Using software to manage all that gear, they can quickly shift work from machine to machine. That allows some of their corporate customers to get more out of the equipment they own. In some cases, server utilization has jumped from a crummy 15% to 60% or more. Progress is also being made in the storage leg of the utility stool. Ten years ago, almost all data were stored inside of servers, making info largely unavailable to other machines. Since then, most companies have created storage networks that separate the data from the servers. In addition to making the data available to more than one server, it also lets companies reduce the amount of unused space in storage drives.
(Part II shall follow)

|
ThinkExist.com Quotes
Sadagopan's Weblog on Emerging Technologies, Trends,Thoughts, Ideas & Cyberworld
"All views expressed are my personal views are not related in any way to my employer"