<$BlogRSDUrl$>
 
Cloud, Digital, SaaS, Enterprise 2.0, Enterprise Software, CIO, Social Media, Mobility, Trends, Markets, Thoughts, Technologies, Outsourcing

Contact

Contact Me:
sadagopan@gmail.com

Linkedin Facebook Twitter Google Profile

Search


wwwThis Blog
Google Book Search

Resources

Labels

  • Creative Commons License
  • This page is powered by Blogger. Isn't yours?
Enter your email address below to subscribe to this Blog !


powered by Bloglet
online

Archives

Thursday, December 30, 2004

Great Moments In Microprocessor History :IBM Perspective

(Via Jeff Barr) IBM just published an informative article titled Great moments in microprocessor history. The Microprocessor heralded the origins of the digital revolution,perhaps the greatest of all the positive revolutions that the world has seen. Excerpts with edits and my comments added:

- Before the flood: The 1960s
Just a scant few years after the first laboratory integrated circuits, Fairchild Semiconductor introduced the first commercially available integrated circuit (although at almost the same time as one from Texas Instruments).
-Development explosion: The 1970s
The idea of a computer on a single chip had been described in the literature as far back as 1952 (see Resources), and more articles like this began to appear as the 1970s dawned. Finally, process had caught up to thinking, and the computer on a chip was made possible. The air was electric with the possibility.
-The first three
At the time of this writing, three groups lay claim for having been the first to put a computer in a chip: The Central Air Data Computer (CADC), the Intel® 4004, and the Texas Instruments TMS 1000.

-Early Intel: 4004, 8008, and 8080
Intel released its single 4-bit all-purpose chip, the Intel 4004, in November 1971. It had a clock speed of 108KHz and 2,300 transistors with ports for ROM, RAM, and I/O. Originally designed for use in a calculator, Intel had to renegotiate its contract to be able to market it as a stand-alone processor. Its ISA had been inspired by the DEC PDP-8.The Intel 8008 was introduced in April 1972, and didn't make much of a splash, being more or less an 8-bit 4004. Its primary claim to fame is that its ISA -- provided by Computer Terminal Corporation (CTC), who had commissioned the chip -- was to form the basis for the 8080, as well as for the later 8086 (and hence the x86) architecture. Lesser-known Intels from this time include the nearly forgotten 4040, which added logical and compare instructions to the 4004, and the ill-fated 32-bit Intel 432.

- AMD clones the 8080
Advanced Micro Devices (AMD) was founded in 1969 by Jerry Sanders. Like so many of the people who were influential in the early days of the microprocessor (including the founders of Intel), Sanders came from Fairchild Semiconductor. AMD's business was not the creation of new products; it concentrated on making higher quality versions of existing products under license. For example, all of its products met MILSPEC requirements no matter what the end market was. In 1975, it began selling reverse-engineered clones of the Intel 8080 processor.

- Moto 68000
In 1979, Motorola introduced the 68000. With internal 32-bit registers and a 32-bit address space, its bus was still 16 bits due to hardware prices. Originally designed for embedded applications, its DEC PDP-11 and VAX-inspired design meant that it eventually found its way into the Apple Macintosh, Amiga, Atari, and even the original Sun Microsystems® and Silicon Graphics computers.

- A new hope: The 1990s
The 1990s dawned just a few months after most of the Communist governments of Eastern and Central Europe had rolled over and played dead; by 1991, the Cold War was officially at an end. Those high-end UNIX workstation vendors who were left standing after the "microprocessor wars" scrambled to find new, non-military markets for their wares. Luckily, the commercialization and broad adoption of the Internet in the 1990s neatly stepped in to fill the gap. For at the beginning of that decade, you couldn't run an Internet server or even properly connect to the Internet on anything but UNIX. A side effect of this was that a large number of new people were introduced to the open-standards Free Software that ran the Internet.The popularization of the Internet led to higher desktop sales as well, fueling growth in that sector. Throughout the 1990s, desktop chipmakers participated in a mad speed race to keep up with "Moore's Law" -- often neglecting other areas of their chips' architecture to pursue elusive clock rate milestones.

- The 2000s
The 2000s have come along and it's too early yet to say what will have happened by decade's end. As Federico Faggin said, the exponential progression of Moore's law cannot continue forever. As the day nears when process will be measured in Angstroms instead of nanometers, researchers are furiously experimenting with layout, materials, concepts, and process. After all, today's microprocessors are based on the same architecture and processes that were first invented 30 years ago -- something has definitely got to give. Exciting days are ahead, with more and more speed,power and newer and newer potential usages.

|
ThinkExist.com Quotes
Sadagopan's Weblog on Emerging Technologies, Trends,Thoughts, Ideas & Cyberworld
"All views expressed are my personal views are not related in any way to my employer"