( Via MckinseyQuarterly). Well known consultants, John Hagel and John Seely Brown, write, Western companies think too narrowly about the emerging world. If they aren’t careful, they may end up as defenders, not attackers. In Part 1 we saw,emerging markets are generating a wave of disruptive product and process innovations that are helping established companies and a new generation of entrepreneurs to achieve new price-performance levels for a range of globally traded goods and service. In this second and concluding part,we shall see some examples and implied approach adn principles that are changing the nature of competition. Excerpts with heavy edits and my comments added:
The new models to follow:
- Chinese Motorcycle makers by embracing new production processes has made rapid gains in motorcycle export markets, especially in Africa and Southeast Asia, and now accounts for 50 percent of all global production of motorcycles. The average export price of Chinese models has dropped from $700 in the late 1990s (already several hundred dollars less than the cost of equivalent Japanese models) to under $200 in 2002. The impact on rivals has been brutal: Honda's share of Vietnam's motorcycle market, for instance, dropped from nearly 90 percent in 1997 to 30 percent in 2002. Japanese companies complain about the "stealing" of their designs, but the Chinese have redefined product architectures, in ways that go well beyond copying, by encouraging significant local innovation at the component and subsystem level.
-Cummins, the producer of diesel engines and power generators, recently did just that in India. By modularizing a product for the distinct needs of different kinds of customers and channel partners, the company cut the total cost of ownership and of sales in the channel. The result: higher demand for Cummins products. The new genset engines have been an unqualified success in India, where Cummins has won 40 percent of the market over the past three years. Genset sales now account for 25 percent of the company's total power generation sales there. Despite the much lower unit prices of the new range, its net profitability is comparable to that of the high end. Exports began in 2002 to other parts of Asia and were later extended to Africa, Latin America, and the Middle East. Innovation in emerging markets won't be limited to manufactured goods.
- The desire to reach vast low-income segments of Asia's population is also pushing service organizations to new levels of achievement. One vivid example comes from the Aravind Eye Care System, at Madurai, in the south Indian state of Tamil Nadu. The Aravind system—dedicated to eradicating "needless blindness by providing appropriate, compassionate, and high-quality eye care for all"—includes a chain of hospitals and a manufacturing center for sutures, synthetic lenses, and eye pharmaceuticals. Aravind hospitals perform 200,000 operations a year—nearly 45 percent of all such operations in Tamil Nadu and 5 percent of those throughout India. High volumes are dictated by the affliction's scale and by the need to make the network's nonprofit hospitals viable and to generate funds for expansion. Over the years, Aravind has carefully honed the flow of work through its outpatient departments and surgical wards—and both have reached impressive levels of efficiency.
The implications for Western companies - These models of innovation spell out a clear message for many companies in the developed world: if you're not participating in the mass-market segment of emerging economies, you're not developing the capabilities you will need to compete back home.
- First recommendation to Western companies is therefore to go offshore, not just to the affluent segments, and not just for wage cost differentials, but to serve the mass market. Only there will you be forced to innovate in the ways required to succeed in the future. The recommendations that follow build on this basic idea.
- A broader view of innovation that values the role of incremental change communicates the power of bootstrapping. Companies that start out with limited capabilities—such as those in many developing economies—can rapidly build them over time through a series of modest process and product innovations. Ultimately, individual innovations may matter less than the institutional capacity to sustain a rapid series of improvements and the pace at which they are developed and disseminated through the network. The principles and examples are very illuminative, but Asia( nay india) may need to do more to change the rules of game decisively across all shades of the spectrum. A recent article in the Economist magazine, says,In recent years, the Chinese industry—always seen as predator, never as a victim,has gone through wrenching changes that America’s textile lobby could scarcely contemplate. Its textiles sector has shed almost 2million workers since 1995 and employment in the clothing industry has levelled off as productivity has soared. China will have to wait for its fair share of the global market. It has already received more than its quota of rich-world protectionism.
In the days of the rudimentary pistol, unlucky shooters were now and then hurt when unburned gunpowder escaped backward toward their faces. They came to describe this unpleasant experience as "blowback," a term that has subsequently gained wider application in military affairs—to any event that turns on its maker. Blowback is an apt term for the unexpected consequences of the investments that Western companies have made in emerging markets. Here, the thinking goes, companies can expect to harvest the fruits of the R&D and innovation skills painstakingly developed in their home countries. That view is dangerously complacent. The very presence of Western intruders and the competition they create have inspired the emerging world's companies to raise their game in response. Far from being easy targets for exploitation, emerging markets are generating a wave of disruptive product and process innovations that are helping established companies and a new generation of entrepreneurs to achieve new price-performance levels for a range of globally traded goods and services. Eventually, such companies may capture significant market share in Europe and the United States. Farsighted vanguard Western companies are not only acquiring key capabilities by serving low-income customers in emerging markets but also preparing to use that experience to attack the growing value segments of developed markets.
Two powerful factors are converging to transform them into catalysts of this kind. One is the low incomes of consumers in China and India—a total of 457 million households in 2002, with an average annual income of less than $6,000 a year. The other is the spending behavior of this immense group of consumers, who, by Western standards, are unusually youthful, demanding, open-minded, and adventurous. These demographics and consumer traits set a stern precedent. To penetrate this vast market, companies must charge prices that the majority of its consumers can afford. Mobile technology demonstrates both the opportunity and the challenge. China and India, thanks to their army of early adopters, have become two of the world's largest markets for mobile phones. But these markets differ from Western ones in important ways. Experts say, the cost of equipment for mobile-telephone networks must fall by a factor of five for it to succeed in the Indian market. Pricing for mobile-network operators must also be restructured, with smaller up-front license fees and more emphasis on performance-based payments. Established technology vendors such as Nokia or Sony Ericsson must decide whether products designed for more developed countries will succeed if merely adapted for Asia's emerging markets or a radical new approach to product and process design is required. A growing number of such companies now acknowledge that going back to the drawing board is the only choice in Asia.
Part II shall follow shortly.
Now, after years of unmet expectations, the pieces may finally be in place for cellphone games and other mobile-content services to catch on. Carriers are investing in higher-speed networks. Cellphone makers are selling flashier handsets. Simpler systems are in place to charge users for content."Year 2004 was the year that the broader industry woke up to the idea of mobile content," says Mitch Lasky, chief executive of Los Angeles-based Jamdat. "We had to overcome a lot of skepticism early on." Fancy Phones, Fast Networks - There was a time that cellphones were, quite simply, phones. But technology improvements have transformed them into all-purpose communication and entertainment devices. With the push of a button, users can pay to play Tetris, read news headlines or subscribe to a service that provides driving directions. Soon, they'll be able to do more. Handset makers are releasing cellphones in the U.S. with more vivid color screens, higher-quality sound and more memory, and it is becoming easier to bill consumers directly through the cellphone. Samsung Electronics Co.'s VM-A680 model, which features high-tech polyphonic ringtones and built-in video recording capabilities, sells for as little as $79.99 on Amazon.com.
Several U.S. carriers, including Sprint Corp. and Cingular Wireless, this month announced plans to upgrade their wireless networks, allowing people to connect to the Internet through their cellphones at faster speeds. Verizon Wireless, a joint venture of Verizon Communications Inc. and Vodafone Group PLC, has begun offering a high-speed network in some markets and plans to expand that service next year.The use of mobile content is greater in Asia and Europe, but the U.S. market is growing rapidly. Boston-based Yankee Group expects U.S. sales of wireless data, such as text-messaging, games and ringtones, to nearly double by 2006.
Merger Mania in Mobile Content - Start-ups are looking to capitalize on the interest in mobile content. Some, like Jamdat, focus on original content. Others partner with companies to distribute branded content, such as games based on movies, to cellphone carriers.Typically, services cost a few dollars for one-time use, and less than $10 for a monthly subscription. Cellphone carriers take a cut of the sales -- as little as 10% and as much as 75% -- then pass the rest on to the content distributors. As interest in mobile content rose this year, the market experienced a flurry of consolidation. Among the dozens of acquisitions, cellphone-game maker Sorrent Inc. announced this month that it will acquire London-based Macrospace Ltd., on the same day that Bellevue, Wash.-based InfoSpace Inc. announced its purchase of Iomo Ltd., a U.K. publisher of mobile games. The industry, they say, is better-suited to a few large players than several small ones. One reason is that cellphone technology is still not standardized, which means companies must ensure their products work with hundreds of different handsets, networks and languages -- an undertaking that is often cost-prohibitive for small companies. Also, mobile content providers rely on cellphone carriers to pick up their products. The size of a cellphone's screen limits the amount of content that users can scroll through, so content providers must fight for the attention of carriers, who control the top spots on cellphone menus. Most carriers want to deal with fewer providers to cut costs and avoid logistical troubles. "They increase their profit margins partially by reducing the number of relationships," says Lewis Ward, an analyst at research firm IDC.Mobile content firms may tout cellphones as mini-PCs, but to many, they're still just phones. Ringtones and simple games still make up the vast majority of mobile-content sales - and some users don't even realize they can download content onto their cellphones.
A crew of roving salespeople and consultants from an enterprise software company could surprise IT departments that labor under broken down legacy systems -- and customize those systems to perfection, sending a horde of skilled developers to fulfill every desire. The VP of sales wants the CRM system to do something the current CRM system has never done before? It’s all about customization, so it’s done. The CFO wants to operate accounts payable from a Wi-Fi enabled Pocket PC? No problem -- you want it, it can be done, so presto, it’s done. In the heat of the moment, no one thinks about what these one-of-a-kind customizations are going to mean down the road. More often than not, they end in frustration. The consultants finish their custom work, and, to maintain the system, existing staff must fully understand the customizations -- which is difficult and rare -- or else the consulting sun never sets. When the customizations get really close to the software metal, even worse things can happen. I’ve seen cases in which enterprise solutions have undergone such intense customization that the vendor who sold the original software can’t even upgrade the system without consulting engagements that are more expensive than the initial purchase.
Software expert Robert L. Glass notes in his excellent book Facts and Fallacies of Software Engineering that every 25-percent increase in problem complexity results in a 100-percent increase in the complexity of the solution. Smart IT shops should limit unneeded complexity at every turn, choose their customizations carefully, and turn a deaf ear to the siren song of the perfectly customized solution. Remember, when a solution is truly “yours,” it can end up being “yours” in the worst way possible: Your problem My Take: No wonder, that in established IT setups, close to 70% of expense outgo are towards maintenance, infrastructure and resource costs. The whole movement towards packages software hinges on this besides being able to embed superior process and quicker time to deploy COTS applications. Unmindful customisations, owing to limited perpectives and stakeholer traps, increases cost of maintenance and upgradation quite dramatically.
With the recent Tsunami that hit Asia,considering the magnitude of the damage,there is lots of talk about tsunami warning systems and global cooperation.We don't need governments and huge sensor arrays to warn people on the beach about the next huge wave approaching at 400 miles-per-hour. Thanks to the Internet, we can probably do it by ourselves.
What we need is a tsunami warning system not just for parts of Asia, but for anywhere in the world that might be subject to such conditions. And that decision about what beaches to protect ought to come not from Washington, D.C., or Jakarta, or any other capital city, but from the beach people, themselves. If you are concerned about a giant tidal wave taking out your village, it might be a good idea to build your own warning system.It can be done.The Tsunami Warning System (TWS) in the Pacific Ocean shows us how such a warning system can be run with the cooperation of 26 countries. TWS is based on crunching two kinds of data -- seismic activity and changes in sea level measured by tide gauges. Most tsunamis begin with an earthquake, the severity and epicenter of which can tell a lot about whether a tsunami is likely, how strong it will be, and in what direction it is likely to go. From the TWS, the first warning is based purely on such seismic data. But once the big wave starts rolling it will have an effect on the level of the sea, itself, which is routinely monitored by weather stations of many types. This additional data gives a better idea of how bad the wave is really going to be, so in the TWS system, it is used to justify expanding the warning to other communities beyond those warned purely on the basis of seismic data.
Depending on where the originating earthquake is, the tsunami can be minutes or hours from crashing into a beach. This week's wave took about 90 minutes to reach Sri Lanka, just over 600 miles from the epicenter. That not only means the wave was traveling at over 400 miles-per-hour, it also means that had a warning system been in place, there would easily have been time to get the people who were affected in Sri Lanka to higher ground. So to start, we need raw seismic data. Thanks to the Pacific Northwest Seismograph Network, here is one place where you can find real time data from 199 seismographs around the world. There are also links to a dozen regional operations that consolidate such data. The data is available. Tide gauge data is available, too, though there is less of it, and aggregation will require more effort, so let's just stick to seismic data for our warning system.
Here's where we need the help of a tsunami expert, someone who can help us calculate the size and direction of a likely tsunami based on the available seismic data. Fortunately, there has been quite a bit of work done in this area of study and appropriate computer codes that can be run on a personal computer either exist or can be derived, perhaps by reflexively evaluating seismic data from known tsunami events. But remember that what we care about here is not global tsunami warning but LOCAL tsunami warning , so the required seismic data sources can pretty easily be limited to those with an uninterrupted aspect of the target beach, which means half a dozen seismographs, not 199.
The seismographs are online, we gather the data using XML, continuously crunch it using the codes that already exist, then we need the warning, which could be flashed on the screen of the PC. Looking just like a TV weather map, the PC widget like Konfabulator would flash a warning and even include a countdown timer just like in the movies.
No international consortium, No broadband is needed to build such a local tsunami warning system. The data is available, processing power is abundant and cheap. With local effort, there is no reason why every populated beach on earth can't have a practical tsunami warning system up and running a month from now. That's Internet time for you, but in this case, its application can protect friends everywhere from senseless and easily avoidable death.
Amazing man - should be tried out immediately- Essentially decentrailised processing using powerful personal computers and ability to process significant amount of data locally coupled with availability of huge mass of data being collected currently,is the basis on which this framework would work.Let some of the fund getting raised for Tsunami relief be routed for piloting this Tsunami warning system framework. Scobeleizer wrote earlier today,Bill and Melinda Gates start writing checks for three million dollars for Tsunami relief - may be they can take the lead in driving this pilot or the indian service providers like TCS,INFY,Wipro etc.. or the Japanese,Korean majorsetc.. ( may be there is a commercialisation opportunity down the line- specialised turnkey offering for most of the beaches in the world!!)who have by the way benefitted significantly by the advancements in the technology world.This idea really requires serious consideration.
The Korean Information Security agency says in the first 10 months of 2004, it received 244,151 complaints about unsolicited text messages and calls to mobile phones, compared to just 78,063 complaints about spam e-mail, seven times more than in 2003. A KISA official went on to tell The Korea Times that junk mail to phones now outnumbers e-mail spam, a trend that's expected to continue. Mobile spam in South Korea takes on two forms - the familiar unwanted text messages, and automated voice calls. Blocking the unwanted voice calls can be problematic, as firms using them aren't required to comply with users' demands to be removed from their lists. There is pending legislation, however, that would outlaw all automated calls "that fail to obtain prior agreement" — a move that hasn't stopped too many e-mail spammers.It's hard to tell if mobile spam really outnumbers e-mail spam in South Korea from the number of complaints in generates. A likely scenario is that not only are users more resigned to e-mail spam and have better technology to deal with it, but also that mobile spam is far more intrusive and offensive — and not to mention costly — than on the desktop.
Takeaway : Operators and service providers must protect their users from spam, otherwise mobile messaging will be rendered useless. Likewise, marketers legitimately using text messaging and other mobile services must carefully orchestrate their campaigns, making sure they're sending them only to users who have asked for them, and also making it easy to stop receiving the messages. I agree with the data and the recommendation -During my recent visit to Korea, I received 90% junk messages in my mobile, and service providers and legislation can prevent spam( as countries like Singapore have shown)!!
Caterina Fake & Stewart Butterfield, Founders of Flickr.com tell Fastompany,Flickr has grown from 0 to 150,000 users while still in beta and with zero dollars ($0.00) in marketing investment. This growth has been entirely organic, based on word of mouth, blog postings and positive press. We have quickly created the largest and best-organized online photo library in existence with 1.8 million images, of which 81% are public, and 85% have some human added metadata. What this means is you can find photographs of anything that strikes your fancy. Vintage cars, butterflies, Parisian graffiti, Halloween costumes – you can get lost for hours exploring Flickr. Flickr has also provided a place for people to both express themselves and be a part of a larger community by creating self-organizing photo albums that they can share privately with friends or family, or with the world; they can curate collections of other member photos, and create dynamically assembled galleries. People all over the world upload photos from the war in Iraq, from the middle of the Florida hurricanes, from the polling booth, from the midst of the revolution in the Ukraine. Flickr is infinitely shareable and easily searchable. As Paul Allen points out, Fastest Growth Sites Are Built on User Generated Content. Paul writes,one of the most powerful ways to develop web site traffic is to enable users to share their content through your web site with others-to create community around user generated content.Many of the fastest growing web sites of all time did this (or do it now): MyFamily.com, eBay, GeoCities, Xoom, Homestead, MySpace, Epinions, Hotshots, LinkedIn.com, Meetup.com, Friendster, and more.If sites are uses to get customers to blog, use message boards, upload photos or reviews, the effect shall be dazzling.With open source software (for message boards, blogs, uploading photos, and more) and with the cost of hard drive storage a tiny fraction of what it was five years ago, the time has never been better to try a user generated content strategy.
We recently covered fibre survives and outsourcing data traffic not affected . Infoworld reports,, despite tsunami's outsourcers stay in chennai.Chennai is the closest competitor to bangalore as mainstay outsourcing center in India. All outsourcing biggies Infosys,TCS,Wipro,Satyam have major presence in chennai and have reported complete normalcy in operations without any disruptions.Infoworld adds, Wipro Ltd. in Bangalore, for example, uses its facility in Chennai as one of two backup sites in the country for its own corporate data as well as customer data from its facilities across India. Its other backup site is in Bangalore. "We have no plans to move the backup facility from Chennai, because the facility is sufficiently inland and not at risk," said a spokesman for the company. For customers who want the service, Wipro also offers backup of their data at centers abroad, the spokesman added. The Marina beach at Chennai was hit by the tsunamis and a large number of people were killed there. But the largest impact of the tsunamis was on other coastal areas of eastern India, where the primary economic activities are fishing and tourism. A number of Indian outsourcing companies including Wipro, Infosys Technologies Ltd, Tata Consultancy Services Ltd. have major software development operations in Chennai.
Infosys does not have plans to move its software development facility out of Chennai in the wake of the devastation by the tsunamis, according to a spokeswoman of the company. The facility is about five miles from the coast, and there was no damage or water seepage in the center, according to a statement from Infosys. The company's disaster recovery facility is inland in Mauritius, the spokeswoman said. Most other companies with facilities in Chennai have indicated they will continue to operate in the city.
The coming 12 months will bring a continued focus on security, app dev tools, on-demand, storage, and desktop search. 2004 closed with a veritable blizzard of mergers and a downpour of desktop search offerings - events and products that may well dominate IT managers’ thoughts well into the new year. With analyst companies such as IDC and Forrester Research predicting an increase in IT spending, 2005 will also witness additional developments in operating systems, SOAs (service-oriented architectures), on-demand computing, storage, open source, and - of ever increasing importance - security.
Setting up the defense - Security: Microsoft is expected to announce a series of new products and an increased emphasis on protection in its current product line. Meanwhile, traditional security vendors will have plenty to offer in 2005, and networking and storage companies will continue to integrate security features into their lineups. Cisco’s year-end purchase of Protego, a network edge security appliance company, presages the direction the industry will take, with consolidation likely to continue, as point products get swooped up by larger security vendors.
App dev matters - Microsoft is also expected to ship next-generation upgrades to its Visual Studio developer toolbox and to its SQL Server 2005 database. Code-named Yukon and talked about as far back as 2001, SQL Server 2005 will feature enhancements in BI, database administration, development, and security. Visual Studio 2005, code-named Whidbey, will come in a new flavor, Team System, which features application lifecycle management and collaborative development. Competing with Microsoft, Sun Microsystems plans to upgrade its Java Studio Creator tool, with additional application server support planned for spring.
Waiting for Longhorn - Beta of longhorn by the end of 2005 with improved security capabilities; a built-in Web services architecture, Indigo; and a brand-new graphics subsystem, code-named Avalon. The overall market revenue for desktops, servers, and packaged software running on Linux will top $35 billion by 2008, according to IDC. Linux-compatible packaged software is expected to reach $14 billion in that same time frame.
In the emerging open source database market - MySQL in 2005 plans to ship Version 5.0 of its MySQL database with stored procedures and triggers capabilities. Enterprise-level, commercial database companies have declared their open source rivals not ready for the enterprise, but MySQL is looking to change that.
On-demand and SOAs - A technology trend that will take an even deeper hold in the enterprise during the course of this year will be the “dynamic IT” environments.“There will be a new focus on a new foundation in IT ,the vendors call ‘on-demand’ or ‘adaptive.’ It’s about the ability to apply flexible approaches based on things like SOAs, Web services, virtualization, and standard components. It is this technical foundation underneath the enterprise that will be the driver for change,” said Frank Gens, senior vice president of research at IDC.
SOA - BEA Systems IBM, Microsoft, Oracle,and Sun,are vying in the SOA space, promoting the use of component-based interchangeable application architectures as the new wave of IT infrastructure.
Computing without wires - In wireless, 2005 will be a year of pilot projects and evaluations. To comply with the first Wal-Mart and Department of Defense mandates for RFID tags, in January thousands of suppliers will deploy RFID tags but only in what is being called a “slap-and-ship” model.After the tags are applied to satisfy customers’ requirements, however, suppliers will start their own pilot projects to see how RFID might reduce costs in their supply chain. RFID will not be generally deployed until late 2006 or 2007.
Storage spreads out - In 2005, more acquisitions among storage software vendors shall happen, as storage management increases in importance and tiered storage continues to come of age. Storage resource management software will continue to grow, as will storage archive software. Dell’s success with its AX100 storage device will no doubt translate to more storage systems with ease-of-use features.Microsoft will increase its stake in storage by adding products that feature tighter integration with Windows to its storage lineup. Other products to look for in 2005 include EMC’s storage switch, which the company plans to release early in the year. Tape vendors will continue to integrate disk into their products and will offer better integration into storage networks. In the same vein, expect to hear a lot about ILM (information lifecycle management) from the major storage vendors and to see plenty of products with iSCSI (Internet SCSI), as the technology begins to take off this year.
Networks will grow faster, more complex, and larger in 2005:As more 10 Gigabit Ethernet products come to market, the network core will see a marked increase in performance. At the same time, the network edge will see a performance boost as Gigabit Ethernet extends to the outer edge.
Search gains significance : Enterprise search platforms will grow in prominence in 2005, fueled by skyrocketing volumes of unstructured content and the closing in of government regulations that mandate quick discovery of a wide range of corporate content -- from e-mails, to documents, to chat conversations.Led by IBM and its Masala project, large software vendors will accelerate the commoditization of full-text search in the enterprise. In addition, enterprise search will continue to blur more and more with content management and business analytics.In addition to continued platform expansions by Autonomy, Fast Search & Transfer, and Verity, new search-related product entrants are expected to be launched by Oracle and Sun in 2005. Meanwhile, the emerging desktop search market is expected to heat up both in enterprise and in consumer markets throughout the year.
Hosted search, another nascent space, will gain more prominence in the enterprise this year, as vendors such as Atomz and CrownPeak push the on-demand model benefits of quicker startup time and lower cost of ownership. Several exceptions : consolidation into platforms, flux in eai space, portals becoming platforms, master data management have not been covered. But overall,a balanced set of expectations - very likely to happen.
Techweb reports that online auction site eBay announced that it will soon drop support for Microsoft's Passport for log-in to the site and discontinuing alerts sent via Microsoft's .Net alerts. Microsoft says,it will stop marketing Passport to sites outside its own stable. As of late January, eBay will no longer display the Passport button on sign-in pages nor allow users to log in using their Passport accounts. Instead, members must log-in directly through eBay. Likewise, eBay's dumping .Net alerts, which means that eBay customers who want to receive alerts - for such things as auction closings, outbids, and auction wins - will have to make other arrangements. The free-of-charge eBay Toolbar, for instance, can be used to set up alerts going to the desktop, while alerts to phones, PDAs, or pagers can be created from the user's My eBay page.
eBay was one of the first to jump on the Passport bandwagon in 2001, but is only the latest site to leap off. Job search site Monster.com, dropped Passport in October. Microsoft has decided to stop marketing its sign-on service to other Web sites.The pull-back, which had been long predicted by various analysts, follows a stormy life for Passport, which among other things, suffered a pair of security breakdowns in the summer of 2003 that could have led to hackers stealing users' IDs. Microsoft also pulled its online directory of sites using Passport - perhaps because the list would have been depressingly short - stating in the online notice that "We have discontinued our Site Directory, but you'll know when you can use your Passport to make sign-in easier. Just look for the .NET Passport Sign In button!" Passport will continue to be the sign-on service for various Microsoft properties, including the Hotmail e-mail service and MSN.com. Two things to note: A.Creating a passport account was never a friendly exercise. B. All talks of community building and fortification by Microsoft are all gone.Would this impact microsoft stock valaution?? when prices rose, nominal value was attached to building and binding a community!! May be Microsoft is too big and well spread, this may be seen as just a blip!! How many blips the company need to exhibit for people to sit up and take note??
- Before the flood: The 1960s Just a scant few years after the first laboratory integrated circuits, Fairchild Semiconductor introduced the first commercially available integrated circuit (although at almost the same time as one from Texas Instruments).
-Development explosion: The 1970s The idea of a computer on a single chip had been described in the literature as far back as 1952 (see Resources), and more articles like this began to appear as the 1970s dawned. Finally, process had caught up to thinking, and the computer on a chip was made possible. The air was electric with the possibility.
-The first three At the time of this writing, three groups lay claim for having been the first to put a computer in a chip: The Central Air Data Computer (CADC), the Intel® 4004, and the Texas Instruments TMS 1000.
-Early Intel: 4004, 8008, and 8080 Intel released its single 4-bit all-purpose chip, the Intel 4004, in November 1971. It had a clock speed of 108KHz and 2,300 transistors with ports for ROM, RAM, and I/O. Originally designed for use in a calculator, Intel had to renegotiate its contract to be able to market it as a stand-alone processor. Its ISA had been inspired by the DEC PDP-8.The Intel 8008 was introduced in April 1972, and didn't make much of a splash, being more or less an 8-bit 4004. Its primary claim to fame is that its ISA -- provided by Computer Terminal Corporation (CTC), who had commissioned the chip -- was to form the basis for the 8080, as well as for the later 8086 (and hence the x86) architecture. Lesser-known Intels from this time include the nearly forgotten 4040, which added logical and compare instructions to the 4004, and the ill-fated 32-bit Intel 432.
- AMD clones the 8080 Advanced Micro Devices (AMD) was founded in 1969 by Jerry Sanders. Like so many of the people who were influential in the early days of the microprocessor (including the founders of Intel), Sanders came from Fairchild Semiconductor. AMD's business was not the creation of new products; it concentrated on making higher quality versions of existing products under license. For example, all of its products met MILSPEC requirements no matter what the end market was. In 1975, it began selling reverse-engineered clones of the Intel 8080 processor.
- Moto 68000 In 1979, Motorola introduced the 68000. With internal 32-bit registers and a 32-bit address space, its bus was still 16 bits due to hardware prices. Originally designed for embedded applications, its DEC PDP-11 and VAX-inspired design meant that it eventually found its way into the Apple Macintosh, Amiga, Atari, and even the original Sun Microsystems® and Silicon Graphics computers.
- A new hope: The 1990s The 1990s dawned just a few months after most of the Communist governments of Eastern and Central Europe had rolled over and played dead; by 1991, the Cold War was officially at an end. Those high-end UNIX workstation vendors who were left standing after the "microprocessor wars" scrambled to find new, non-military markets for their wares. Luckily, the commercialization and broad adoption of the Internet in the 1990s neatly stepped in to fill the gap. For at the beginning of that decade, you couldn't run an Internet server or even properly connect to the Internet on anything but UNIX. A side effect of this was that a large number of new people were introduced to the open-standards Free Software that ran the Internet.The popularization of the Internet led to higher desktop sales as well, fueling growth in that sector. Throughout the 1990s, desktop chipmakers participated in a mad speed race to keep up with "Moore's Law" -- often neglecting other areas of their chips' architecture to pursue elusive clock rate milestones.
- The 2000s The 2000s have come along and it's too early yet to say what will have happened by decade's end. As Federico Faggin said, the exponential progression of Moore's law cannot continue forever. As the day nears when process will be measured in Angstroms instead of nanometers, researchers are furiously experimenting with layout, materials, concepts, and process. After all, today's microprocessors are based on the same architecture and processes that were first invented 30 years ago -- something has definitely got to give. Exciting days are ahead, with more and more speed,power and newer and newer potential usages.
Casino owners may become the newest members of the RFID bandwagon. They show a keen interest in the technology as a way to trackcustomers from the moment they hit the gaming tables. And the purchase of two patents by a large manufacturer shows they will not miss out on the 21st century technology. Shuffle Master, a gaming supply company headquartered, in Las Vegas, Nevada, recently purchased two RFID-related patents for $12.5 million.The company specializes in providing utility products to casinos, such as automatic card shufflers. But it also carries its own proprietary table games for casinos. The patents’ seller was Enpat, a licensing agent for the inventors, explained Paul Meyer, Shuffle Master's president and chief operating officer. "We did a lot of due diligence. We looked at over 400 patents to make sure we identified the correct ones," he added.
-The first patent is for RFID-enabled gaming chips. A casino could track the chip from the time it was first played to the time it is cashed in.
- The second patent is for the gaming table tracking system that monitors and records all gaming chip transactions in a casino. Both patents were originally filed in 1995.
"At last year's Global Gaming Expo, we were showing an intelligent table system," said Mr. Meyer. "At that time our product was based on optical technology, but we later decided that one size fits all made no sense because many casinos had different needs. We went on a modular development role and earlier this year concluded that the technology of the future was RFID. It offers 100% accuracy and is superior to optical. The purchase of these patents demonstrates our commitment to bring reliable and integrated RFID technology to market for the benefit of our casino customers." Gaming Partners International Corporation (which makes RFID readers and chips) will retain its exclusive license for the use of these patents in the manufacture and sale of RFID gaming chips and readers, according to Shuffle Master. "Mikohn (a Shuffle Master competitor) has a few placements of RFID. They've been trumpeting RFID for quite some time. With our acquisition; Mikohn would be a licensee of Shuffle Master," said Mr. Meyer.
Even with the minimal placements of RFID chips and tables in casinos thus far, Mr. Meyer thinks the field is still very wide open. "There are some 42,000 gaming tables in the world. That means lots of room to grow." He's looking at the end of the fiscal year, which, for Shuffle Master, is October, "that we'll certainly have something to demonstrate at the Global Trade Show." The downside to a Casino for RFID implementation, he said, is that it "would require an investment because they would have to re-rack their casinos." Case in point: a conventional chip without RFID costs about 90 cents, he said. With RFID, that cost would jump a dollar, to $1.90 a chip. "I've been told the average cost to a casino (to upgrade) is $250,000." So how do RFID-enabled chips and tables work :"Say I sit down at a black jack table and I have a player's card. I place it and a $100 bill on the table. My card is swiped which places me at that table," explained Mr. Meyer. (A player's card is another way for casinos to track frequent gamblers. They earn points on the card for free meals, or other rewards.) Without RFID, "as I play over time, the only way the casino can estimate the kind of player I am, is by using pit boss estimates. That's a pretty rough estimate. That's where table tracking comes in. Every chip is associated with me and is tracked using a reader. Exactly what I'm betting and losing or winning is tracked automatically. Without tracking, they (casino) don't know what I'm betting." In other words, the reasoning behind RFID utilization is that the casino will know what every player is doing at every table.
"Say you move away from one table with $500 in chips. You now go to cash in those chips. Those RFID chips can be read at the cage and associated with you. In your moment of generosity, you give a cocktail waitress a $25 chip. When she cashes it in, we know how generous a tipper you are." But not only does RFID allow casinos to track players, it also can track employees.
Another example: "If a dealer during a shift change tries to leave the table with a $100 chip, he'd be caught," said Mr. Meyer. "Every chip has a unique serial number, which is tied to the player card. If I don't have a player card, it's difficult to associate an RFID chip with me, but a very high percentage have player cards," he added. If the player doesn't have such a card, he would simply give the dealer his first name and last initial before buying the chips and they'd be tied to him in that fashion. In conjunction with RFID table tracking, "that literally means tracking player performance at the table and all around the casino," said Mr. Meyer. "There has been an enormous amount of attention given to this," he added. "New technology takes time to be adopted and I think it's very clear that Shuffle Master's reputation as the leader at deploying technology in the table area" could drive the move towards this new technology. "There has been a lot of buzz. People look at the adoption by Shuffle Master favorably because they know it will be deployed quickly."
Walter Mossberg writes in WSJ,Microsoft hasn't made any important functional improvements in Internet Explorer for years. The software giant has folded IE into the Windows operating system, and the browser only receives updates as part of the "Windows update" process. In recent years, most upgrades to IE have been under-the-hood patches to plug the many security holes that have made IE a major conduit for hackers, virus writers and spyware purveyors. The only visible feature added to IE recently: a pop-up ad blocker, which arrived long after other browsers had one.
Meanwhile, better browser like were getting built and the most significant of these challengers is Firefox, a free product of an open-source organization called Mozilla, available for download at www.mozilla.org. Firefoxis both more secure and more modern than IE, and it comes packed with user-friendly features the Microsoft browser can't touch. Firefox still has a tiny market share. But millions of people have downloaded it recently. There are some other browsers that put IE to shame. Apple's elegant Safari browser, included free on every Mac, is one. But it isn't available for Windows. The Opera Browser is loaded with bells and whistles, but is pretty complicated. And NetCaptor, is very nice. But since it's based on the IE Web-browsing engine, it's vulnerable to most of IE's security problems. Firefox, which uses a different underlying browsing engine called "Gecko," also has a couple of close cousins based on the same engine. One is Netscape, now owned by America Online. The other is a browser called Mozilla, from the same group that created Firefox. But Firefox is smaller, sleeker and newer than either of its relatives, although a new Netscape version is in the works. Firefox isn't totally secure - no browser can be, especially if it runs on Windows, which has major security problems and is the world's top digital target. But Firefox has better security and privacy than IE. One big reason is that it won't run programs called "ActiveX controls," a Microsoft technology used in IE. These programs are used for many good things, but they have become such powerful tools for criminals and hackers that their potential for harm outweighs their benefits.
Firefox also has easier, quicker and clearer methods than IE does for covering your online tracks, if you so choose. And it has a better built-in pop-up ad blocker than IE. The best aspect of Firefox is tabbed browsing, a Web-surfing revolution that is shared by all the major new browsers but is absent from IE. With tabbed browsing, you can open many Web pages at once in the same browser window. Each is accessed by a tab. The benefits of tabbed browsing hit home when you create folders of related bookmarks. And Firefox can recognize and use Web sites that employ a new technology called "RSS" to create and update summaries of their contents. When Firefox encounters an RSS site, it displays a special icon that allows you to create a "live" bookmark to the site. These bookmarks then display updated headlines of stories on the sites.Firefox also includes a permanent, handy search box that can be used to type in searches on Google, Yahoo, Amazon or other search sites without installing a special toolbar.
And it has a cool feature called "Extensions." These are small add-on modules, easy to download and install, that give the browser new features. Among the extensions include the one that automatically fills out forms and another that tests the speed of my Web connection. You can also download "themes," which change the browser's looks.There is only one significant downside to Firefox. Some Web sites, especially financial ones, have chosen to tailor themselves specifically for Internet Explorer. They rely on features only present in IE, and either won't work or work poorly in Firefox and other browsers. Luckily, even if you switch to Firefox, you can still keep IE around to view just these incompatible sites. (In fact, Microsoft makes it impossible to fully uninstall IE.) There's even an extension for Firefox that adds an option called "View This Page in IE." So Firefox is the current choice of a Windows Web browser. It is to IE in 2004 what IE was to Netscape in 1996 -- the upstart that does a better job.
Light Reading writes, Telecom services across South Asia are gradually being restored after the devastating tsunami that hit the region last weekend. With tens of thousands dead and more than a million homeless, subsea communications links will be vital as aid agencies the world over continue the work of assessing the damage and providing assistance. The major undersea cables, operated by consortiums of telecom providers, survived largely unscathed -- Videsh Sanchar Nigam Ltd, says the Tata Indicom-Chennai-Singapore cable, SEA-ME-WE-2 and SEA-ME-WE-3, and the Western Africa Submarine Cable (WASC) were not affected; neither was Bharti Tele-Ventures Ltd.'s 3,200 kilometer cable connecting Chennai, India, with Singapore. Om Malik points out that these are the cables over which much of the outsourcing traffic to-and-from India travels. Many US retailers, banks and credit card companies operate call centers that use these cables. In a sense, this tragedy also proved that even in worst case scenario, the work of money continues. Sad, but true! The Malaysian leg of the South-Asia-Far-East (SAFE) submarine cable had been disrupted and traffic was being rerouted via VSNL’s redundancy cables. There was no word on when the cable could be restored, as repairs have to be made by the Malaysian landing operator. The SAT-3/WASC cable links Europe with West Africa and SAFE continues the connection on to India and Malaysia. The Indian link remains operational. A map showing the major undersea cables in Asia,is available here.
(Via Mohan Srinivasan) Michael Porter says,it is way too early for India to think it had been successful — or even partially successful!!. We had recently covered in this blog India's lack of competitiveness and the gap that India needs to bridge to be competitive. Manjari Raman interviews Michael Porter for Business Standard and writes,Porter signalled that it was way too early for India to think it had been successful — or even partially successful. Concerned that India’s globalisation story might be aborted by short-sightedness in policy or blind-sided by misguided ambition, Porter prescribes a healthy dose of self-criticism. As a frequent visitor to India and as head of the Institute for Strategy and Competitiveness at Harvard, Porter has a clear understanding of India’s potential as well as Indian companies’ latent aspirations for global growth. At the same time, as the architect of The Business Competitiveness Index in the World Economic Forum’s annual Global Competitiveness Report, he knows how far Indian companies have to go before they can rightfully claim their place in the League of Multinationals. He also knows, first hand, how hard it is to make Indian CEOs realise that a critique is not the same as criticism. “India has a tremendous tendency for overstatement,” says Porter, and “ Indians don’t take criticism well. They get very offended.” To compete in an aggressive global environment, Indian companies must not only learn to invite criticism, but also find ways to use it to strengthen strategy and twist into competitive advantage. Excerpts with edits and comments added:
On Indian Companies Not Being in Top 100 International Business : The ability of Indian companies to prosper and be competitive internationally has a lot to do with the home base, and whether India offers an attractive business environment. If companies don't have to compete at home and don't have a vibrant, dynamic environment at home, it's very, very hard for them to compete internationally.
What must be done to be in Top 100 global lists : A good place to start is to think about the nature of the business environment in India and where India stands internationally. Certainly, India is on the right track and is improving its economic performance. The growth in GDP per capita has been quite good. The growth in productivity is still low, but there is some evidence that it has picked up a bit. India's exports are growing, but that growth is dominated by growth in service exports and in particular IT-related services. India is doing quite well in IT-enabled services, but to a considerable extent, that's it! It's a one-trick pony. India is getting tremendous international profile from IT service exports, but they aren't indicative of the broader economy. The export clusters that are growing rapidly are jewelry and precious metals, textiles/apparel, fishing, construction, metal manufacturing and agriculture. Pharmaceuticals are very small, and as per dataa, the sector is growing at a slower rate than India's average growth rate of exports of goods. In automotive components, India shows up on the list; in automotive products, India has a 0.15 per cent share of world exports, and it has not grown its share. Components are one area that has been doing a little bit better. India has a 0.3 per cent world export share in automotive parts and it has grown slightly. But automotive components exports from India in 2002 amounted to just $460 million.In terms of the business environment, IT service exports are growing, but India's service exports, in general, are not growing that fast. Exports of goods are growing, but, again, not that fast, and the big areas in goods exports are still traditional clusters like textiles. There is certainly movement in the right direction, but the magnitude of that improvement is still tiny. In terms of assessing where India really is, we have to understand that there's a long way to go.
On How to globalize faster & better and Government’s role :To build a competitive economy, first, you need to have sound overall contextual conditions, such as macroeconomic policy, a sound legal system, etc. Those are cross-context factors, and include macro, legal, social, and political factors. They need to be sound, stable, and trusted for an economy to be competitive. But in of themselves, those are not enough.In order to have a competitive economy, you also have to have competitive firms. To have competitive firm, you need to have an efficient and appropriate business environment, which creates the right inputs, the right incentives, and the right competitive pressure to allow firms to improve their productivity. Governments shouldn't work with individual firms--that's almost always a mistake. Government should work, first, to enhance and improve the overall business environment--the cross-cutting business environment that affects many clusters. Then, government ought to work with established or emerging clusters to remove the obstacles and constraints that prevent those clusters from becoming more dynamic. If government does those two things, we find that exports and outward foreign direct investment follow. But it's inappropriate and inefficient for government to engage with individual companies. When it is engaging in cluster development, the government's role is really to support the efforts of all existing and emerging clusters to upgrade productivity rather than to make choices about which clusters need specific support. There has long been a tendency in India of distorted support through subsidies. The mentality needs to shift from "we need to support some clusters" to "we need to create a policy framework that allows all clusters to flourish."
On links between country, clusters, and companies and global competitiveness & country: competitiveness is defined as, companies that can be productive and meet the test of international competition. A company has to be globally competitive, or it's simply going to die. From a company's point of view, competitiveness is a matter of survival. Having competitive companies is the way a country supports a high and rising standard of living because those companies can afford to pay high and rising wages. They create new jobs. And by the way, India has a crisis of jobs in the formal economy. When we think about cluster development, we can't think national; we have to think regional. The locus of economic development, particularly in a country of the scale and size of India, needs to be driven down to the state level, and within the state, down to the metropolitan and urban areas. The fact that some states are fairly advanced and organised in terms of that kind of thinking is one reason that India as a nation is successful.
On Trends in competitiveness of Brazil, Russia, India ,China economies and on Indian competitiveness: Emerging economies are becoming more significant players in the global economy. We are seeing increasing outbound foreign investment from the emerging economies, and India is an example of that. Foreign investment out of India is up to roughly $1 billion a year, and that's a meaningful amount of external investment by Indians. That would be one trend. Secondly, the global economy has been shifting a little from the traditional West to the emerging economies in terms of sheer weight.Although the Indian business environment is improving in multiple respects, it has some fundamental weaknesses. A. the capital markets remain relatively weak and undeveloped. B. the physical infrastructure is abysmally ranked.
Indian firms face a really compelling logistical disadvantage over companies in China in terms of getting goods and services to market. But the most pernicious problems in India--which are still not being confronted head-on - are the pervasive barriers to competition. A lot of Indian companies are investing abroad partly to, if you will, escape weaknesses in the domestic business environment, and to build assets and skills that are slow to develop at home. It's interesting that the most successful Indian clusters are ones where the government didn't really have any (contribution). A fundamental shift is still required in the nature of the business-government relationship.
On why globalize , when there is a large domestic market : If India opens up its domestic market and has a lot of competition in the domestic market--as in the United States--then it will begin a more positive cycle. Companies will get to ramp up and build some capability in the domestic market, and competition will drive them to start looking abroad. That dynamic could happen in India if the fundamental characteristics of the business environment are systematically addressed. The other consequence of a large domestic market--which affects both India and China—is, what little foreign investment comes into India is not because India is a great business platform; it's there because of the consumers. China has taken better advantage of that than India has because China is in many ways more open, more dynamic. We've seen many more companies come into China because that's such a dynamic place. The business environment is a bit more efficient, which is why multinationals use China as an export platform. But we don't see that much in India. The multinationals are there primarily just to do business in India and sell to the Indian market. Another really big challenge for India, if she is going to develop the more advanced clusters, is the issue of intellectual property (IP) protection. Until India can be really credible on that, I think the growth of biotech will be limited. Can't say anything more and better than this!!
Wired has an interesting article on Bram Cohen and BitTorrent, the filesharing software that has tens of millions of users and generates about a third of all internet traffic. Excerpts with edits and my comments added:
Bram Cohen is the creator of BitTorrent, one of the most successful peer-to-peer programs ever. BitTorrent lets users quickly upload and download enormous amounts of data, files that are hundreds or thousands of times bigger than a single MP3. Analysts at CacheLogic, an Internet-traffic analysis firm report that BitTorrent traffic accounts for more than one-third of all data sent across the Internet. Cohen showed his code to the world at a hacker conference in 2002, as a free, open source project aimed at geeks who need a cheap way to swap Linux software online. But the real audience turns out to be TV and movie fanatics. It takes hours to download a ripped episode of Alias or Monk off Kazaa, but BitTorrent can do it in minutes. As a result, more than 20 million people have downloaded the BitTorrent application. If any one of them misses their favorite TV show, no worries. Surely someone has posted it as a "torrent." As for movies, if you can find it at Blockbuster, you can probably find it online somewhere - and use BitTorrent to suck it down.
Cohen’s idea : Breaking a big file into tiny pieces might be a terrific way to swap it online. The problem with P2P file-sharing networks like Kazaa, he reasoned, is that uploading and downloading do not happen at equal speeds. Broadband providers allow their users to download at superfast rates, but let them upload only very slowly, creating a bottleneck: If two peers try to swap a compressed copy of Meet the Fokkers - say, 700 megs - the recipient will receive at a speedy 1.5 megs a second, but the sender will be uploading at maybe one-tenth of that rate. Thus, one-to-one swapping online is inherently inefficient. It's fine for MP3s but doesn't work for huge files.Cohen realized that chopping up a file and handing out the pieces to several uploaders would really speed things up. He sketched out a protocol: To download that copy of Meet the Fokkers, a user's computer sniffs around for others online who have pieces of the movie. Then it downloads a chunk from several of them simultaneously. Many hands make light work, so the file arrives dozens of times faster than normal. Paradoxically, BitTorrent's architecture means that the more popular the file is the faster it downloads - because more people are pitching in. Better yet, it's a virtuous cycle. Users download and share at the same time; as soon as someone receives even a single piece of Fokkers, his computer immediately begins offering it to others. The more files you're willing to share, the faster any individual torrent downloads to your computer. This prevents people from leeching, a classic P2P problem in which too many people download files and refuse to upload, creating a drain on the system. "Give and ye shall receive" became Cohen's motto, which he printed on T-shirts and sold to supporters.
Bram Cohen's approach is faster and more efficient than traditional P2P networking:
1. A single source file within a group of BitTorrent users, called a swarm, spreads around pieces of a film or videogame or TV show so that everyone has a chunk to share.
2. After the initial downloading, those pieces are then uploaded to other needy users in the swarm. The rules require every downloader to also do some uploading. Thus the more people trying to download, the faster everything is uploaded.
3. Before long, the swarm has shared all the pieces, and everyone has their own complete source.
Traditional Peer-to-Peer sites are slow because they suffer from supply bottlenecks. Even if many users on the network have the same file, swapping is restricted to one uploader and downloader at a time. And since uploading goes much slower than downloading, even highly compressed media can take many hours to transfer. BitTorrent is something deeper and more subtle. It's a technology that is changing the landscape of broadcast media. "All hell's about to break loose," says Brad Burnham, a venture capitalist with Union Square Ventures ... BitTorrent does not require the wires or airwaves that the cable and network giants have spent billions constructing and buying ... BitTorrent transforms the Internet into the world's largest TiVo. If enough people start getting their TV online, it will drastically change the nature of the medium ... The whole concept of must-see TV changes from being something you stop and watch every Thursday to something you gotta check out right now, dude. Just click here.What exactly would a next-generation broadcaster look like? The VCs at Union Square Ventures ... suspect the network of the future will resemble Yahoo! or Amazon.com - an aggregator that finds shows, distributes them in P2P video torrents, and sells ads or subscriptions to its portal. The real value of the so-called BitTorrent broadcaster would be in highlighting the good stuff, much as the collaborative filtering of Amazon and TiVo helps people pick good material. Man... this is amazing.. True convergence at work.
The 2005 A.T.Kearney study of outsourcing's most remarkable change is likely to be a quantum leap in the number of countries vying for this back-office work, which increasingly is being outsourced by U.S.,Western European and Japanese companies. Excerpts with edits and my comments added:
As many as 40 countries are being considered for next year's list, says Janet Pau, policy analyst for A.T. Kearney's Global Business Policy Council, a roundtable for corporate chief executives.The trend is toward "more offshore activity in countries besides India, such as some of the smaller players like the Czech Republic and Malaysia," Pau notes. Moreover, she adds, there is likely to be a farther eastward push within Eastern Europe, to countries such as Romania and Bulgaria, as work spills over from places such as the Czech Republic and Hungary.A similar expansion is expected in other regions of the world. Several of the nations the A.T. Kearney experts are currently considering for inclusion in their spring 2005 survey are unlikely to leap to mind when Western executives contemplate potential off-shoring locations. Among these are Morocco, Tunisia, Ghana and Uruguay.
A longer list of contenders will only reflect the rapidly spreading interest around the world in getting a piece of the BPO action. While the amount of back-office work that is currently sent outside their national borders by Western, European and Japanese companies is still a tiny fraction of the total amount of such work that these companies outsource, it can constitute a substantial source of revenue and employment for recipient countries, as India has demonstrated.locations as disparate as Dubai, Mauritius and Sri Lanka are making plans to capture some of this economic upside. Dubai, a component of the United Arab Emirates, is setting up a Dubai Outsourcing Zone where wholly foreign-owned companies can operate tax-free. Dubai is also promoting its efficient transportation infrastructure and westernized lifestyle to potential outsourcers.Mauritius, already a destination for tourists drawn to its sparkling Indian Ocean beaches, is working on creating a high-tech enclave and touting the multilingual skills of its population. Because of periods of both French and English rule, many of its 1.2 million people are comfortable in both French and English. "I think the next big emerging phenomenon is a hub-and-spoke model in globalization of services," Aron says. Singapore already has become a BPO off-shoring hub whose spokes extend to India, China and the Philippines, and some day could reach out to Sri Lanka and Vietnam, he says. Dubai has the potential to be a Singapore-like hub in the Middle East. It is "stable, forward looking and technological advanced," he says.
Singapore and India bear a symbiotic relationship, Aron adds. So far as business process off-shoring is concerned, one couldn't exist without the other. India has depth and breadth of technologically skilled labor, available at a low cost; English language skills that facilitate communications with the developed English-speaking countries; an improving telecommunications infrastructure and extensive IT experience.Singapore has a far smaller population but it has deeper ties to the West, including a free trade agreement with the U.S., and a highly advanced transportation and internet infrastructure. It also has "squeaky clean markets and judicial systems," Aron says, adding that frequently, when Indian companies export software, the contracts include clauses that provide for arbitration of disputes in Singapore courts. Singapore also has sophisticated managerial talent attuned to Western standards and provides a standard of living that is more attractive to expatriate Western managers than locations in India.Add its geographical location, virtually "next door" by air, and Singapore increasingly is serving as the "natural hub for doing long-term strategic control," Aron says, providing services in such areas as regulatory compliance, auditing, data security, and risk mitigation that overlay nitty-gritty call center and back office work.More than 300 Indian IT-services companies - including giants Satyam Computer Services and Tata Consultancy Services - have located in Singapore, in part to insure themselves against adverse U.S. legislation on trade issues."Singapore is a natural shelter because of its free trade agreement with the U.S.," Aron says. If needed, Indian BPO companies' computers can route their TCP/IP packets of data to the United States via Singapore. Besides, Singapore's extraordinary telecom and physical infrastructure also makes it a prime location for business data continuity and disaster recovery operations of Indian and other companies offering BPO services. Latin America and Eatern Europe shall also be powerful contenders. My Take: I still beleive that India shall retain a marketshare of 65% to 70% for atleast the next 5-6 years. Maturity, process advantage, manpower availability, mindshare - all shall help India maintain the momentum - but the race is getting tougher with every passing day - India has to create world class infrastructure and become more and more and more business friendly -otherwise this could become the case of another missed opportunity.
The blog—short for weblog can indeed be, as Scoble and Gates say, fabulous for relationships. But it can also be much more: a company's worst PR nightmare, its best chance to talk with new and old customers, an ideal way to send out information, and the hardest way to control it. Blogs are challenging the media and changing how people in advertising, marketing, and public relations do their jobs. A few companies like Microsoft are finding ways to work with the blogging world—even as they're getting hammered by it. So far, most others are simply ignoring it. That will get harder: According to blog search-engine and measurement firm Technorati, 23,000 new weblogs are created every day—or about one every three seconds. Each blog adds to an inescapable trend fueled by the Internet: the democratization of power and opinion. Blogs are just the latest tool that makes it harder for corporations and other institutions to control and dictate their message. An amateur media is springing up, and the smart are adapting. Says Richard Edelman, CEO of Edelman Public Relations: "Now you've got to pitch the bloggers too. You can't just pitch to conventional media."
In a blog, whatever the topic, the discussion of business isn't usually too far behind: from bad experiences with a product to good customer service somewhere else. Suddenly everyone's a publisher and everyone's a critic. Says Jeff Jarvis, author of the blog BuzzMachine, "There should be someone at every company whose job is to put into Google and blog search engines the name of the company or the brand, followed by the word 'sucks,' just to see what customers are saying." It all used to be so easy; the adage went "never pick a fight with anyone who buys ink by the barrel." But now everyone can get ink for free, launch a diatribe, and—if what they have to say is interesting to enough people—expect web-enabled word of mouth to carry it around the world. Unlike earlier promises of self-publishing revolutions, the blog movement seems to be the real thing. A big reason for that is a tiny innovation called the permalink: a unique web address for each posting on every blog. Instead of linking to web pages, which can change, bloggers link to one another's posts, which typically remain accessible indefinitely. This style of linking also gives blogs a viral quality, so a pertinent post can gain broad attention amazingly fast—and reputations can get taken down just as quickly. As the impact of blogs spreads through global business, that pain—and promise—will be something companies will have to deal with. And if they don't? You're bound to read about it in a blog. This piece does provide a good overview of the ways blogs are impinging on the business world and watch the bloggers action about the Tsunami - they are now one of the most significant community in the world towards fund raising, communication and in aid measures..
Open Source components inside software products can be potential, "Deadly Room of Death that one may not like to have inside the product."Many software companies and other businesses as engineers frantically search their files for something they hope not to find: open-source components. The improper use of open-source components, in the worst-case scenario, could subject companies to costly litigation from parties like SCO Group of Lindon, Utah. SCO says it owns intellectual property in the Linux open-source operating system and has set off alarm bells in executive suites by suing International Business Machines and three other Linux-using companies over the past year. "It's almost like you've got be a lawyer now to develop software," grumbled Jothy Rosenberg, chief executive and chief technical officer, Service Integrity ,who this month ordered a 24-hour scanning of his company's Sift 3.5 software during a "code freeze" before its introduction. "In this day and age, anybody building a commercial piece of software has got to do this. It's like buying insurance on your building."
There are no hard numbers on how much U.S. businesses are spending to prevent themselves from possibly infringing on open-source licenses. While few say that the problem rises to the level of the "Y2K" problem - adapting numerous programs to display four-digit numbers for years after 1999 - many say it has become pressing and costly. Some liken it to the Sarbanes-Oxley financial reporting requirements that have rattled executives at publicly traded companies. And the problems are related, in that Sarbanes-Oxley requires public companies to value their software and assess their litigation risks. Open-source software is freely available to use, distribute and modify, but it is subject to large and small restrictions set forth in dozens of open-source licenses. Some companies, like Avid Technology, which makes digital film editing machines, have sought to avoid license conflicts by banning open-source software. Others have persisted in using open-source code but have purchased scanning software or set up search engines to hunt for license conflicts they can resolve through proper identification or attribution. The most serious conflicts, involve code covered by the so-called General Public License. Under that license, anyone who acquires and modifies open-source code must make their modified versions freely available to the public. Depending on how many files of code are covered and what is in them, such a requirement can sometimes be a major impediment for a proprietary software company. Among the scariest aspects of the problem is that many business executives do not know whether open-source code is in their software, or they mistakenly presume that they have none. Either way, they could be setting themselves up for a lawsuit.
Software developers working on "value-added" applications routinely borrow pieces of open-source code as building blocks for such functions as encryption, security or platform interfacing. Offshore programmers for American companies have become especially adept at grabbing lines of open-source code and mixing them with proprietary code in progress. "There are corporations that literally don't know what lurks in their code," said Douglas Levin of Black Duck , a start-up company. Black Duck developed its scanning software partly by assembling a giant repository of open-source code, employing a young team of "spiders" to sift through Web sites looking for open-source lines and patterns. A related article here highlights the potential for litigation in using opensource. Law firms, consultants, software developers and technology service companies - also are moving to capitalize on the jitters that have been spreading in the business world. Optaros , a consulting start-up, is offering to provide its clients with open-source audits, examining how they use the software and advising on licenses. Levin, president and chief executive of Black Duck Software estimated that the market for all companies addressing open-source litigation risks could total $500 million by 2005. "There are a lot of challenges for companies working with open-source software, but they're manageable," is what some in the industry feel. "Open-source is here, and companies have to deal with it, just like you have to deal with snow in New England." Open-source has been around for two decades as a favorite tool of computer scientists and technology-minded college students, but it only recently has moved into the business world.
IBM's decision to support Linux in 1999, partly as a counterweight to the dominant Windows operating system sold by its rival, Microsoft, brought open-source software into corporate data centers where it has gained momentum among users of large servers, the machines that form the backbone of business computer networks. But the corporate love affair with open-source cooled in March 2003 when SCO sued IBM for more than $1 billion, alleging that it had introduced into Linux proprietary code misappropriated from SCO. And SCO has since sued DaimlerChrysler, AutoZone and Novell, the company that sold SCO the source code and patents from the Unix operating system that was a model for Linux. About 1,500 other Linux-using companies received warning letters from SCO. Businesses fear that SCO's flurry of lawsuits may be a sign of trouble to come. "What SCO has done is to throw down the gauntlet," Scott Nathan, a lawyer, said. "If SCO is successful, there are going to be copycats." Nuisance suits related to open-source could prove a worrisome distraction for companies that have belatedly embraced the technology as a cost-saving measure. "If you're Wal-Mart and you have embedded Linux in every cash register, you might be seen as a deep pocket" by litigious SCO copycats, said Thomas Carey, an attorney with the Boston law firm Bromberg & Sunstein. Interesting developments. Watch this space for related developments.
Wal-Mart's experience so far has served as a reminder that creating the future is not all that easy. With Jan. 1 just days away, the technology is not yet ready to meet the needs of either Wal-Mart or its suppliers. The tags, which are typically about the size of a credit card and contain an antenna and microchip encased in plastic, receive query signals from scanning devices called readers. Using the energy captured from those signals, they broadcast a snippet of code identifying the goods to which they are attached.
To date, most of Wal-Mart's suppliers have not figured out inexpensive ways to automate the printing and application of the tags. Although read rates are improving, no one who uses the technology has systems that can reliably read the information 100 percent of the time in factories, warehouses and stores; Wal-Mart said the rate was around 60 percent in its stores. Nor is the data currently integrated well enough with other technology to initiate changes in manufacturing or shipping schedules that could actually save the large sums of money that would make the investment worthwhile. Wal-Mart's official position is that it is working closely with suppliers, meeting its goals and learning valuable lessons that will pay off as the technology continues to roll out. But analysts who regularly survey major consumer goods companies said that most participants were cooperating with Wal-Mart out of fear of offending the retailer and were, as much as possible, putting off investments in the technology.
"The big manufacturing companies have advocates for the technology who are very positive, but the people on the floor who are implementing it are much more negative," Kara Romanow, an analyst at AMR Research,said. Wal-Mart's goal was to wring billions of dollars from the supply chain by using the tags to keep shelves filled with whatever consumers were buying, cut back on shipments of other goods and combat theft.The mandate was soon defined in narrower, more practical terms as supplying tagged cartons and pallets, not individual items, to a limited number of stores through just three Texas distribution centers by the Jan. 1 deadline. Wal-Mart said recently that more than 100 suppliers would be tagging bulk shipments to the three Texas centers next month. But only 40 will be tagging everything they send. Of the remainder, two have been so tied up in a complete overhaul of their entire information technology infrastructure that they have put off attempting to introduce radio tagging. Some suppliers will be tagging as little as 2 percent of the goods going to the centers.
"We think the average supplier will be tagging about 65 percent of the volume they ship to the three centers," Linda Dillman, the chief information officer of Wal-Mart, said. AMR, the research firm, said it had found that companies were investing $1 million to $3 million to comply with Wal-Mart's program, far less than the $13 million to $23 million that AMR had estimated in August would be needed for fully integrated systems that generated useful data.Some companies delayed getting started for so long that they are now having trouble getting tags, according to the analysts and Wal-Mart. That problem is expected to recede next year as tag manufacturers expand their production lines. An important stimulant to that came last week, when a next-generation standard for tags and readers was ratified by EPCglobal, a nonprofit industry group. Heavyweights like Texas Instruments and Philips that had not made the first-generation tags plan to enter the market with the newer technology.
Although the progress has been slow, it has an air of inevitability. Radio tagging, known as RFID (for radio frequency identification), has been spreading through the economy for decades in applications like automated toll collection, tracking tags for animals and wireless cards controlling access to buildings.But the technology was not widely publicized until Wal-Mart announced its deadline. Subsequent decisions by other merchants like Target,and Best Buy to push for radio tagging made it unmistakable which way the wind was blowing, at least among retailers. The movement toward radio tags on consumer products gathered momentum when the Defense Department also set a Jan. 1, 2005, deadline for its major suppliers of a broad range of general merchandise and endorsed the tag and scanner standards being developed by a consortium of retailers and major suppliers like Procter & Gamble and Hewlett-Packard.In addition, drug companies are expanding pilot projects of applying radio tags to pharmaceutical shipments. The Food and Drug Administration has set 2007 as its goal for general use of the technology. Separately, Boeing and Airbus are working together on standards for tagging the 5,000 or so aircraft parts that are most frequently handled by airline maintenance crews.
Wal-Mart and other retailers, and many manufacturers, are excited about the technology because the tags can store more information than bar codes, and large numbers of them can be scanned at one time. In addition to its top 100 suppliers, Wal-Mart is working with 38 others that have volunteered to be in the first wave of vendors complying with its mandate. But the pilot testing this year has offered evidence that, before most businesses can justify big investments in the technology, its costs must fall sharply and the scanners must be able to read tags faster and in more varied conditions. To drive down costs, manufacturers want the recently adopted American standards to be made compatible with those being developed elsewhere. Still, if the size of the challenge became apparent in 2004, so were the ways in which it could be tackled. Wal-Mart and others say that, in 2005, not only will tagging be expanded, but there will also be a sharp increase in the testing of software and business strategies that use the data captured from the tags.
Recently we covered the future growth prospects and the direction that cisco would be taking moving forward.Oligopolywatch writes Cisco Systems appetite for acquisitions remain unabated.Cisco Systems, the dominant player in the market for network routers and switches, is a major success story. While it was the darling of the 1990s bubble, it is still a very successful firm with $22 billion in income and $4.5 billion in net income, and is #100 in the Fortune 500. Much of Cisco's growth has come though acquisitions, numbering almost 100 in its decade of existence. The company acquired when its stock price was stratospheric, and it continues to acquire now that its stock price is more reasonable. In spite of its hunger to acquire, Cisco has stayed very close to its core strengths, communication and security hardware and software. But that core area has extended in recent years to such areas as teleconferencing, Internet telephones (VOIP), and home networking. Writing about Cisco, CNET.com wrote, "Acquisitions have been the lifeblood of Cisco's growth. From its early days, when it branched out from its router focus to build a multibillion-dollar switching business through acquisitions to its current strategy of buying its way into the optical networking market, the company has used its high-flying stock to access new market niches". The stock isn't flying so high anymore, but with 14 acquisitions in 2004, the company philosophy hasn't changed much.It's possible to see Cisco as an engine for acquisitions, swallowing up nearly all the new relevant ideas that pop up in small start-ups, then integrating those ideas into its product line. Many of the acquisitions have been relatively inexpensive, though over the years that have been several multibillion dollar deals, including those for companies like Cerent and ArrowPoint Communications. The 2004 acquisitions include:
- Protego, which makes data security monitoring equipment aimed at small to
- BCN Software, which offers network routing software
- Jaki Networks, which sells networked device management tools
- NetSolve, with its remote network management software
- P-Cube, that offers software for management for IP service providers
- Dymanicsoft, software for providing VOIP services
- Perigo, a company that makes something called admission control software,
used to defend networks form attacks.
- NetSolve, a provider of software for remote network management and secure
- The UK's Parc Technologies, Ltd., which makes software for data traffic
- Actona Technologies, which makes wide-area file management software
- Twingo Systems, a maker of software for protecting secure network transactions
- Riverhead Networks, a maker of software to protect servers from
"denial of service" attacks
- Procket Networks, which develops router software
- Latitude Communications, a provider of online conferencing software
As the venerable Om Malik points out,Only Procket was a hardware vendor. Perhaps like all hardware vendors, Cisco is realizing that the future is software, not hardware which is becoming increasingly commoditized.