The Shore group has come with an excellent report on the outlook for the year 2005for the content industry.As the year 2005 starts the rubble of old business models that began to come tumbling down in 2004 is being pushed aside to make room for new business models that span old categories and define highly profitable niches where profits werenever imagined before. Shore sees four key areas where the rapidly shifting action willunfold in creating and expanding these new models: cooperation, commercialization, containerization and consolidation. Excerpts with edits:
In 2004 we watched the "walls come tumbling down" as new and reconfigured players asserted themselves in professionally-oriented content.Barriers crumbled between publishers and the content and technology partners that are succeeding where many traditional providers have stumbled or stalled, creating opportunities for new and more potent business models to assert themselves far more aggressively than ever before.
Cooperation : As so many new models for success have migrated from the control of major publishers and aggregators to new suppliers of content value, pure power plays are going to be few and far between in 2005. Content value is increasingly unchained from specific platforms and technologies in most instances. This makes specific technology alliances very fluid instruments in a greater model.
Commercialisation: Consideration of new and additional monetization models is a must for these vendors in 2005. The wary stance taken by many publishers contrasts with outlets such as Google that pushed head-on in 2004 with fearless new ways of monetizing content. This underscores the need for content suppliers of all kinds to leave no money on the table in 2005 – a concept that applies not only to premium content
Containerization: With content flowing into more types of user venues than ever before, having content that is not locked into specific venues is more important than ever. Peer-to-peer networks, RSS feeds, XML-based Web services, eBooks and portal-driven intranets are successful venues in which the formats provided for content allow it to flow easily from venue to venue at the behest of subject experts, content enthusiasts and content technologies.
Consolidation: While innovative content and content technology companies are proliferating again, key parts of the content story in 2005 will be written by companies that are able to absorb and convert content from behind-the-times suppliers into more efficient and profitable services and venues. Many journal, newspaper, database and magazine publishers failed to anticipate the rise of online production and delivery in 2004 as the irrevocable center of future profits.
Rights management: This area will escalate into a full scale free-for-all between authors, producers, publishers, libraries, distributors and technology players,accelerated by Google initiatives in expanding the realm of searchable content, from books to scientific publications and video. From a technical viewpoint, search engines open up access to a much broader breadth of content, but the bigger issues are the ownership and monetization of that content. Creating content costs money, so the owners rightly expect compensation for their labor, but this reality blurs in the digital world. “Free” access to content for marketing and advertising purposes (the Amazon model) is hard to distinguish from checking out a purchased eBook from a library.Even with purchases, monetization in the form of royalties takes on new aspects. In the traditional publishing world, subsidiary rights pay lower royalties to authors than hardback book rights, but in fact may be more valuable in an electronic world.
Importance of policy: Organizations, particularly businesses, will be forced to set explicit policies governing use of the easy to use technology for creating blogs and RSS feeds. The issues are very different, depending on the context. The most visible form has been citizen journalism which has the same feel as the earlier Free Speech movement in Berkeley, celebrating the individual and their right to freely express an opinion.
Existing business models are not going to disappear in 2005, but 2005 may be the last year in which vendors can capitalize on new revenue flows based on the rapid growth of contextual online advertising, rich data and technology-oriented product and services sales in time to stave off future revenue crises. Many of the more advanced content vendors have counted on smart workflow integration products to get their content value propositions back in line in the institutional space, for example, which has helped to steady sales and cement relationships. But 2005 will be a year in which the question, "What business are we in, anyway?" will call successful workflow products and rich data offerings from content vendors into stronger question as a long-term solution to more fundamental revenue issues surrounding licensed content and traditional delivery services.
Strategy +Business writes, Digital terrestrial television uses analog technology to deliver digital programming, may soon affect everything from home entertainment to mobile communications. Digital terrestrial television (DTT), which uses the analog infrastructure of traditional broadcast television to deliver digital programming, has the potential to bring interactive TV, multichannel capabilities, and TV-based online shopping, banking, and other services to the hundreds of millions of people who don’t have access to cable or satellite systems. In the process, DTT will create an array of new business opportunities that could affect everything from home entertainment to mobile communications. Excerpts with edits from an interesting article:
DTT is either already making inroads in several countries or poised to do so, because of regulatory agency mandates that will require broadcasters to switch from analog to digital terrestrial TV by around 2010. Because consumers can use conventional TV sets to access DTT and usually don’t have to pay a subscription fee to view basic stations, adoption of DTT is generally very rapid in almost every country where it is introduced. In the United Kingdom, more than 4 million households use DTT after only two years in the market. Initially, DTT will have its greatest effect in the pay-TV sector, which is, thus far, controlled by cable and satellite companies. A comparable cable/satellite offering would require a monthly subscription for basic service plus digital or some other premium service. The content provider would have to maintain complex bookkeeping and customer management systems as well as its vast TV distribution network.
Cellular operators also have something to fear from digital terrestrial TV. Using DTT platforms, media companies will be able to deliver entertainment or information on new dual-mode handsets that recognize both mobile phone and DTT technical standards. In effect, these handsets will let media companies bypass traditional mobile technology. DTT is a far more efficient transmission protocol for mobile entertainment than cellular networks, because it is capable of delivering content to millions of individual connections at once without network interference or overload. For instance, tens of thousands of people in a football stadium could simultaneously and reliably access DTT-provided data or entertainment through a mobile device, whereas a cellular network would be too congested to handle a load even 1/1000th that size. DTT’s most intriguing impact could be on interactive or two-way advertising campaigns, DTT is a more universal delivery system than cable or satellite, it has the potential to make interactive advertising more common. The future of DTT comes down to numbers. The terrestrial open platform provides a strong enough signal to reach everyone without the need for a satellite dish or cable lines, and it can potentially offer more than 50 free digital channels, depending on a country’s geography and available terrestrial frequencies. Cable and satellite are more powerful — most cable TV systems offer about 150 channels, and satellite TV delivers 500 or more channels — but they are based on proprietary platforms and monthly subscription fees. Perhaps the more important numbers have to do with investments of time and money. It will be a lot quicker and, over time, a lot less expensive for small content providers to offer high-quality, Internet-age, in-home and mobile programming and applications on DTT than on any other medium. That may just be enough to begin the new revolution in old TV.”
Jonathan Schwartz writes,computers aren't the commodity - computing (and bandwidth) are.He argues,just as power generators built at GE aren't the commodity,whereas electricity is. In his recent blog, Jonathan extends his reasoning to cover evolving developments in the technology industry. He argues, standardisation precedes commoditization and processing powers available on tap are setting the standards and more importantly, commodity selling enteprises are among the most successful profitable enteprises in the world. Excerpts with edits and my comments added:
Early in its evolution,only the wealthy could afford electricity (along with the requisite generator and electrician), and the technologies were fragile. Businesses that wanted power generation facilities were similarly wealthy enough to afford large-scale versions of the same, It took about a decade for those deploying electricity to settle on a few standards that ultimately accelerated consolidation. From voltage to cycle to plug configuration once the standards existed, businesses could plug into a grid - labor markets went through a fairly sizable dislocation, but electricity was firmly established as a ubiquitous service. Scale efficiencies and the resulting massive decrease in price allowed the government to bridge the power divide through rural electrification. Electricity that started out 20 times the price of gas lighting - obviously got a lot cheaper. Once the standards were set, and the grid powered up, electricity finally established a transparent price - the hallmark of a true commodity. If pricing isn't transparent, products can't be deemed a commodity - transparency meant to be defined for a standard unit of measurement like "5 cents per kilowatt hour," "2 dollars per gallon." It's either a standardized physical delivery (gallon, barrel, ton), or unit of consumption (typically time based, 100 megabit hours, megawatt hours, etc.) - but it's the same across the industry. Jonathan argues, by this definition, a 4-way x86 dual core Opteron server running an open source indemnified Solaris OS with a J2EE 1.4 compatible Java Enterprise System web services stack, optimized JVM and 256M of RAM is not a commodity.
Sun is all set to announce - the evolution of a true computing grid, priced at $1/cpu-hr; the business and technology models underlying such a transformation; and moreover, the impending impact on the marketplace for computing power and value.Jonathan adds,while power operators can't add a lot of value to a powerline, things are a tad more hopeful for the network operators.
The great thing about commodities, whether financial services, telecommunications, oil and gas, and now computing - is that the companies whose business it is to monetize those commodities, along with the businesses that supply the technologies necessary to compete in a commodity market, are among the largest on earth. Vodafone, Citigroup, Exxon Mobil. What do they have in common?
1) They're among the most valuable businesses on earth.
2) They're primarily technology companies.
3) They're among the largest buyers of technology in the world. And
4) They're all in commodity businesses.
They all thrive on Innovation.
My Take:While Jonathan has made his point about the nature of commoditization, I think that neither computing nor applications can ever become commodities - wherever high technology is involved, wherever faster growth is seen, wherever innovation keeps happening and scope for further innovation exists,wherever multiple fields of study meld and fuse, chances of commoditization are very low.In fact, going by this logic, one will have to wait and see the nature of success of offering computing power on taps at predetermined prices as a packaged offering - we have to see how many customised offering exist and how many account managers need to gain insights into customers unique demands and offer differentiated services. The best example is aircraft and airline services - aircraft manufacturing is trailblazing -like the recent A380 Double Decker launch and innovative airline services like Jeblue and distinction by better service like Singapore Airlines, wherever there is a fusion of different experience, things can never get commoditised. Since technology industry is highly innovative, involves advances in multiple disciplines - I think neither computing nor computers can ever get commoditized(at least for the next fifteen years) - with infinite opporunities for technology industry to grow - I only see differentiation and premium pricing becoming more amd more well entrenched - cutting through the commodization trap.
(via Silicon valley watcher) Human stories illustrate the economic disparities that characterize Silicon Valley as the region struggles to bounce back By the numbers, the economy is getting better and worse -- depending on who you are. Silicon Valley has developed two separate economies that have drifted further apart ever since the dot-com bubble burst in 2000.In the valley's technology economy, profits, revenues and average pay are up dramatically. But fewer people are sharing in the good fortune because tech companies are doing more with less - they have cut tens of thousands of jobs and continue to do so, boosting the productivity of their remaining workers. Guys - Generally speaking -isn't this true of all places?
Eric Pfanner visualizes a future in which media consumers, empowered by new technology, demand everything for free - New technologies offer more opportunities than threats. The spread of broadband, for instance, provides video and interactive experiences of a quality unimagined a few years ago. Digital distribution of music and other media via the Internet creates a whole new business model, not just a vehicle for runaway piracy. Excerpts from various leaders views:
- Richard Gelfond , co-chairman and CEO of Imax : Imax licenses the technology for movie theaters with giant, wraparound screens and 3-D effects and provides films to show in them. In the past, these theaters tended to be housed in museums and were filled with schoolchildren awed at seeing wild animals seeming to bound forth from the screen. But with new technology, Imax is changing from the stuff of Tuesday morning field trips to Saturday night datesAn Imax theater used to cost about $5 million, Gelfond said. But a lower-cost system called MPX has cut the cost to as little as $1.6 million, allowing multiplex cinemas to upgrade their offerings. Another technology lets the company convert standard studio releases into Imax format, allowing theaters, on average, to charge a 30 percent premium.
- The Media Owner: Hubert Burda, publisher and CEO, Munich-based Hubert Burda Media: "The traditional magazine business will be flat from last year ... so we will have to take advantage of new advertising models." He says the company has seen rapid growth in keyword advertising on its news magazine site.
- The Middleman: Brian Wood, president of Columbia House: Owned by private investment firm Blackstone Partners, music seller Columbia House so far has avoided the online download business; the implication is there's no guarantee of profitability and no Apple-like hardware rationale. Says Wood, "The tricky thing is coming up with a sustainable business model at a time when everything is changing."
- The Futurist: John Battelle, author, He runs several Web sites, including a blog, and he plans to start a business that will sell advertising for other blogs : "Big media's revenue premise is based on the delivery of advertising on a platform that's no longer necessary." He thinks blogs can benefit from pooling ad sales by retaining editorial independence while offering target audiences for advertisers.
- The Investor:Glenn Hutchins, managing member, Silver Lake Partners, and stakeholder in Thomson: "Just as the DVD didn't ruin the content industries, neither will broadband.
Om Malik made the famous statement sometime back ,"The axis Of technology has shifted to south china sea" and here he writes about the reach of internet in asia and predicts the internet traffic in asia to further improve. Om Malik writes,"Japan’s influence on mobility, South Korea’s leadership in broadband deployment, China’s growing influence in manufacturing and net-enabled economy, and India’s services sector are creating a whole new set of dynamics for technology industry". He quotes,Telegeography’s Global Internet Report ( Aug 2004), and spots that the international IP Traffic Grew 115 percent between 2003 and 2004, outpacing the underlying capacity growth of 46 percent. But the most surprising number is the sharp increase in the traffic between Asian countries - 434 percent between 2003 and 2004, compared to 82 percent between European countries. However, one could easily discount that because after all it is coming off a smaller base. But the number to watch is the traffic growth between US and Europe and US and Asia. Average Internet traffic across the Atlantic and Pacific grew 110 and 119 percent, respectively, between 2003 and 2004. It used to be exactly the opposite a few years ago. Unscientifically speaking, the following stat shows that US is conducting more "net" business with Asia and expects this to increase further. As India, Singapore and other new hubs become more and more fiber-enabled there will be a growth in the traffic and hence contact between US and Asia, concludes Om Malik.
Smalla.Net points to this paper Breaking the Ice- Rethinking the telecommunications law for the the digital agetalking about the changes that beed to take place in the telecom environment. The paper notes," Telecommunications is a trillion-dollar industry undergoing a massivetransformation. As technology and market developments undermine long-standing business models and value chains, existing legal frameworks are failing" and is recommending a few ideas to facilitate the transformation.
I was wondering, as Asia is really moving fast in this space, we should be seeing a lot of IP based phones and related enteprises -IP Telephone services providing flat rate services springing out of Asia - I keep visiting all Asian capitals regularly and could not find asia headquartered one across shanghai,seoul,singapore and chennai.
As James Seng points out, there are several hurdles in making this happen.
- The first and foremost problem is the lack of harmonization of regulatory framework across Asia. This means licensing, getting phone numbers, negotiate interconnections, implementing emergency services, wiretapping, universal service obligation would be very different across each economy where unlike US or EU. Poor understanding of regulators and lack of open market are issues.
-In many countries, providing Internet is a job for the incumbent with very few alternatives. Even in countries which is supposingly wired up, issues like poor quality of copper wires or lack of maintenance are issues to be grappled.
-Backbone would also be a nightmare if you are gung ho to "give better voice quality" via your own network. Due to close-market regulation and competitions on other routes, a STM-1 from neighbouring countries would cost several times compared to Singapore to US rates. (sea routes are often cheaper then land even at greater distance).
Asia also have some highest Broadband pentration countries in the world. Broadband is one of the basic requirement for IP Telephony service to boom and so this represent an immediate market for one willing to explore. Asia also represent 250M Internet Users or 32% of the internet population with huge room for growth with 60% of the world population. This means a huge long-term potential for a pan-asia player.
( Via Om Malik) Nicholas Negroponte, author of Being Digital and the Wiesner Professor of Media Technology at MIT, says he has obtained promises of support from a number of major companies, including Advanced Micro Devices, Google, Motorola, Samsung, and News Corp. The low-cost computer will have a 14-inch color screen, AMD chips, and will run Linux software.AMD is separately working on a cheap desktop computer for emerging markets. It will be sold to governments for wide distribution.An engineering prototype is nearly ready, with alpha units expected by year’s end and real production around 18 months from now, he said. The portable PCs will be shipped directly to education ministries, with China first on the list. Only orders of 1 million or more units will be accepted.Mr. Negroponte’s idea is to develop educational software and have the portable personal computer replace textbooks in schools.Major companies from Hewlett-Packard to Microsoft to Dupont, facing saturated markets in the richest industrial countries, have shown an interest in developing less expensive products to sell in low-income countries in south Asia, Africa, and Latin America.
My Take:Throw in some communication capabilities and some more open source desktop tools to run on top of Linux - this would serve multiple objectives - reaching to new set of masses, having a larger pool of computer literates and ofcourse sowing seeds for deploying alternates to monopolies in desktop segments to provide true options to end consumers. With saturated markets in the west and the potentially mindboggling latent demand in emerging markets, and in the context of mobile reaching almost 25% of world's population - the personal computer's reach and penetration need to be enhanced quite significantly and initiatives like this would go a long way. In anycase servicing the the bottom of the pyramid is the way to go - Courtesy C.K.Prahalad - even in seemingly different high tech industry
Modern day writers would often ask the question –How did yesteryear writers manage to publish with the help of typewriters. Today, word processors let us create sentences easily our hard drives are better suited for storing and retrieving documents than file cabinets. But writers don't normally rely on the computer for the more subtle arts of inspiration and association.. The word processor has changed the way we write, but it hasn't yet changed the way we think. Changing the way we think, of course, was the cardinal objective of many early computer visionaries like Vannevar Bush and Howard Rheingold.
2005 may be the year when tools for thought become a reality thanks to the release of nearly a dozen new programs all aiming to do for your personal information what Google has done for the Internet. These programs all work in slightly different ways, but they share two remarkable properties: the ability to interpret the meaning of text documents; and the ability to filter through thousands of documents in a few seconds. Together there comes a tool that will have as significant an impact on the way writers work as the original word processors did. Steven Johnson writes,"The raw material the software relies on is an archive of my writings and notes, plus a few thousand choice quotes from books I have read over the past decade: an archive, in other words, of all my old ideas, and the ideas that have influenced me. Having all this information available at my fingerprints does more than help me find my notes faster. Yes, when I'm trying to track down an article I wrote many years ago, it's now much easier to retrieve. But the qualitative change lies elsewhere: in finding documents I've forgotten about altogether, documents that I didn't know I was looking for. In practice this works like this - I would write a paragraph that addressed the human brain's remarkable facility for interpreting facial expressions. I'd then plug that paragraph into the software, and ask it to find other, similar passages in my archive. Instantly, a list of quotes would be returned: some on the neural architecture that triggers facial expressions, others on the evolutionary history of the smile, still others that dealt with the expressiveness of our near relatives, the chimpanzees. Invariably, one or two of these would trigger a new association in my head - I'd forgotten about the chimpanzee connection -- and I'd select that quote, and ask the software to find a new batch of documents similar to it. Before long a larger idea had taken shape in my head, built out of the trail of associations the machine had assembled for me"
In the traditional way of exploring your files, the computer is like a dutiful, but dumb, butler: the evolution is from searching to exploring or brainstorming . The fuzziness of the results is part of what makes the software so powerful. These tools are smart enough to get around the classic search engine failing of excessive specificity: searching for "dog"and missing all the articles that have only "canine" in them. Modern indexing software learns associations between individual words, by tracking the frequency with which words appear near each other. This can create almost lyrical connections between ideas . They're associative tools ultimately. They don't do cause-and-effect as well.So they're ideally suited for books organized around ideas rather than single narrative threads: There's a fundamental difference between searching a universe of documents created by strangers and searching your own personal library. When you're freewheeling through ideas that you yourself have collated - particularly when you'd long ago forgotten about them - there's something about the experience that seems uncannily like freewheeling through the corridors of your own memory. It feels like thinking.
Simson bought 235 used hard drives between 11/2000 and 1/2003 from eBay, computer stores, and swap meets. He set up a technical infrastructure to mount the drives, image them (using FreeBSD), store the images on a RAID server, store the metadata in a MySQL database, and then mine the data. Simson Garfinel found a huge amount of data, including confidential information such as medical records, HR correspondence, and financial data including a hardidsk from an ATM.It contained one year’s worth of transactions, including over 3,000 card numbers. In this case, the drives weren’t sanitized correctly and the data was still on them for Simson to play around with.
In addition to explaining the problem and substantiating it with real data, Simson makes a number of suggestions for how to address the issue. Two of his more severe (but logical) suggestions for cleaning all the data off of used drives are :
(a) to degauss them with a Type 1 or Type II degausser or
(b) destroy, disintegrate, incinerate, pulverize, shred, or melt the drive. For less than $1,000 and working part time, he was able to collect thousands of credit cards, detailed financial records on hundreds of people, and confidential corporate files. He concludes by asking – "who else is doing this?". Simson's presentation is available here. Every system administrator, IS security expert, CIO's and business manager must read this excellent presentation.
- Half of the three million increase in global industry output since 2000 came from Toyota.
- Toyota will soon be making more cars abroad than at home. It has overtaken Ford in global production terms and is set to pass Chrysler in sales to become one of America's Big Three. - In an industry , where hardly any volume producer makes a real return on its capital, Toyota is exceptional in that it consistently makes good returns.
- Market capitalisation says it all. Toyota is worth more than the American Big Three put together, and more than the combination of its successful Japanese rivals, Nissan and Honda- Toyota taught the modern car industry how to make cars properly. At the core of TPS is elimination of waste and absolute concentration on consistent high quality by a process of continuous improvement (kaizen). The catchy just-in-time aspect of bringing parts together just as they are needed on the line is only the clearest manifestation of the relentless drive to eliminate muda (waste) from the manufacturing process. The world's motor industry, and many other branches of manufacturing, rushed to embrace and adopt the principles of TPS .
- Toyota's success starts with its brilliant production engineering, which puts quality control in the hands of the line workers who have the power to stop the line or summon help the moment something goes wrong. Walk into a Toyota factory in Japan or America, Derby in Britain or Valenciennes in France and you will see the same visual displays telling you everything that is going on. You will also hear the same jingles at the various work stations telling you a model is being changed, an operation has been completed or a brief halt called.Everything is minutely synchronised; the work goes at the same steady cadence of one car a minute rolling off the final assembly line. Each operation along the way takes that time. No one rushes and there are cute slings and swivelling loaders to take the heavy lifting out of the work. But there is much more to the soul of the Toyota machine than a dour, relentless pursuit of perfection in its car factories.
- Spend some time with Toyota people- there is something different about them. The rest of the car industry raves about engines, gearboxes, acceleration, fuel economy, handling, ride quality and sexy design. Toyota's people talk about “The Toyota Way” and about customers. - There is one more ingredient that adds zest to all these. Toyota people always put themselves “outside the comfort zone”: whenever they hit one target, they set another, more demanding one. That relentless pursuit of excellence certainly explains much of what has been happening to the company in recent years, at home and abroad.
- In 1980 Toyota had 11 factories in nine countries; in 1990 it had 20 in 14 countries; today it has 46 plants in 26 countries. In addition, it has design centres in California and in France on the Côte d'Azur, and engineering centres in the Detroit area and in Belgium and Thailand.Such international growth and globalisation is the biggest change happening to the company.
- The greatest challenge is maintaining Toyota's high standards in such areas as quality while it grows so fast across the globe. For Toyota has only recently started to transform the way it is run to make itself a truly global company rather than a big exporter with a string of overseas plants.
As GM's bonds sink towards junk status, and as Japanese carmakers steadily overhaul America's Big Three, it must be a chilling thought that Detroit's nemesis is working on ways to improve its performance.
Jeremy Wagstaff's blog LOOSE wire has this interesting post covered from a BBC report regarding Lucian George, a young british kid who discovered not one but five errors in this year's edition of the Encyclopedia Britannica. Lucian whose favourite subjects are history and nature,found inaccuracies regarding eastern European. Lucian wrote to the Enyclopedia Brittanica editor, after examination, admitted the errors and said they are grateful to Lucian and promised to have the mistaks rectified. Jermey also points to this link listing other known errors in Enyclopedia Brittanica.
Jeremy writes in defence of Wikipedia on criticisms for inaccuracies saying these overlook its mindboggline advance and its inestimable value as a reference when the most revered encyclopedia in the world can be just as prone to inaccuracies. The key difference Jeremy says, is Encyloepedia Brittanics is a printed volume and prone to be preserved in its form for sometime and revisions would have to wait for the subsequent reprint. Needlesss to say, someboby having an older version may choose not to look at the latest release.In the case of Wikipedia such issues can be immediately rectified and there would be one current version available to all at the sametime.
My Take :Recently,wikipedia said they are in the process of complete revision of content by baselining existing content. Technology has hit Enclopedia Brittanica with a very hot blast - may be Encyclopedia may have to start a wiki lisitng errors and revisions – that may be the solution in this instant spot-rectify-upgrade-review world.
Despite the technical advances, cellphone/PDA won’t allow one to run life. It does a nice job of helping manage things. One can do email, which untethers one from the desk. One can take, send, watch and listen to movies and audio. It’s the next generation or two that is more exciting and makes one curious - the expectations from the next generation phone/PDA would be:
1. At least 1gb of storage via flash memory, or 5gbs through a miniature drive.
2. A standard USB port to enable copying any standardized files to or from an external storage device
3. The ability to recognize that storage device as a drive accessible to phone apps.
4. The ability to call a number or us bluetooth to replace credit or debit cards and automatically record the transaction in a money management program
5. The ability to watch videos in Mpeg4/VC1/Divx format. Its going to come in handy when the car manual is in the glove compartment on a USB flashdrive and one can just watch the video on how to fix what breaks. Should be able to use phone to watch directions on whatever complex operations one may come in touch with.
5a. A home /kitchen operating manual that can be plugged into the USB or via bluetooth, see a demo of it on the y phone while one is in the store.
5b. Being able to Froogle it for pricing based on the bar code would help as well, as would knowing if they have it in stock without having to get a clerk, placing my order and picking it up or having it shipped to my house.Naturally, storing all receipts in the phone in case there is a problem
6. Able to save and store my IMs and Text Messages
7. Able to download tickets to events and just let them scan my phone rather than having a ticket.
I may call it my phone, but in reality, it’s my portable transaction device. Anything I can collect, create, transact or transmit digitally, I want the ability to do through my phone. It’s a digital world, why not? Same applies to being able to plug in the USB wire hanging from the ATM terminal with instructions.
The key trends here are effectively leveraging communication technologies, advances in storage and tons of mobile apps.
Businessweek publishes an interview with Bill Gates where he shares Microsoft's vision about the IPTV business.Excerpts with edits:
Microsoft first invested in TV technology more than a decade ago but found little success over the years. Now it seems the market that Microsoft and others first envisioned is close to become a reality. Soon, channel surfers will get vast new choices in programming and the ability to customize their selections. And Microsoft's decade long pursuit gives it much experience to draw on. Bill Gates explains,"TV started as analog, went to digital broadcast, but now is making the evolution to IPTV [Internet protocol TV, which uses Web technology to deliver programming]. And when you go to that generation, you can do something dramatic. We believe that software can improve the TV experience".
One of the important changes is that Microsoft's set-top box software no longer needs to run on Windows. TV viewers care about the TV viewing experience. If you can really make it better, then it's a very profound impact. Whether it's game shows or sports or news or advertising...you should have a show that fits what you're interested in., and it's only when you get that IPTV framework that you get the chance for that kind of innovation. Microsoft is a believer in the power of software. Anywhere we see a chance for great software, we're going to invest in it - and we're going to just stick with it. It's very important for us to have gotten in early to do a TV platform.
Microsoft believes that that it could win a substantial number of the cable and telecom deals around the world, and could see a pretty good-sized business. Not a Windows-sized business, but a very, very good-sized business. And that is reinforcing other related works - media [compression technology], digital rights management. It's a great thing for getting other efforts around the company to critical mass. Gates concludes by saying that "We see this [Microsoft TV] as being quite a profitable business".
My Take:It is clear that Microsoft has strategised well about its moves in the consumer electronics sector. We covered recently digital convergence in living room wherein we covered, Inspired by a residential consumer electronics market that is potentially several times larger and more lucrative than the PC business, armies from at least ten different tech-related industries—including PC makers, CE makers, cable companies, telcos, utilities, media companies and software developers—have amassed here in the desert to plan their assaults on the living rooms of the world. We also covered Russell Beatties viewpoint that Its game over for microsoft's competitors in the consumer electronics sector where we concluded,"Hats off to Microsoft - despite several criticisms on multiple fronts, they are repeatedly exhibiting ability to innovate( in some case immitate) and execute well". Microsoft's recent deals in IPTV market further fortifies its advances in the consumer electronics sector.
More than a century ago, King Gillette invented both the safety razor and a new way of marketing consumer goods. Before Gillette, men shaved with straight razors, which required skill to both make and use, and lasted almost forever. Gillette's safety razor was mass-produced and required little skill to make OR use, but couldn't be re-sharpened, so the removable blades had to be discarded when they became dull. His marketing breakthrough was selling the razor handles at little or no profit while making huge profits on the consumable - the blades. This same technique is used today to promote mobile phones and inkjet printers. And it is supposedly behind Apple's success with the iPod music player.
But in the case of Apple, is the iPod a razor or a blade? In other words, is Apple a hardware company or a media company? Cringley while holding the view that Apple is a hardware company also examines the well articulated alternate perspective.
It’s the classical distribution vs production issue rearing its head again. The razor vs blades iPod argument is founded on the premise that distribution is the bottleneck for the media industry.Churn across content devices may become low due to changing technologies at the physical level, there will be a small number of key conduits from content generation to media distribution technology enterprises such as (wireless, fibre, terrestrial waves, cable etc). A similar trend is observed at the enabling software layer owing to evolving frameworks by DRM etc.
A chemical company with small suppliers can’t push B2B integration like Walmart’s –" Only My Way" integration.The expectations of using standards like CIDX and Web services to ease B2B integration are high, but reality is several steps away.
The general expectation is that Web services and adoption of standard business document formats based on UN/CEFACT Core Componentswill make the problem go away. With WSDL definitions for your Web services and give them to the company that you want to connect with and they can bind their solution to the WSDL and there comes connectivity and a simple solution. The issues are:
- Firstly, the company you want to connect to must also provide you with a WSDL definition for their Web services ... if they have them.
- Secondly,eCommerce connections using Web services, need to use multiple different specifications, e.g. Do you use WS Addressing or WS Context to describe the metadata about your service? Do you use WS Reliability or WS Reliable Messaging to make sure your message is delivered? How do you do message security? Do you use HTTP/S or XML Signature and WS Security. You might agree to using WS Security but which of the signature algorithms do you use for generating your signatures?
The key point is that any variability requires additional effort when building your solution. So even if you are using WSDL to define your connections, there is potentially still a lot of work to do before you can implement that connection. A small business can't afford to do this, of if they do, they will only do it for their major customers. If you are not one of their major customers then you are stuck. So the less variability the better. Ideally you probably want to get to the point where the only variability you have to support is the URL used for receiving messages and the digital certificate to use for encrypting messages and validating signatures. If everything else is standardized then you stand a much better chance of being able to add connections to new partners quickly and easily. So really, for interoperability "less is more"! The solution: - Firstly, some of the Web services standards need to mature - standards such as WS Addressing are still in development. Debates over competing standards such as WS Reliability and WS Reliable Messaging need to be resolved.
- We also need to develop a profile of how all the different web services standards are used together so that the amount of variability that small organizations have to handle is significantly reduced. While groups such as WS-I are working on this,we're not there yet,but we're getting there.
Google is now indexing more than 8 billion Web pages, against 2 billion three years ago and 3 billion two years ago.That's a lot of pages. As David Weinberger of Harvard University's Berkman Center puts it: "We've been struggling for several years with the Internet's size and complexity." So is there a better way of finding stuff?. Google, after all, merely indexes the words it finds on a Web page, and those on pages linked to it. But Google doesn't try to figure out what the words actually mean, or what the pages are about.In short, using Google is like going into a library and hiring a very fast runner, who isn't smart but happens to a be a very fast reader, to sprint around finding all the books that have the word xxxxx in them. It would be better to just wander over to the catalog and look up the subject of searched item. It would, but so far there's no catalog like this. But there's an idea of one. It's called the "semantic Web", and it's simple enough: To categorize information on the Web by adding tags to Web pages. But with billions of pages out there, and thousands more added every day, this is not a task that anyone is volunteering to do. Last year a couple of free Internet services started doing something interesting, entirely independently of each other. Flickris a Web site for storing photographs; del.icio.us lets you store bookmarks to your favorite Web pages. They share two features: Both let users add tags to what they are storing, and by default share that data with any other user.So, say you upload a photo to Flickr, you might add a word or two to categorize it - say, scuba, or marzipan. The same applies if you add a Web page to your del.icio.us bookmarks. But because both of these tools are public, it also means that you can see what other pictures, in the case of Flickr, or Web page links in the case of del.icio.us, have the same tags. Tags are a good way to keep your bookmarks (what Microsoft calls "favorites") in a place you can find them. And there are alternative sites that offer this service like Simpy, Powermarks,and Spurl. All of these solve two basic problems:
- how to keep tabs on your bookmarks if you use more than one browser, or more than one computer, and,
- second, how to find them again easily.
Still, tagging is the future and once you see it in one place you see it, and its potential, everywhere. The beauty of labels, or tags, is that you can assign more than one.
We recently covered Peter Merholz view on building Metadata for the masses, where we noted, many classification systems suffer from an inflexible top-down approach, forcing users to view the world in potentially unfamiliar ways -But what if we could somehow peek inside our users’ thought processes to figure out how they view the world? One way to do that is through ethnoclassification -how people classify and categorize the world around them. Instead of a committee sitting down and deciding on some hierarchical system of categorizing stuff,it was ordinary people adding whatever tags sprang to mind, on the fly. A sort of "egalitarian taxonomy" - which is why some people are calling it "folksonomy", which may or may not catch on Imagine that you're interested in scuba diving. You add a few relevant Web sites to del.icio.us and tag them "scuba." Suddenly, on your del.icio.us bookmark page, you can see not only all your tags, but how many others have tagged the same pages. And you can see what other pages have also been tagged "scuba."You've not only stored your bookmark somewhere you can find later, but you've helped point others to the same page. And, most important, you can then see a whole library of pages others have considered worth bookmarking. Suddenly tagging becomes something simple, social - and useful. This month, Technorati started using tags from Flickr and del.icio.us to categorize the millions of blogs, or online journals, that it indexes. That turns Technorati into a kind of homepage of every conceivable topic you can imagine people writing about: Most important, this social tagging thing, if it takes off, could make finding information much easier. Instead of relying on search engines, we can rely on other surfers submitting interesting sites as they find them. A bit like having some seriously fast, smart speed-readers running around the Internet on our behalf armed with piles of index cards. Jeremy provides a directory of bookmark managers here.
Tim Oren writes with a lot of insight about approach for entrepreneurs wanting to build value in software startups, and venture funders requiring both defensibility and eventual liquidity, respond to the rise of open standards, open source, and offshoring. Here he provides an excellent outline of a practical approach that would help stakeholders take decision in terms of funding. Excerpts with edits:
The shape of response is emerging from the fog - perhaps an early indicator of the shape of symbiosis between community and commercial processes - the two stage software startup. Unlike a two-staged rocket, the first stage is light and runs on little fuel, it's the second stage that has the big burn, if it ignites.The first stage of the new model software venture builds a useful product as cheaply as possible. Actual engineering is focused ruthlessly on the unique value and differentiating features. In most cases, open standards are exploited to address as large a market as possible using off-the-shelf APIs and libraries. In many cases, the software is written on top of open source platforms, such as LAMP, to keep down development and initial customer costs. Code is usually written to published interfaces, rather than integrated into the open source itself, to avoid 'contamination' by GPL and other OS licenses. Often, a portion of the development will be sent offshore, particularly if the founders have prior experience or cultural connections with a reliable venue. Build as little as possible, as fast and cheaply as possible, while demonstrating some unique value.
Many of these efforts will result in a product, or even a feature, rather than a sustainable company. But that may be OK for the first stage, because the development time and expense are small enough to be funded by the founders, friends and family, or a few angels. The go-to-market is similarly light. Rather than a sales channel, the venture will buy ad words on Google, promote itself via word of mouth on blogs and via user communities, and penetrate enterprises by pricing low enough to fall within the purchasing power of a department, or even an individual. Being in early and continuous touch with its market, the venture can course correct early and often.The time value of having a functioning product with newly proven value may be sufficient for a quick sale to a larger company which has sales channel synergy, or products in a touching function which can quickly integrate the new functionality. While the sale may result in only a few million dollars, that outcome may be quite profitable to the founders and the individual backers. This may even be true on a risk adjusted basis, and that may be a new thing.
Second stage activities will consume cash in advance of the sales to fund them, as they must occur before imitators arrive. They may include adding functionality to meet customer requests, rebuilding parts of the product for greater efficiency and defensibility, adding the necessary sales force, scalability, and system integration to be able to sell to a higher end market, such as the CXO enterprise level, or carriers. At the point of making the second stage decision, the technology risks have been greatly reduced, and a portion of market risk eliminated. The company has already been learning from the market, though it will undoubtedly need to relearn some things as it shifts focus. Entrepreneurs who choose to enter this stage will receive valuations well above what they would have commanded before achieving a first stage takeoff, though perhaps not as much as they might hope - The PC movement almost followed a similar approach for geting funded several years back.
WiMax, an emerging wireless-broadband technology is akin to a long-range version of the popular Wi-Fi technology that allows computers close to a small base-station to surf the internet without wires.Whereas Wi-Fi's range is limited to a few tens of metres, WiMax can, in theory, work over tens of kilometres, allowing huge areas to be blanketed with wireless coverage. Hence the claims that WiMax will bring internet access to the 5 billion people who currently lack it, or that it will render expensive "third-generation" (3G) mobile networks redundant.The reality, is that WiMax has been hugely overhyped. Today, the actual number of WiMax devices on the market is precisely zero. The hype is now giving way to much scepticism about the technology's prospects.WiMax, may be used by telecoms firms in rural areas, to plug holes in their broadband coverage. In urban areas WiMax does not make sense, since it will be uneconomic compared with cable and DSL. It is also too expensive for use in the developing world, since early WiMax access devices (which must be fixed to the outside of a building) will cost around $500; other forms of wireless link, such as mobile-phone networks, will remain a cheaper way to connect up remote villages.
Intel regards WiMax as a promising source of future growth. Intel plans to establish a franchise similar to its hold on desktops in mobile devices through WiMax, a far larger market. Equipment-makers, for their part, are counting on Intel to deliver: WiMax will become widespread only if the price of access devices falls, which in turn depends on the availability of cheap, mass-produced WiMax chips.
In today’s world of tight IT budgets, increased regulation, global competition, and accelerating change, companies (and governments) require quantifiable results from their investments in technology. No executive will sign off on any investment in new technology without a solid expectation for how it will deliver value to the business. When people understand an established technology and how it will provide value over time, calculating the return on investment (ROI) for an IT expenditure will often be a straightforward process. Calculating ROI on projects involving new technologies or emerging IT approaches like Service-Oriented Architecture (SOA) is frequently more of an art than a science. What makes calculating the ROI of SOA even more challenging is that architecture, by itself, doesn’t offer specific features that companies can readily identify with some particular return. Only by understanding the full range of SOA value propositions can companies begin to get a handle on calculating the ROI of SOA, and even then, it may be impossible to understand SOA’s true ROI before the project is completed, because SOA addresses issues of fundamentally unpredictable business change.
SOA provides benefits in four basic categories: reducing integration expense, increasing asset reuse, increasing business agility, and reduction of business risk. These four core benefits actually offer return at many different levels and parts of the organization, depending on which set of business problems the company is applying SOA to. First, implementing loosely-coupled integration approaches can reduce the complexity and hence the cost of integrating and managing distributed computing environments. While moving to standards-based interfaces such as Web Services reduces integration cost somewhat, the real win with SOA is in replacing multiple function calls at a fine level of granularity with coarser-grained, loosely coupled Services that can handle a wider range of interactions in a more flexible manner than API-based integration.
Companies can compare their investment in Web Services-based SOA to an equivalent traditional integration middleware approach and then compare both the immediate licensing and configuration cost reductions as well as the longer term maintenance and change costs. As detailed in Understanding the Real Costs of Integration, companies can realize significant and immediate ROI from simply moving from tightly-coupled forms of integration to loosely-coupled ones. Eventually, companies can phase out their more expensive integration approaches altogether, without suffering from the traditional pain associated with "ripping" out the infrastructure. Companies are now implementing SOA side-by-side with their existing EAI and ETL projects, providing immediate cost reduction, while over the long term, SOA can lead to significant complexity reduction, as companies gradually replace their legacy middleware technologies.
Increased business visibility in the face of changing regulations is a concrete instance of the business agility benefit that SOA can provide. Implementing SOA for the purpose of controlling business processes, establishing corporate-wide security, privacy, and implementation policies, and providing auditable information trails, are all examples of ways that SOA can reduce several of the risks facing companies today. While the reduction in risk that SOA provides is tangible, it is difficult to quantify the true ROI of an SOA implementation where risk reduction is a primary benefit. Companies will find value in implementing SOA to reduce risk to some arbitrarily acceptable level, and base the ROI of that implementation on the perceived avoidance of loss that the implementation addresses. Because of the multi-faceted nature of the SOA value proposition, ROI calculations for SOA projects can vary greatly from one project to another. Rather than seeking a single ROI goal for an SOA implementation, companies should take the same iterative, composite approach to ROI that they take for SOA implementation itself. For example, every time they define a Service as part of a company’s Service model, they should also define a corresponding ROI objective for that Service.
- How much will they spend on this Service?
- What direct and indirect returns can they realize from this Service’s implementation, in terms of reduced integration costs, improved asset reuse, or greater business agility?
- How will the composition of the Services into processes realize additional ROI for the business?
Ronald Schmelzer concludes,"In many cases, SOA implementations can provide a clear, positive ROI from the first day a Service goes live. However, it is more likely that ROI expectations, like SOA implementations, should be iterative in nature, frequently assessed, and composite. In doing so, users can not only quantify, but also realize, the return on investment of their SOA implementations".
(Via zdnet) Software as a service, a corner of the computing industry is seeing faster growth than traditional software sales. The creation of an IBM-hosted application bundle is one of several initiatives at IBM to promote the notion of software as a service, or having applications delivered over the Internet. This model for buying software "on demand" is taking hold after years of missteps and failures, which were due to both technical challenges and faulty business models. IBM recently acquired Corio,an application service provider serving medium-size businesses. IBM also has programs to encourage independent software vendors, or ISVs, to convert their applications to run effectively over the Internet. IBM's growing interest in hosted services reflects a belief among software companies that spending on hosted applications offers brighter growth prospects than traditional software sales. Latest convert is Siebel Systems. Others include the likes of PeopleSoft, Oracle, SAP, Epiphany and Ariba
Now IBM is working with a variety of software companies so that it can offer hosted software bundles -horizontal and vertical on demand building blocks on its hardware and software infrastructure. This may be viewed as infrastructure-as-a-service, creating huge data centers running pre-configured applications, tuned to its middleware and hardware, and delivered over high bandwidth Net.
Rather than purchase a license and spend months installing software, a hosted offering lets people get started immediately with applications delivered via a Web browser. The purchasing model is different as well, with customers paying a monthly fee for a specified number of users. This is particularly attractive to smaller companies wary of large up-front costs. virtualization software lets a data center operator partition off dedicated portions of a single large server to separate customers. Also, software companies are increasingly building their applications in a more modular form around standardized protocols, such as Web services, which greatly simplifies the task of moving data between different applications. The services themselves are getting more mature as well, with better management tools and even the option to have applications run in-house while being remotely managed by a third-party service provider. A preconfigured combination of different applications, will help drive the market for hosted applications from one-off purchases for an individual department to broader usage, said analysts.
"It's becoming less and less one-item shopping, Providers are combining elements into a broader solution." Over the next few years--as enterprises get past capital expenditures off their books the concept of owning your own (infrastructure) for basic applications should appear ridiculous and preposterous when telcom, server, storage and application capabilties have improved several times. This can possibly facilitate birth of several new specialised applcations as well - as entry barriers and distribution costs can come down dramatically - what impact this can have on the maintenance revenue of established while Jeff Nolan says he is non plussed as the thing enterprise software companies lust for is not license revenue, it's the maintenance base. In fact,he argues, mature software companies will almost always generate more revenue from maintenance than they will from new license sales, because an enterprise license contract is a form of annuity. With green flag from all enteprise majors and enteprises seriously considering on-demand this movement can only go from strength to strength.
Update: Joe Wilcox adds,If uptake for Outlook Live meets Microsoft expectations, more software-rentals-with-attached-services offerings is a distinct possibility. In a sense, some of the software features packed into MSN, like those from Money or Picture It, already are there. With consumers, small and medium businesses and enterprises slowing down software upgrades in many categories, Microsoft needs to find new ways to entice adoption of newer products. Since some of those products have reached the perceived "good enough" threshold, the services approach would act as a carrot encouraging newer version adoption. Microsoft also would benefit from any shift to reoccurring revenue, as it does with enterprise contractual licensing.
In Part I of this article, we covered IBM's Paul Horns perspective about the emerging discipline -"Service Science".In this part,we shall look at this from a different viewglass - My Take on his view: Tom Peters once said, "The professional service firm is the best model for tomorrow's organization in any industry".The Logic of Paul Horn is right. But the approach is debatable. Service companies in existence for several decades should have this knowledge codified in their repositories. To me it seems that big service organisations candidly admitting that they are not able to train their consultants appropriately in their " Solidified Propreitary Methodologies!!" I would think that precisely for the reasons advanced in Paul's article -these courseware must be made open source – we should experiment this idea with a new wiki like technology and not make this available just in elite institutions – in order to ensure that this rolls out to professionals who can stabilize the price equation in labour markets, a key concern for Paul. Care should be taken to ensure that we produce in good numbers and make sure that these are not hijacked by bug entrenched players. However, I firmly beleive, this intersection is better understood with more real life experience – precisely why in house programs, repositories and coffee cooler talks happen on this and can provide inarguably better results. We have published on this topic in greater length Managing the professional service firms where we covered Tom Peters outline of ideas that with a simplified list of 25+ themes along three dimensions that professional firms and consultants need to embrace/master. We also covered in a follow up post David Maister's view that two aspects of professional work create the special management challenges of the professional service firm.
- First, professional services involve a high degree of customization in their work. Professional firms must manage customized activities where little, even management information, can be reliably made routine. Management principles and approaches from the industrial or mass-consumer sectors, based as they are on the standardization, supervision, and marketing of repetitive tasks and products, are not only inapplicable in the professional sector but may be dangerously wrong.
- Second, most professional services have a strong component of face-to-face interaction with the client. This implies that definitions of quality and service take on special meanings and must be managed carefully, and that very special skills are required of top performers.
This is also very important for emerging IT technology majors. Similarly most of the emerging consulting houses today have skewed levels of distribution between technology people and business people ( I for one beleive that these are intertwined and cannot be separated in respect of service consultants) – there are technology ascetics and business dumboes - the problem is making both this lot leverage each other very well in a modulated manner - this process and framework would determine the enduring success or eventual failure of consulting houses - not academic courses but mindful creation of business values through well thought out and executed methodologies
( Via Businessweek) Paul Horn of IBM writes Services Science, a melding of technology with an understanding of business processes and organization is crucial to the economy's next wave. Excerpts with edits and my comments added:
It's a melding of technology with an understanding of business processes and organization - and it's crucial to the economy's next wave. Services have come to represent more than 75% of the U.S. economy( may be 2/3rdsof global economy),and the field is growing rapidly. In the information-technology business, services have become even more important. But there's a shortage of skills where they're needed most - at the intersection of business and IT. As companies build more efficient IT systems, streamline operations, and embrace the Internet through wholesales changes in business processes, a huge opportunity exists. Nonetheless, little or no focused efforts are preparing people for this new environment or to even to thoroughly understand it. The IT-services sector is in dire need of people who are talented in the application of technologies to help businesses, governments, and other organizations improve what they do now - plus tap into totally new areas. The complex issues surrounding the transformation of businesses at such a fundamental level require the simultaneous development of both business methods and the technology that supports those methods. This is the seedbed for a new discipline that industry and academia are coming to call "services science." Services science would merge technology with an understanding of business processes and organization, a combination of recognizing a company's pain points and the tools that can be applied to correct them. To thrive in this environment, an IT-services expert will need to understand how that capability can be delivered in an efficient and profitable way, how the services should be designed, and how to measure their effectiveness. This discipline would bring together ongoing work in the more established fields of computer science, operations research, industrial engineering, management sciences, and social and legal sciences, in order to develop the skills required in a services-led economy.
Today, IT-services training is mostly accomplished through individual companies' on-the-job programs. This may have been adequate before, but it's not any longer, especially with increasing globalization We're now entering a new phase where value will be found in what we do with information to improve business, government, and people's lives. Call it an innovation-based economy, where profits and jobs will go to those who have the skills to capitalize on the explosion of new opportunities at the intersection of business and technology.
Part II with my views on this topic shall be published shortly.
Peter Drucker gave the cadre of professionals like doctors, lwayers. Engineers etc. an enduring name: knowledge workers. These are, he wrote, "people who get paid for putting to work what one learns in school rather than for their physical strength or manual skill." What distinguished members of this group and enabled them to reap society's greatest rewards, was their "ability to acquire and to apply theoretical and analytic knowledge." The world has changed. The future no longer belongs to people who can reason with computer-like logic, speed, and precision. It belongs to a different kind of person with a different kind of mind. Today amid the uncertainties of an economy that has gone from boom to bust to blah - there's a metaphor that explains what's going on. And it's right brain. The Information Age that Americans prepared for is ending. Rising in its place is the Conceptual Age, an era in which mastery of abilities that we've often overlooked and undervalued marks the fault line between who gets ahead and who falls behind. This shift - from an economy built on the logical, sequential abilities of the Information Age to an economy built on the inventive, empathic abilities of the Conceptual Age - sounds delightful.. The causes: Asia, Automation,& Abundance. The effect: the scales tilting in favor of right brain-style thinking.
Asia : Those squadrons of white-collar workers in Asia are scaring the jobs across North America and Europe. According to Forrester Research, 1 in 9 jobs in the US information technology industry will move overseas by 2010. And it's not just tech work,but extends to chartered accountants preparing American tax returns, lawyers researching American lawsuits, and radiologists reading CAT scans for US hospitals. Outsourcing to Asia is overhyped in the short term, but underhyped in the long term. American’s are not going to lose jobs tomorrow. (The total number of jobs lost to offshoring so far represents less than 1 percent of the US labor force.) But as the cost of communicating with the other side of the globe falls essentially to zero, as India becomes (by 2010) the country with the most English speakers in the world, and as developing nations continue to mint millions of extremely capable knowledge workers, the professional lives of people in the West will change dramatically. If number crunching, chart reading, and code writing can be done for a lot less overseas and delivered to clients instantly via fiber-optic cable, that's where the work will go. But these gusts of comparative advantage are blowing away only certain kinds of white-collar jobs - those that can be reduced to a set of rules, routines, and instructions. That's why narrow left-brain work such as basic computer coding, accounting, legal research, and financial analysis is migrating across the oceans. But that's also why plenty of opportunities remain for people and companies doing less routine work - programmers who can design entire systems, accountants who serve as life planners, and bankers expert less in the intricacies of Excel than in the art of the deal. Now that Asians can do left-brain work cheaper, the US must do right-brain work better.
Automation :Last century, machines proved they could replace human muscle. This century, technologies are proving they can outperform human left brains - they can execute sequential, reductive, computational work better, faster, and more accurately than even those with the highest IQs. Stockbrokers, lawyers , financial agents all are feeling the heat of automation.Consequently, legal abilities that can't be digitized - convincing a jury or understanding the subtleties of a negotiation - become more valuable. The routine functions are increasingly being turned over to machines." The result: As the scut work gets offloaded, engineers will have to master different aptitudes, relying more on creativity than competence. Any job that can be reduced to a set of rules is at risk. Now that computers can emulate left-hemisphere skills, american’s have to rely ever more on right hemispheres.
Abundance :Left brains have made Americans rich. The Information Age has unleashed a prosperity that in turn places a premium on less rational sensibilities - beauty, spirituality, emotion. For companies and entrepreneurs, it's no longer enough to create a product, a service, or an experience that's reasonably priced and adequately functional. In an age of abundance, consumers demand something more. Liberated by this prosperity but not fulfilled by it, more people are searching for meaning. From the mainstream embrace of such once-exotic practices as yoga and meditation to the rise of spirituality in the workplace to the influence of evangelism in pop culture and politics, the quest for meaning and purpose has become an integral part of everyday life. This will only intensify as the first children of abundance, the baby boomers, realize that they have more of their lives behind them than ahead. In both business and personal life, left-brain needs have largely been sated, the right-brain yearnings will demand to be fed.
As the forces of Asia, automation, and abundance strengthen and accelerate, the curtain is rising on a new era, the Conceptual Age. If the Industrial Age was built on people's backs, and the Information Age on people's left hemispheres, the Conceptual Age is being built on people's right hemispheres. The Americans have progressed from a society of farmers to a society of factory workers to a society of knowledge workers. And now the next level of progress - to a society of creators and empathizers, pattern recognizers, and meaning makers. To flourish in this age, the Americans need to supplement the well-developed high tech abilities with aptitudes that are "high concept" and "high touch." High concept involves the ability to create artistic and emotional beauty, to detect patterns and opportunities, to craft a satisfying narrative, and to come up with inventions the world didn't know it was missing. High touch involves the capacity to empathize, to understand the subtleties of human interaction, to find joy in one's self and to elicit it in others, and to stretch beyond the quotidian in pursuit of purpose and meaning. Developing these high concept, high touch abilities won't be easy for everyone. For some, the prospect seems unattainable. Forget what your parents told young americans in the past. Instead, do something Asian’s can't do cheaper. Something computers can't do faster. And something that fills one of the nonmaterial, transcendent desires of an abundant age. In other words, go right, young man and woman, go right. Indeed an excellent, thought provoking article - an elaboration of the future path for abundance and survival for western society would make this article fullsome - This is indeed one of the finest piece published in this year.
We recently covered about the spectacular growth of china and also covered Om Malik's famous line - "The axis of technology has shifted somewhere to South china sea". It is obvious,The size of the china market is a major pull says this article.The sale of IBM's PC division to Chinese giant Lenovo was not the only sign of an ominous shift of the IT industry's centre of gravity towards the emerging economic superpower and its neighbours. Networking giant Cisco, blaming increasing competition from asian manufacturers, announced that it is to move from selling individual devices to becoming a systems provider providing solutions rather than products. This will put it into direct competition with IBM, whose China deal reflects a similar shift. The software industry, itself moving towards providing services rather than products, is also feeling the eastern wind of change. Palmsource, the software spin-off from Palm, announced that it is buying China Mobilesoft , a developer of software for mobile phones. The deal gives Palmsource an entry into the Chinese market; but more significantly from the West's point of view, the company also announced that it is to implement the PalmOS interface, its biggest asset, on the Linux operating system.
China is already a big Linux user and may provide a critical mass of applications and users that could make open source a major player in the mobile and desktop markets. Microsoft meanwhile faces a separate threat from clones of its Office products. A Chinese company called Evermore launched what it called an advanced English-language edition of its Evermore Integrated Office (EIS) into the US, Japanese and Chinese markets.written in Java, this runs under both Windows and Linux and claims to be more tightly integrated than the Microsoft product. The spreadsheet, word processing, and business-graphics functions are accessible from a single module rather than being separate programs; and linked data is more easily synchronised in EIS than Office - changes to a spreadsheet, say, can easily be reflected in a table in a word processor document. EIS can also create pdf files natively, whereas in Microsoft Office they require a plug-in. Sometime back the west was worried that china may be pursuing its own standards in communications including mobile technology, chip standards and in emerging technologies like RFID. Now signs of this getting repeated in the software product market as well.The size of the market, the fast growth and the determination to exhibit might makes the difference.
We had covered recent advances in search technology through a number of posts in the recent past. In Advances in video and multimedia search wherein,covering the plans of major search engines and research groups, we wrote "With Broadband, content explosion, increasing use of the internet for day-to-day activities, using search technologies that were developed for searching flat HTML files would not be sufficient to meet current day requirements. Special technologies are needed to search multimedia files, provide non linear search capabilities, find patterns and provide search results by factoring multidimensional attributes are the focal area of the search industry for now and the near future". We also covered the picture search tool Montage, we initiated coverage of IBM's IBM's plan for corporate search market. Ramesh Jain's insightful perspective on search questioning the need for using yesteryear search mechanisms and his views on advancing search technology was also covered in this blog recently. Jeff Nolan points out this interesting article about the advances that are happening in the search technology . Excerpts with edits:
"Googling" has become synonymous with doing research, online search engines are poised for a series of upgrades that promise to further enhance how we find what we need. New search engines are improving the quality of results by delving deeper into the storehouse of materials available online, by sorting and presenting those results better, and by tracking your long-term interests so that they can refine their handling of new information requests. In the future, search engines will broaden content horizons as well, doing more than simply processing keyword queries typed into a text box. They will be able to automatically take into account your location--letting your wireless PDA, for instance, pinpoint the nearest restaurant when you are traveling. New systems will also find just the right picture faster by matching your sketches to similar shapes. They will even be able to name that half-remembered tune if you hum a few bars.
Much of the digital content today remains inaccessible because many systems hosting (holding and handling) that material do not store Web pages as users normally view them. These resources generate Web pages on demand as users interact with them. Typical crawlers are stumped by these resources and fail to retrieve any content. This keeps a huge amount of information - approximately 500 times the size of the conventional Web, according to some estimates - concealed from users. Efforts are under way to make it as easy to search the “hidden Web" as the visible one. Some search engines attempt to identify patterns among those pages that most closely match the query and group the results into smaller sets. These patterns may include common words, synonyms, related words or even high-level conceptual themes that are identified using special rules. Eg Northern light and Clusty.Another way computer tools will simplify searches is by looking through your hard drive as well as the Web. " Implicit search" capabilities can retrieve relevant information without the user having to specify queries. The implicit search feature reportedly harvests keywords from textual information recently manipulated by the user, such as e-mail or Word documents, to locate and present related content from files stored on a user's hard drive. Microsoft may extend the search function to Web content and enable users to transform any text content displayed on screens into queries more conveniently. Sophisticated software will collect interaction data over time and then generate and maintain a user profile to predict future interests. . Another class of context-aware search systems would take into account a person's location. GPS and RFID may be integrated with search. A key problem in finding a specific tune is how to best formulate the search query. One type of solution is to use musical notation or a musical transcription-based query language that permits a user to specify a tune by keying in alphanumeric characters to represent musical notes. string-matching function must accommodate a certain amount of " noise."
- Future search services will not be restricted to conventional computing platforms,but could be extended to systems like telematics systems, also embedding search capabilities into entertainment equipment such as game stations, televisions and high-end stereo systems.
- Search technologies will play unseen ancillary roles, often via intelligent Web services, in activities such as driving vehicles, listening to music and designing products.
- Another big change in Web searching will revolve around new business deals that greatly expand the online coverage of the huge amount of published materials, including text, video and audio, that computer users cannot currently access.
- Next-generation search technologies will become both more and less visible as they perform their increasingly sophisticated jobs. The visible role will be represented by more powerful tools that combine search functions with data-mining operations-specialized systems that look for trends or anomalies in databases without actually knowing the meaning of the data. The unseen role will involve developing myriad intelligent search operations as back-end services for diverse applications and platforms. Advances in both data-mining and user-interface technologies will make it possible for a single system to provide a continuum of sophisticated search services automatically that are integrated seamlessly with interactive visual functions. Eventually it will be difficult for computer users to determine where searching starts and understanding begins.
The global service economy is in a state of upheaval and transformation. He expects to see a wider – and more widely distributed – array of services available via networked technology in the coming years. That will prove to be a threat to some as it was in the recent debate over outsourcing and offshoring. As Accenture’s Anatole Gershman has written, the challenge for business leaders "is not tracking the technology or making sense of the standards behind Web services. It is understanding the opportunities that lie ahead. When Web services reach their full potential, they will change the way we do business." Web Services have the potential to redefine and radically grow our modern service economy. The services could be advanced one linking processes virtually like the dynamic, real-time matching of supply and demand for high value services – much as Commodore Vanderbilt’s railroads and long distance communication made it possible to build national and global markets for a "product economy" of manufactured goods in the late 19th century. Many organisations like Fedex are also trying to exploit web services potential in a fundamentally business transforming manner.
This is significant as the potential reach of the web services movement is mindboggling - approximately at 2/3 rd the GDP of advanced countries are driven by services led economy,where webservices can play a dominant role by unbundling and aggregating, liberate and extend the reach of today’s services . As he puts it, "More and more, products will become a channel for service, and customer relationships will change because many newly possible services will be delivered dynamically…Existing suppliers will be able to deliver highly personalized services and maintain continuous customer interaction. Some may join the ranks of intermediaries who emerge to broker web services." There are limitless opportunities waiting to unfold - in the virtual world, eBay, Amazon, Google, Microsoft have shown how new business can be built using web services. Read our recent coverage on Giving Away The Store Through Amazon Web Services.
Any movement would require lots and lots of interventions and accelarators - we recently covered IBM's recent announcement anyone developing open source software will be allowed to make royalty-free use of 500 IBM patents with IBM’s full consent. The scope of the patents is expansive, covering areas as diverse as Web services, e-mail, message queuing, Web browsers, parallel processing, database storage, encryption, voicemail systems, and even systems designed to prevent motorists from falling asleep.. AMRresearch writes,IBM’s promise makes it harder for Microsoft and Sun to monetize Web services transactions. It is also a further signal that anyone wanting to handicap open source through litigation will have to get through IBM first.The patent pledge makes it harder to monetize Web services transactions - The 500 patents include several related to IBM’s approach to Web services. Although many vendors are signing up to provide a Web services platform, they have been relatively quiet about how they would charge for it. Customers are concerned that vendors want to push demand-based pricing too far, actually charging for each business transaction that gets passed through Web services. By making these particular patents available for free use through open source, IBM is suggesting its answer: Web services transactions should be free. This aligns closely with the position of the primary Web standards body, the W3C consortium, which has a preference for royalty-free use of technology. If the W3C finds favor with IBM’s plan, it will be much harder for others to pick the other side..
My Take :While I certainly agree with the humongous potential of web services to create new business value, as we recently covered Do web services matter, Nicholas Carr's point of view -"Web services will play a key role in overcoming incompatibilities between information systems, helping create a more standardized and productive IT infrastructure for business. But they’re not going to usher in an age of protean value chains. And they're not going to usurp the place of managers in negotiating, overseeing and modifying complex partnerships. Machines matter, but people matter more" is equally valid and therefore conclude that the role of IT/Web Services as enabler shall remain only at that and together with powerful strategies, new business models and process transformation, webservices can play a significant role in creating a new service world for the future.