Cloud, Digital, SaaS, Enterprise 2.0, Enterprise Software, CIO, Social Media, Mobility, Trends, Markets, Thoughts, Technologies, Outsourcing


Contact Me:

Linkedin Facebook Twitter Google Profile


wwwThis Blog
Google Book Search



  • Creative Commons License
  • This page is powered by Blogger. Isn't yours?
Enter your email address below to subscribe to this Blog !

powered by Bloglet


Wednesday, November 30, 2005

Blu-Ray and HD-DVD Give Way : Enter Holographics storage

While there is a battle going on about the importance of the Blu-Ray vs. HD-DVD format war, companies including Turner have already begun to move towards holographic storage. DVD had a good 10+ year run,but the same timespan may not be available for the next upturn.Whichever format "wins" will only be the "best" for maybe 3-5 years as far as capacity as well as read and write speed go. The adoption seems to be happening faster than expected. Early versions of holographic disk storage store up to 300GB per disk, a computer disc about the size of a DVD that can hold 60 times more data with ability read and write data at 10 times the speed of a normal DVD. By 2010, the disks are expected to cost $100 each and store 1.6 terabytes (that’s roughly 1,600 gigabytes) each. Some expect to break the 1TB-per-disk barrier in the next two-three years. Suppport for HD-DVD or Blu-Ray may lose steam with rollouts like this being planned. I think that Blu-Ray and HD-DVD are both fighting a losing battle because holographic discs are pretty close. Blu-Ray and HD DVD are looking to satisfy a couple of needs: more space (related to storing and retreiving movies), high definition (which most people cannot fully use), and tougher encryption (pet theme for movie studioes). The complications galore – In the interim for Blu-Ray and HD DVD to be successful, DVD, the most popular format today has to disappear. The movie studioes may need to stop releasing DVD’s to accelerate adoption of a different standard.

Category :

The World Is Flat As An Extensible Wiki

The World is Flat wins the first Financial Times and Goldman Sachs Business Book of the Year award. Thomas Friedman has a vision for the final edition of The World is Flat: anybody will be able to update it. He is looking at actually turning the book into an open-source product - like putting it up on the web like Wikipedia and let people manage it. It may be a concern for its publishers besides the risk that opponents of the book or its message about the benefits of globalisation will try to hijack the wiki edition. But it is a vision that is perfectly in tune with the picture of a globalised and interconnected world that Mr Friedman outlines. The rapid evolution over the past year of what is known as "Web 2.0" - a flowering of internet companies and strategies, based on the growing availability of cheap or free software online - has made it possible for budding entrepreneurs and big companies to experiment with strategies without having to "bet the firm".

A version of the book that can be constantly updated may also be the only way to guarantee that it remains current. The book's premise is that, at the beginning of this century, the world entered a new phase of globalisation, based on disruptive social, political and technological events ("flatteners", as Mr Friedman calls them) during the latter part of the 20th century.In this flatter world, companies and individuals will be able to collaborate and compete more successfully, whatever their size and wherever they are. Those that fail to adapt will suffer, he says. Mr Friedman says the evolution towards an interconnected, flatter world has accelerated since the book was completed nearly a year ago.This month, for instance, the audio version of The World is Flat became the top-selling podcast album on Apple's iTunes audio downloading site, says the author.He notes that when he started this book in March 2004, podcasting didn't exist.

Category :,


Enterprise Software & Innovation

Innovation is not happening as much as it should in the enterprise software industry- Even on-demand is seen in some quarters as an extension of what was getting done in the past. Most of us feel strongly about it– read Vinnie Mirchandani's perspective on this- similar is the case in the opensource world as well. Most of the software sale that we get to see are incremental sales/minimal functionality extension – we do not necessarily see new waves in adaptation of enterprise software. I strongly believe that SOA & composites would force structural changes and unleash innovation whereby software will be described as a portfolio of capabilities and possibilities instead of modularized applications. Data models will be standard-based and externalized to enable interworking between services, and data will be considered to be like any other form of "digital content" ever ready for exchange and transformation between systems. Grady Booch points out that in the enterprise space, a considerable amount of innovation is happening in two dimensions:
- on the edges of the enterprise and
- in the spaces between enterprises

He sees that on the edges, energy is being spent in frameworks that live above middleware (domain-independent ones, such as Ajax, as well as domain-specific ones, aligned around specific industries) and those that work in conjunction with devices [classical mobile devices and devices with RFID tags. In the spaces between, there is the formation of systems of systems (both legacy and new systems). In the context of such Web-centric systems, services have proven to be an essential mechanism in both dimensions. On the edges, services deliver behavior in a manner that transcends the underlying technology. In the spaces, services provide a powerful means of semantically rich communication between systems of various kinds. On the edges, services deliver behavior in a manner than transcends the underlying technology. In the spaces, services provide a powerful means of semantically rich communication between systems of various kinds.

Category :, , .

Tuesday, November 29, 2005

Writley To Word Would Be Like Hotmail To Outlook

The innovation that is happening in the personal computing space is amazing. Brian Livingston points to the fact that the days when Web browsers made you wait just to update the screen are ending. Today a Web page can be just as responsive as any program you run on your PC. The biggest upheaval AJAX will cause will be new powers in word processing, especially for documents that different people must write, edit, and approve. Microsoft Word has long enabled co-workers to insert comments into draft documents. And collaborative environments such as Microsoft's SharePoint Services are making possible "shared workspaces" so multiple individuals can simultaneously edit documents safely. None of these methods, however, are as simple to set up as a standard Web browser, which can quickly access a file from anywhere in the world. That's the promise that AJAX brings to the party. InetWord is a Web-based way to efficiently edit HTML files and office documents. One of InetWord's most serious competitors is another AJAX tool named Writely. In its current beta stage, Writely allows anyone to register for a free account and start uploading files. The document's creator can give editing privileges to any number of other people. The service is optimized for editing HTML files, but you can also upload Microsoft Word and OpenOffice files, which are converted on the fly into HTML format.Best of all, editing by multiple individuals is supported by Writely in real-time. There's no risk that edits made by one person can wipe out the changes made by another. InetWord focuses on sharing document with yourself and by hosting a file, allowing the user to edit from multiple locations, while Writely is more focused on dynamic collaboration. Writley shall be word what hotmail was to outlook. If you work with other people to create, edit, and approve documents, Web-based applications like Writely now offer an alternative to in-house content-management systems. Look at the comparison post on online word processors.

Category :

The End Of Copyright & Changing Role Of Intermediaries

Gamasutra thinks that we are witnessing the beginning of the end of a major era in world history. It may take fifty years, it may take a hundred, but the age of copyright is drawing to a close. This is not about the google controversy. While not sure if a world without copyrights is a good thing or a bad thing, but it’s inevitable. With Guttenburg’s printing press, the concept of intellectual property was born. Over the next two centuries or so, copying books went from being high praise to being a crime. When photocopiers became commonplace - when enough people feel that it’s OK to do a thing, that thing ceases to be wrong in their own cultural context. The Fair Use doctrine evolved with respect to copyright materials. The law changed. It’s now OK to photocopy parts of books for educational, non-commercial use. In effect, the authors and book publishers had to give some ground in the face of the overwhelming tide of public opinion. There’s no intrinsic reason why someone should continue to get paid for something long, long after the labor they expended on it is complete. Architects don’t get paid every time someone steps into one of their buildings. They’re paid to design the building, and that’s that. This is the age of new techniques. Travesties like the Digital Millennium Copyright Act don’t promote the progress of science; they actively discourage it. So do software and biotechnology patents. The patent system was intended to allow inventors to profit for a limited time on particular inventions, not to allow huge technology companies to put a stranglehold on innovation by patenting every tiny advance they make. The lawsuits, the spyware, the DMCA: these are the death struggles of an outdated business model The urge to create is too strong in all of us, and consumers will always be willing to pay for novelty and for excellence. These methods may not matter. It may mean that nobody gets mega-wealthy any more. What it does mean for sure is that the giant dinosaurs that currently dominate the distribution channels had better learn to adapt or die. Nicholas Carr highlights that far from experiencing disintermediation, business is undergoing precisely the opposite phenomenon - hypermediation. Transactions over the web, even very small ones, routinely involve all sorts of intermediaries, not just the familiar wholesalers and retailers, but content providers, affiliate sites, search engines, portals, Internet service providers, software makers, and many other entities that haven't even been named yet. These middlemen need to look at new models - It's no coincidence that the most profitable internet businesses - eBay, Google, Yahoo - play intermediary roles. They've realized that, when it comes to making money on the web, what matters is not controlling the ultimate exchange (of products or content or whatever) but controlling the clicks along the way. That's become even more true as advertising clickthroughs have become the main engine of online profits.


Apple Mac Mini As The Digital Hub

Think Secret speculates that Apple's Mac mini will be reborn as the digital hub centerpiece it was originally conceived to be. The new Mac mini project, code-named Kaleidoscope, will feature an Intel processor and include both Front Row 2.0 and TiVo-like DVR functionality.Following the expectation that apple iBook may host intel chips, comes the next one - The new Mac mini is also said to sport a built-in iPod dock, a feature that was scrapped from the Mac mini Apple first introduced one year ago. The Mac mini may be positioned as the living room command center. Apple’s Front Row 2.0 and Apple's DVR application are likely to be a "TiVo-killer." Apple's media center intentions have become startlingly clear in the past year since Apple first delivered the Mac mini and customers first started connecting the system to home theaters and installing it in automobiles. With the hardware, software, and iPod sales behind it, Apple now seems poised to firmly plant its footprint in living rooms.
The Digital Home and Digital convergence are hot topics – these are expected to change the lifestyle of people atleast in rich countries to start with and more importantly open up new opportunities for business in allied areas. Intel envisions that PCs will be “all-in-one hi-fi devices”, “entertainment PCs”, and “vaults” for digital content.Intel's vision is that consumers will start to use their PCs at home to download, store and manage films, songs and games, in order to transmit all this fun stuff wirelessly to TV screens and stereo speakers throughout the house. consumer take-up of digital technology in the home is at a 'tipping point' which could lead to a dramatic increase in sales for converged devices that integrate video, audio and computer technology. The industry has talked up the idea that computers will finally move from the home office to the living room for many years, but Ballmer said he thinks this theory may be about to become a market reality. After all the the critical mass has to come from the PC, or a next-generation video device. Positioned as the mac for the masses – clearly a major change in direction is bound to happen. The Mac Mini clearly has a glorious future.

Category :,

Software Services : Trust & Morality

Alex Bosworth writes, as the software industry changes from shrink wrapped product development to a service model, not only does the model for developing and distributing software change, but also the model for how a company must conduct itself changes.He points out that every search, every click, the time spent lingering on a photo, the choice of wording and revision in an essay, the pattern of trips taken, the record of purchases made are all easily captured and stored forever by a web-service. Unlike the user of a packaged software product, there is very little control a user of a web-service has over this data collection. GMail may appear to delete email when 'trashed', but closer inspection reveals no firm guarantee the email ever disappears from their data centers, and certainly not at the time a user clicks delete forever. Detailed personal data that lasts lifetimes, bits don't have a built in expiration date. He points to TiVo users complaining about about embarrassing profiles. Software services have increasing potential to profile and people on a much grander scale than what television a person might want to watch.Rogue nations can profile and target people for imprisonment or reprisal based on web-service data mining as Google has the potential to deliver personalized advertisements. Social software offers a new degree of concerns, as these aim to graph your entire network of friends and acquaintances. Suddenly services don't simply know as much information about you as you release yourself, they also know what your friends think about you and information about them as well. Social software is at such an early stage, it's hard to think of all the abuses this information could create, however abuses of this information trust already exist. Sony slipping DRM rootkits on their CDs can erase a lifetime of good will. The only way to restore or create trust is by over time and repetition creating a pattern of ethical decisions. Look at RFID privacy related issues, online tracking facilities, spyware – we are indeed in a cocooned web world. James Governor points to the customer respect report and notes that IBM, SAP, Microsoft and not one of the Web 2.0 leviathans, Google, Yahoo, eBay is on the list as "excellent" & wonders whether or not customer respect is not an essential part of a succesful online business model!!.Surprising considering the fact that an the list looked different an year back. The most important thing for services and users of services to realize is that trust is an extremely valuable commodity that is hard won and easily lost.

Category :

Sunday, November 27, 2005

Our Brave New World – The World Of Platforms & Process Supremacies

Charles and Louis-Vincent Gave, along with Anatole Kaletsky have written a brilliant and easily read 150 page book called Our Brave New World.
John Mauldin highlights some important points made in the book : Until proven wrong, the trend of tech-dominated NASDAQ is higher, while the trend of the GM's, Ford's, Alcoa's, US Steel's and the like are downward in broad terms. The 'platform' companies sell high margin design; the others sell low margin manufacturing. In the post-industrial age, we need to own the former and to be short of the latter. The book echoes the simple understanding that, in the words of Lord Keynes, 'The facts have changed, and when the facts change, we change.' The 'facts' of the world have indeed changed, and we want to own design firms, software firms, pharmaceutical firms, firms that sell high end services, companies like IKEA (if it were public) that sell high end goods but have all of the manufacturing hoved off to the cheapest place of manufacture and can drive costs down while holding margins high. The facts have changed and we need to understand that."
"Platform companies" are the wave of the future. We are not talking here about tese type of platform companies.The platofrm companies referred in this post are companies which design and market, but let someone else manufacture. Thus they have lower capital costs and are less subject to economic downturns, as they have no factories to shut down or workers to lay off. There are no legacy costs and no unions to fight. If anything, given the nature of the global economy, there aren't going to be many products insulated from the global pricing structure - even services. So, platform companies are the new growth companies - not by virtue of what they sell, but by virtue of how they are organized. This means we shall see more & more of product development in software getting outsourced. So it’s the age of solectron’s and competitive advantage comes in terms of flexibility, speed of response , customer delight – all pointing to the supremacy of business models and the ubiquitous nature of processes.

Category :

Editors & Bloggers Control Part Of The Reader Attention

We covered the advancement internet reach surpasses the reach of the radio.Chris Anderson says he does not read mainstream publications and anything relevant to his interests therein will be anyway pointed to by his regular blog reads .He does not think that the mainstream media value is lost; but external editors in the form of network of bloggers do a good job. He sees his job as an editor to be one of a pre-filter - & the bloggers are post filters. So mainstream editors just control part of the attention chain, but the wise crowd controls the rest.
The blogs that have earned their way to my feed list are doing is adding value to commodity information in atleast three ways
1. Add value with a unique perspective or analysis.
2. Add value with unique information.
3. Add value by providing a unique filter/lens on content available elsewhere.
The magazines that have a place in this world, are those that offer something one can't get elsewhere & do one or more of the following:
1. Add value with unique perspective or analysis (above and beyond what's already out there).
2. Add value with unique information (often obtained through the privileged access still afforded the mainstream media).
3. Add value with unique presentation, especially using immersive forms that don't work well on-screen, such as long-form narrative and lavish packaging (including photography, infographics and other design elements).
While discussing the the future of news in the Covergence, RSS And iPod Age & the futue of newspapers, we emphasised the point that the future of newspapers is to change from a news organization into a news community. As we noted here, it is clearly the case that Internet's distinctive role in politics/business has arisen because it can be used in multiple ways. Part deliberative town square, part raucous debating society, part research library, part instant news source, and part comedy club, the Internet connects voters to a wealth of content and commentary about politics/business. After all the new forms of media are part of emerging trends.

Category :,

Amazing Sense Of Scales

I digged this – A visual sense of various scales as published here.

- 1.616x10-35 m 1.616x10-35 m - the planck length,(the smallest measurement of length that has meaning)

- 1x10-15 m 1x10-15 m - one fermi

- 12,756.2 km 1.27562x107 m - diameter of Earth at its equator

- 26,000,000 kly 2.4x1026 m - distance to farthest known object (quasar SDSS_1044_0125).

There are several more listed herein. Sometimes these data can give us so much to reflect upon - the amazing variety that we encounter in somthing as direct as scale. Notice here - most of the bewidering scales that are talked here are right in the realms of nature - clearly nature has no match.

Category :,

Digg & Engadget Attract More Traffic Than Slashdot

We recently covered Digg trying to catch up with Slashdot in terms of traffic.
Alexa now reports higher traffic for Engadget and Digg compared to Slashdot. Digg started with the notion of how to leverage the collective mass of the Internet in various ways: applying it to content, using it to rank content, using it to make content more palatable to the masses. Automated systems take time to crawl the net. Editorial systems have the human factor. They may decide they're not interested that day, or they'll do it tomorrow. In Digg’s case, there's no barrier.The larger the critical mass of users and the collective wisdom applied to digg, the better and more relevant the stories get. There are systems set in to prevent abuse of digging. Coupled with a unique ranking system and power of collective wisdom, Digg gets more powerful. The founders note that Digg started with an application, and are using components of social networking to expand the value of the site. In the case of social networking, it serves one very distinct purpose – introduction.Truly amazing story.

Category :,

Saturday, November 26, 2005

P2P File Sharing, Long Tail & New Pricing Models

Chris Anderson points to a well researched paper by David Blackburn on the economics of P2P file-sharing,which amongst other things finds that it does indeed depress music sales overall. But the effect is not felt evenly. The hits at the top of the charts lose sales, but the niche artists further down the popularity curve actually benefit from file-trading. Blackburn finds that it is unrealistic to believe that the effects of file sharing are constant across all artists as the costs and benefits of file sharing differ with the ex ante popularity of the artist. This suggest that ex ante unknown artists are likely to see more positive overall effects of file sharing than ex ante popular artists are. By adopting an estimation procedure which allows for the effect to vary according to measures of artist popularity, file sharing has had strong effects on the sales of music. In particular, new artists and ex ante relatively unknown artists are seen to benefit from the existence of their songs on file sharing networks, while ex ante popular artists suffer for it.And while the average effect across artists is essentially zero, the average effect on sales is not zero, as more popular artists not surprisingly tend to have higher sales. Thus, this paper finds that file sharing has had large, negative impacts on industry sales and that the RIAA’s strategy of suing individual file sharing users has led to reduced file sharing activity and sizeable increases in sales. Furthermore, the differential effect of file sharing on the sales of artists of different levels of ex ante popularity has led to a dramatic shift in the distribution of sales among artists, as new and less popular artists are now selling more records while star artists have seen their sales shrink, compacting the distribution of outcomes). Chris points out that Blackburn does a little mathematical magic to simulate what would happen if file-trading were reduced by 30%. Artists who are unknown, and thus most helped by file sharing, are those artists who sell relatively few albums, whereas artists who are harmed by file sharing and thus gain from its removal, the popular ones, are the artists whose sales are relatively high. The Long Tail implications of this are pretty clear. For the majority of artists further down the tail, free distribution is good marketing, with a net positive effect on sales. Which is yet another reminder that the rules are all too often made to protect the minority of artists at the top of the curve, not most artists overall. To me, it looks like lot more financial sophistication is needed to get better returns on albums – we need to note that P2P file trading is not zero cost & the record companies ought to look for a different pricing model – charge premium for big names and sell at higher prices in the peak season ( Blackburn notes that the sales of recorded music appear to follow decay patterns and seasonality patterns similar to those of motion pictures) – the first few weeks upon launch & as a comment in anderson’s blog noted , there is a greater case for record companies to look into dramatically cutting the prices for tracks in the long tail. Does the same apply to Bollywood films - not directly as Bollywood films have different economic models - but the longtail patterns are distinctly felt. .

Category :,

The PageRank Mechanism Revealed & The Birth Of Google.

The Page Rank model(the closest parallel that I can think of in terms of reach and power is the Coke’s original formula) is indeed a very powerful framework – the benefit of which is felt by all in the web world but little understood. Earlier I covered John Battelle writing about the 1996 Sergey and Page's crawler configured from stanford homepage working outward. Inspired by citation analysis, Page realized that a raw count of links to a page would be a useful guide to that page's rank. He also saw that each link needed its own ranking, based on the link count of its originating page. But such an approach creates a difficult and recursive mathematical challenge - you not only have to count a particular page's links, you also have to count the links attached to the links. Together, Page and Brin created a ranking system that rewarded links that came from sources that were important and penalized those that did not. Page and Brin's breakthrough was to create an algorithm - dubbed PageRank after Page - that manages to take into account both the number of links into a particular site and the number of links into each of the linking sites.
I digged and stumbled upon the original Stanford presentation authored by Larry Page, a piece of history that Stanford is clearly proud of that includes the basis of the Google PageRank algorithm. This is a fascinating read for anyone who has ever been interested in Search Engines & Optimization. The slide show in simple terms explain valuable information about how page rank is determined (do not miss the notion that a link transfers page rank depending on how many links are outgoing from the original page). Also note the simplicity with which it is explained – this has become a google trait for years to come. In practical terms, the relationships governing pagerank are best captured here@ prweaverblog. Some may find the pigeonrank mechanism interesting but am sure that all those interested would find the detailed paper on Pagerank computation - the authentic one, lot more interesting and for some inspiring - how small ideas & brilliant execution can help grow business worth several hundred billion doallr marketcap or own private aircraft .

Category :, ,

Friday, November 25, 2005

Bill Gates On Microsoft Research Direction & Edge

Bill Gates in an interesting interview with Informationweek talks of a various issues centered around Microsoft Research. Excerpts with some edits and comments:
Microsoft research has always had a pretty broad set of activities and are growing ( Microsoft is filing for 60 patent applications a week till recently) with the addition of the fourth center in India. Gates claims, Microsoft will actually have more than matched the kind of relevance that Google can deliver in a short time ( I wonder what he is referring to and how is it measured, given Google’s dominance.) & notes that the role of Microsoft Research in that has been phenomenal. He notes that there's so much data in the sciences that without the kind of software management that we have, both in our products and in our research, that they won't be able to make the rapid advances that they should. Nowhere is that more true than in biology, life sciences. The ability to connect these data sources together using very state-of-the-art web service and visualization approaches is pretty exciting. “Science," is not just about people designing cars, think about people designing planes, think about people thinking through the design of a Web site. It's not just new medicines, although that alone would justify all this work. It's not just modeling the environment, although that alone is a supercritical thing that we absolutely need to do, and advanced computer software will play a key role there. It's sort of the digitization of the world applied to science and business and commerce. If anything, you could actually say the fact that we need software understanding to advance the sciences means the shortage is all the more acute, because you need people sitting in these computer science classes that then go off and really focus on life sciences, and focus on environmental sciences. Software is becoming this key thing; in the way that math was historically. The world at large needs more scientific understanding. The fact that these understandings allow us to make advances in medicine, and understanding economics, and the environment should make the field all the more attractive.
Gates makes some excellent observations - the very brightest people often study in multiple fields & notes that he studies physiological psychology and economics. He says Microsoft is always on the lookout for somebody who loves software but knows it so well they're seeing how it can be applied in different ways. In an interview process, it's one of the best things to ask somebody about some problem that they're working on that they're passionate about, to see their depth of understanding and how they go about it, rather than asking them some very specific questions. He also holds the view that multicore computing would enable focusing on more advanced areas like speech, very advanced search. He adds that when you use Live there will be some services up in the Internet like storing your files on the Internet, being able to back those up over the Internet, being able to have documents translated for you with a service on the Internet. The fascinating things as he sees is even though Microsoft is really just a software company, it influences the direction of hardware, & needs to constantly understand the direction of hardware.
Gates is definitely trying to position Microsoft as a great corporation aka the IBM’s & Bell lab types – Kevin once mentioned that every year, Microsoft spends more on R&D than it cost to send a man to the moon. No doubt Microsoft’s bold moves to reorient vista development, the impending office 12 launch, imprints in CES and a slew of other initiatives about to be made widely available is giving it a perceptively better impression but on Innovation and where it matters it may need to do a lot better.

Category : ,,

Jim Collins On Peter Drucker

Jim Collins has written a very small but sweet piece on Peter Drucker. Excerpts from the article:
Peter F. Drucker was driven not by the desire to say something but by the desire to learn something from every student he met - and that is why he became one of the most influential teachers most of us have ever known. Drucker never forgot his own teaching: Ask not what you can achieve but what you can contribute. Jim thinks that If he had been granted another 95 years, he would have continued to produce. At age 85, when asked which of his 26 books he was most proud of, he responded: "The next one." In intervening years he published at least eight more volumes, and at age 95, shortly before his death, he released yet another. He had a remarkable ability not just to give the right answers, but more important, to ask the right questions - questions that would shift our entire frame of reference. Throughout his work runs a theme that highlights a fundamental shift, away from achievement - jettisoning with the flick of his hand, as if he were waving away an irritating gnat, any consideration of the question of what you can "get" in this world - to the question of contribution Undoubtedly an inspiring personality!!

Category :

Bihar : The Trendsetting Change

Shekhar Gupta sees the just concluded Bihar election as a forerunner of a major change about to happen in the polity of India . He writes that politics in India is in grave danger of being trivialised by yet another factor—psephology. If every electoral verdict were to be reduced to simple arithmetic, it would not only become dull and predictable, but also irrelevant. This Bihar election marks the arrival of an aspirational wave in the most backward Bharat where no more than 12 per cent of infants are immunised at birth, where birth rates are higher than the most backward countries of the world, and where per capita income is one-third of the national average and even one-sixth of some of the richer states of India. A resurgence of hope and aspiration in the Hindi heartland is an event to cherish and celebrate.The voter in Bihar is defining for all of us a welcome new notion of empowerment in India’s political heartland: social equality combined with religious tolerance, security, and economic upliftment and opportunity. Shouldn’t it be reasonable to believe that this very welcome infection will inevitably spread to UP as well?

Category :,

The Auto Web 2.0 Validator

Came across this site called web2.0 validator. Based on the thirty second rule, given an URL, the site validates and gives a score on web2.0 conformance. Popular sites like msn.com, google.com score 2 or 3 on a maximum of 16/17. The interesting thing about the site is that all the rules of web 2.0 are provided by users of this site. Some of the rules look downright funny :
• Uses the prefix "meta" or "micro"?
• Attempts to be XHTML Strict ?
• Has a Blogline blogroll ?
• Refers to mash-ups ?
• Appears to use AJAX ?
• Appears to be built using Ruby on Rails ?
• Refers to VCs ? No

The definition of web 2.0 changes on a daily basis and the site checks randomly against the most recent rules decreed by it's users. Adaptivepath site scores 2/14. No better way to mock at Web 2.0!!

Category :

Thursday, November 24, 2005

Globalization & China: The Disappointed Investors

I read during my flight out of Seoul, Newsweek’s very insightful article on how may global enterprises are forced to hedge their bets on china following massive investments and suffering extremely poor returns. A must read for all talking about the rise of china – to understand the nature of growth and investments that are happening in china. I can confirm that I am repeatedly hearing such stories all around( my initial surprise turned into realistic assessments to a certain sense of tiredness in hearing such views repeatedly) on investments made in china in my varied interactions.
I believe that when it comes to discussing Asia – I find that amongst all western observers - Stephen Roach is clearly ahead of the pack – others look at the blistering growth here and write rosy words – Stephen does several notches better. In his latest bulletin
Stephen Roach writes that there’s nothing flat about this unbalanced global economy. The image of a “flat world” is most appropriate for the endgame of globalization, adding that ideal state is decades into the future - if that. While IT-enabled connectivity has shrunk the world in many new and important respects, the world is struggling mightily with what this connectivity has brought. China and India are reshaping the global economy as never before. The 40% of the world’s population that lives in these two countries is only just getting a taste of economic prosperity. They are pushing ahead rapidly with very different development models. China has done it the manufacturing way catering to external demand, whereas in India it’s been more of a services and internal consumption story. The theory of globalization teaches us that this is a “win-win” development. Globalization may well be win-win in the long run, but in the here and now it is profoundly asymmetrical. It has given rise to a multitude of new entrants on the supply side of the global equation but very few new consumers on the demand side. With the important exception of India, Asia remains very much an external demand story. With the rest of Asia now increasingly integrated into a China-centric supply chain, the region remains far more skewed toward US-centric external demand than internal consumption. India’s consumption-led growth dynamic is encouraging, but its per capita spending far too low. The hyper-speed of an increasingly asymmetrical globalization is hardly the stuff of a flat world. China is now looking to services, whereas India is counting on manufacturing to fill the void. The question is, Can they pull it off? I still have my doubts about India’s manufacturing strategy,( he earlier wrote, The Indian manufacturing model, continues to suffer from three major deficiencies - a lack of infrastructure, a low national saving rate (a little over 20%), and anemic inflows of foreign direct investment,given the nation’s seemingly chronic shortfalls of infrastructure, FDI, and saving. At the same time, he argues that china is relatively clueless when it comes to understanding a services culture – He concludes that globalization at this point in time is far more about disparities between nations than the assimilation of a flat world. Definitely important points to ponder.

Category :, ,

Presentations : Form & Substance

In the November edition of HBR, senior editor Gardiner Morse has an excellent piece - a brief forethought on “information graphics”. Morse takes on the circle and arrow drawing brigade influence in business circles to demonstrate the process and flow sequence. He writes,"Business communications are lousy with circle-and-arrow diagrams that range from the dumb to the deceptive". Morse eggs on presenters, readers and listeners to examine clearly next time when you find yourself preparing a circle for presentation, ask yourself if the process that you are describing really works the way you say it does and turning his attention to readers and listeners, the next time when a presenter touts a circle to make a point, find the bogus link and put him on the spot.

Well – personally speaking, I am not a great fan of jazzy slides/presentations- I get to see hundreds of them – they are in a make-believe world – trying to outdo each other. In this world of digital interruption characterized by continous partial attention where clearly user experience management is a quality and not a discipline, we are all conditioned to think that Jazzy graphics in slides are better than meet and depth. I have seen innumerable CEO’s struggling with content and presentation linkages in slides while under preparation – still persist with Jazz and animation to provide the rich experience to audience.One of the recent presentation that I liked is this from Steve Jobs. I personally find the Lessig method attractive with emphasis centered on:
- Simplicity & sophistication characterizing the slide content
- Emphasis on readability as against background artistries.
- Built in punchlines for good communication across, emphasis on ideas.

Category :,

Feedworld 2.0 & SSE

Dick Costolo, CEO of Feedburner has an excellent post on how feeds will change the way content is distributed, valued, and consumed.From all feeds being derived from a blog,today,however,there are innumerable feeds that are unrelated to blogs. Commercial publishers have embraced feeds wholeheartedly; most web services and many search engines now provide subscribed results; and podcasts and videocasts are entirely feed-based while not necessarily tied to blogs. Feeds provide three critical benefits to any digital media:
1. A notification mechanism for updates to a specific channel of content
2. The ability to subscribe to content, creating a persistent link between publisher and subscriber
3. A semi-structured version of the content
While some complain about feed overload, some see a revolutionary impact of feeds in general and some see increasing business usage of feeds, the reality is that
we can leverage the benefits of feed structure to allow publishers to provide a feedback loop to the Web site; the feed can become input to content on the site. Since feeds are now widely searched, shared, and passed around via web-based aggregators or opml reading lists, it’s probably wise today to distribute a more limited collection of broadly subscribed feeds. Items are already pulled from feeds today in a variety of both publisher-friendly and publisher-annoying ways. Blog search, of course, pulls items from disparate feeds in order to deliver specific search engine results. Since the results themselves may be subscribed, we quickly see items from one feed hopping into another mixed feed in a way nobody would find contentious. Blog spam, to visit the other end of spectrum, also sees tools that take advantage of the simple structure in RSS/Atom feeds to enable hucksters to rip, mix, and ring the cash register by creating blogs that are seeded with content that will attract high cost-per-click ads to sit alongside the content. The blog spammer thus monetizes the item without involving the publisher, and perhaps more annoying, the original content is made to seem the property of another.
By following the atomic unit of content around as it’s ripped, mixed, and republished, the content is afforded the widest variety of distribution paths to reach the largest possible audience, which in turn creates the greatest opportunity for monetization
. Read his full report available here. Microsoft has published an excellent perspective about the evolving feed mechanisms. It sees feeds transforming from unidirectional To bidirectional Feeds. These simple sharing extensions (SSE) are a specification that extends RSS from unidirectional to bidirectional information flows. SSE defines the minimum extensions necessary to enable loosely cooperating applications to use RSS as the basis for item sharing—that is, the bidirectional, asynchronous replication of new and changed items among two or more cross-subscribed feeds. Just as RSS enables the aggregation of information from a variety of data sources, SSE enables the replication of information across a variety of data sources. Data sources that implement SSE will be able to exchange data with any other data source that also implements SSE. From the user's perspective, this means that you will be able to share your data (such as calendar appointments, contact lists, and favorites) across all of your devices and with anyone else that you choose, regardless of infrastructure or organization.Clearly we are unto a new world of feeds and it is amzing to see the advancements and transformations happening centered around feed mechanisms in the past two years and clearly more changes are ahead of us.

Category :

The US & The War For Talent

WSJ writes that industrialized countries other than the US recognize the importance of human capital for economic growth, and they have ratcheted up recruitment of the world's mobile talent. Meanwhile, the U.S., the undisputed leader in attracting global talent, has erected barriers for skilled migrants and watches passively as they stay home or go elsewhere.America has seen the number of legal migrants, who tend to be more educated, fall by nearly a third over the past few years . Now is not the time to scale back foreign recruitment. The explosive growth of higher education in many developing countries, particularly in Asia, has caused a perceptible, if gradual, shift in the global talent pool. China and India are producing more engineers than all industrial countries combined. Larger developing countries have new opportunities to attract jobs for skilled workers and keep them at home. Today's skilled jobs are increasingly service jobs, and, unlike manufacturing jobs, service work is skill-intensive rather than capital-intensive. With the rising educational attainment in many developing countries, and the low capital costs of outsourcing service labor, developing countries have an emerging competitive advantage. Foreign talent has helped make the U.S. economy the world's most productive and innovative. Time spent in the U.S. by foreign citizens has also been a crucial means by which American values and institutions have been transferred around the world. Raising barriers to talented foreign students and workers might yield short-term political gains, but the long-term economic consequences will be much less salubrious. While the concerns are obvious and well thought out,recently elaborated by Richard Florida so tantalisingly - I do not know of any other country in the world which can absorb immigrants in the scale that the US traditionally does and there are not many societies outside of the US that can be so open minded and welcome immigrants and allow them to seemlessly fuse into local ethos.

Category :, .

SaaS, SMB & Enterprise Adoption

Amy Wohl sees most traditional software vendors have positioned SaaS mainly for the SMB market, primarily for two very good reasons:
(1) For easy penetration.
(2) To avoid cannibalisation.

She argues that SaaS appeals to large enterprises as well & points out that the ASP market (the precursor to the SaaS market) at its initial days did not excite SMBs - particularly the idea of buying software on line and they weren't ready to act on it. Lots of enterprise buyers who readily understood the benefit of on-line software and they were often the first buyers, especially using it for short-term projects, projects that included non-employees (contractors, suppliers, customers), or remote offices (and individual users) that did not themselves have substantial IT support. As more and more of the SaaS offerings were new, net-native applications, architected to exploit the on-line environment (rather than repackaged traditional software), more SMBs found the offerings interesting too. The applications themselves are specified and scaled with enterprise buyers in mind. Some of the enterprise applications we see are very niche oriented & others have more horizontal applications and more customers.

I think that the enterprise buyers are a little cautious in making more investments in traditional enterprise software and also a little wary of SaaS as well.The enterprise software industry is plagued by low innovation, high cost structure and exorbitantly high maintenance and rollout fees. Occasionally we see some vendors looking at initiatives like time based pricing, a significant move.As I wrote earlier this would certainly others are bound to imitate –particularly in the SME segment. It needs to be seen how the pricing is structured including collection mechanism and how Tibco’s traditional partners react to this. Also to observe is the fact what happens to defaulters and enterprises that jump in and out of the schemes. The locking cost, jump out & switching costs need to be understood in greater detail. In traditional economics, leasing may be cheaper than rental pay structures. The pressures of competition & pricing makes companies try out new models – these sometimes force companies to think for the customers as well – no more proof is needed to be convinced about massive changes that are bound to happen in the enterprise landscape. the maintenance revenue led party may not last long.The straightforward calculations about existing customers continuing to pay very high maintenance revenue year after year may prove to be wrong moving forward and in fact may choose to converge into one homogeneous platform and look at saving lot more costs. The real test of one's ability to hang onto customers will not come when maintenance contracts expire but when the major software companies, transition to so-called "service oriented architectures," a fundamental change in the way applications are deployed, integrated and accessed. In the transition to the industry makeover, we are seeing some thrust towards hosted model solutions. In the era of consolidation, the industry needs to get rid of this problem of huge upfront money and twenty percent plus maintenance tariffs, buyers will not be open to some of the most innovative solutions out in the market today - I wrote along similar lines on licensing -Clearly the structure, style and measures of performance of software industry - particularly enterprise software industry is set to change dramatically.

Category :

Tuesday, November 22, 2005

Cisco : IP Vision In This Convergent World

With this acquisition, Cisco muscles into the STB market, and gearing to get into the home networking as well. There is now no competitor with matching reach and skill base like Cisco. Cisco already has great Storage Networking technology in-house, modulating and re-aligning may be a challenge, but reality is that in this changing digital life, consumers would want to view, listen, communicate over varied mediums covering almsot all digital gadgetries at home as well.The services can be extended to support a lot more features. The STB segment is a very important niche, with potential to grow manifold times in the coming years & Cisco has certainly done a great job in identiifying the space and is all set to become a significant player here. As News.com captured it so well - for a long time Cisco has been talking about network convergence, the idea that data, voice and video traffic will one day travel over a single network. The vision has already come to fruition within the carrier's network. Most cable operators and phone companies carry their internal traffic over an Internet protocol, or IP, network that uses Cisco routing and switching equipment. Now the trend is finally making its way into the home, as cable companies and phone companies start offering customers a triple play of services that includes high-speed Internet access, telephony and, finally, video-all over an IP network. We covered Geoffrey Moore's View on Cisco's future from his perspective where he rightly predicted that the options for Cisco as:

Cisco- integrate the network to transform it into a platform for web ervices
- enter systems management beginning with security
- annex enterprise storage
- use visionary consulting to influence the emerging architecture
Geff Moore writes that Linksys One is an appliance-like offer that will be resold through telecommunication service providers (SPs), both traditional and non-traditional, as a hosted service. This does not require Cisco to develop deep SP expertise, and it gives it a leveraged entrée into the small business market through a channel that is already a trusted source of services. Moreover it brings to market a next-generation platform from which one can readily envision launching a broad array of services. Meanwhile, its acquisition of Scientific Atlantic gives it’s a second SP play—by resurrecting the CLEC vs. ILEC play. Only this time the C in CLEC stands for Cable-enabled. The ILECs are still deeply engaged with the traditional telecom switch providers—Lucent, Nortel, Alcatel, Seimens, Ericsson—and while the company will no doubt continue to knock on those doors, the exercise is a bit like pushing a rope uphill. The cable folks, on the other hand, are absolutely delighted to get help with their own version of the voice/data/video triple play. What Scientific Atlantic brings to the table is the possibility of an end-to-end platform play with Cisco controlling both ends. As we noted earlier, much of Cisco’s growth has come through acquisitions, almost 100 in its decade of existence. But this one stands apart – for the size and the new terrain that it leads Cisco into.

Category :, ,

Wikipedia In Top Ten Sites

Steve Rubel points to Nov first week Nielson Ranking of news sites and finds Wikipedia there - and that's a first for any open citizen-powered site - truly interesting. I was a little curious and went to Alexa to check on compartitive traffice of other popular sites and found that Cnet.com has caught up with about.com in traffic.

Category :

Open Source: Economics & Innovation

Closely following Larry Augustin’s views on opensource, Marc Fleury writes that the business model of software MUST include R/D. He sees that FOSS development models are economically sustainable, have lower expenses associated with them, specifically in the QA arena and that for-pay licensing-based software, while greatly profitable, can be undermined by cheaper models. Marc who earlier wrote about VC investments in open source now writes that JBoss sees FOSS is about a better way to develop, distribute and support software. Today's software have tons of room to grow in terms of technical maturity, the economarket dynamics have tightened since the bubble forces The dirty little secret of the enterprise software model in today's maturing market place is that, with the notable exception of a few players , the days of the hugely profitable sotware license are gone. In With the traditional software development model, your cost of sales, marketing and distribution is so high that these models completely depends on the for-pay license. An optimally functioning FOSS business model needs 20 cents of sales and marketing to acquire 1 dollar of maintenance, where a traditional software company will have to spend around 2 1/2 dollars. Professional FOSS businesses can sustain sales and marketing costs out of the maintenance revenue stream. This model produces earnings (EBITDA) according to the P&L of stable software business models, those in mature subscription-based phases. The P&L of these business sustain R&D of 20%, where we are today at JBoss. Thus Professional FOSS, in theory and practice, sustains the research and development expenses associated with the classic business model. On innovation as a proof point, JBoss and the FOSS community in Java have been pushing the frontier with EJB3, annotations, light-weight containers, IoC, SEAM etc and a lot more in the pipeline. The days of fat profits in licenses may be gone but software is moving ahead, as vibrant and innovative as ever. Look at where Jboss mostly operates in the enteprise layer - the answer lie there.

Category :

Paul Graham On Web 2.0

Paul Graham thinks that he does not see any deliberate plan to suggest there was a new version of the web in naming it as Web 2.0. Some views are better read without any inference thrown in - Paul's writing are always like that. Excerpts : The proponents just wanted to make the point that the web mattered again. It was a kind of semantic deficit spending: they knew new things were coming, and the "2.0" referred to whatever those might turn out to be. The 2005 Web 2.0 conference reminded me of Internet trade shows during the Bubble, full of prowling VCs looking for the next hot startup. I wouldn't quite call it "Bubble 2.0" just because VCs are eager to invest again. Pointing out that the Internet is a genuinely big deal. The bust was as much an overreaction as the boom. It's to be expected that once we started to pull out of the bust, there would be a lot of growth in this area, just as there was in the industries that spiked the sharpest before the Depression. The reason this won't turn into a second Bubble is that the IPO market is gone & Venture investors are driven by exit strategies. The reason they were funding all those laughable startups during the late 90s was that they hoped to sell them to gullible retail investors; they hoped to be laughing all the way to the bank. Now that route is closed. Now the default exit strategy is to get bought, and acquirers are less prone to irrational exuberance than IPO investors. The startups are different this time around. They are to the startups of the Bubble what bloggers are to the print media. Graham concludes that as Google is a "Web 2.0" company – it shows that, while meaningful, the term is also rather bogus. It's like the word "allopathic." It just means doing things right, and it's a bad sign when you have a special word for that. Also read a related note here.

Category :


Too much of travel these days- On a trip to North Asia starting form Seoul. Updates may not be regular for a few days.

Monday, November 21, 2005

The End of Process - No Way

Ross Mayfield asks that if a knowledge worker has the organization's information in a social context at their finger tips, and the organization is sufficiently connected to tap experts and form groups instantly to resolve exceptions - is there a role for business process.Quoting Clay Shirky "process is an embedded reaction to prior stupidity", he adds that there was an exception to process and an expert designed a way for people to work together in one context that should fit all prior contexts. The problem is, the process becomes calcified and accepted as the rule. After all, it's a rule, and in corporations we follow them, even if it fails us or simply doesn't make sense. Because of constant change in our environment, processes are outdated the immediately after they are designed. The 90s business process re-engineering model intended to introduce change, but was driven by experts which simply delivered another set of frozen processes.Claiming some some staid corporations are abandoning process all together,he believes that at best a process should serve as a reference model - something that others can reference when completing a task & can be leveraged for innovation, a boundary condition for experimentation at the margin and concludes that the first organizations bringing it to an end will have a decided competitive advantage.
Jeff Nolan provides the best perspective( I fully agree with his views on this topic)- He points out that the automation of processes is not perfect, certainly, but like democracies it is better than the alternative. There are a great many things in the modern enterprise that will benefit from ad-hoc and freeform collaboration but in no way will these technologies displace the fundamental machinery that companies rely on day in and day out.. and that's all process based. Tom Davenport viewed that the speed with which some businesses have already adopted process standards suggests that many previously unscrutinized areas are ripe for change. For companies that don't have process standards in place, it makes sense for them to create standards by working with customers, competitors, software providers, businesses that processes may be outsourced to, and objective researchers and standard-setters. Setting standards is likely to lead to the improvement of both internal and outsourced processes. In a computerworld interview Tom adds When business processes become commodities, all the rules change in ways that can revolutionize business. For virtually anybody, a standard process can be a starting point—a point of departure from which to design a new process. When people are designing organizational charts, they often look at other organizational charts. It also simplifies the commodity stuff that people do. There are so many processes in an organization that don't confer any competitive advantage, and doing them in an innovative way wouldn't make much difference to revenues or profits. So you might as well do them in a standard way. And application packages all assume some sort of process by which they're used. Process activity & flow, Process performance, Process management - these are among the evolving standards that business needs to relate to their business functions and activities to leverage process advantage better. Barry Briggs makes the best assessment when he calls this decade as the decade of process. He rightly sees that in many ways business process is by far the most important and valuable form of collaboration since while e-mail, instant messaging, and shared workspaces facilitate communication, business process achieves business goals. When a customer buys something on the web site, a process is set in motion which at its conclusion results in the customer receiving goods and the enterprise, money. Where the eighties were known as the decade of productivity applications - spreadsheets, word processors, and so on, the nineties as the decade of email and the Internet, this decade, starting now (isn't it interesting that software waves start around the middle of the chronological decade), will be the Decade of Process

Category :

Sunday, November 20, 2005

Congrats Jim On Winning The Stevens Award

Due to my intense travel – missed noticing this important award announcementJim Highsmith winning the Stevens Award. Jimis Director of Cutter Consortium's Agile Software Development & Project Management practice and is a Fellow of the Cutter Business Technology Council, which prepares the Opinions for Cutter's Business Technology Trends and Impacts service.

Jim is someone whose writings, I had been following for more than a decade –he is a man of terrific stuff. Fully deserves to win this – considering the caliber of previous recipients of the award include the likes of Tom Demarco.

Category :

Youth, Technology & Connected Cocoons

In my visits to atleast twenty different national capitals in the part of the year that has gone by; it is almost impossible not to notice that the young people are in a race with each other to embrace technologies faster – not surprising in this iPod generation & presence era,given that technology can only improve,the trend is irreversible. Instat recently reported that the youth market in the Asia-Pacific region is becoming a significant driver for growth in the region’s mobile phone market. Around 10-15% of all youth disposable income is spent on mobile products in developed countries, displacing spending on traditional youth products like clothing, toys, and comic books, etc., the high-tech market research firm says. In the Asia-Pacific region, spending by youth on mobile data is estimated to grow at 15.3% annually from 2004 to 2010.
Guardian reports that today's teenagers use technology to stay in touch with friends at all times - turning their bedrooms into "connected cocoons". Through personal computers, mobile phones and gaming consoles, teenagers are spurning antisocial angst for a culture of "connected cocooning"- a phrase coined by MTV to describe how the current 16-to-24-year-old "MTV generation" is permanently plugged into a network of digital devices, bringing the world to their fingertips in a way no previous generation has ever experienced. Such limitless communication is having a revolutionary impact on the way young people interact, socialise, work and play. MTV's recently released Generations report on the lives of the MTV (ages 16 to 24) and VH1 (ages 25 to 44) generations defines how technology has driven differences between these age groups. Young, early adopters have become used to instant gratification, the report found. Globalisation and consumerism do not deter. Instead, brands define and give a sense of belonging. Devices and their uses displace the real and the virtual, creating a world where you can be who you want to be. And joining the digital march isn't just a personal choice; to play a part in youth society, it is imperative to be switched on, charged up and always connected. For the older generation, the pace of change has been quite frightening. The VH1 generation grew up in a period where there were still a few certainties. The family was still together & with a sense of belonging. With most of these certainties seem to have gone - one has to try to mix and match identities to assert who you are. The MTV generation doesn't have fixed values, so they are more open to new technologies. Wi-Fi access to Nintendo gamers ia an important related development. This phenomenon is creating pressures for a variety of industries - The toy industry is responding to age compression,wherein kids are getting older younger – and is creating a revolution in the toy industry. Realizing that today's kids are sophisticated and tech-savvy,they are fighting fire with fire by building their own lines of "youth electronics." No doubt as we covered earlier, mobility has created revolutionalry sort of changes- already, governments have fallen, youth subcultures have blossomed from Asia to Scandinavia, new industries have been born and older industries have launched furious counterattacks and increasingly technology and mobility shall certainly act as the most dramatic change agent in the society.

Category :,

Saturday, November 19, 2005

Microsoft Intensifies Ajax Efforts

Ajax, a conglomeration of technologies that cover everything from presentation and object modeling to data interchange and retrieval is attracting wide attention.Microsoft thinks Ajax apps are too hard to build, and the company's Web platform team is trying to demystify Ajax with the development of an easier-to-use Ajax-style programming technology code-named "Atlas" that it's planning to bring to market during the first half of 2006. Look at Adam Bosworth talk about Ajax. A prototype of the technology is available here. Brian Goldfarb, Microsoft's product manager refuting the claim that Microsoft is unready for web 2.0, shares some inside perspectives on Ajax. He sees that there are two important things happened recently to revitalize interest in Ajax.
- First, a wider number of browsers have provided support for the technologies developers need for Ajax- style development, and
- Second, there has been a new focus and interest in delivering better user experiences for customers.
For the Web there are two challenges to this.
- The first is the limitation of application development within the browser,
- The second is development complexity.
Microsoft hopes that Atlas would help make Ajax- style development easier, and more approachable for a broader range of developers & believes that Atlas will be as good as is gets on the browser and will enable the broad masses of developers to easily take their Web applications to the next level. From a benefits perspective, Ajax enables better user experiences on the Web, which can help businesses gain a competitive advantage, providing the emotional connection that users have when they use an application, leading to brand loyalty and more. Richer experiences are able to provide big business benefits. From a technical perspective, compared to traditional web apps, the primary advantages of Ajax-style Web applications include more interactive user interfaces by executing code on the client, automatic updates without requiring the user to refresh the page, and better performance from fewer round trips to the server, among many others. Ajax-style development enables developers to optimize the user experience by gaining more control over what happens on the client versus the server. Brian makes the right observation that there is a larger focus on generally better experiences - Ajax is just a component of the story. It is about differentiating offerings, standing out from the crowd, and providing awesome experiences–and that need will drive the technology.

Category :,,

Google-Mart : Wal-Mart Style Domination In Web 2.0 World

Bob Cringely thinks that Google may not be interested in operating systems but could be playing to its strengths and could effectively take over the internet. All by seducing, he says. Dismissing Google’s interest in becoming a Super ISP, Bob writes,its interest in fibre could be for building Googlemarts – these containers could host within thousands of opteron processors and many petabytes of data & the idea is to plant one of these puppies anywhere Google owns access to fiber, basically turning the entire Internet into a giant processing and storage grid and adds these may be placed at around 300 locations which serve as the Internet peering points globally.They get Google closer to users, reducing latency. They offer inter-datacenter communication and load-balancing using that no-longer-dark fiber Google owns. But most especially, they offer super-high bandwidth connections at all peering ISPs at little or no incremental cost to Google.
Huge upside : Internet TV will scale to the same level as broadcast and cable TV, the coming AJAX Office and other productivity apps, they'll sit locally, good backups, data never goes away unless deleted. This is more than another Akamai or even an Akamai on steroids. This is a dynamically-driven, intelligent, thermonuclear Akamai with a dedicated back-channel and application-specific hardware. Google has the reach and the resources to make this work. There will be the Internet, and then there will be the Google Internet, superimposed on top. Cringley thinks that this is similar to Wal-Mart strategy – just as Wal-Mart doesn't try to own the roads its goods are carried over, Google won’t trample over ISP’s. And the final result is that Web 2.0 IS Google.

My Take: It is very difficult to say if Cringley is right – he may not be totally wrong. But as I its time Google outlines its vision. Like in any other industry the responsibility of pioneers and leaders are lot more than just looking after themselves - they are trendsetters and rolemodels in the industry.
But Cringley's idea warrant serious examination. I do not necessarily think about this as GoogleVs Microsft/Yahoo issue alone – but if you extend things further- it could be the case local processing intensities may improve- opening up a new wave of distributed computing – the network and infrastructure ecosystem would be forced to upgrade/transform massively- triggering and cascading huge opportunities for infrastructure and service players – once the public infrastructure gets upgraded like this – several other industries including the enterprise software sector may look at leveraging the new infrastructure in lot many more new ways – This would lead to A NEW INTERNET ECOSYSTEM per se – If cringley is right – we are in for A HUGE CHANGE AHEAD – WITH HUMONGOUS OPPORTUNITIES FOR ALL PLAYERS OPERATING IN THE ECOSYSTEM.

Category :,

Apple & iBook Laptops

We recently wrote that Apple's move to Intel marks a tectonic shift for Apple, which has used processors from IBM and Motorola throughout the life of the Mac. However, the company has changed architectures before, shifting in the 1990s from Motorola's 68000 family of chips to the PowerPC architecture jointly developed by IBM and Motorola. Think Secret speculates that Apple is planning to release its first entry-level iBook laptops with Intel processors next January. While not sure of what processors or price points the new models will debut at,it is thought Apple will expand the iBook line with one additional model and will lower prices - to entice current Windows users and prove to the market it will be more competitive with the likes of Dell, Gateway, HP and Sony. It is more likely the initial release of products with the new processor will be consumer-based products only and not professional, high-end lines, such as PowerBooks and towers, as some Web sites have reported. Apple will almost certainly tap Intel's forthcoming Yonah processor for the iBooks, a successor to the company's Pentium M. The dual-core Yonah chip could very likely deliver performance greater than Apple's current G4-based PowerBooks. Rolling out a consumer system like the iBook as the first Intel Mac also makes sense from a software development standpoint. Analyst Stephen Baker at NPD Intelect believes for Apple, they need a nicely configured $699 notebook to be competitive at the entry level he said. "They can play up the value of the bundled software and Mac OS X and really have a strong case for consumers to buy their product & notes the intensifying competition in the consumer notebook market. While launching Tiger recently, I wrote that clearly this move adds to the Mac's general superiority over typical Windows computers as the best choice for average consumers doing the most common computing tasks. There’s no doubt that Apple's move to Intel could usher in a new era of success. While on this do not forget to read the misconceptions regarding the so called Apple-Intel Deal.

Category :, ,

Friday, November 18, 2005

Email : From Productivity Wonder To Maddening Time Wastener

While covering the future of microsoft,I wrote that Wiki's, collaboration and the blogosphere all are certainly getting good traction at levels - all segments of enterprise and consumer included - and with the impending power of broadband and hosted models creating a new tapestry very different from what exists today. We also covered the definitive need for using proper email etiquettes to minimise email stress effect within organisations. We also covered the perspective that Email destroying the mind faster than marijuana. Businessweek writes that Email is stretching and morphing from a point-to-point medium into a broadcast Medium . The article points out that Gartner Group predicts that wikis will become mainstream collaboration tools in at least 50% of companies by 2009. Clay shirky predicts that while e-mail will remain the prime tool for notification and one-to-one communication, "a huge percentage of collaboration will occur outside of e-mail, with a continued rise in these other tools,". J.P.Rangaswami of Dredsner feels that there's an enormous untapped value to be gotten by getting collaboration right and adds that while email is not on its way to floppy disk-dom, it has certainly come under threat before. The Lotus Notes juggernaut of the early 1990s never displaced e-mail. Nor did attempts to build collaborative platforms during the boom, but e-mail has hit a wall, creating an impenetrable scale of conversations people don't need to be a part of and shipping around mounds of information they can't possibly digest. In the long run, perhaps the biggest death knell for e-mail is the anthropological shift occurring among tomorrow's captains of industry, the text-messaging Netgens (16-to-24-year-olds), for whom e-mail is so "ovr," "dn," "w/e (over, done, whatever)."
Businessweek finds that some Organisations are ditching e-mail in favor of other software tools that function as real-time virtual workspaces. Among them: private workplace wikis (searchable, archivable sites that allow a dedicated group of people to comment on and edit one another's work in real time); blogs (chronicles of thoughts and interests); Instant Messenger (which enables users to see who is online and thus chat with them immediately rather than send an e-mail and wait for a response); RSS and more elaborate forms of groupware such as SharePoint, which allows workers to create Web sites for teams' use on projects. Despite the brawniest corporate filters, more than 60% of what swarms into corporate in-boxes is spam. Since so much of what's received involves scams about millions languishing in nonexistent bank accounts, interoffice status contests, and people plopping unwanted meetings onto Outlook calendars, the e-mail blow-off factor is rising.

Category :,
ThinkExist.com Quotes
Sadagopan's Weblog on Emerging Technologies, Trends,Thoughts, Ideas & Cyberworld
"All views expressed are my personal views are not related in any way to my employer"