Wednesday, March 11, 2009

If it ain't broke but obselete, should we upgrade it?

My MBA class and I discussed the case of a successful retailer whose business model was heavily dependent on their IT infrastrcuture, and whose IT infrastructure was DOS-based. The class initially seemed unanimously in favor of upgrading the system to a more modern platform. However, a parallel analysis led us to conclude that the existing system supported the retailer's business model and information needs very well. There were no obvious immediate deficiencies, their IT was well aligned with their business strategy, their processes were carefully engineered, and their employees were trained and empowered. Besides, their key challenges to scaling their business were not IT-related, but had to do with operations and human resources. 

The discussion ended unresolved. A good summary of what was left open might be: when a system is technologically "very old" but functions in perfect alignment with one's business model, does it make sense to rip it out and replace it towards being technologically current, or reducing vendor dependence, or to create future (but non-obvious) application development options? Or should IT investment strategy always be driven by forseeable business needs, alignment with current business strategy and the philosophy of "if it ain't broke and does what you need well, don't fix it"?

Monday, February 16, 2009

The Micropayments Mystery

It appears that my favorite print newspaper is on the verge of extinction, and its survival depends on resolving a long standing ecommerce mystery. Time Magazine recently conjectured that future high-quality journalism will be viable only if we establish a micropayment system which allows newspapers to charge consumers per article, and if we alter consumer psychology to get them used to paying for online content. 

This line of argument raises two interesting issues. First, consumers are willing to pay for "bundles" of content as long as their collective quality is vetted by an intermediary. We still buy newspapers, subscribe to magazines, and pay for cable television.  Maybe this is because the cognitive costs of assessing the quality of millions of slices of content are too high, and we'd rather let someone trusted do a first screening. 

More interestingly, we've been promised a micropayment system for over a decade now, but nobody seems to have succeeded in creating one with widespread adoption. Instead, we have trusted intermediaries like iTunes who use a standard payment system (your credit card) to "microcharge" you for 99c songs and $1.99 iPhone apps. This seems to be working out rather well for Apple, who generated a couple of billion dollars worth of transactions in 2008. 

Perhaps this is the future of micropayments for content -- combining trust in payments with trust in taste or filtering. If we have enough faith in the intermediary, we'll let them narrow our content for us, and trust them to charge us right.  Its also loosely consistent with why eBay owns PayPal. At first glance, this seems less appealing and efficient than a stand-alone micropayment system combined with technology-based filtering and an open market for content creators. But if you factor in cognitive costs and a growing body of evidence that consumers do pay more for online reputation, placing your "taste" and transaction risk in the hands of the same trusted intermediary might not be all that inefficient. 

That still doesn't solve the micropayment mystery. Why hasn't a good micropayment system emerged? The technology exists, there's lots of money to be made from being the standard, and there's certainly plenty of microcommerce languishing in its absence. There's got to be something about what consumers value in a payment system that is missing from all the efforts we've seen so far...

Thursday, February 12, 2009

Genes, Science and Technology

As human beings, we are wired by our genes to reject actions that don't sustain our survival, and to adopt actions that enhance it. However, a substantial fraction of what governs our choices  today isn't determined by our genes; rather, it is (for lack of better labels),  constructed by social, economic and technological forces. 

This disconnect between our genetic instincts and our social, economic and technological environment seems to have changed how we evolve. Successful scientists seem to frequently and naturally reject important evidence because they are wedded to contributing to a paradigm that promotes creating science that is considered "normal" and adds to a prior tradition. This is documented by Kuhn in his book about the process of human discovery, one whose thesis maps best to what I've seen. In the language of genetics, we naturally avoid mutation, choosing selection and recombination instead.  We choose to enhance the present rather than thinking about the future. 

Similarly, the "gene" of a firm is very different from the gene of a person. And this is what makes information technologies so disruptive. Successful firms are those that are wired to select and recombine, to gather and react to what their environment (customers) tell them is important. So when the development that requires radical mutation comes along, it is the most successful firms that don't react. After all, ignoring the fringes is what makes them succeed, and most new technologies develop on these fringes. This is a lot like the scientists who contribute to a paradigm while ignoring the evidence that contradicts their theories. 

The firms that recognize that their success imposes these horse blinkers will therefore make that extra effort to "mutate" and experiment aggressively,  even if it doesn't add to their short term profits. Google, Yahoo, Microsoft (yes) and IBM are great current examples, although in different ways.  I'm not sure they are doing enough; however, it is good to see some convergence between our genes and our socially constructed instincts.