In a recently published paper, Andrew Odlyzko, a professor at the University of Minnesota, divines lessons from the history of transportation to explain the telecoms industry’s attraction to price discrimination, and what it may mean in future. Of course, in general telecoms, companies already exploit variations in what customers are willing to pay for digital bits, depending on whether they take the form of a cable television programme or an SMS text message. On the internet, however, charging according to content would mark a big change.
On the net, discrimination might mean one price for web and e-mail traffic, another for instant messaging and still others for telephone calls, music and films. Is it likely? Mr Odlyzko hopes not, although history strongly suggests that the temptation exists. He thinks that price discrimination might not be in telecoms companies’ interests after all. Unlike on canals, toll roads and so forth, internet capacity is abundant. Internet service is therefore a commodity. Simpler, flat-rate pricing, he argues, is likely to increase usage: discrimination would turn some users away.
Indeed, he says, distinguishing between different types of traffic would mean so much technical rejigging that the openness of the internet would be destroyed. Because the internet is decentralised and simply priced, it is cheap for many other networksrun by big companies, universities and telecoms firmsto connect to it. This in turn gives the internet a great capacity for innovation. Price discrimination could jeopardise all this. While content delivery does lend itself to a closed network, connectivity does not. Open networks are likely to win because they can attract more revenues from users, Mr Odlyzko says. Is this wishful thinking? History, as he shows, is full of examples of successful price discrimination. The telecoms companies may yet think it worth a try.
Malcolm Gladwell of “The Tipping Point” fame is one of those people who has to be read whenever he writes something. His new book “Blink” is due out shortly. Meanwhile, here’s his latest: “Mustard now comes in dozens of varieties. Why has ketchup stayed the same?”
There are five known fundamental tastes in the human palate: salty, sweet, sour, bitter, and umami. Umami is the proteiny, full-bodied taste of chicken soup, or cured meat, or fish stock, or aged cheese, or mothers milk, or soy sauce, or mushrooms, or seaweed, or cooked tomato. “Umami adds body,” Gary Beauchamp, who heads the Monell Chemical Senses Center, in Philadelphia, says. “If you add it to a soup, it makes the soup seem like its thicker–it gives it sensory heft. It turns a soup from salt water into a food.” When Heinz moved to ripe tomatoes and increased the percentage of tomato solids, he made ketchup, first and foremost, a potent source of umami. Then he dramatically increased the concentration of vinegar, so that his ketchup had twice the acidity of most other ketchups; now ketchup was sour, another of the fundamental tastes. The post-benzoate ketchups also doubled the concentration of sugar–so now ketchup was also sweet–and all along ketchup had been salty and bitter. These are not trivial issues. Give a baby soup, and then soup with MSG (an amino-acid salt that is pure umami), and the baby will go back for the MSG soup every time, the same way a baby will always prefer water with sugar to water alone. Salt and sugar and umami are primal signals about the food we are eating–about how dense it is in calories, for example, or, in the case of umami, about the presence of proteins and amino acids. What Heinz had done was come up with a condiment that pushed all five of these primal buttons. The taste of Heinzs ketchup began at the tip of the tongue, where our receptors for sweet and salty first appear, moved along the sides, where sour notes seem the strongest, then hit the back of the tongue, for umami and bitter, in one long crescendo. How many things in the supermarket run the sensory spectrum like this?
This is quite hilarious!
Technology Review writes:
Gas turbines powered much of 20th-century technology, from commercial and military aircraft to the large gas-fired plants that helped supply U.S. electricity. But these days it isnt the hulking machines in the labs museum that capture [Alan] Epsteins enthusiasm. Instead its a jet engine shrunk to about the size of a coat button that sits on the corner of his desk. Its a Lilliputian version of the multiton jet engines that changed air travel, and, he believes, it could be key to powering 21st-century technology.
Though the turbines blades span an area smaller than a dime, they spin at more than a million revolutions per minute and are designed to produce enough electricity to power handheld electronics. In the foreseeable future, Epstein expects, his tiny turbines will serve as a battery replacement, first for soldiers and then for consumers. But he has an even more ambitious vision: that small clusters of the engines could serve as home generating plants, freeing consumers from the power grid, with its occasional black- and brownouts. The technology could be especially useful in poor countries and remote areas that lack extensive and reliable grids for distributing electricity. A comparison to how the continuous shrinkage of the integrated circuit drove the microelectronic revolution is tempting. Just as PCs pushed the computing infrastructure out to users, microengines could push the energy infrastructure of society out to users, says Epstein.
Epsteins immediate goal, however, is to use these miniature engines as a cheap and efficient alternative to batteries for cell phones, digital cameras, PDAs, laptop computers, and other portable electronic devices. The motivation is simple: batteries are heavy and expensive and require frequent recharging. And they dont produce much electricity, for all their size and weight.
Newsweek has an interview with Evan Schwartz, the author of “Juice: The Creative Fuel That Drives World-Class Inventors.” Says Schwartz about inventors: “One quality that stands out: it’s the ability to find new problems that no one else even sees. The conventional view of inventors is, they’re good at solving problems. It’s really finding problems. Max Levchin was an expert in cryptography. Everyone thought there needed to be a secure way of paying people on the Internet. There were dozens of start-ups that failed because their solutions were so complicated. They required you to download a cryptography application onto your desktop and run the program every time you wanted to pay for something. People didn’t want to do that. So he came up with PayPal, where all the security features reside on a server, and the customer just sends an e-mail. He saw this problem in a completely different way. That’s what inventors do. They ask: how can I make this better?”
Dave Pollard writes: “would hazard a guess that, both in business and in our personal lives, ‘finding people’ is our most inefficient process, the one we waste the most time doing ineffectively, and the one we do the worst job at. This is another instance of ‘the cost of not knowing’ — Who is the best supplier to repair your furnace, or advise you on managing your business, who are the best people to go into business with, what is the best community of people to live in and with, and, of course, who is Mr. or Ms. Right to spend the rest of your life, or at least the rest of the week, in the romantic company of.”
Dave has some interesting ideas on how to make the process more effective.
Contrary to popular opinion, there is not a law that says you must answer every e-mail as it is received. In fact, this is a sure-fire way to kill your productivity and end up becoming a slave to e-mail rather than using it as a tool to accomplish your work on your terms. One simple way to do this is to schedule specific times of day to work on e-mail.
The most unproductive thing you can do when it comes to e-mail is to read the same messages over and over again. This has the effect of doubling, tripling, or even quadrupling your workload. Instead, you should read each message once, then decide what to do with it. Read-decide. Read-decide. This is the pattern of effective e-mail processing. The goal is to end up with an empty inbox daily or, at the very least, every couple of days.
The Economist writes that “the next thing in technology is not just big but truly huge: the conquest of complexity.”
Steven Milunovich, an analyst at Merrill Lynch, another bank, offers a further reason why simplicity is only now becoming a big issue. He argues that the IT industry progresses in 15-year waves. In the first wave, during the 1970s and early 1980s, companies installed big mainframe computers; in the second wave, they put in PCs that were hooked up to server computers in the basement; and in the third wave, which is breaking now, they are beginning to connect every gadget that employees might use, from hand-held computers to mobile phones, to the internet.
The mainframe era, says Mr Milunovich, was dominated by proprietary technology (above all, IBM’s), used mostly to automate the back offices of companies, so the number of people actually working with it was small. In the PC era, de facto standards (ie, Microsoft’s) ruled, and technology was used for word processors and spreadsheets to make companies’ front offices more productive, so the number of people using technology multiplied tenfold. And in the internet era, Mr Milunovich says, de jure standards (those agreed on by industry consortia) are taking over, and every single employee will be expected to use technology, resulting in another tenfold increase in numbers.
Moreover, the boundaries between office, car and home will become increasingly blurred and will eventually disappear altogether. In rich countries, virtually the entire population will be expected to be permanently connected to the internet, both as employees and as consumers. This will at last make IT pervasive and ubiquitous, like electricity or telephones before it, so the emphasis will shift towards making gadgets and networks simple to use.
UBS’s Mr [Pip] Coburn adds a demographic observation. Today, he says, some 70% of the world’s population are analogues, who are terrified by technology, and for whom the pain of technology is not just the time it takes to figure out new gadgets but the pain of feeling stupid at each moment along the way. Another 15% are digital immigrants, typically thirty-somethings who adopted technology as young adults; and the other 15% are digital natives, teenagers and young adults who have never known and cannot imagine life without IM (instant messaging, in case you are an analogue). But a decade from now, Mr Coburn says, virtually the entire population will be digital natives or immigrants, as the ageing analogues convert to avoid social isolation. Once again, the needs of these converts point to a hugely increased demand for simplicity.
The question is whether this sort of technology can ever become simple, and if so, how. This survey will analyse the causes of technological complexity both for firms and for consumers, evaluate the main efforts toward simplification by IT and telecom vendors today, and consider what the growing demands for simplicity mean for these industries.
ACM Queue has a special issue devoted to RFIDs. “A technology called RFID (radio frequency identification), which is relatively new to the mass market, has exactly this characteristic and for many people seems a lot like magic. RFID is an electronic tagging technology that allows an object, place, or person to be automatically identified at a distance without a direct line-of-sight, using an electromagnetic challenge/response exchange. Typical applications include labeling products for rapid checkout at a point-of-sale terminal, inventory tracking, animal tagging, timing marathon runners, secure automobile keys, and access control for secure facilities.”
In the marketing research context, blogs are a disruptive technology. Instead of having to generate data by way of surveys or focus groups with whatever artifacts the process introduces, blogs provide direct visibility into customers. Instead of having to connect potentially artificial samples back to the actual market, now you have to filter real market behavior, interpret it, and make sense of it. That presents two challenges to market research functions. First, market research staff have to develop new skills. For that, they would do well to pay attention to Dina. Second, management of market research needs to spend some quality thinking time about what to do with access to this new kind of market data.
The opportunity that blogs introduce into the marketing research equation is to create the opportunity to identify and run multiple micro-experiments in the market. Those that succeed get the resources to scale, those that fail generate some useful data and are quickly shut down. There are challenges, of course, especially given how quickly ideas spread in a connected world, but that should be offset by the speed with which experiments can be identified and run. Worth thinking about.
The Feature has an article by Mark Frauenfelder, which states that “two mobile phone components — antennas and frequency oscillators — have stubbornly refused the join the rest of the circuitry that has moved onto wireless transceiver chips. But research at the University of Michigan could lead the way to a single-chip solution.”
The next step, says Michael Flynn, who heads the wireless research group at UM, is to put both the slot antenna and the RF MEMS oscillator onto a single wireless transceiver chip. Besides making traditional mobile phones more useful, a single-chip transceiver could foster the development of smaller smart dust motes and tinier non-traditional mobile devices as well. Obviously, the idea of a phone small enough to fit inside your ear canal is repugnant to most people, but a single-chip transceiver will certainly lead to innovations no one has thought of yet.
News.com writes about the Broadcast.com founder:
Cuban’s ideas–like others that have materialized on his Web log–center around the emerging industry for personal video recorders (PVRs), such as TiVo, and video on demand (VOD). VOD is not as widely available as PVRs are, but the idea has shown some recent signs of life with a movies-on-demand deal between TiVo and Netflix, and with the VOD service–offering mostly obscure programming–of Akimbo.
Cuban’s first idea is a software program that takes advantage of the time TiVo subscribers spend watching their commercials fast-forward. Where subscribers now sit glued to the blur of fast-forwarding frames, Cuban suggests displaying a static advertisement.
A successful implementation of technology along those lines would come as a balm to broadcast advertisers frightened by the prospect that PVRs are eroding their audience. Already, TiVo has proposed interactive advertising features that would help compensate for the fast-forwarding phenomenon.
Cuban’s second idea is for software that would let people use the Internet and VOD services to piece together their own TV schedules and share them with friends.
The third idea, following the second, would let people emulate existing cable TV programming.
“Here is the one app that I think could really mess things up,” Cuban wrote. “It could really toss a wrench into things…Where the real trouble starts is as more TV shows and movies are available via VOD and the Net, then the programs will also be able to do a cost comparison. Is it cheaper to buy programs on your own and emulate your favorite network or buy the network?”
In his talk at Wharton, Brian Roberts made it clear that he believes his company’s future hinges on video-on-demand. That, he argues, is why the Disney bid made sense. On-demand allows customers to choose not only what they want to watch but also when. It also lets them control their viewing via functions such as pause and replay. Satellite TV, cable’s biggest competitor, offers abundant programming but not two-way communication.
For video-on-demand to beat satellite, it has to provide lots of programming, whether it’s Disney cartoons or, as Comcast has begun offering lately, the NFL Network. The NFL channel provides 10- to 15-minute replays of the highlights of all the prior weekend’s football games. Comcast also recently announced a partnership with Sony and MGM to offer their libraries of movies and TV shows via on-demand video.
Comcast has turned to trying to deliver the best services possible through that infrastructure. “We think that with this new platform, we have to reinvent television,” Roberts noted. “Television today is a one-way experience. It seems totally clear to me that the personalization of television is the future. Everybody wants to do what they want, when they want. And we happen to have a platform for that, where our competitor, satellite, doesn’t. So all of our energy is to give our customers, on demand, the ability to get as much content as possible.”
“With on-demand, we have servers with virtually unlimited content. We don’t care what you want to do, we just want you to do it a lot. And you can’t do any of that on satellite or broadcast.”
There is no doubting the need for computers in emerging markets. A digital infrastructure can help these nations address the pain points that plague personal life and business interactions better. For example, if India needs to ensure education for the 200 million youth of the country, computers can complement teachers to help students learn better. Computers can make businesses into real-time enterprises and thus make supply chains more efficient. Computing can help governments interact better with citizens. Even entertainment can be transformed with the availability of Massputers.
I dont think Massputers will be a reality in the quantities that are needed with either AMDs $249 device, Steve Ballmers $100 PC talk, or Microsofts cheaper (and limited) versions of Windows XP. To me, these solutions dont go far enough. All they do is protect the legacy businesses of these companies even as they experiment with ideas for the emerging markets. What they should really be doing is leveraging the lack of legacy in the emerging markets by reinventing the complete computing ecosystem and thinking of not computing but CommPuting. Here is what emerging markets need to build their digital infrastructure:
Network Computers: Yes, it is the old Larry Ellison idea. Where the Oracle chief went wrong was that he focused on replacing desktops in the developed markets and not on new users in the emerging markets. He also ignored the fact that rich client applications exist in large numbers and these will not all work in a browser. The Network Computers that I am talking about [see my recent Tech Talk series] are multimedia-capable thin clients which display virtual desktops from a centralised computing platform. It is possible to make these computers for about $50-60. Add in a refurbished monitor and you get Steve Ballmers $100 PC (or my Rs 5,000 PC). This can potentially run every application already written without modification and support client-side multimedia to do voice and video. Besides, it needs zero maintenance and so can be bundled by telcos and ISPs as part of a CommPuting bundle.
The Grid as Platform: Network computers need a centralised computing platform this is where the Grid comes in. The Grid is a collection of commodity hardware running Windows or Linux in Terminal Services mode. Think of it as Google, but also offering complete computing and storage (and not just search and mail). It is possible to start thinking of the Grid for massively large public computing because the broadband networks exist the world is awash in fibre, and wireless technologies are improving rapidly in terms of available bandwidth. The problem in most emerging markets is that the content is not available at the other end of the pipes. That doesnt need to change what is needed at the other end is the computing and storage part of the computer that we have on our desktop. The Grid also becomes the platform for software vendors and content providers to make available their offerings.
Utility Pricing: What is needed is not computers on installments (people are smart enough to look at the total cost of ownership), but subscriptions which can be started and stopped on demand. In addition, the mix of small payments via pre-paid cards is what has dramatically boosted mobile phone usage in India and online gaming in China. A similar model needs to be thought of for computing. This is the equivalent of computing in sachets. To a certain extent, as Ballmer pointed out, it is already happening via Internet cafes.
Tech 7-11s: The utility pricing argument also offers a corollary: think access, not users (as Sam Pitroda did when he launched the public call offices STD/ISD booths). Just like the 7-11 grocery stores that dot neighbourhoods across many cities, the need is for neighbourbood community computing centres. These need to go beyond the Internet cafes of today which only offer a computing device and Internet access. The Tech 7-11s would serve multiple purposes: offer a microGrid for computing, a distribution point for local broadband connections, and a training area for the next users so they can make the best use of computers.
Relevant Applications and Content: Even as the network computers and the grid address the affordability, and Tech 7-11s offer accessibility, there is a need to address the desirability issue. This has to come from locally relevant content and software applications. Think of information marketplaces to bridge information asymmetries among users, think of a library of business processes for enterprises, think of broadband content offering edutainment, think of a school-in-a-box or library-in-a-box for educational institutions. Aggregating what is already out there and making it available on the Grid is a good starting point.
By putting all of these point solutions together, it should be possible to make CommPuting available for Rs 700 ($15) per user per month including the computer (at home or work), the grid usage with gigabytes of storage space, broadband connectivity, basic set of software applications from the open-source world, and support. For educational institutions, the Rs 700 price point could be split across a few students, thus not only making it more affordable and but also ensuring that each student gets plenty of computing time.
It is this wholistic approach to reinventing computing which can transform emerging markets in the next five years, and bring in the next billion users. The competition here is non-consumption, the constraint is our own imagination. This is the big thing the industry has been waiting for — computings next Kumbh Mela.
Postscript: What should Microsoft and AMD do?
Microsoft should consider making Windows XP available for $1 (Rs 45) per computer per month on the Grid. Over three years, it makes half of what it otherwise charges for Windows XP. This is $36 more than what it makes today from users in the emerging markets! This will help strengthen the developer base and give independent software vendors and content developers who already have Windows-based applications and content an opportunity to generate revenues and expand further. It can also neutralise the potential impact that open-source software can have for users.
As for AMD, I think it will find a market for its PIC but it is likely to be not among the masses in the emerging markets, but possibly in selected niches in the developed markets (the second or third computer at home, libraries, internet kiosks, schools) where the simpler $249 device will finally fulfill Larry Ellisons network computing vision. Talk of unintended consequences!
NW Venture Voice has a post by Martin Tobias on his visit to India:
– The Indian economy is firing on all cylinders. Auto sales, pharma sales, real-estate, outsourcing and IT jobs are up double digits YonY.
– India’s growth is accelerating even as China comes on line.
– Indian companies are moving up the value chain, especially in IT. There is a consolidation going on at the top of the IT outsourcing business by Wipro, Infosys leaving the middle market and small players looking to product development for growth. Look for Indian companies to get into original branded software product development in a big way (look out America).
– The bio/pharma sector is probably hotter than the IT sector. A partner in a leading executive recruiting firm said he has triple the number of searches going on in bio/pharma versus IT.
– The VCs that only visit Bangalore and think the India story is only about BPO are missing the mark.
– The Indian government is serious about divesting state owned enterprises (the old bedrock). Twelve new ones are being offered in the next three months.
The most interesting one to me was the fact that every day in the Economic Times of India was another article on an Indian IT company moving up into product development. And details of all the government programs to support this. While the culture of development is significantly different from the culture of a call center, it is probably just a matter of time. The recruiter I was talking to said most of his “C” level hires are returnes from America and Europe. Salaries go MUCH farther in India. India produces more computer engineers than America. It is only a matter of time before this combination of American trained management and inexpensive raw talent starts to deliver really cool products.
While I didn’t spend alot of time looking for investments or talking with potential partner companies, it was clear to me that there is WAY more going on in India than most people are considering. I advise every start-up I work with to consider what their strategy to leverage India and China is. It can be as simple as outsourcing QA/testing or call center. It can be as complex as outsourcing all development and selling into the local markets. Whatever the strategy, the CEO who doesn’t leverage the growth and market efficiencies going on in India/China does so at his peril.
The Register explodes some of the myths in light of “Microsoft’s determination to demonstrate that Linux is less secure than Windows”:
– Windows only gets attacked most because it’s such a big target, and if Linux use (or indeed OS X use) grew then so would the number of attacks.
– Open Source Software is inherently dangerous because its source code is widely available, whereas Windows ‘blueprints’ are carefully guarded by Microsoft.
– Statistics ‘prove’ that Windows has fewer, less serious security issues than Linux, that Windows issues are always fixed, and that they are fixed faster.
InfoWorld writes: “Four big technology challenges face IT managers who work at small to medium-size businesses. And guess what? The challenges look a lot like the ones confronting large organizations: VoIP, SANs, Gigabit Ethernet, and security. Few IT shops, big or small, would be crazy enough to tackle all four issues at once. Together, all this new technology may have reached the critical mass necessary for SMBs to think seriously about a network overhaul.”
David Emberton writes:
Most people in the industry still think of broadband as a class of connection, whether it be fast DSL through Cable to a T1. But in reality, broadband as a generic term will soon become disconnected from the technical specifics, and come to represent a particular kind of content. A space beyond the hypertext web that consumers refer to by name.
Allow me to paint a (somewhat inadequate) word picture:
Its off-broadway, for TV.
In the same way that off-broadway plays are the poor/weird cousins of premium theater, some things are appropriate for regular TV broadcast, and some arent. Whether it be short, cheaply made, or interactive, theres just a certain class of content that lends itself to being browsed on a computer rather than watched on TV. The point is that broadband is definitely not just text websites delivered faster, or even text websites with a few bells and whistles added. Its TV-on-demand, but also on-a-budget.
Lets imagine it another way. If youve flown in the past few years, internationally at least, you will have encountered in-flight entertainment systems and movies on demand. These types of systems are typical of what Im describing except instead of full-length films, broadband (as a mental concept) will be associated with anything under 5 minutes or so. High budget productions might last longer, especially commercials like the Volvo V50 example, but on the whole it will be quick and easy to digest, and tied together with a kind of interface that we really havent seen yet.
So what can you do to take advantage of the emerging broadband market? Start studying short features; DVD extras, cartoon shorts, interactive CD-ROMs. Figure out how to make disparate pieces of DV content fit together and interlink in ways text pages cant. Find yourself writing down scripts, counting words, and figuring out what TV and movie people have already learned about stringing pieces of video together.
Mostly though, pay attention not to what software companies and entrenched geeks think, but what teenagers and children are actually saying and doing. Young consumers are key, and what theyre saying is thatll end up on broadband.
Slashdot has a discussion on which VNC software is best. It is relevant for us since VNC is one of the ways to deliver a virutal desktop from the server grid to network computers.
News.com writes about JBoss’s release of the open-source workflow engine:
Baeyens said this release of jBPM does not have a GUI (graphical user interface), as the development team initially concentrated on creating a powerful work flow engine. A version of the product with a GUI will be released in the first quarter of 2005.
The market for work flow engines is fragmented, with no single vendor able to handle all process requirements, according to a report from research company Gartner called “Creating a BPM and Workflow Automation Vendor Checklist.” JBoss hopes to exploit this fragmentation, and its developers claim that jBPM will be cheaper to implement than other market offerings and has been designed to handle all requirements.
Wil van der Aalst, a professor at the Eindhoven University of Technology and the author of various books on work flow management, said work flow vendors have traditionally had a hard time making an impact in the market, both because of cost and a lack of understanding by management.
“It is very difficult to explain a work flow system to management, as it doesn’t solve just one problem,” van der Aalst said.
At present, work flow engines are mostly used by large organizations such as insurance companies and banks, according to van der Aalst. But he points out that work flow engines are often a component of other systems, such as those devoted to ERP (enterprise resource planning) and product data management, as well as call center software.
Van der Aalst said JBoss may also face competition from within the open-source community, as there are more than 20 open-source work flow projects, including YAWL, an engine on which he is collaborating.