Elections: The Andhra Pradesh Verdict

The end came quick. The use of Electronic Voting Machines in the Indian elections meant that in a few hours after the counting process began the decade-long regime of Chandrababu Naidu was over. Naidu went the way of Digvijay Singh in Madhya Pradesh in December last year – both were voted out of power after two terms (ten years) in power. A year ago, they were both seen as the two model Indian chief ministers. So, what went wrong?

I will discuss the national implications later in the week. In a day, we will know the national verdict in the Indian elections – all indications are for a hung Parliament. This has already spooked the stock markets, which fell 4% on Tuesday after a 2% fall on Monday, and are down more than 10% since in the past few weeks.

Back to the question. What went wrong for Naidu, and before him Digvijay Singh? In two words, rising expectations. While there is an anti-incumbency element (people’s desire for a change), there’s much more to it. People want a lot more and a lot faster. They want a basic quality of life that most of the Indian governments have still not been able to deliver. Development for the most part has been uneven. In a democracy, there is “one person, one vote.” And sometimes, we living in the cities forget that there is another India that has barely changed. [Read my “Rajasthan Ruminations” written after a visit earlier in the year.] Education, Electricity, Water, Opportunities – we are still not able to provide these to the majority of Indians. And the elections are the only time they can have their say.

One positive outcome of the elections will, hopefully, be that India needs to develop faster, not slower. Politicians and those in power have five years to deliver. As a commentator on TV put it, “Bharat has sent a message to India” – meaning rural and the less privilged in urban areas are telling the elite that they do not want to be left behind.

Maybe, I am reading too much into the elections. But, I think the time has come for a transformation in rural India – that is the core which can also bring better quality of life in urban India by not just stopping but also reversing rural-to-urban migration. This is where Atanu’s RISC (Rural Infrastructure and Service Commons) model comes in. I hope some of the people in power read the paper and execute on it.

India needs change not between two generations, but between two elections. That is the only way to guarantee success at the polls!

Some other opinions on the Andhra Pradesh elections:

The Hindu: “After nine years in office, the Chandrababu Naidu Government has been emphatically voted out of power because it neglected basic issues relating to electricity, irrigation, unemployment, education, and inadequate social security for farmers and artisans. The character of the verdict makes it clear that much more than an incumbency disadvantage was involved. What Andhra Pradesh experienced was a powerful negative vote against the imbalance between World Bank-led models of economic reform and the imperatives of welfare in a society where deprivation is a non-shining mass reality. Four consecutive years of drought, mismanagement of the relief measures, and apologetic implementation of anti-poverty programmes such as `Vegulu’ only compounded the ruling party’s woes. The strident negativism of the TDP’s campaign backfired badly.”

Sanjaya Baru writes in The Indian Express:

Naidus vision will remain APs vision, but the Congress party will now have to re-work the political route to that vision.

Congress party spokesman and thinker Jairam Ramesh put it well when he told The Financial Express that in policy terms the Congress will offer Naidu Plus to AP, not Naidu minus. That is, the good work done in IT, in bio-tech, in institution building, in urban development, PLUS focus on areas he had neglected like irrigation, agricultural credit and rural infrastructure.

{The likely new chief minister] YSR Reddy will be Naidu plus Digvijay, says Ramesh, pointing to the fact that Madhya Pradesh Chief Minister Digvijay Singh had, in fact, focussed on the rural development to the neglect of power and roads, while Naidu is accused of focussing on power and roads to the neglect of rural development.

This, however, is easier said than done. The state government needs financial resources to deliver and the coffers are empty. Reddy will have to be more imaginative and less populist than he has been so far in crafting a policy for sustaining the states development momentum.

Mega-Gates, Mega-Bytes and Mega-Bits

A friend suggested a presentation by Duane Northcutt (Vice President of System Architecture Silicon Image) on how the three forces of cheap silicon, storage and bandwidth are changing computing. Duane discusses a sever-centric architecture – a “virtualised computational environment”, with stateless thin clients. Duane earlier worked at Sun on the Sun Ray. Some of his points:

Beginning of New Era
– New cost/performance points in storage and communications
– Allows/demands re-thinking of old tradeoffs
– Offers new cheap commodities to burn in order to optimize some other factors
– Retain the good parts of the current model, but eliminate (or reduce) the bad and ugly bits
– The next phase in the evolution of computing will be dominated by communications technology

Next-Generation Architecture
– Repartition the system only user accessed peripheral devices at end-points, no state/computing
– Consolidated computation and storage for greater efficiency and accessibility
– Global, fast, interactive access via ubiquitous display consoles (e.g., telephones)
– A sea of anonymous computers (including laptops)
– Simplified/automated system administration/ management

– Shifts underway in the forces that shape computing
– Computing defined by Moores Law giving way to new model driven by bandwidth
– Storage can make up for the lack of unlimited, ubiquitous bandwidth
– Enables new architectural model
— Consolidation of clients (through remote display technology)
— Transparent migration (through virtualized computing environments)
– Shift provides major opportunity for telecoms provide service, not just connectivity

A New Military Theory

WSJ has a fascinating analysis on suggestions by Mr. Thomas Barnett to split US forces into two in keeping with the new world order:

Mr. Barnett’s military is a far cry from the shape of today’s armed forces. Instead of a single force to wage wars and rebuild nations, Mr. Barnett envisions two. The first, which he dubs “Leviathan,” would be hard-hitting, ready to take on conventional foes such as Saddam Hussein on a moment’s notice. The second, more unconventional force of “System Administrators” would focus on bringing dysfunctional states into the mainstream through the type of nation-building operations seen in Iraq, the Balkans and Eastern Africa. It wouldn’t only mop up after wars but would travel the world during peacetime building local security forces and infrastructure.

In Mr. Barnett’s world, countries are divided into two categories. His “core” countries are part of a global community linked by trade, migration and capital flows. Europe, the U.S., India and China fall into this group. Then there are “gap” countries that either refuse to join the global mainstream (such as Saudi Arabia and Iran), or are unable to because they have no central government or are struggling with debilitating crises (such as Iraq, Afghanistan, Somalia, and much of sub-Saharan Africa).

“The “gap” is a petri dish of grief, repression, terrorism and disease,” says Adm. Cebrowski. “And 9/11 shows we can’t wall ourselves off from it.”

To join those worlds together, Mr. Barnett envisions two different military forces. The Leviathan force consists of stealthy submarines, long-range bombers and highly trained soldiers who are “young, unmarried and slightly p- off,” Mr. Barnett says.

The System Administrator force is named for the technology wonks who run corporate computer networks. This force is focused on training “gap state” security forces, stamping out insurgencies and rebuilding basic infrastructure such as legal systems and power grids.

That force would include lightly armored soldiers, the Marine Corps and officials from the State, Justice and Commerce departments along with the U.S. Agency for International Development. Its troops would be older and more specialized than the Leviathans. The purpose of the System Administrators would be to bring order to a country, but the force would also be strong enough to defend itself.

This concept relies on a key assumption: The power of the U.S.’s nuclear and conventional arms, plus increasing global economic interdependence, has made war between superpowers a thing of the past. It also assumes that wars with less-powerful states are less likely to occur.

Instead, the U.S. is more likely to find itself embroiled in dysfunctional parts of the world battling terrorists and rebuilding failed states, something it doesn’t do very well.

Mr. Barnett bets that advanced technologies will allow the U.S. to fight wars with smaller, high-tech formations. Some military analysts, such as retired Marine Corps Lt. Gen. Paul Van Riper, think that’s naive. Gen. Van Riper, who plays the enemy in Pentagon war games, says enemies could too easily hide from the Leviathan force’s sophisticated surveillance. He also thinks the System Administrator force wouldn’t be strong enough to defend itself in places such as Fallujah.

Would be interesting to get John Robb’s views on this.

A March 2003 article by Thomas Barnett on “The Pentagon’s New Map“, wherein he explains why the US is going to war and why it will keep doing so. Barnett has just published a book by the same title.

Dutch Auctions

NYTimes writes about Dutch Auctions in the context of Google’s forthcoming IPO:

Google is going to use a variation on what is known as a Dutch auction, called that because it was created in the early flower markets of the Netherlands to sell multiple identical items. In the classic Dutch auction, a seller indicates how many items are available for sale and sets the minimum bid price.

Bidders indicate the number of items they want to buy and the price per item they are willing to pay. All winning bidders pay the same price per item – which is the lowest successful bid, called the clearing price. Those who bid above the clearing price, however, earn the right to buy the number of items they want, while those who bid at the clearing price have to divide the remainder.

Google’s shares would be sold in a modified Dutch auction because it has reserved the right to set the final sale price, the allocation of shares and other auction terms. It said in its public offering statement that its goal was to eliminate the first day “pop” in prices that was built into most initial stock offerings.

[A] potential problem with Dutch auctions is that investors have an incentive to bid higher than the fair value of a stock so that they can be assured of getting shares to buy. If only a few people bid high, they would still only pay the market clearing price determined by the vast number of presumably more rational investors, since the price would be set by the lowest bid. But if lots of investors take up this strategy, the price would be driven above sustainable levels.

Another factor, called the “winner’s curse,” can potentially lead to depressed prices after the initial offering. The top bidders may realize that they bid more than what other bidders believe the shares are worth. If the winners start to worry about the share price paid, they may sell shares immediately after the auction, causing a drop in the price. For more thoughtful investors, the fear of the winner’s curse could lead them to moderate their bids in advance – a move that might lead to lower prices in an auction than in a traditional offering in which the price is set by investment bankers.

Adds the Economist:

Paul Klemperer, an economics professor at Oxford University and a designer of Britain’s 3G auction, explains in a new book (Auctions: Theory and Practice) that the details of auctions can make all the difference. In essence, auctions can fail in two main ways: by setting a price that is too high, or one that is too low. The latter failure has been more common recently. Collusion between bidders can reduce the price paid, as happened in one American auction of radio spectrum in the 1990s. Alternatively, the costs of entering an auction can be prohibitive, as with one British television franchise. The government had imposed such high costs by requiring detailed programming plans that only one bidder bothered.

Given the large number of expected bidders and the relatively low costs, Google’s IPO runs a bigger risk of setting a price that is unsustainably high. That would be the result of what economists call the winner’s curse: high bidding by naive punters that allows them to win an auction, but only by overpaying.

Mr Klemperer says that Google needs to do more to save its auction from this fate. Ensuring that small investors have the same information provided to big investors would help. So would simply explaining to unsophisticated bidders how the Google auction will work.

One concrete idea proposed by Mr Klemperer is to start by auctioning a small fraction, say 10%, of the shares to institutions. This would allow more sophisticated investors to give their view of the fair price, before unsophisticated individuals place their bets. Another option is to hold an English (ascending-bid) auction in which institutional shareholders’ bids can be observed. Again, inexperienced investors could keep a close eye on what the smart money is doing, and adjust their bets accordingly. This could diminish the risk that over-eager punters bid up the price too high, which would put a damper on the use of such auctions in the future.

Intel’s New Strategy

Dana Blankernhorn writes:

Intel’s decision to turn away from straight-ahead development of its Pentium IV and Xeon lines, in favor of putting all its eggs in low-power chips, is a big, big deal.

For starters it’s another illustration of Moore’s Second Law, which holds that as chips get more complex they get more expensive to make. Even Intel can’t do it all any more.

Intel has decided to go all-out on low power. The laptop metaphor, and all that follows it, means that Intel is leaving the electrical grid behind. Instead the grid will become back-up power. Batteries will now matter more.

This is also an important endorsement of Always-On. I’ve written here many times about how wireless LANs, as an application platform, need a battery-powered “black box” at their center.

This is where Intel is moving, toward a low-power platform that is still expandable. The “PC” at the heart of its future can’t be a desktop, because this announcement has closed that off, and it can’t be a laptop, because laptops aren’t expandable.

Some new central reference design is needed. More likely, several such designs will come to contend to the future. I think the interface-free black box, separating the interface from the workings of the device, is a contender in the home market, but we’ll see.

What Intel also needs to do is $50 thin clients and sell them for $1-2 a month. Computing has to become a service and Intel, along with Microsoft, needs to recognise this.

TECH TALK: Two Blog Years: The Wider View

I have always liked to write. During 1995-97, I wrote a fortnightly print column (about 1500 words) for Express Computers. I stopped when I realised that I was starting to repeat myself since the Internet wasnt growing as rapidly and I had already said much of what I wanted to say! Perhaps, the format wasnt that interactive which enabled me to get feedback which is very important for spurring new thought and breakthrough ideas. I started writing again in November 2000 with the Tech Talk series on Tech Samachar. That was a time when I found that my broader technology industry knowledge had atrophied over the years as I had become more and more focused on the portal business. The writing was a way for me to start reading again. The blog has just helped amplify this effort.

When I started, I was clear about one thing: that the blog would be updated daily. And except for two weekend days just after I started, that has been the case. It is a lesson I learnt from the IndiaWorld days: you have to become a daily part of the life of your readers. This is also what I tell every person who I recommend blogging to have something new everyday. It is not difficult; it just requires a discipline and determination. It daily blogging is a commitment we make, our readers will reciprocate by making a daily visit. Given that there is so much happening everyday and our minds are constantly active and thinking, it is not a very difficult thing to do. Near-ubiquitous connectivity, even when one is travelling, makes posting a trivial exercise.

What has spurred the writing revolution for me is the ease of the blogging tool. Using MovableType is very easy. I am not dependent on any other person for posting to the blog. Once the blog has been set up, no technical expertise is needed. This simplicity of the blogging tools has laid the foundation for the two-way web that we are seeing emerge around us.

RSS and the Info Aggregator have been tremendous productivity enhancers. In the beginning, I used to go to various blogs via the browser. There was a finite limit to how many I could visit. Now, with the Aggregator, I am processing 10x the quantum of content in just about the same time. This is why I believe that RSS is where HTML was in 1994 at the cusp of a revolution, and why it holds great potential to become the foundation of the new Publish-Subscribe Web.

One of the by-products of blogging has been the new friends and connections that Ive made. I have met a few of them personally during my rather infrequent travels. More often than that, the interactions have been via email. I do try and respond to every mail or comment and feedback, often asking for some details about the person. This is why I believe that the weblog is a non-linear way to make connections we can only meet so many people during our life, but via the weblog we can build an exponentially increasing network. For me, the blog and its readers are the social network.

Tomorrow: Some To-Dos

Continue reading TECH TALK: Two Blog Years: The Wider View