TECH TALK: Server-based Computing: A Brief History of Computing (Part 2)

The commoditisation of the PC through the attack of the clones in the 1980s and 1990s ensured that prices kept falling and adoption kept rising. During the 1990s, PCs running Windows took over the desktops, and Servers running Windows NT and Unix much of the server-side. Unix workstations made their presence felt where there was a need for additional horsepower on the desktops. Suns mantra for much of the 1990s was: the network is the computer.

Desktops continued to became more and more powerful following Moores Law, and software got more and more bloated. Into this world of the mid-1990s, seemingly out of nowhere, emerged the Internet. Built on open standards (TCP/IP, HTTP and HTML), the Web Browser started becoming the de facto front-end with web servers as the back-end for serving data and applications. This was client-server with a twist: the client only had to have a web browser, and didnt need to do much processing.

There were even efforts to create network computers which were stripped down computers. These efforts failed because they forgot one important fact: Microsoft had by then taken control of the desktops, and more critically, the file formats for document exchange (DOC, XLS and PPT). It was no longer possible to be part of the world community and not run Windows on the desktop.

At the same time, the world was getting awash in bandwidth. Telecom companies laid fibre all over the place and multi-megabit connectivity between locations became commonplace across locations in organisations. The great wall between speeds available on the LAN and the WAN was falling. It didnt make a difference where information or applications resided. As long it was out there on the network, it was possible to access it. The network had truly become the computer.

So, then, heres the anomaly. Even as computers got more and more powerful and networks become more and more pervasive, the processing power available on the desktop far exceeded what most users needed. Most applications came with a Web-based front-end. The mix of what users needed on the desktop boiled down to Email, Instant Messaging, Web Browser and the desktop productivity applications. The first three were platform-independent, but Microsoft had the lock on desktop through its MS-Office suite.

This has meant that computers (thick desktops) now come with the processing power of mini-computers of the past and cost about USD 700, and need Microsoft Windows and Office which cost USD 500. At no stage does the CPU utilisation on the desktop cross a small fraction of whats available. These economics have meant that while the developing countries of the world have embraced computing whole-heartedly to near-saturation levels, the emerging markets of the world have been hampered because of the dollar-denominated pricing. The result: low adoption rates and high software piracy in countries like India and China.

Until recently, there didnt look like there was any way to beat the system. Fortunately, a number of recent developments offer hope for a return to a variant of the earlier era of server-based computing. Therein lies the route to create a computing mass market for the next 500 million users.

Tomorrow: Recent Developments

Published by

Rajesh Jain

An Entrepreneur based in Mumbai, India.