The first era of server-based computing (mainframes and mini-computers) required very expensive servers. The clients were just dumb terminals. In that period, connectivity between the terminals and the server was through low-speed serial ports limiting the data transfers. The era of client-server computing required the use of expensive (from the point of view of the emerging markets) desktops. The Internet computing paradigm offered a return to server-based computing but with the need for a Windows desktop to support applications like Outlook, Lotus Notes and MS-Office. The proprietary file formats made the shift hard and infeasible.
So, what is different that can drive server-based computing this time around? To answer this question, we need to examine some of recent developments in the world of technology.
1. Linux and Open Source: In the past few years, Linux has had a lot of success on the server side. It has made a big impact on servers, especially in the web server segment with Apache. Today, Linux has a complete suite of applications available: File Server, Print Server, Mail, Proxy, Firewall, Anti-Virus and Anti-Spam support, Web Server and even a J2EE-compatible Application Server (JBoss).
2. Open Office: Recent releases of Open Office can read and write MS-Office formats. This does not mean that every file can be read perfectly. What it means is that most files can be read with limited loss of formatting information and files can be written in DOC, XLS or PPT formats for exchanging with MS-Office users.
3. Desktop Applications: Applications like Mozilla (for web browsing) and Evolution (for email, calendar, contacts and to-do lists) provide the combined power to create a Linux-based Thin Client. Applications like Wine also provide the ability to run some specialised Windows applications, should the need arise.
4. Internet: The Internet provides a platform for distributing software updates easily to the Thick Servers on enterprise LANs. This becomes important because in the world of open source, updates keep happening on a regular basis and therefore these need to be distributed to the servers. The Internet also provides the infrastructure for moving data across multiple locations in an enterprise.
5. Standards: First HTTP and HTML, and now XML and SOAP, are enabling the creation of software components, much like Lego blocks. While Web Services are primarily for application-to-application integration, they provide the platform for building enterprise software components that can be run off servers and distributed via the Internet.
6. LAN Speeds: 100 Mbps LANs are the norm in enterprises today, with technology pushing towards 10 GBps Ethernet. So, sending screens for the Thin Clients across the network will no longer clop up the LAN as used to happen in the early days of using X Windows.
7. Microsofts Software Assurance program: Microsofts July 31 deadline for companies to move to a subscription basis for its software and the consequent increase in costs for companies has encouraged IT managers to experiment with alternatives. Of course, in the developed markets, the cost of alternatives is small because there is re-training needed for the staff. This is not necessarily the case in emerging markets where many users may actually be first-time users.
Tomorrow: Recent Developments (continued)