Video Games Technology

Writes IEEE Spectrum in an article on ID Software’s technology for video games, and how it has pushed the envelope on the desktop:

n the mid-1990s, Carmack felt that PC technology had advanced far enough for him to finally achieve two specific goals for his next game, Quake. He wanted to create an arbitrary 3-D world in which true 3-D objects could be viewed from any angle, unlike the flat sprites in Doom and Wolfenstein. The solution was to use the power of the latest generation of PCs to use BSP to chop up the volume of a true 3-D space, rather than just areas of a 2-D plan view. He also wanted to make a game that could be played over the Internet.

For Internet play, a client-server architecture was used. The serverwhich could be run on any PCwould handle the game environment consisting of rooms, the physics of moving objects, player positions, and so on. Meanwhile, the client PC would be responsible for both the input, through the player’s keyboard and mouse, and the output, in the form of graphics and sound. Being online, however, the game was liable to lags and lapses in network packet deliveriesjust the thing to screw up a fast action game. To reduce the problem, Id limited the packet delivery method to only the most necessary information, such as a player’s position.

“The key point was use of an unreliable transport for all communication,” Carmack says, “taking advantage of continuous packet communication and [relaxing] the normal requirements for reliable delivery,” such as handshaking and error correction. A variety of data compression methods were also used to reduce the bandwidth. The multiplayer friendliness of the game that emergedQuakewas rewarded by the emergence of a huge online community when it was released in June 1996.

Sun’s LAMP Initiative

Sun switches on LAMP initiative – Computerworld:

Going forward, Sun hopes to take advantage of the wide popularity of open-source software such as Linux, Apache Webserver, MySQL database and PHP scripting language to drive new hardware, software and services sales, said Jonathan Schwartz, Sun’s recently appointed executive vice president of software.

In a teleconference with analysts today, Schwartz outlined a new Sun initiative called SunLAMP (for Linux, Apache, MySQL and PHP) that’s designed to pursue opportunities at the low end of the market.

Sun Microsystems Inc.’s announcement of a new entry-level Linux server yesterday is only the opening salvo in an aggressive new Sun effort to capture market share in the desktop and edge-of-the-network applications market.

Going forward, Sun hopes to take advantage of the wide popularity of open-source software such as Linux, Apache Webserver, MySQL database and PHP scripting language to drive new hardware, software and services sales, said Jonathan Schwartz, Sun’s recently appointed executive vice president of software.

In a teleconference with analysts today, Schwartz outlined a new Sun initiative called SunLAMP (for Linux, Apache, MySQL and PHP) that’s designed to pursue opportunities at the low end of the market.

“LAMP is already running on 100% of the Linux servers out there. All of these technologies are already well developed,” Schwartz said.

Sun hopes to tap this interest to position itself as a viable and low-cost option for edge server applications and as an alternative to Microsoft Corp.’s desktop and client technologies, Schwartz said.

TECH TALK: Tech’s 10X Tsunamis: PCs for USD 100 (Part 2)

The way to get USD 100 PCs is to recycle them. As the users in the worlds developing markets buy new computers, their old computers need to find their way to the worlds emerging markets. Everything in a PC is built to last. Computers, in fact, have a lifetime much beyond the 3-4 years that weve been using them for. Look at cars. They are used, and used, and used for 10 years or more. The users may change, some parts of the car may change, but they are not junked to the Recycle Bin after a few years.

The problem with using older computers lies not in the hardware, but in the software. The older software is removed from service every few years, leaving little choice but to upgrade because the newer versions require more firepower memory, processing power and storage. Think about it for a moment. To use a 3-year-old PC, you will need 4-year-old software to ensure adequate performance. But can you buy older versions in the market? We cannot get Microsofts Windows 95 or 98, of Office 97. In fact, come next April, and the only Windows version available will be XP. No prizes for guessing the kind of hardware one needs to run that.

What compounds the situation is the cost of the software. Microsoft needs to be paid more than USD 500 for its combination of Windows and Office. Thus, the base cost of the computer for hardware and software exceeds USD 1,000 (Rs 50,000).

This is the game which the Intel-Microsoft nexus has perfected in the past 15 years. This is the nexus which needs to be broken if the worlds emerging markets want computers to penetrate as deeply as they have elsewhere.

The solution to the software lies in using old PCs as Thin Clients in office, and at home. What this entails is a shift to server-based computing the use of Thick Servers. What can make this possible now is the bandwidth revolution of the past few years.

The distance between the Thin Client and the Thick Server can be measured in hundreds of metres, unlike the few centimeters of the past. In other words, one had to use Thick Desktops because the LAN/WAN speeds were not great enough to be able to support a large number of Thin Clients users concurrently. With 100 Mbps LANs and the coming Gigabit Ethernets, this is no longer a concern in the enterprise. From the home users point of view, cable modems can provide a reasonable bandwidth to the home, as do wireless LANs. Both of these enable the server to be located remotely.

By using Linux and applications like KDE (desktop), Evolution (email and calendaring), Mozilla (web browser) and OpenOffice (word processor, spreadsheet and presentation), it is now possible to eliminate the need for proprietary and expensive applications on the desktop. For example, the most recent release of OpenOffice have made it good enough to open most of Microsofts Word, Excel and Powerpoint files, making it possible to notrequire the use of Microsoft Office for reading and writing documents in the de facto standards. This is the corner turn for creating a viable alternative to the Microsoft monopoly on the desktop.

Thus, recycling older computers and converting them into Linux Thin Clients can bring down the cost of the desktop to USD 100 or less. This is the 10X tsunami the cost of computing brought down by a factor of 10 will make the computer affordable to a whole segment of users who previously could never been imagine using one. This is at the heart of the revolution which needs to cut across the worlds emerging markets.

Additional Reading:

  • Tech Talk: The Digital Divide: USD 100 Computer (April 11, 2002)
  • Tech Talk: Server-based Computing (July 1-12, 2002)
  • Tech Talk: Indias Next Decade: A New Mass Market [1 2 3] (May 13-15, 2002)

    Tomorrow: Tech Utility