The idea of simplifying computing was echoed a couple weeks ago by Jim Smith, a general partner with Mohr Davidow Ventures. In an article on Always-On Network, Jim wrote:
It may come as a surprise to some people that the cost of operating a computing infrastructure now dominates the cost of acquiring it. At the same time, while we’ve made huge leaps in all areas of computing we struggle to make use of those advancements.
For example, we’ve proven that we can build 10GHz microprocessors, but we leave a massive portion of those cycles unusable. We can build petabyte-scale storage systems, but absorb a huge chunk of that storage with often inaccessible data. We can connect to one another at 10Gb per second, but don’t know what information to exchange.
The key to lowering operating costs, reducing complexity and making better use of technology advancements is application-specific computing.
The low cost of computer components today allows system vendors to target systems to specific business processes. By reducing the general-purpose nature of these systems, configuration demands can be minimized. By reducing the base of software that resides on a machine, there are fewer knobs with which malicious users can abuse infrastructure. By reducing the number of applications that share infrastructure, the likelihood that applications will corrupt one another’s data stores is greatly reduced. A byproduct of application-specific systems is the opportunity to more efficiently use computing resources and vastly improve the performance of those systems.
Consider how we use toasters and personal computers. When you unbox that new toaster, you can plug it in, pop some toast in, and push the button. Contrast that with when you unload a new desktop. You set it up in a few hours and constantly update the software. Mind you, the overwhelming majority of that software you will never use. Malicious users floating through the network may use it, but you likely won’t.
When the toaster doesn’t perform up to expectations, you throw it out and buy a new one. When the computer (loaded with all types and flavors of applications) reaches a breaking point, the specter of transitioning to a new machine is almost always (and rightly so) cause for hesitation.
A crucial aspect of application-specific computing is raising the level of abstraction at which enterprises operate computing machinery. Were a toaster to operate like a general-purpose machine, a user would be forced to specify the temperature at which to toast the bread and the amount current to supply to the heating coils, and time-share that energy with the DVD player. Instead, the single-purpose nature of the machine allows it be a simple choice between light and dark.
While Don Norman comes at the issue of the complexity of computing from the consumer viewpoint, Jim Smith addresses it from the enterprise side. Their views may have come a few years apart, but they underline a common thread that of making computing less complex, more manageable, and reducing the total cost of ownership. Add to this the growing demand for affordability from the next billion users in the worlds developing countries, the growing use of open-source software in the creation of sites like Google and Yahoo, the rapid proliferation of broadband networks. This is the backdrop in which the resurgence in interest in network computers needs to be evaluated. So, the question that arises is, are the conditions any different now to ensure the rebirth and success of the network computer?
Tomorrow: The World Today
Tech Talk Network Computer+T