TECH TALK: The Network Computer: What Is It?

Wikipedia has this to say about the network computer:

A network computer is a lightweight computer system that operates exclusively via a network connection. As such, it does not have secondary storage such as a hard disk drive it boots off the network, and it runs applications off the network, possibly acting as a client for an application server. During the mid to late 1990s, many commentators, and certain industry players such as Larry Ellison, predicted that the network computer would soon take over from desktop PCs, and everyone would use applications over the internet instead of having to own a local copy. So far, this has not happened, and it seems that the network computer “buzz” was either a fad or not ready to happen.

The idea actually goes back a long way however, back to the text-only dumb terminal, and later to the GUI of the X terminal. The former needed no software to be able to boot, everything was contained in ROM, and operation was simple. The latter requires some files to boot from the network, usually using TFTP to get them after obtaining an IP address via DHCP and bootp. Modern implementations include not only the X terminal, but also the Terminal Server in Microsoft Windows 2000 and XP, and others. The name has also evolved, from dumb terminal to network computer, and now to thin client.

Webopedia adds:

A computer with minimal memory, disk storage and processor power designed to connect to a network, especially the Internet. The idea behind network computers is that many users who are connected to a network don’t need all the computer power they get from a typical personal computer. Instead, they can rely on the power of the network servers.

This is really a variation on an old idea — diskless workstations — which are computers that contain memory and a processor but no disk storage. Instead, they rely on a server to store data. Network computers take this idea one step further by also minimizing the amount of memory and processor power required by the workstation. Network computers designed to connect to the Internet are sometimes called Internet boxes, Net PCs, and Internet appliances.

One of the strongest arguments behind network computers is that they reduce the total cost of ownership (TCO) — not only because the machines themselves are less expensive than PCs, but also because network computers can be administered and updated from a central network server.

Sun too has said since its inception that the network is the computer. There is something appealing about the idea about low-cost, simple computers connected to a centralised computing platform. The network computer has had many names thin clients, diskless workstations, information appliances. It is one of these enduring ideas in computing that refuses to die and keeps floating back every few years.

The world of today is very different now as compared to the mid-1990s when Larry Ellison first proposed the idea of a network computer. To understand if the network computer can succeed in todays world, we first need to travel back and see what went wrong when the network computer was first introduced.

Tomorrow: Ellisons Ideas

Continue reading TECH TALK: The Network Computer: What Is It?

Built-to-Flip

Business 2.0 writes on hte new approach to creating companies: “Oddpost is part of an emerging breed of here-today, bought-tomorrow startups that are sprouting with minimal funding, flowering briefly, and being gobbled up by far bigger companies. In many instances, these built-to-flip outfits forgo — or sometimes can’t get — money from venture capitalists. They instead create shoestring operations focused on the rapid development of narrow technologies to plug gaps in existing product lines or add useful features to existing products. Then they look to a deep-pocketed patron to scoop them up.”

Azul’s Specialised Chips

WSJ writes about Azul, which has been founded by Stephen DeWitt who earlier headed Cobalt, which was acquired by Sun for $2 billion:

Departing from an industry trend toward standard chips, Azul Systems Inc. says it has packed the equivalent of 24 microprocessors on a single piece of silicon. Many companies are offering or developing such “multicore” chips, including International Business Machines Corp. and Intel Corp., but started out by squeezing two to four processors on each chip.

Azul’s chips are tailored for software programs that are written using a new generation of programming technologies, including Java from Sun Microsystems Inc. and Microsoft Corp.’s .NET. Azul plans to offer special-purpose server systems — in sizes ranging from 96 to 384 processors — that it believes will be much more efficient and powerful than existing machines for running such software.

The target audience for Azul’s new gear is corporate managers who are struggling to estimate how many servers to buy. Where each of those systems is typically assigned to run a single program or two, Azul’s machines, by contrast, are designed to handle changing workloads from many programs.

“We wanted to fundamentally eliminate the issues of capacity planning around computing,” Mr. DeWitt said.

Azul was largely inspired by the evolution of data-storage systems, Mr. DeWitt said. Where companies used to buy storage hardware from their computer vendor — which was mainly designed to work with its products — technology standards emerged in the 1990s that allowed storage systems to hold files of any type that come from nearly any kind of computer.

New programming technologies, such as Java and .NET, use a layer of translation software, called a virtual machine, that allows an application program to run on multiple kinds of computers and operating systems. Though programs based on such technologies are a fraction of the software companies use today, Mr. DeWitt cites estimates that 80% of new programs by 2008 will be based on virtual-machine approaches.

Tech Review 100

Technology Review presents its fourth class of 100 remarkable innovators under 35 who are transforming technologyand the world.

It would be interesting to prepare a list of Indians working in India who are doing the same. I am sure there is a lot of innovation happening in India, but it hasn’t bubbled up yet. Any suggestions?

Transitive’s Emulation Sensation?

Technology Review writes: “A startup claims it has created software that lets programs run on any operating systemand any processorwithout slowing down. Is the hype for real this time?”

Software emulatorssoftware that allows another piece of software to run on hardware for which it was not originally intendedhave been an elusive goal for the computing industry for almost 30 years. The ability to port software to multiple hardware configurations is something companies such as IBM, Intel, and Sun Microsystems are constantly working on. Software emulators do exist today, but most are narrowly focused, allowing one particular program to run on one other processor type. Sometimes, performance suffers with the use of an emulator.

It was with a shock, then, that I read the announcement by tiny Transitive Software of a new product, Quick Transit, that it claims allows software applications compiled for one processor and operating system to run on another processor and operating system without any source code or binary changes. My first thoughts went straight to the heart of the Linux/Microsoft battle. Could this software emulator be used to run Microsoft programs on Linux? And wouldnt that be inviting the full wrath of the Microsoft legal team?

I called the Los Gatos, CA-based startup to learn more and ended up talking with CEO Bob Wiederhold, who spoke from Manchester, England, home of the companys engineering offices. Wiederhold immediately dashed my grander ideas. If we tried to run Windows programs on a Linux platform, Microsoft would be upset, Wiederhold said. Thats not what were trying to do. Wiederholds initial goals are less incendiary, but could bring about big changes in the way companies manage their technology assets. Whats more, the technology could eventually drift down to the consumer level, where it could allow older video games to play on newer versions of game platforms (such as Microsofts Xbox, or Sony Playstation). The initial target market for the product, however, is large computer makers.

Wiederhold says Quick Transit has been in development for nine years, and that its the first software emulator that works with a broad array of processors with minimal performance degradation. Typically, software emulatorswhen they do worksuffer performance hits; a cursor arrow struggles to move across the screen, or there’s a two-second delay after clicking on a file menu before the dialogue box opens. Analysts who have seen Quick Transit report that it exhibits no such degradation.

Wired News also wrote about Transitive earlier.

Advantaged Supply Network

Strategy+Business writes:

A few companies often market leaders in their industries have moved away from single-transaction interactions with suppliers. These leading corporate buyers have built what we call an advantaged supply network. An advantaged supply network does not have pricing self-interest as the only basis for the buyersupplier relationship; rather, it aims for participants in the network jointly to create competitive advantage from diverse sources for themselves and for others. Buyers strive to work closely with suppliers to attack inefficiencies and waste in the supply chain, to coordinate their business strategies, and to manage resources together for competitive advantage. Efficiency and innovation in manufacturing are gained through such cooperative buyersupplier strategies as collaborative product and process planning, integrated engineering and design, and other forms of cooperation that lower total costs, decrease time to market, and improve the quality of the entire supply bases output.

Whereas the price-driven transactional management model encourages transient relationships between buyers and suppliers, the advantaged supply network creates incentives for buyers to build deeper and longer-lasting relationships with suppliers, so that both sides can more effectively pursue, over time, many opportunities to bolster economic stability and competitive advantage. The network also encourages players to look for and eliminate waste.

Electronic Medical Records

WSJ has an article by Laura Landro on EMRs in the US context:

In New York’s Hudson Valley, more than 600,000 patients are blazing a trail with a new regional medical-information network that lets area hospitals, doctors, labs and pharmacies share medical records securely over the Internet.

The Taconic Health Information Network and Community project is one of the most ambitious efforts yet in a growing movement to establish large regional health-information networks around the country. While it may be a decade or more before Americans have a national system of electronic medical records — as promised this year by the Bush administration — more than 100 state and local groups are moving quickly to establish their own networks, securing seed money from federal agencies and nonprofit groups, and lining up local employers and health plans to offer financial incentives, including bonuses for doctors to participate.

The regional networks aim to get local providers to convert patients’ paper medical files to electronic records, and persuade doctors to exchange pertinent information with a patient’s other health-care providers. By using a single network, regional health groups say they can reduce medical mistakes, better track patients with chronic diseases such as diabetes, zip prescriptions electronically to pharmacies, and cut costs by eliminating duplicated lab tests and X-rays.

“The simple vision is that we want to see every American covered by one or more regional health-information organizations,” says David Brailer, who was appointed as the nation’s first health-information-technology coordinator this year. Regional networks are better suited to meet the needs of specific geographic populations, he says, and eventually, the regional networks can all be interconnected to form a national network that will enable officials to track health trends, report disease outbreaks and better identify public-health issues.

TECH TALK: The Network Computer: The Idea Returns

Recently, there was speculation that Google was building a network computer along with its own browser. The New York Post wrote:

The broader concept Google is pursuing is similar to the “network computer” envisioned by Oracle chief Larry Ellison during a speech in 1995.

The idea is that companies or consumers could buy a machine that costs only about $200, or less, but that has very little hard drive space and almost no software. Instead, users would access a network through a browser and access all their programs and data there.

The concept floundered, but programmers note that Google could easily pick up the ball. Already, its Gmail free e-mail system gives users 1000 megabytes of storage space on a remote network providing consumers a virtual hard drive.

“I think a similar thing [to the got network computer] is developing in a more organic way now,” said Jason Kottke, a New York-based Web developer who follows Google’s moves. “People are ready for it. Instead of most of your interaction happening with Windows or Mac, you’re spending a lot of time with Google-built interfaces.”

News.com wrote: Google has also been rumored to be working on a thin-client operating system that would compete with Microsoft in areas beyond search. Techies have even discussed the idea of Google becoming a file storage system.

A commentary on ZDNet added:

What Google must do is get itself on the desktop. The obvious Google-shaped hole is local searching, where Microsoft has a history of conspicuous failure. A browser plug-in that amalgamated general file management with knowledge of Outlook, multimedia data types and online searching would be tempting indeed. Add extra features such as integrated email, instant messaging, automated backup to a remote storage facility and so on, and it gets very interesting. That would need considerable browser smarts, but would extend the Google brand right into the heart of the unconquered desktop where it would stick like glue.

By effectively combining local computing and the Web in this way Google would open up multiple revenue models. As well as advertising-supported and subscription services, it could start to offer very effective antivirus and other security functions–your data, safe in their hands–as well as any number of cleverly targeted sales opportunities based on what it knows about your personal file mix.

It would also remove one of the big barriers that stops people moving from Windows to open source. If all your important data has been painlessly stored on Google’s farm and there’s a neat, powerful Java browser-based management tool to retrieve it, you can skip from OS to OS without breaking into a sweat.

Google may not be the only one thinking about networked computers. A recent story in Business Week mentioned that AMD is planning to announce as early as October that it is teaming up with contract manufacturers to create an inexpensive, networked PC for sale in India or China. It’s part of [CEO] Ruiz’s ambitious plan to help connect 50% of the world’s population to the Internet by 2015.

So, is the network computer just a dream or will it become a reality? Given that we already have ever-cheaper computers, cellphones, TVs and gaming consoles, do we really need a fifth device? Will the network computer succeed in its second avatar? Is the network computer idea the harbinger of a deeper shift in computing?

As we seek to answer these questions, we need to first understand what a network computer is.

Tomorrow: What Is It?

Picking a Winning Product

HBS Working Knowledge has an article by Eric Mankin on the “four benchmarks for predicting the success of your product or service:”

A new product or service will be successful if it does a better job than existing products at satisfying the needs of a targeted customer group. But “doing a better job” actually has four dimensions. If a new product or service can exceed existing offerings across all four of these dimensions at once, then we can guarantee that the targeted customer group will purchase it.

The four dimensions fall into two categories, purchase motivators and purchase barriers. The new product has to excel at:

1. Providing high purchase motivators
A. It must be less expensive than existing products (lower price).
B. It must provide better features than existing products (greater benefits).

2. Eliminating purchase barriers
A. It must not have any switching or adoption costs (easy to use).
B. It must be readily available (easy to buy).

Customers for whom all four conditions apply will purchase the product or service because there are only benefits and no barriers. The closer any new product comes to succeeding in all four dimensions, the greater the chance that the product will be a winner. And, of course, the innovation will be a financial success if these conditions can be met at a profit.

Trust and Transactions in Media

Tim Oren writes that “from an investor’s perspective, there’s the possibility that one of the major value chains in modern society – media and advertising – will be rearranged, at least in part. That makes an economic analysis of the issue rather interesting.”

Google’s business model is provocative in partially reassembling the bundle from the advertisers’ point of view. Through search related ads, bundling around declared interests rather than demographics can be achieved. Adsense goes further in attempting juxtaposition of ads with actual content on the same basis. I’m awaiting with interest the form that advertising will finally take on Google News. Google is leveraging cheap cycles and a lot of algorithms research against the bundling needs of advertisers, but largely leaving the readers to fend for themselves. But, it has the advantage of a clear business proposition.

RSS aggregating software and services are a provocative attempt to let the readers build their own bundles. This is impossible in the legacy media, and creates a sharp differentiation from the old style of bundling. The juxtaposition of citizens’ media (blog posts) with legacy media content ripped from its home site goes one step further in exploding the apparent value of the old bundle. Reader side aggregation can thus destroy old value, but hasn’t so far shown an ability to extract serious revenue from readers.

Technorati is another cut. It’s not a bundling solution at all. Instead it seeks to reduce the ‘search costs’ associated with following threads of interesting discussion across the Web. If the transaction costs of retrieving individual information bits is reduced, the need and attraction of bundling is reduced. But, there’s also the problem of a lacking business case. Perhaps that can be found from the advertisers’ side. If promotion to demographic or general interest bundles is giving way to selling by influence, then tracking the conversation becomes of value. Technorati appears to be a radical unbundling hypothesis on both the reader and advertiser sides.