TECH TALK: The Network Computer: What Is It?

Wikipedia has this to say about the network computer:

A network computer is a lightweight computer system that operates exclusively via a network connection. As such, it does not have secondary storage such as a hard disk drive it boots off the network, and it runs applications off the network, possibly acting as a client for an application server. During the mid to late 1990s, many commentators, and certain industry players such as Larry Ellison, predicted that the network computer would soon take over from desktop PCs, and everyone would use applications over the internet instead of having to own a local copy. So far, this has not happened, and it seems that the network computer “buzz” was either a fad or not ready to happen.

The idea actually goes back a long way however, back to the text-only dumb terminal, and later to the GUI of the X terminal. The former needed no software to be able to boot, everything was contained in ROM, and operation was simple. The latter requires some files to boot from the network, usually using TFTP to get them after obtaining an IP address via DHCP and bootp. Modern implementations include not only the X terminal, but also the Terminal Server in Microsoft Windows 2000 and XP, and others. The name has also evolved, from dumb terminal to network computer, and now to thin client.

Webopedia adds:

A computer with minimal memory, disk storage and processor power designed to connect to a network, especially the Internet. The idea behind network computers is that many users who are connected to a network don’t need all the computer power they get from a typical personal computer. Instead, they can rely on the power of the network servers.

This is really a variation on an old idea — diskless workstations — which are computers that contain memory and a processor but no disk storage. Instead, they rely on a server to store data. Network computers take this idea one step further by also minimizing the amount of memory and processor power required by the workstation. Network computers designed to connect to the Internet are sometimes called Internet boxes, Net PCs, and Internet appliances.

One of the strongest arguments behind network computers is that they reduce the total cost of ownership (TCO) — not only because the machines themselves are less expensive than PCs, but also because network computers can be administered and updated from a central network server.

Sun too has said since its inception that the network is the computer. There is something appealing about the idea about low-cost, simple computers connected to a centralised computing platform. The network computer has had many names thin clients, diskless workstations, information appliances. It is one of these enduring ideas in computing that refuses to die and keeps floating back every few years.

The world of today is very different now as compared to the mid-1990s when Larry Ellison first proposed the idea of a network computer. To understand if the network computer can succeed in todays world, we first need to travel back and see what went wrong when the network computer was first introduced.

Tomorrow: Ellisons Ideas

Continue reading

Built-to-Flip

Business 2.0 writes on hte new approach to creating companies: “Oddpost is part of an emerging breed of here-today, bought-tomorrow startups that are sprouting with minimal funding, flowering briefly, and being gobbled up by far bigger companies. In many instances, these built-to-flip outfits forgo — or sometimes can’t get — money from venture capitalists. They instead create shoestring operations focused on the rapid development of narrow technologies to plug gaps in existing product lines or add useful features to existing products. Then they look to a deep-pocketed patron to scoop them up.”

Azul’s Specialised Chips

WSJ writes about Azul, which has been founded by Stephen DeWitt who earlier headed Cobalt, which was acquired by Sun for $2 billion:

Departing from an industry trend toward standard chips, Azul Systems Inc. says it has packed the equivalent of 24 microprocessors on a single piece of silicon. Many companies are offering or developing such “multicore” chips, including International Business Machines Corp. and Intel Corp., but started out by squeezing two to four processors on each chip.

Azul’s chips are tailored for software programs that are written using a new generation of programming technologies, including Java from Sun Microsystems Inc. and Microsoft Corp.’s .NET. Azul plans to offer special-purpose server systems — in sizes ranging from 96 to 384 processors — that it believes will be much more efficient and powerful than existing machines for running such software.

The target audience for Azul’s new gear is corporate managers who are struggling to estimate how many servers to buy. Where each of those systems is typically assigned to run a single program or two, Azul’s machines, by contrast, are designed to handle changing workloads from many programs.

“We wanted to fundamentally eliminate the issues of capacity planning around computing,” Mr. DeWitt said.

Azul was largely inspired by the evolution of data-storage systems, Mr. DeWitt said. Where companies used to buy storage hardware from their computer vendor — which was mainly designed to work with its products — technology standards emerged in the 1990s that allowed storage systems to hold files of any type that come from nearly any kind of computer.

New programming technologies, such as Java and .NET, use a layer of translation software, called a virtual machine, that allows an application program to run on multiple kinds of computers and operating systems. Though programs based on such technologies are a fraction of the software companies use today, Mr. DeWitt cites estimates that 80% of new programs by 2008 will be based on virtual-machine approaches.

Tech Review 100

Technology Review presents its fourth class of 100 remarkable innovators under 35 who are transforming technologyand the world.

It would be interesting to prepare a list of Indians working in India who are doing the same. I am sure there is a lot of innovation happening in India, but it hasn’t bubbled up yet. Any suggestions?

Transitive’s Emulation Sensation?

Technology Review writes: “A startup claims it has created software that lets programs run on any operating systemand any processorwithout slowing down. Is the hype for real this time?”

Software emulatorssoftware that allows another piece of software to run on hardware for which it was not originally intendedhave been an elusive goal for the computing industry for almost 30 years. The ability to port software to multiple hardware configurations is something companies such as IBM, Intel, and Sun Microsystems are constantly working on. Software emulators do exist today, but most are narrowly focused, allowing one particular program to run on one other processor type. Sometimes, performance suffers with the use of an emulator.

It was with a shock, then, that I read the announcement by tiny Transitive Software of a new product, Quick Transit, that it claims allows software applications compiled for one processor and operating system to run on another processor and operating system without any source code or binary changes. My first thoughts went straight to the heart of the Linux/Microsoft battle. Could this software emulator be used to run Microsoft programs on Linux? And wouldnt that be inviting the full wrath of the Microsoft legal team?

I called the Los Gatos, CA-based startup to learn more and ended up talking with CEO Bob Wiederhold, who spoke from Manchester, England, home of the companys engineering offices. Wiederhold immediately dashed my grander ideas. If we tried to run Windows programs on a Linux platform, Microsoft would be upset, Wiederhold said. Thats not what were trying to do. Wiederholds initial goals are less incendiary, but could bring about big changes in the way companies manage their technology assets. Whats more, the technology could eventually drift down to the consumer level, where it could allow older video games to play on newer versions of game platforms (such as Microsofts Xbox, or Sony Playstation). The initial target market for the product, however, is large computer makers.

Wiederhold says Quick Transit has been in development for nine years, and that its the first software emulator that works with a broad array of processors with minimal performance degradation. Typically, software emulatorswhen they do worksuffer performance hits; a cursor arrow struggles to move across the screen, or there’s a two-second delay after clicking on a file menu before the dialogue box opens. Analysts who have seen Quick Transit report that it exhibits no such degradation.

Wired News also wrote about Transitive earlier.

Advantaged Supply Network

Strategy+Business writes:

A few companies often market leaders in their industries have moved away from single-transaction interactions with suppliers. These leading corporate buyers have built what we call an advantaged supply network. An advantaged supply network does not have pricing self-interest as the only basis for the buyersupplier relationship; rather, it aims for participants in the network jointly to create competitive advantage from diverse sources for themselves and for others. Buyers strive to work closely with suppliers to attack inefficiencies and waste in the supply chain, to coordinate their business strategies, and to manage resources together for competitive advantage. Efficiency and innovation in manufacturing are gained through such cooperative buyersupplier strategies as collaborative product and process planning, integrated engineering and design, and other forms of cooperation that lower total costs, decrease time to market, and improve the quality of the entire supply bases output.

Whereas the price-driven transactional management model encourages transient relationships between buyers and suppliers, the advantaged supply network creates incentives for buyers to build deeper and longer-lasting relationships with suppliers, so that both sides can more effectively pursue, over time, many opportunities to bolster economic stability and competitive advantage. The network also encourages players to look for and eliminate waste.

Electronic Medical Records

WSJ has an article by Laura Landro on EMRs in the US context:

In New York’s Hudson Valley, more than 600,000 patients are blazing a trail with a new regional medical-information network that lets area hospitals, doctors, labs and pharmacies share medical records securely over the Internet.

The Taconic Health Information Network and Community project is one of the most ambitious efforts yet in a growing movement to establish large regional health-information networks around the country. While it may be a decade or more before Americans have a national system of electronic medical records — as promised this year by the Bush administration — more than 100 state and local groups are moving quickly to establish their own networks, securing seed money from federal agencies and nonprofit groups, and lining up local employers and health plans to offer financial incentives, including bonuses for doctors to participate.

The regional networks aim to get local providers to convert patients’ paper medical files to electronic records, and persuade doctors to exchange pertinent information with a patient’s other health-care providers. By using a single network, regional health groups say they can reduce medical mistakes, better track patients with chronic diseases such as diabetes, zip prescriptions electronically to pharmacies, and cut costs by eliminating duplicated lab tests and X-rays.

“The simple vision is that we want to see every American covered by one or more regional health-information organizations,” says David Brailer, who was appointed as the nation’s first health-information-technology coordinator this year. Regional networks are better suited to meet the needs of specific geographic populations, he says, and eventually, the regional networks can all be interconnected to form a national network that will enable officials to track health trends, report disease outbreaks and better identify public-health issues.

TECH TALK: The Network Computer: The Idea Returns

Recently, there was speculation that Google was building a network computer along with its own browser. The New York Post wrote:

The broader concept Google is pursuing is similar to the “network computer” envisioned by Oracle chief Larry Ellison during a speech in 1995.

The idea is that companies or consumers could buy a machine that costs only about $200, or less, but that has very little hard drive space and almost no software. Instead, users would access a network through a browser and access all their programs and data there.

The concept floundered, but programmers note that Google could easily pick up the ball. Already, its Gmail free e-mail system gives users 1000 megabytes of storage space on a remote network providing consumers a virtual hard drive.

“I think a similar thing [to the got network computer] is developing in a more organic way now,” said Jason Kottke, a New York-based Web developer who follows Google’s moves. “People are ready for it. Instead of most of your interaction happening with Windows or Mac, you’re spending a lot of time with Google-built interfaces.”

News.com wrote: Google has also been rumored to be working on a thin-client operating system that would compete with Microsoft in areas beyond search. Techies have even discussed the idea of Google becoming a file storage system.

A commentary on ZDNet added:

What Google must do is get itself on the desktop. The obvious Google-shaped hole is local searching, where Microsoft has a history of conspicuous failure. A browser plug-in that amalgamated general file management with knowledge of Outlook, multimedia data types and online searching would be tempting indeed. Add extra features such as integrated email, instant messaging, automated backup to a remote storage facility and so on, and it gets very interesting. That would need considerable browser smarts, but would extend the Google brand right into the heart of the unconquered desktop where it would stick like glue.

By effectively combining local computing and the Web in this way Google would open up multiple revenue models. As well as advertising-supported and subscription services, it could start to offer very effective antivirus and other security functions–your data, safe in their hands–as well as any number of cleverly targeted sales opportunities based on what it knows about your personal file mix.

It would also remove one of the big barriers that stops people moving from Windows to open source. If all your important data has been painlessly stored on Google’s farm and there’s a neat, powerful Java browser-based management tool to retrieve it, you can skip from OS to OS without breaking into a sweat.

Google may not be the only one thinking about networked computers. A recent story in Business Week mentioned that AMD is planning to announce as early as October that it is teaming up with contract manufacturers to create an inexpensive, networked PC for sale in India or China. It’s part of [CEO] Ruiz’s ambitious plan to help connect 50% of the world’s population to the Internet by 2015.

So, is the network computer just a dream or will it become a reality? Given that we already have ever-cheaper computers, cellphones, TVs and gaming consoles, do we really need a fifth device? Will the network computer succeed in its second avatar? Is the network computer idea the harbinger of a deeper shift in computing?

As we seek to answer these questions, we need to first understand what a network computer is.

Tomorrow: What Is It?

Picking a Winning Product

HBS Working Knowledge has an article by Eric Mankin on the “four benchmarks for predicting the success of your product or service:”

A new product or service will be successful if it does a better job than existing products at satisfying the needs of a targeted customer group. But “doing a better job” actually has four dimensions. If a new product or service can exceed existing offerings across all four of these dimensions at once, then we can guarantee that the targeted customer group will purchase it.

The four dimensions fall into two categories, purchase motivators and purchase barriers. The new product has to excel at:

1. Providing high purchase motivators
A. It must be less expensive than existing products (lower price).
B. It must provide better features than existing products (greater benefits).

2. Eliminating purchase barriers
A. It must not have any switching or adoption costs (easy to use).
B. It must be readily available (easy to buy).

Customers for whom all four conditions apply will purchase the product or service because there are only benefits and no barriers. The closer any new product comes to succeeding in all four dimensions, the greater the chance that the product will be a winner. And, of course, the innovation will be a financial success if these conditions can be met at a profit.

Trust and Transactions in Media

Tim Oren writes that “from an investor’s perspective, there’s the possibility that one of the major value chains in modern society – media and advertising – will be rearranged, at least in part. That makes an economic analysis of the issue rather interesting.”

Google’s business model is provocative in partially reassembling the bundle from the advertisers’ point of view. Through search related ads, bundling around declared interests rather than demographics can be achieved. Adsense goes further in attempting juxtaposition of ads with actual content on the same basis. I’m awaiting with interest the form that advertising will finally take on Google News. Google is leveraging cheap cycles and a lot of algorithms research against the bundling needs of advertisers, but largely leaving the readers to fend for themselves. But, it has the advantage of a clear business proposition.

RSS aggregating software and services are a provocative attempt to let the readers build their own bundles. This is impossible in the legacy media, and creates a sharp differentiation from the old style of bundling. The juxtaposition of citizens’ media (blog posts) with legacy media content ripped from its home site goes one step further in exploding the apparent value of the old bundle. Reader side aggregation can thus destroy old value, but hasn’t so far shown an ability to extract serious revenue from readers.

Technorati is another cut. It’s not a bundling solution at all. Instead it seeks to reduce the ‘search costs’ associated with following threads of interesting discussion across the Web. If the transaction costs of retrieving individual information bits is reduced, the need and attraction of bundling is reduced. But, there’s also the problem of a lacking business case. Perhaps that can be found from the advertisers’ side. If promotion to demographic or general interest bundles is giving way to selling by influence, then tracking the conversation becomes of value. Technorati appears to be a radical unbundling hypothesis on both the reader and advertiser sides.

Ram Charan

[via Shrikant] Fast Company has a profile:

Ram Charan lives nowhere and goes everywhere, consulting for the largest and most powerful companies seven days a week, 365 days a year. Work is all he does, and all he wants to do. But even more than his dedication, it’s his insights that have won him the ear of hundreds of top managers.

Unlike many consultants, who, as the old joke goes, will borrow your watch to tell you what time it is, Charan doesn’t reinforce his clients’ preconceived notions. Rather, he submerges his own ego, asks questions, and ultimately tries to bring the executive to his or her own “aha” moment. Although he has the stocky build and intense gaze of a prizefighter, his voice is low and unthreatening. Says Bossidy, former CEO of AlliedSignal and Honeywell and Charan’s coauthor: “Most [consultants] tell you what you want to hear. He doesn’t, but he does it in a very positive way. He’s not a ranter or a raver, but nonetheless he’s objective and honest.”

He also speaks in the language of a real person, rather than the Harvard-trained academic he is. Indeed, Profitable Growth is written so plainly that the lessons sound almost simplistic: To make your company grow, go for singles and doubles, not home runs; focus on organic growth, not acquisitions; get your customer involved through what’s called “upstream marketing” early in the process. Charan would rather distill a concept to its pure soul and put it into action than coin a lot of useless jargon. “This is nothing earthshaking,” he says. “When I teach, I start with the idea that every person in this classroom takes one idea home to practice. I don’t want to hear people say, ‘That was a great speech. What did he say?’ Conversion of learning into practice is what counts.”

Tim Berners-Lee on the Semantic Web

Exceprts from an InternetNews.com interview:

We’re coming into phase two. It’s an exciting phase but we still have a long way to go. We have the foundation in place with the approval of RDF [Resource Description Framework] and OWL [Web Ontology Language]. In this phase, we can build up and out from those foundations.

In practical terms, it has reached a certain level of maturity. At SpeechTek here, there are a few people discussing the connection of speech to the Semantic Web, and that’s always exciting. There are some students independently at MIT doing some work and sparking a lot of discussion about the connection. There are a lot of programs coming out connecting a lot of data and a lot of ontologies.

I suppose it’s a lot like where we were in 1992 and 1993. Back then, the Web wasn’t stable, but we knew it was there and it held a lot of promise. We knew it would grow and mature, but there were a lot of things that we needed but didn’t have. This was pre-Google. Around 1991, you would go on the Web to look for something that wasn’t there. Today, that information is there and we can find it easily.

So, I think that’s where we are with the Semantic Web. We know it will mature, but we’re not quite there yet.

The excitement that it continues to generate is encouraging. The military needs it; the health sector needs it. There’s already an academic field around it. We have RDF and OWL as W3C recommendations, which are big pluses. To that extent, the Semantic Web has already reached a certain level of maturity.

Running an Efficient Board Meeting

Ed Sim has some advice: “Board meetings are like theater. Like any play, I expect the CEO to have a well thought out and scripted agenda for the meeting. The most efficient way to do so is to lay out an agenda and get feedback pre-meeting from the other board members to ensure that the board covers appropriate topics and allocates the right amount of time for each one. From an update and preparedness perspective, the CEO should always go into the meeting having a complete understanding of where the various board members stand in terms of any major decisions. There should be no surprises. This means that the CEO should have individual meetings and calls in advance of the board meeting to walk each director through any decisions that need to be made and the accompanying analyses behind them.”

Ramesh Jain on Search

[via John Battelle] ACM Ubiquity has an interview with Ramesh Jain. I had met Ramesh on my recent US visit, and he discussed some of his ideas. I think Ramesh is one of the key people who will define the next generation of search. His perspective combines multiple mental models – from multimedia to experiential computing.

JAIN: Current search engines like Google do not give me a “steering wheel” for searching the Internet (the term steering wheel was used by William Woods in one of his articles). The search engines get faster and faster, but they’re not giving me any control mechanism. The only control mechanism, which is also a stateless control mechanism, asks the searcher to put in keywords, and if I put in keywords I get this huge monstrous list. I have no idea how to refine this list. The only way is to come up with a completely new keyword list. I also don’t know what to do with the 8 million results that Google threw at me. So when I am trying to come up with those keywords, I don’t know really where I am. That means I cannot control that list very easily because I don’t have a holistic picture of that list. That’s very important. When I get these results, how do I get some kind of holistic representation of what these results are, how they are distributed among different dimensions.

UBIQUITY: What would that kind of holistic representation be like?

JAIN: Two common dimensions that I find very useful in many general applications are time and space. If I can be shown how the items are distributed in time and space, I can start controlling what I want to see over this time period or what I want to see in that space.

Connecting Blog Categories

Carrick Mundell writes:

What if you could be reading someone’s blog, or even your own, and you see that the blogger has assigned a category to his post, and that category is a link to, presumably, all his other posts assigned to that category… what if when you click that link, you not only get related posts from the blog you’re reading but from all other blogs (MT or not) that use that category? The resulting related posts from other blogs would be ordered by most recent at the top and limited to, say, five or ten displayed at one time. Suddenly, you would see an immediate connection between the post you’re reading and all other posts in the entire blogosphere. I think that would be very cool.

Now, implementing this would be the real challenge. First of all, we know it’s difficult to index blogs. I still don’t find much utility in Technorati or Feedster. There’s so much noise and, it seems, latency. A simple, and in the end, non-scalable solution would be to use trackbacks to a central system of some kind. The idea of all MT blogs pinging a central system with category and summary data might work but could become hopelessly bogged down once you have thousands, if not millions, of browsers asking for related data to every blog post ever written. And that’s’ just MT blogs. Seems like a more distributed system would be in order, something P2P-ish.

Opening up TV, a new API

Kontra writes: “Just like Microsoft (and Google, Amazon and eBay), one day all TV networks will have to digitally restructure their content and publish their APIs so that anyone can reliably and reasonably plug into them. That’s the difference between dormant assets and a constantly evolving and extending platform.”

Suggestions for NBC: “NBC should transform itself into a content platform that provides not just finished TV shows but a wide spectrum of technical functions, including extensive metadata on its programs; pervasive indexing; advanced text, audio and video search on its stock library; speech-to-text conversion; statistical data for its news, sports, quiz and financial shows; media format conversion; automated graphical skinning and rebranding tools; e-learning templates; reviews; ratings and so on. In other words, it should not only provide the footage but all the tools necessary to enable third parties to easily plug into and remonitize what would otherwise be dormant assets.”

The Linux Enterprise

LinuxDevCenter has an article by Tom Adelstein: “Much discussion exists concerning the presence of GNU/Linux and open source software in the enterprise. Linux users often call into question decisions by major vendors who increase innovation on servers at the expense of the desktop. In this article, we define the market and discuss the business reasons Linux companies pursue the enterprise market while limiting their initiatives for consumers.”

Powered by Skype

Stuart Henshall looks at Skype’s first year:

Skype’s business model is dramatically different to the traditional telecoms. For the most part all the hardware required to operate the network is owned by the individuals using the service. Thus unlike traditional telecoms there is no need to build out infrastructure, new users simply bring it to Skype. In that way it is similar to the Seti at Home project.

Increasingly I think of Skype’s potential like a low cost electrical utility. Compared with telecoms which have had little innovation in handsets the electrical grid enables thousands of different appliances. Electricity is also similar in that it is always-on and you use it in real-time. The switch and control remains with the user.

A year ago Skype was just another IM client with a voice-centric bias and tremendous audio quality. People said no-one makes money at IM or in free telephony. I have quotes in my Skype Journal. A small few a year ago (including me) said that Skype was disruptive based on its architecture and audio quality. I believe the hidden learning now is more about evolutionary changes that may not look like much in the short term but in the end are quite revolutionary. Skype is beginning to rewire the whole way in which we communicate. It will extend to business processes and social interactions. It’s also living proof that telephony is now just a software application.

Their platform capability means that when they release an effective API just about anyone may be able to develop services that plug into Skype’s data and communications network. Early DOS is a good comparison in this regard. Although there it was hardware, Skype is directing their assault at platforms. These platforms have long life spans and the operating piece they are carving out is underdeveloped. While Microsoft looks at Dell, IBM, HP, Toshiba, etc, Skype is looking at the platform suppliers as manufacturers.

So their strategy poses a continued strategy for both Telecoms and Microsoft Windows. Telecoms are threatened by the cost structure and the long-term challenge to their numbering system. While Windows will be challenged by Skype if the API is open enough because the incentive will exist to develop an office platform for Linux that integrates presence and availability and communication capabilities with documents and files. While LCS Live Communication Server will offer this capability it requires a central server and my guess is still a lower quality audio engine. A successful Skype gives new utility to a Linux desktop at significantly lower cost vs. Windows. If the API enables easy cross platform solutions then this market may explode.

TECH TALK: Thinking A New Food Portal: Business Model

So, plenty of ideas to build a new food portal leveraging existing expert and user-contributed content, with enhancements in the form of formatting for mobile phones, RSS feeds to provide alerts, videos of the cooking process, personalisation, and more. But what is the business model? How does a site like this make money? Here are a few ideas:

Subscriptions: A part of the site could be available only to subscribers. I think the video content should be made available for the equivalent of tens of rupees per download. So, users can get the recipe details for free, but if they want to actually see the entire cooking process on-demand, then it is available for a fee.

Mobiles: Given the growth of cellphones in India, they can be tapped as a source for revenue. Food-related information could be a useful value-added service for cellphone users.

Advertising: The food industry is quite big and growing. The major Indian foods companies are still not advertising online in a significant way. But the more interesting opportunity could come from neighbourhood restaurants, who can even provide deals if they have space to fill. Non-intrusive contextual text ads could be a useful source of revenue.

Commerce: Selling ingredients and cooking-related appliances could be a potential source of revenue. Making books out of the content that exists on the site is another possible financial source.

Food is a very important part of our lives. A new food site done well could be financially lucrative. The time is right for leveraging a mix of content and technology to create richer user experiences. More importantly, it would also create a platform to build other vertical sites along similar lines.

Beyond Food

The idea of this thought experiment was to think on how new technologies can help build better and more interactive content experiences, and at the same time also provide potential advertisers with targeted segments. Many times our legacy prevents us from thinking out of the box. Our habits change much slower than the relentless march of technology. And at the same time, it is the technological innovations which create new opportunities. The first ten years of the Internet in our lives have been a source of great value. The next five will probably see even more innovation as content producers start seeing the power of the changes and extensions in the Internets infrastructure.

In India, we have an opportunity to leapfrog and embrace much of this change. As the foundation of low-cost access devices (be it cheaper computers, thin clients, cellphones, TVs) leverage the communications base that is being built, the need will be for a new generation of services which go beyond the content that is in prevalence today. This is the opportunity for Indian entrepreneurs envision new services and applications for an India with a hundred million Internet users.