InfoWorld has a special report: “HP, IBM, Sun, and others promise a glowing future in which enterprises tap into computing resources where and when they’re needed.”
There are three basic definitions.
Utility as an on-demand computing resource: Also called adaptive computing, depending on which analyst or vendor you talk to, on-demand computing allows companies to outsource significant portions of their datacenters, and even ratchet resource requirements up and down quickly and easily depending on need. For those of us with gray whiskers in our beards, its easiest to think of it as very smart, flexible hosting.
Utility as the organic datacenter: This is the pinnacle of utility computing and refers to a new architecture that employs a variety of technologies to enable datacenters to respond immediately to business needs, market changes, or customer requirements.
Utility as grid computing, virtualization, or smart clusters: This is just one example of a specific technology designed to enable the above definitions. Other technologies that will play here include utility storage, private high-speed WAN connections, local CPU interconnect technologies (such as InfiniBand), blade servers, and more.
These three descriptions are different enough to seem unrelated, but in fact theyre dependent on each other for survival. Should utility computing ever live up to its name — a resource you plug in to, as you would the electric power grid — then that resource needs to be distributed, self-managing, and virtualized. Whether that grand vision will ever be realized is an open question, but at leaast some of the enabling technologies are already here or on the horizon.