as powerful as they are, search engines have huge weaknesses. For example, a recent Google search on the word Linux took just 0.4 seconds, but it had 95 million hits. Too bad if the one you need is No. 10,000 on the list.
But researchers are poised to revolutionize search technology over the next few years. The most common thrust is to personalize search engines so that they know, for example, that if you’re an IT professional and you search for mouse, you’re more likely to want information about PC devices than about animals.
Wired has an article by Michael Malone:
The biggest impediment to our technological future isn’t extending Moore’s law. Thanks to recent breakthroughs at the semiconductor manufacturing level, by 2010 top-tier processors should be stuffed with a billion transistors and running at more than 20 gigahertz. No, the biggest challenge to progress is much more ordinary: It’s battery life. What good is a super-functional cell phone if it runs out of juice after 20 minutes? Or a laptop supercomputer that weighs 15 pounds and singes your thighs?
The problem can be stated in a single word: wireless. When Intel cofounder Gordon Moore made his famous proclamation in 1965, he may have anticipated the existence of untethered electronics. But in those days of core memory and wired logic, integrated circuits were seen as astounding breakthroughs in energy conservation. No one could have imagined that billions of chips would be in use, each packed with millions of transistors – and that so many of the chips would unplug themselves from the wall.
In Moore’s own prophetic words, “it may prove more economical to build large systems out of smaller functions, which are separately packaged and interconnected. The availability of large functions, combined with functional design and construction, should allow the manufacturer of large systems to design and construct a considerable variety of equipment both rapidly and economically.”
It may not have the pop of Moore’s legendary logarithmic memory-chip chart, but implicit in those words are the roots of a new rule regarding the efficiency of electronic devices. Like Moore’s first law, the second is actually a pact. The first, as explained by networking pioneer Robert Metcalfe, was a promise made by the chip industry that it would strive mightily, for as long as physically possible, to double net chip performance along the three axes – speed, miniaturization, and price – every 18 to 24 months. Increasing overall system efficiency requires much greater collusion.
What we need is a fourth axis of development – a systematic improvement of overall system efficiency, from the individual silicon gate, through motherboards and displays, all the way up to the Internet itself. How do we do it? Exhaustively.
What should be the second law’s equivalent to the first’s famous “double every 18 to 24 months” formulation? We need something sufficiently Herculean without being impossible.
Let’s try this. Moore’s second law: Overall net efficiency of any electronic system will double every 24 months.
William Gurley writes:
What is most striking about the notion of a 45-megabit Internet Protocol connection is the overwhelming universality of such an incredibly high-speed packet-based conduit. Into it melt all forms of media and communications: voice, data, video and any other application or service you might imagine. There is no need to consider bringing multiple connections or service providers into your home, for this network can do everything you need and more. Early signs in Japan are consistent with this notion. Yahoo BB announced a stunning 80 percent attachment rate on its IP-based phone service. It is now promoting an IP-based set-top box for the ultimate in personalized television.
While an all-IP world may not happen immediately, over the next 10 years, our communications networks will very likely follow the lead of the aggressive rollouts in Korea and Japan. As IP engulfs everything else, many traditional industries and paradigms will be challenged. For the companies involved, the time to prepare for these challenges is today. Postponement will only increase the likelihood of failure.
India has an opportunity to do things right by building a nationwide broadband IP backbone.
HBS Working Knowledge has an excerpt from a new book “Heads Up: How to Anticipate Business Surprises and Seize Opportunities First” by Gartner’s Kenneth McGee:
In many ways, it is difficult to understand why real-time surveillance, like that aboard airplanes, of critical day-to-day business events is not more readily incorporated into the daily regimen of employees, managers, executives, and even board directors. After all, the notion of monitoring, capturing, analyzing, reporting, and responding to critical information is not an alien concept in our day-to-day activities.
* We depend on real-time information about the time of day to make it to meetings on time.
* We rely on thermostats in our homes and office buildings to respond instantly with more heat or air conditioning when the temperature rises or drops beyond a certain point.
* We expect the gauges in our cars to reflect real-time information on our speed (especially when we see a semi-concealed police car) and fuel status.
* We watch the meter at the self-service gas pump to make certain we stop the flow of gas when we reach the desired amount.
* We use smoke and fire detectors to warn us immediately of danger, especially while we are asleep.
* We even use temperature-sensitive pop-up buttons to tell us when to take a Thanksgiving turkey out of the oven.
We are surrounded by examples of real-time monitoring, capturing, analyzing, reporting, and responding to events. Despite the damage caused by business surprises attributable to an absence of real-time information and the prevalence of real-time information in our personal lives, little is being done to change the business culture and processes that tolerate surprises and to begin using real-time opportunity detection.
The Economist wrote recently about the Energy Internet:
Transforming today’s centralised, dumb power grid into something closer to a smart, distributed network will be necessary to provide a reliable power supplyand to make possible innovative new energy services. Energy visionaries imagine a self-healing grid with real-time sensors and plug and play software that can allow scattered generators or energy-storage devices to attach to it. In other words, an energy internet.
The good news is that technologies are now being developed in four areas that point the way towards the smart grid of the future. First, utilities are experimenting with ways to measure the behaviour of the grid in real time. Second, they are looking for ways to use that information to control the flow of power fast enough to avoid blackouts. Third, they are upgrading their networks in order to pump more juice through the grid safely. Finally, they are looking for ways to produce and store power close to consumers, to reduce the need to send so much power down those ageing transmission lines in the first place.
In the long run, however, the solution surely does not lie in building ever fatter pipes to supply ever more power from central power plants to distant consumers. Amory Lovins, head of the Rocky Mountain Institute, an environmental think-tank, explains why: the more and bigger bulk power lines you build, the more and bigger blackouts are likely. A better answer is micropowera large number of small power sources located near to end-users, rather than a small number of large sources located far away.
At first glance, this shift toward micropower may seem like a return to electricity’s roots over a century ago. Thomas Edison’s original vision was to place many small power plants close to consumers. However, a complete return to that model would be folly, for it would rob both the grid and micropower plants of the chance to sell power when the other is in distress. Rather, the grid will be transformed into a digital network capable of handling complex, multi-directional flows of power. Micropower and megapower will then work together.
ABB foresees the emergence of microgrids made up of all sorts of distributed generators, including fuel cells (which combine hydrogen and oxygen to produce electricity cleanly), wind and solar power.
Energy-storage devices will be increasingly important too. Electricity, almost uniquely among commodities, cannot be stored efficiently (except as water in hydro-electric dams) The most intriguing storage option involves hydrogenwhich can be used as a medium to store energy from many different sources.
Hydrogen could radically alter the economics of intermittent sources of green power. At the moment, much wind power is wasted because the wind blows when the grid does not need, or cannot safely take, all that power. If that wasted energy were instead stored as hydrogen (produced by using the electrical power to extract hydrogen from water), it could later be converted back to electricity in a fuel cell, to be sold when needed. Geoffrey Ballard of Canada’s General Hydrogen, and the former head of Ballard, a leading fuel-cell-maker, sees hydrogen and electricity as so interchangeable on the power grid of the future that he calls them hydricity.
India needs to think of a leapfrog energy platform. Incremental solutions will not get us anywhere. One possible option for the future energy needs of countries like India is hydrogen fuel-cell micropower. What are the challenges in making it a reality?
Tomorrow: Energy (continued)