Elections: Jivha’s 20 Lessons

Jivha has a series of posts on 20 themes that defined India’s elections: Foreign Origin, Rural India, EVM, Vote for Sonia? Or Vote against Vajpayee?, Youth, The Left, High Tech Campaigns, Dynasty Politics, Exit Polls, Others, India Shining, Celebrity Contestants, Alliances, SP/BSP Marginalization, Moronli, Contesting from Multiple Seats, Mud-slinging, CEO Chief Ministers, Ideology, The Atal Wave.

Eyes for Text

Ramana Rao writes about enterprise search:

Humans have Eyes for Text. Handed a document, a person can quickly scan the document and extract all kinds of useful information. The key word is useful. Its human magic that makes that assessment work in such a broad range of conditions.

Machines have generally lacked Eyes for Text. They see the same document as a sequence of bytes, when in fact to us its really made of words organized in sentences organized in passages, not to mention all kinds of other structure related to conventions, forms, genres, and so on.

Humans are blind in the face of large numbers of documents, while machines are blind to whats in one document. Our human blindness limits our ability to, well, see patterns and anomalies across the whole and connections across elements. While machine blindness means they cant help us much. It would be a case of the blind leading the blind.

Eyes for Text also suggest steps up toward understanding that arent necessarily all the way up, say to Brains for Text. So before we solve all the scientific problems of artificial intelligence and cognitive science, we can master a useful set of primitives that certainly must lay on the path in any case.

Would we be able to understand whats in a kitchen, not to mention navigate around it, or make dinner there, if our eyes didnt pull out useful features like edges, surfaces, corners?
And the edges, corners, and surfaces within documents? They are the entities mentioned, the statements made about them, whether they state relationships, events, or facts. And the sequence of these statements tell us about the topics, authority, applicability, and so on of the text.

He adds in an article on Always-On:

In the next few years, many enterprises looking for new levels of organizational intelligence will deploy what I call ‘eyes for text’ engines. These text analysis engines will produce a new generation of search applications that more effectively leverage human skills. Not only that, but they will create an entirely new class of applications, in which collections or flows of information are analyzed in their entirety, instead of one user and one document at a time. These new applications can be called ‘discovery applications’ rather than search applications.

By now, everybody is quite familiar with search applications, which are also technically known as ‘information retrieval.’ Search can be characterized as users chasing documentswhere a user, by finding and understanding documents, is trying to fulfill a need for information required for some broader task. The push version of retrieval, often called filtering or routing, is not essentially different, it is just a switch to documents chasing users. It’s still about individual users, documents, and information needs in the context of broader tasks.

But discovery applications based on text analysis are quite different. The crucial action leading to their use is not so much about documents, instead it is about statistics over statements. To clarify this, ‘statistics’ are about either patterns or anomalies: shapes that emerge from enough stuff, or things that blink in the night. The ‘statements’ part is about the specific content and value of individual human language expressions found in documents. And ‘over’ is the bridge between generalization and particulars, and between general applicability and specific applications.

I believe that in seven years, purchases of discovery technologies will catch up with spending on traditional retrieval applications. Why? Because content mining can be applied directly to a number of organizational missions, whereas information retrieval is applied indirectly, by augmenting individuals and their knowledge.

By “augmenting individuals” I mean that search applications are mind-expanding applications, in the sense that users must do the understanding part, so their brain is used and thus expanded.

Using a ‘mind-expansion’ proposition to spread a technology works best by convincing end users one at a time that a technology makes them work more efficientlyuntil suddenly organizations have no choice except to view the technology as a cost of doing business. Many personal computing technologies started out like this, including personal computers, office software, browsers, Web servers, and e-mail systems. And on intranets, search engines are almostbut not quiteseen in this way, despite almost 10 years of mainstream Internet search.

Intel’s Multicore Strategy

WSJ writes:

Intel told analysts that it is changing its fundamental design strategy to begin adding multiple microprocessor “cores” — the calculating circuitry inside computers — onto each of its chips. Intel had discussed the concept for some time, and last Friday signaled that it would accelerate the timetable for introducing such multicore technology from 2006 to 2005.

But Paul Otellini, Intel’s president and chief operating officer, yesterday indicated an even broader commitment, predicting that the conventional microprocessors that Intel invented in 1971 will quickly become a rarity.

“All of our microprocessor development going forward is now multicore,” Mr. Otellini said. “The design paradigm has shifted at Intel.”

Multicore technology had been embraced earlier by companies such as International Business Machines Corp. Intel appeared to be in less of a hurry, in part because of its past success in boosting computing performance by increasing the operating frequency of its chips.

But higher frequencies increase heat and energy consumption, big problems as Intel tries to move its chips more aggressively into portable computers and consumer-electronics devices. Putting more microprocessors on each chip can minimize the problem.

Fully exploiting multicore chips requires new software, however. Mr. Otellini said an increasing number of programs are being enhanced to do multiple chores at once, and a new Microsoft Corp. operating system dubbed Longhorn is expected to take advantage of the new technology. Mr. Otellini, speaking at the company’s annual gathering for analysts in New York, said the dual-core technology will bring a range of benefits to consumers, such as the ability to process digital video while a user does other chores.

In another new strategy, Intel has recently built specialized circuitry into chips to carry out features that will be turned on later when software to exploit them is available.

Adds News.com:

Intel believes the dual-core approach will let it offer greater performance for desktops and notebooks while circumventing power-consumption problems. Dual-core chips offer more performance than single-cores by adding more parallelism, or the ability to do multiple jobs simultaneously. A dual-core processor could, for example, render a video on one core while running a PC’s operating system and other applications on its other core, Otellini said.

Those abilities would prove particularly useful inside the digital home, where consumers are expected to record television programs on PCs or use them to share video or other content with various electronic devices. The two processor cores on a dual-core chip would also run at lower clock speeds, reducing overall power consumption, even compared with a single-core chip of similar performance, Otellini said. Chip power consumption tends to rise with clock speed.

Otellini said, “We could not ship desktop and notebook processors at 150 watts. We just couldn’t do it.” Right now, Intel’s fastest desktop Pentium 4s, which run at 3.2GHz and 3.4GHz, consume about 90 to 100 watts of power.

The dual-core strategy may also underscore a different way of thinking for the company, as outlined by senior executives at the analyst meeting.

Instead of talking about clock speeds, as they often have in past meetings, the executives stressed Intel’s approach to entire markets and outlined how they believe the company can sell more of its chips by taking advantage of occurrences such as the convergence of technologies. Going forward, Intel plans to boost its presence in the wireless computing and cellular communications markets, in addition to PCs, said Craig Barrett, Intel’s CEO, in his presentation at the meeting.

“As the technology converges, that means that our opportunities expand. This is what we’re putting all of our resources behind…(the) extension of our markets around the world,” Barrett said.