Google vs Teoma

WSJ (Lee Gomes) writes that Teoma is in the race to give Google a run for its money with its ability to provide links to “communities” associated with the search. Teoma is owned by AskJeeves. Some background:

In the Web’s early days, if you wanted to know about “mortgages,” the first generation of search engines would show you pages with references to the term. But, as porn sites quickly discovered, this approach is easily fooled, say by putting “mortgage” somewhere — or dozens of times — on your page.

Then, Jon Kleinberg, a Cornell University computer scientist, realized a better approach would be to forget about the contents of a page and concentrate instead on the people linking to it. It’s known as “link analysis,” as opposed to the earlier “text analysis.” Prof. Kleinberg ran an IBM research project that tried to write software that would find the Web “communities” around a particular topic, like mortgages. You’d then go to that community, and see what sites it thought were best. A good idea, but the IBM crew couldn’t figure out how to do it fast enough.

Enter Google, which in the late 1990s came up with its own variation of link analysis. Google’s soon-to-be-famous “Page Ranking” system listed Web sites by their popularity, on the assumption that the best sites were those with the most people linking to them. It was slightly different from what IBM was trying because with Google, everyone, in effect, had a shot at voting at the best page, rather than a presumed “community of experts.” It worked, and Google quickly became the No. 1 search engine. It holds that position today for many reasons besides its technology, like its clean design.

Lately, the Google folks have been downplaying the page-ranking system in describing their advantages — if only because everyone else is now doing it. In fact, all search engines nowadays take many things into account when deciding how to list the pages in response to a query.

A second story on search engines in WSJ is about how some smart companies capitalised on the power outage and created ads quickly to show up when people did searches for words like “black out” and “power outage”, and started attracting traffic within minutes for a cost-per-clickthough as low as five cents. The self-service and “instant-on” capabilities of search engines means that one now can capitalise on any major news event.

Payments via Cellphone

WSJ writes about how “credit-card companies, cellphone operators and retailers alike are aiming to create a phone that can be used any time, anywhere, to charge anything — even the time on a parking meter.”

Some of the systems use phones that emit an infrared beam or radio wave directly to a cash register; others involve a special chip embedded in the phone that can be waved over a scanner; and one has no new technology at all, using just a simple text message from the customer’s bank to his phone to authorize a credit transaction.

The payoff for everyone from banks to telecom companies could be enormous. In Japan and China, people are more likely to have a cellphone than a credit card. Moreover, by pushing the credit card from the wallet to the phone, banks and card companies believe they can hook a new generation of consumers on the idea of charging things that today are covered by pocket change. For mobile-telephone operators, that would not only increase data traffic on their networks, but also place the cellphone even more squarely at the center of people’s daily lives.

Community Computing in India

Technology Review features an interview with MIT’s Kenneth Keniston, who has perhaps the best insights into India’s ICT projects. “One tactic that particularly interests Keniston has been the deployment of community information centers in Indiakiosks where villagers can pay a few rupees for accessing land records, market prices, and other information. India is host to an extraordinary number of community information center experiments, including private-sector initiatives like Drishtee; government-to-citizen initiatives like the Bhoomi project, which has computerized 20 million land records; and the deployment of community information centers by Indian agriculture business giant ITC, an effort that improved the efficiency of the company’s supply chain.” Excerpts from what Ken has to say:

There are several models [for Community Information Centers] in India. In Madhya Pradesh, for example, the Gyandoot project had the backing of the deputy district collector but was designed to be largely self-sustaining. In Warana, the big impetus came from the Maharashtra government, the sugarcane cooperatives, and the National Informatics Centre. Then we have the ITC, which has set up a vast operation with 800 Community Information Centers operational and increasing to 2000 kiosks soon. Soybeans, shrimp, and coffee are transacted through these kiosks, and they have a very carefully thought out revenue model. By bypassing the middleman, ITC saves eight to 10 percent on the purchase of soy, which is very impressive. In Warana, I am told that enough savings are generated from the kiosks to sustain and maintain them. The interesting thing is that some of these setups are products of companies that are not philanthropically inclined in nature.

From the point-of-view of sustainability, the Drishtee Community Information Centers and the Sustainable Access in Rural India projects are similar. They plan to offer a variety of services through the Community Information Centers to recover initial investments and operating expenses. We do not know the degree to which these projects are self-sustaining, but its perhaps too early to say. Then there are the Bhoomi land records project in Karnataka, the government of [Indian state] Andhra Pradeshs various e-government projects, the National Informatics Centres efforts to computerize the district collectors offices across India, and the efforts of Chhattisgarh chief minister Ajit Jogis efforts to computerize the states functions. Theres the case of the SARI projects collaboration with the Aravind Eye Hospital, where the retinas of people were photographed and the doctors identified cataract patients among them, but one doesnt know how sustainable this is. India probably has more ICT4D projects than any other country in the world, but there are no studies on their impact on the common man.

There should be two aspects to such a study: impact and sustainability. To study the impact, one should not just ask questions but live in the villages, and talk to everyone from the outcasts to the Brahmins. We also need to take a very hard look at sustainability and understand what are the expenses in building, maintaining, and sustaining the infrastructure. What are the possible sources of revenue? We know that if you pour enough money, you will be successful. But NGOs [nongovernmental organizations] get tired of pouring money, and they eventually pull out.

It would be good to meet with Ken sometime and get his feedback on the RISC project that we plan to start shortly.

RSS Aggregator as Receiver

Adam Curry makes some interesting points:

We’ve collectively created transmitters and tools for content creators but forgot to flood the market with receivers.

RSS provides a ‘static free’ format for content and is constantly being improved to do things like carry attachments (just like email) along with written words. No formatting, no fancy script thingies or blink tags, just plain content.

RSS is the content carrier wave of the future. And everyone with a weblog can or is already creating a compatible broadcast channel.

There’s a hidden gem in rss. Each entry can be accompanied by a url that points to an ‘enclosure’. The idea is that an enclosure is a large media object, like an mpeg or DiVX file.

When an aggregator scans an rss signal with enclosures, it can decide to download the file in the background to your desktop or any spot on your harddrive.

A good receiver will wait for you, and only notify you when the file has been completely downloaded. It will give you a link to the local location on your hard drive. Click, no wait.

Adam has a request for an application:

I made this one up: Request For Application.

The application is described as an internet media receiver.

Technically speaking (in my own terminology) the application is a news aggregator that supports downloads

It should be cross platform (otherwise I can’t test it on my mac 🙂 and run in the background. It needs to be light weight.

The application uses a browser as interface, so it has a built in web server. The interface minimally allows for a One Page Aggregator and subscription management that includes specifying a download folder for each feed’s enclosures.

No windows, buttons, panes, prefs or options. I want my receiver to do nothing more than “Collect & Serve”.

I think we should write to Adam! The Info Aggregator we have can be the base for building what Adam wants.

Continue reading

Smarter Web Clients

The Guardian looks beyond the browser and sees, among other things, Macromedia Central.

Web applications have become increasingly complex. Not only are users looking for more interactivity, they also need to be able to work on and offline. Web architectures also make it difficult for web applications to monitor changes and deliver alerts and changes, and for applications to connect to each other.

Companies such as Macromedia and Microsoft are working on tools to solve these problems, tools that still use web technologies to work with servers, but behave much more like traditional desktop applications than a browser. Known as smart clients, or rich web clients, these build on familiar technologies such as Flash or Windows Forms, as well as using web services and XML.

Flash is still limited to the browser. In March, Macromedia announced it was working on a new technology that would turn Flash into a tool for delivering rich applications to end-user desktops. Central is due to be launched by the end of the autumn. It is a two-part system: a set of development tools and a simple desktop application you use to find and install (and purchase) new applications.

What sort of applications are people building? One thing Macromedia has discovered from the beta is that developers are using Central to provide front ends for web services, such as applications to give you weather information, or help you find restaurants. You don’t need to build a new back-end application, as long as you can deliver XML to Central.

Continue reading

TECH TALK: IT’s Future: USA Today and Andy Grove

The IBM-Dell Profits Mystery

An article in USA Today by Kevin Maney addresses the mystery of why IBM and Dell are making money on what is seemingly a commodity. He begins with an analogy to help elucidate on Nicholas Carrs argument.

To understand Carrs argument, think of IT as cars and companies as teenagers. When I was in high school, hardly any boys had cars. So the ones who did own cars had a huge strategic girl-luring advantage over those who didn’t. Those boys were mobile. They could get to every party. They could make out in their cars. My friend Ed had a car that had gaping holes in the floor and belched smoke like an Iraqi oil well fire, and even that was a strategic advantage.

Today, in my neighborhood of the spoiled, every high school boy has a car. So having a car is no longer a strategic advantage. Having a Lexus might give you a bit of an edge over a classmate with a Hyundai, but it’s not even close to the gulf between a boy with a car and a boy with no car.

Bottom line: As a strategic advantage for teenage boys, cars no longer matter.

This is exactly what has happened with IT. Carr says that IT used to be a strategic advantage for companies because not every company had it. So Wal-Mart could jump ahead of Kmart, in part, by investing heavily in IT and making better and faster decisions.

But these days, great technology is cheap and plentiful, and every company has its share. So IT doesn’t matter because it’s no longer a strategic advantage. It’s essentially a cost of doing business.

And if that’s the case, who wants to spend a lot on IT? It’s like phone service or office stationery you want quality stuff for a low price, in bulk. Who does that better than anybody? Right now it’s Dell. And Dell is hotter than just about any technology producer.

But and this is a big ol’ BUT IT is different from most other products in one big way: The technology keeps changing and improving, often in great leaps.

If a tech company can keep coming out with really high-end, super-cool new technology, it can go to customers and offer something that will give them a strategic advantage over all the other mopes buying the commodity bulk stuff from Dell.

That’s the road IBM has taken. It pumps billions of dollars a year into its massive scientific research labs and builds big honkin’ machines like T-Rex, which it unveiled earlier this month. T-Rex is three times more powerful than previous commercial mainframes, and it starts at $1 million apiece.

In the market, IBM is increasingly winning the customers willing to take a risk on technology that might bring a strategic advantage, and Dell gets all the rest, who are just trying to keep from getting toasted by competitors.

ITs Potential

In an interview in Business Week, Andy Grove says that we cant given glimpse the potential of IT.

In any field, you can find segments that are close to maturation and draw a conclusion that the field is homogeneous. Carr is saying commercial-transaction processing in the U.S. and some parts of Europe has reached the top parts of an S-curve. But instead of talking about that segment, he put a provocative spin on it — that information technology doesn’t matter — and suddenly the statement is grossly wrong. It couldn’t be further from the truth.

It’s like saying: “I have an old three-speed bike, and Lance Armstrong has a bike. So why should he have a competitive advantage?” Besides, it is outside of traditional commercial-transaction processing where info tech will have the greatest impact in the future.

[Carr is correct that commercial-transaction processing in the U.S. and parts of Europe is indeed mature.] Transaction-processing software or database software Revision 8 does not change from Revision 7 nearly as much as Revision 2 did from Revision 1. But that’s not [all] information technology. Ask yourself: Is digital distribution of music saturating? If it is saturating, what is all the hullabaloo about? Is digital electronics applied to warfare saturating? Then what is it that we witnessed a few months ago? You’re talking about different parts of information technology.

Also, not even commercial-transaction processing is uniformly used. Health care, an industrial segment that represents 15% of GDP, way underuses information processing. And a lot of the problems with health care would be improved if it used it to the same extent banks do. It doesn’t.

[Carr’s article struck such a nerve] because we are in the third year of a recession, and people are anguishingThe industry is in terrible turmoil. People are going in all different directions trying to find the magic answer. Somebody comes in and says: “There’s no magic answer because the whole industry’s dead.” But the industry is not dead.

The world is being turned into a digital representation. Distance means nothing if you have a digital infrastructure. Anything digital is borderless. You cannot put obstacles in the way of digital technology flowing everywhere. Everything that has an information element can be digital, increasingly inexpensively.

That leads to wholesale personalization of everything. MP3 players are personalization. Digital delivery allows you to make your own playlist. Same thing is happening in television, with personal video recorders like TiVo. The same thing is beginning in medicine, with diagnostics and personalization of treatment. This is information technology. And I submit to you, it is very, very early. We can’t even glimpse IT’s potential in changing the way people work and live.

Tomorrow: Phil Wainewright and Business Week