On a recommendation by Chetan Parikh, I am reading James Miller’s Game Theory At Work. It is an excellent book and a must-read for all. Through a wide variety of examples, Miller takes us through various situations that we face in personal life and business, and how game theory can help in analysing the possible options. Many times, we use our intuition to make decisions. Game theory can be a good add-on in our arsenals to, as the book puts it, “outthink and outmaneuvar your competition”. Nash’s Equilibrium, Prisoner’s Dilemma and various other situations are important for us to understand.
The last couple dyas, have been meeting with Atanu Dey and listening to his RISC ideas. RISC stands for Rural Infrastructure and Services Commons. The more I listen to Atanu, the more I believe that these are the ideas which can truly transform rural India. Atanu’s background in engineering and economics makes for A Brilliant Mind. The challenge is for us to see how these ideas can be implemented to create a revolution across an India that has remained largely unchanged and untouched.
Tim Bray of Antractica, an information visualisation software, has a two-part series on “the information landscape out there in the real world. Part 1 surveys the Business Intelligence landscape (it’s bad). In Part 2, the question is: how to get people to try new technology in tough times?” [via Rahul Dave]
Collecting vs Using: The bottom line: every company out there is collecting oceans of data on every aspect of their business. Nobody ever got fired for deciding to retain the records or generate a report…But investors aren’t handing out rewards for collecting a lot of data. It’s just meaningless mountains of vacuous bits unless you’re getting some good use out of it; and if what I’m seeing is the norm, a lot of people aren’t.
Sales Process: No matter which side of the fence you’re on, the software sales process is becoming increasingly dysfunctional. Furthermore, while we all acknowledge the hangover from Y2K and the bubble, we can’t all give up doing capital acquisitions, much and all as we’d like to…But I think that even when the good times return, we’ll never again see the days where people would write six-figure purchase orders for technology without having had their hands on it and really REALLY convinced themselves that it’s going to get the job done.
The second part has a discussion on the multi-part process Antractica is using to selling. Some good ideas which we can apply. A point which Tim makes: “For us at Antarctica, we find out if the customer really has a problem they care about, because writing even a small cheque, these days, is a very effective filter.”
Barron’s writes about the shift from enterprise application integration (EAI) software vendors to application server vendors.
To some extent, the application integrators, such as Tibco, webMethods, SeeBeyond, Vitria Technology, and Mercator fulfilled their promise — some more than others — but the group has not performed over the long-haul nearly to the degree that many analysts and investors envisioned. Much of the blame justifiably rests on the companies themselves. Their products were often expensive, difficult to install and labor-intensive. But more importantly, during what was supposed to be their moment of glory at the turn of the century, much of the integration action shifted toward connecting with systems outside the firewall via the Internet. The emergence of “application servers” — layers of software sitting on top of operating systems that connect to the Internet — by BEA Systems, IBM, Microsoft, and to much lesser degrees Sun Microsystems and Oracle, stole much of the EAI group’s thunder. The application server became core of the enterprise universe, at least for a shining moment.
Outlines – or Personal Directories – are the missing link in the information milieu that we see today. Imagine if each of us bloggers could create a set of pages which put our writings in context like a directory. So, now, if I wanted to find out more about WiFi or the Digital Divide and if I know that there is an expert in this area, then I can go to that person’s blog, knowing that I will get a complete perspective through the outline and links, rather than just what are the new developments. The blogger already has a mental map – a taxonomy, a context – of the space. With transclusion (the ability to connect and show outlines in place), all these individual outlines could be independently linked together to create paths through the web which a search engine or a directory can never do.
What’s missing? The language – OPML – is already there. What’s missing is a mass-market outlining tool which can be integrated with blogging. Radio Userland has an outliner. But what’s needed is integration at the blog post level – so that when I am doing a post, besides categorising it, I can also place it appropriately in my directory. Into this ecosystem of personal directories should then come search, and the ability to narrow searches – in a way the RSS search engines are now doing to blogs. They still do not cover verticals or trusted blogs, but that can be expected soon enough.
What Personal Directories will do is provide a context for viewing information. Instead of just seeing news items as individual specks, we will start seeing the landscape as a whole – through the eyes of the experts. This will create a richer overlay on the world that already exists. The time for a million, linked directories has now come.
Lets think about a world with personal directories. Imagine we were doing a paper on the Memex. The first step we would do (as I did when I started thinking about this topic) is go to Google and type memex. This is the result we would get. It is a good starting point but considering that others have probably also explored this topic in great depth, wouldnt it be useful to be (a) pointed to experts in this area, and (b) get connected to their outlines of the topic?
What is missing in the blogging world is a directory of experts. For which now, we could perhaps use Google itself, though it is a short step from where we are to build this. Imagine if I am searching for a specific topic, and then it could point me to people who have written extensively on that topic, and perhaps whom others consider as experts. This information could be gleaned by doing a semantic indexing of blog posts, along with seeing what others turn to the blogger for (for example, which of a bloggers posts have the most inward links).
Basically, this creates a third alternative to finding information: Yahoos directory gives us information on websites, Googles search gives us information on actual web pages, while our blog search gives us information on experts (who also maintain a blog). If bloggers started maintaining personal directories of the content space they have expertise in, it will provide a mapping of the blogosphere which is richer and more insightful and updated than anything we have seen before. By taking ideas from ants, brains, memes, and small worlds, the Memex can weave magic.
Tomorrow: Of Stigmergy and Memes
Something interesting is happening in the US telecom industry. As the NYTimes reports, phone companies are moving to flat-rate plans. Geography and distance are truly becoming irrelevant in the voice business, as much of is now carried over data networks.
These unlimited-use plans offer callers the advantage of predictability and less time spent checking monthly bills. They commonly cost $50 to $60 a month with services like voice mail and caller ID bundled in, making the price only slightly higher than the $48 that American households typically spend on local and long-distance calling, according to the FCC.
Positive responses from customers are good news for an industry that faces a number of incipient threats, including the loss of market share to calls made over the Internet, cellphone-addicted young customers who spurn land lines, and families who swap their second telephone lines for high-speed Internet connections.
Most calls now travel most of their journey over fiber optic lines that connect the whole country. A company’s expense in routing a call depends very little on the distance the call travels, but largely on whether a call needs to travel across lines owned by other phone companies and the access fees charged for that use. In most cases, calling a friend across the country now costs your phone company about as much as calling your next-door neighbor.
Bundling is coming to the “connected world” with flat-rate plans of the future likely to encompass voice (wired and wireless), data and cable.
The Orphans of Invention is the title of a piece by Ellen Ullman in the New York Times, remembering a talk gien by Doug Engelbart five years ago:
Back then, Silicon Valley companies were hiring people in great numbers, and a tide of youthful energy entered the field. And for the few years of the boom, the industry was large enough to employ several generations of programmers, from 22-year-old Web-page coders to 50-year-old experts in C++. There was, in that moment, a passing on of generational memory. The audience became aware that computers, though innovative, were not exactly new.
With the bust came the scattering of those generations. The 25-year-olds were fired first, then came the ones in their 30’s; soon came the layoffs of even more senior people. No one was immune. In 2001 and 2002, America lost 560,000 technology jobs. In Silicon Valley alone, 27,000 software positions disappeared between the spring of 2001 and the spring of 2002.
But more than jobs have been lost. To listen to Mr. Engelbart that day almost five years ago was to realize that the computer industry, when it started, was not simply about becoming a chief executive or retiring on stock options at 35. It was to remember that real innovation the stuff that made computers so much more than “crummy factors of production” comes from mysterious places, wild people, dreamers and tinkerers, and to remember all the skepticism they had to endure.
Kevin Werbach writes:
Linear progressions, such as the consistent improvement in processing power heralded by Moore’s Law, are fundamentally boring. They are like driving for hours on a straight, featureless highway: You know you’ll eventually get to where you want to go, but the trip itself becomes a blur. If instead the path forward involves stair-step transitions, through which the entire ecosystem reconfigures itself, life is far more exciting. Change is no longer measurable by one variable. It arrives in waves of interconnected developments whose relationship we only dimly discern.
That’s what’s happening today. The technologies and concepts generating buzz at industry gatherings like PC Forum, O’Reilly’s Emerging Technology Conference, and Supernova include social software, the semantic Web, Web logs, rich Internet applications, Web services, unlicensed wireless, grid computing, digital identity, broadband media. The more one looks at these developments, the more hidden connections appear. They are pieces of a larger whole, which we don’t yet have words to describe.
News.com writes about Jupiter, which “binds together at least three of its server applications into a single bundle. Jupiter includes BizTalk Server integration software; Content Management Server, for storing and presenting business documents; and Commerce Server, for building e-commerce Web sites.”
One of Microsoft’s biggest advantages over server software competitors is its dominance of the desktop software market, said analysts. The company is exploiting its desktop software stronghold to bolster its position in industrial-strength server software. For example, Microsoft’s Visio diagramming tool will produce BPEL-compliant code from the design of a multistep business process, according to executives.
Jupiter will also include so-called business activity monitoring tools, which will let someone use an Excel spreadsheet to view the progress of an ongoing business process. For example, a call center manager could query the workflow engine to spot where incoming calls are experiencing long delays.
Jupiter server software will work closely with the InfoPath forms-building capability that Microsoft is building into Office 2003, said Wascha. InfoPath lets a person create a form, defined as an XML document. By tying the form with the XML-based workflow engine in Jupiter, the person could establish the approval steps of a simple purchase-order business process, Wascha explained.
The close coupling of Microsoft’s desktop and server applications, combined with its traditional strengths in pricing and ease-of-use, give its renewed, albeit late, thrust into the server software market a distinctive look, according to analysts.
We need something similar for Linux also. Bundling is the key to making it proliferate. Linux has to succeed on the desktop, and thus needs to position itself as an alternative for new users in emerging markets. The desktop and server spaces need to be targeted in tandem.
Economist celebrates the 30th anniversary of Ethernet. It began life at 3 Mbps and now is touching speeds of 100 Gbps. What can we learn from Ethernet’s ubiquitous success?
The first reason is simplicity. Ethernet never presupposed what sort of medium the data would travel over, be it coaxial cable or radio waves (hence the term ether to describe some undefined path). That made it flexible, able to incorporate improvements without challenging its fundamental design.
Second, it rapidly became an open standard at a time when most data-networking protocols were proprietary. That openness has made for a better business model. It enabled a horde of engineers from around the world to improve the technology as they competed to build inter-operable products.
Third, Ethernet is based on decentralisation. It lets smart end-devices, such as PCs, do the work of plucking the data out of the ether, rather than relying on a central unit to control the way those data are routed. In this way, Ethernet evolved in tandem with improvements in computing powera factor that was largely overlooked by both critics and proponents when Ethernet was being pooh-poohed in the 1980s and early 1990s.
The Economist writes about how Sun is planning to work with open-sorce software by giving customers a choice.
Sun unveiled two new low-priced servers based on Intel chips. It also revealed that Oracle had agreed to make its software work on these machinesadding to speculation that Oracle is about to buy Sun. But much more significant was a subtle but crucial shift in the firm’s Linux strategy: as well as Linux, Sun will now also push an Intel-compatible version of Solaris.
Sun’s Jonathan Schwartz may seem to want to have it both ways. But he is trying to capitalise on an important trend. Some software users have started to realise that even Linux is not as free as it appears: for instance, it has to be maintained and upgraded. Linux is like a puppyin the beginning it’s great, but you also have to take care of it, says Mr Schwartz. He hopes that firms will opt for Solaris, because it requires less care.
Simply put, Mr Schwartz wants to give customers a choice. On the one hand, he will offer them an open-source solution, which lets them tinker and benefit from the collective brain power of volunteer developers. On the other, he will offer a proprietary option for customers worried about operational costs.
The problem of information overload has been with us for a long time, and is getting worse. Ray Ozzie puts the situation in context:
Just as the first generation of personal computers was mostly about personal productivity, the first generation of the Internet has largely been about centralized Web sites, used for publishers, transactions and e-mail. For the most part, all seems well and good. At a personal level, however, many of us are overwhelmed. We’re chained to e-mail and the Web, drowning in an information flood that leaves us feeling more and more like human message-processing machines.
Unfortunately, mainstay tools are falling behind our needs. Software was conceived in an era with substantially different requirements. For example, e-mail emerged 30 years ago, when computer viruses, spam and e-mail overload weren’t even on the radar screen. That era could not conceive of a future in which we’d deal daily with online documents and presentations, e-mail and instant messages, Web sites and blogs.
Each of us will soon face hundreds, thousands, or tens of thousands of “inputs” that we’ll need to continuously absorb and coordinate. A world with complex social, economic, organizational and personal interdependencies is inevitable. And as we near this linked future, systems and technologies must evolve or we will simply be unable to cope.
Ozzie believes that Personal productivity tools will become joint productivity tools designed for online use instead of a paper-only world. A rich cadre of collaborative online writing, media management, presentation and consumption tools will move to the forefront of our daily electronic lives.
The problem the Memex solves is that of rapid retrieval of relevant content from a humungous pool of unstructured content on the Web. Esther Dyson puts this in perspective in the Jaunary issue of Release 1.0:
To start, lets just consider how the Webs unstructured information can be organized. The two leading approaches are exemplified by Yahoo! and Google. Yahoo! has created a single, very broad taxonomy; although it has not in fact organized everything (!), it offers a directory (taxonomy) structure that in theory should be able to classify any content that shows up. By contrast, Google organizes the Web dynamically: Tell us what you want, and well put it at the center of the world and find you the surrounding informationTheres a trade-off between depth and breadth; the directory offers fine-grained, carefully vetted material, while the search engine offers access to everything
Yahoos Srinija Srinivasan says: Directories make most sense when you are browsing, when you want to discover something. Whereas you use search when you know what you are looking forWe cant possibly manage the entire range of what people might be looking for. The directory was never intended to cover every word of every page out there.
Google arose from the perspectivethat the Web is simply too vast for anyone to define or structure it properly: Best to let each query define its own neighborhood, and to start each search from the query outwards, rather than from some mythical top down, to where the answer lives.
As Yahoo!s Srinivasan notes, users have turned from browsing directories to searching, from exploring to going after specific results.
Between the two extremes of the centralised approaches of Yahoos directory and Googles search is the individual, ant-like, emergent Memex. To construct the Memex needs the active participation of each of us. As we have seen, the tools to bring to life Vannevar Bushs 1945 vision are only now becoming available. As writing and self-publishing becomes easier, individuals are starting to provide a shape and form to information on the Web, and embellishing it with their thoughts and ideas. This is creating for a richer, two-way web, built around the blogs, RSS and OPML ecosystems.
Next Week: Constructing the Memex (continued)
Jared Blank writes from his experience of marketing his own blog:
The [Google] keyword program worked well for me, but be patient with results and test many different combinations of words. It took me several weeks to understand broad terms worked better for me than specific keywords. You may have completely different results, depending on your industry. I found it was not worth bidding more than a nickel for higher placement in the results page. My best-performing keywords typically ranked fifth in the listings and performed better than other words that placed higher in the results. Contact other bloggers and ask them to place a link to your Weblog on their pages. This will be even more effective if the blogger is writing about a similar industry. Don’t give up on working your house list. Place a link to the Weblog in each of your newsletters and a link to the Weblog in your e-mail signature file.
A WSJ special report on Technology writes: “Digital technologies are upending the competitive balance across the corporate spectrum.”
It adds: “Safely profitable niches aren’t secure anymore. Longtime industry leaders are being forced to re-examine their basic ways of doing business. Upstarts are on the rise. And who will emerge victorious is anyone’s guess.”
A related story on Linux states: “As Linux grows, it not only stands to win lucrative parts of the server market before Microsoft, but it also threatens to lessen the value of the very software that Microsoft has built its empire on: Windows. It’s a challenge that finally has awakened the industry giant.”
InfoWorld: “UNeDocs, or United Nations extensions for aligned electronic trade documents, was started in 2002 by the UN’s Economic Commission for Europe. The aim is to use XML to create an electronic equivalent for paper trade documents based on existing EDI (Electronic Data Interchange) standards, according to the UNeDocs Web site.”
It is a project we should keep track of.
Boston Globe has an interview with Nicholas Negroponte:
I think WiFi is exactly like the Internet, it’s exactly the same…There’s not only a precedent, there’s a very strong economic model … flower boxes.
Think about it. If you put a flower box outside your house, you’re first of all using your own money to buy the flowers. You’re hanging it out there. You’re doing it for your self-esteem, for the beauty of looking out the window and seeing the flowers, of decorating your house and making it look well. But it also, if everyone on the street puts nice flower boxes out, makes the street look nicer. It happens a little bit on Beacon Hill, it happens a lot in European cities.
Now the theory of flower boxes, if there is such a thing, could be taken to WiFi. I put in a WiFi system in my home for my own use, but it radiates out into the street. There’s no incremental cost for me to let other people use it. There really isn’t. … If everybody does that, then the entire street has broadband. Every park bench has broadband, every convenience store has broadband, and so on.
So if you take that approach, it’s very much like the Internet. You make these resources available by connecting them. The sum of the parts is just much, much greater. And I think that’s what’s going to happen for a major piece of wireless.
Microsoft Research has a paper on Scope, a glanceable notification summarizer. Seems a bit like a Digital Dashboard.
We have designed this simple information visualization tool to help unify notifications and reduce distractions for the user, thus avoiding notification overload. The Scope allows users to remain aware of notifications from multiple sources of information, including e-mail, instant messaging, information alerts, and appointments. The design employs a circular radar-like screen divided into sectors that group different kinds of notifications. The more urgent a notification is, the more centrally it is placed. Visual emphasis and annotation is used to reveal important properties of notifications. Several natural gestures allow users to zoom in on particular regions and to selectively drill down on items.
Linux Journal writes, quoting a report from Finland, that “Free and open-source software are not only a useful and significant tool for the developing countries, but clearly have the potential to help democratization and help find solutions to the most pressing problems faced by the populations of developing countries.”
According to Clay Shirky, grids are not the next big thing as everyone making them out to be.
Supercomputing on tap won’t live up to to this change-the-world billing, because computation isn’t a terribly important part of what people do with computers. This is a lesson we learned with PCs, and it looks like we will be relearning it with Grids.
If users needed Grid-like power, the Grid itself wouldn’t work, because the unused cycles the Grid is going to aggregate wouldn’t exist. Of all the patterns supported by decentralization, from file-sharing to real-time collaboration to supercomputing, supercomputing is the least general.
Networks are most important as ways of linking unevenly distributed resources — I know something you don’t know; you have something I don’t have — and Grid technology will achieve general importance to the degree that it supports those kinds of patterns. The network applications that let us communicate and share in heterogeneous environments, from email to Kazaa, are far more important uses of the network than making all the underlying computers behave as a single supercomputer.