Mirror Worlds

I had read David Gelernter’s book “Mirror Worlds” nearly a decade ago. I was quite fascinated by it, but didn’t quite understand the full implications of what he was saying. Reading Steven Johnson’s column about the book makes me think I should go and re-read it.

In 1991, computer scientist David Gelernter of Yale University predicted in his book Mirror Worlds that advances in computing power and connectivity would lead to the creation of virtual cities: micro versions of the real world built out of data streams and algorithms instead of bricks and concrete…Fast-forward a decade, and evidence of Gelernter’s prescience abounds. Millions of people are active participants in virtual worlds that possess the economic and creative vitality of actual communities. The Net denizens who have built a homestead in massively multiplayer games like The Sims Online (www. thesimsonline.com) are the digital world’s equivalent of the postwar immigration to California. The worlds are so vivid that the players now take the virtual objects that they’ve accumulated in these gamesswords, houses, entire charactersand sell them in online auctions for real-world currencies.

In a true mirror world, data would be mapped onto recognizable shapes from real life. For instance, to find information on a local hospital, you would locate the building on a computerized map and click on it with an “inspector” tool. Within seconds, the big-picture data about the facility would come into focus: number of patients and doctors, annual budget, how many patients died in operating rooms last year, and more. If you were looking for more specific informationsay you were considering giving birth at the hospitalyou could zoom in to the obstetrics department, where you would see data on such subjects as successful births, premature babies, and stillborns. Information about how the hospital connects to the wider citywhat Gelernter calls topsightcould be had by zooming out.

Another key feature of Gelernter’s vision is what he calls narrative information systems. The data in a mirror world are time-based: The mortality rate at a hospital varies from month to month and from year to year, and a mirror world would record those changes. So with any variableor combination of variablesyou could reverse the data stream to see past conditions. This is a tool not only for making sense of the past but also for predicting the future: If you’re in the middle of an economic downturn and you’re thinking of moving to a new neighborhood, you might like to see how the real estate values fared during previous recessions. With a mirror world, you would select a neighborhood (or a city block, if you wanted that much detail) with the inspector tool and shuttle the data stream to 1990 or the mid-1970s or the late 1920s, as though you were rewinding a VHS tape.

OPML Directory Browser

Dave Winer has a detailed explanation, with his goal being “to get a set of work-alike directory browsers and outliner authoring tools in a variety of different environments, open source and commercial, much as there are many different implementations of XML-RPC, SOAP and RSS. I want to demo these at OSCOM on May 28.”

Continue reading

Content Pipelines

Clemens Vasters has an interesting idea:

The weblog infrastructure that I am (still, due to little free time) building, has its own aggregation system that flows aggregated content though a pipeline until it’s pushed into the storage system. So, what the system does is to pull content from RSS feeds, from Exchange public folders, websites and others sources (the “feed readers” are pluggable), maps everything into a common representation and flows articles through the pipeline. The stages in the pipeline can look at the content and make adjustments (fix up HTML), do filtering (assign categories) and, most importantly, can enrich the content with metadata. So, when the system is pulling information from an RSS source, it may invoke a stage that runs all the words in the article against a dictionary and add links to a site like dictionary.com, it may invoke a stage that find relevant books on amazon.com or a stage to get Google links or even a stage that translates the original Russian text into German for me, and add all that additional information to the “extended metadata” of the article, etc. Everything is pluggable.

Here’s the idea: I really don’t want to write the Amazon, Google, Dictionary and Babelfish stages, myself. What I rather want to do is to enlist those sites as web services into my pipeline. Using one-way messaging and WS-Routing I could say “here’s an article, add your metadata to it and send it back me or the next pipeline stage here at my system or elsewhere when you’re done”. Or I could just walk up to an RSS provider and say, “don’t reply to be directly, please route back to me these stages”.

So, if such a distributed infrastructure existed, and you’d aggregate this entry “backrouted” through a pipeline of filters provided by Weather.com, Google.com, Dictionary.com and Amazon.com, you’d have the weather for Athens and Madrid, all relevant Google links and books on “content” and/or “pipelines” and WS-Routing, and links to explanations of all non-trivial words in this text. How’s that?

Semantic Blog

Writes Jon Udell:

As we consume more of our information by way of RSS feeds, the inability to store, index, and precisely search those feeds becomes more painful. I’d like to be able to work with my RSS data locally, even while offline, in much more powerful ways.

I’ve long dreamed of using RSS to produce and consume XML content. We’re so close. RSS content is HTML, which is almost XHTML, a gap that HTML Tidy can close. In current practice, the meat of an RSS item appears in the “description” tag, either as an HTML-escaped (aka entity-encoded) string or as a CDATA element. As has been often observed, it’d be really cool to have the option to use XHTML as well. Then I could write blog items in which the “pre” tag, or perhaps a class=”codeFragment” attribute, marks regions for precise search. You or I could aggregate those items into personal XPath-aware databases in order to do those searches locally (perhaps even offline), and public aggregators could offer the same capability over the Web.

TECH TALK: Transforming Rural India: Village InfoGrid (Part 2)

One set of institutions which need to be part of the Village InfoGrid are engineering colleges, which can play an important role in both developing software applications relevant for the rural segment, as well as providing technical support to nearby villages. By stimulating the creativity of the young human mind, we can create a win-win situation for students looking for interesting and practical projects to do in their final year of college, and the needs of the villages looking for technology talent to create content and software for the TeleInfoCentres and the InfoGrid.

An interesting idea to make villages attractive by clustering them together is outlined by Indias President APJ Kalam. It is a scheme called PURA (Providing Urban Amenities in Rural Areas), and aims to to make rural areas as attractive to investors as cities are. Then, rural areas too will generate urban-style employment to halt (if not reverse) rural-urban migration. The scheme envisages:

  • Linking a loop of villages by a ring road about 30 km in circumference with frequent bus services. That will integrate the population of all connected villages into one market. Then, those villages become a virtual city with a potential to expand and accommodate 3-5 lakhs population.
  • Compensating farmers for the land acquired from them NOT by a lump sum but by an annual fee equal to twice the price of the produce they grow. That gives farmers a perpetual inflation protected income.
  • Sub-leasing the land to employers both for business and for employee residences within walking distance of each other. That will virtually eliminate daily commuting to work, an unavoidable evil in city living.
  • The President wrote recently in India Today on PURA: The model envisages a habitat designed to improve the quality of life in rural places and makes special suggestions to improve urban congestion too. As against a conventional city, say rectangular in shape and measuring approximately 10 km by 6 km, this model considers a ring-shaped town integrating a minimum of 8 to 10 villages in the same area. This model provides easy access to villages, saves transportation time, cuts costs substantially and is more convenient for the general people. Such a model of establishing a circular connectivity model of rural village complexes will accelerate rural development process by empowerment.

    On this idea, overlay technology with TeleInfoCentres connected as part of the Village InfoGrid, and we have an architecture that now fully integrates the village into the networked world, both physically and virtually.

    By building a technology centre in the villages and connecting these together, we are leapfrogging a whole set of people from an era where they could interact with only a handful of people to one where they can peer with many more like them irrespective of distance. It is much like how the Internet connected diverse and isolated networks in its early days. The Village InfoGrid is the first step towards making the global village a reality.

    Tomorrow: Intelligent, Real-Time Governance

    Continue reading