Digital Delusions

Gaurab Raj Upadhaya writes in Himal (August issue) on the irrelevance of digital technology in its current corporate form for the mass market. [Thanks to Mohan Narendran for the link.]

Gaurab echoes a lot of what we have been thinking and doing as part of Emergic. His three-pronged approach focuses on reused hardware, open-source software and content in local language. An excerpt (it is long, but definitely worth reading):

What is required is the judicious application of digital technology, without any emphasis on its communication aspect. The routine computing process, using minimal hardware and relevant software, is more than adequate for the present. Hardware, for all its industry-driven problems, does not present obstacles in the way of minimising investment costs. In fact, the advantages of minimal depreciation and high obsolescence can be used to the advantage of poor societies. Upgradation of hardware creates wasted capacity in the form of computing equipment that is phased out of networked organisations. These machines are obsolete only because they are connected and not because their inherent utility has been made redundant. As independent machines, reoriented to appropriate ends, they constitute cost-effective and durable resources. Given the steady flow of such equipment that is made available by constant obsolescence, there is little point in developing countries investing in cutting-edge hardware. So long as connectivity is not an issue, reusing discarded equipment is a feasible option. The first step towards the appropriate use of future-proof digital technology is, therefore, to exit from the connectivity loop.

This option also dispenses with the need for incurring huge expenditures in research and development that are necessitated by the search for indigenously developed, specialised hardware resources, which in any case might lead nowhere.

The main constraint to the extended use of information technology in the developing world is therefore appropriate software. This is the key area of concern since no second-hand solutions are available. Developing software is particularly difficult. The irony of South Asia is that it provides software professionals at all levels to the global industry, but does not have a single major application in any of the local languages, let alone a programme geared for local needs. If development funds for ICT are to be invested anywhere, it is in software creation and let the donor agencies hear this loud and clear.

This involves two related issues, namely the principles of using programming code and investment in training, research and development. The only way towards the creation of less expensive software technologies that are not based on the assumptions of the networked world is to participate in the arena of non-proprietary code. Globally, programming runs on two different principles. There is the proprietary system generally adopted by large corporates, wherein the original code of the programme is not available in the public domain. Against this, there is the principle of open-source development, according to which, whoever develops a programme releases its code in the public domain, for use in whatever form by other software developers.

Open-source is about the free development of software that is based on design ideas that have been collaboratively developed. This considerably reduces the time and effort spent in developing programmes and is the key to generating locally adapted solutions. All this makes for a great deal of flexibility in creating software options, including the incorporation of local language computing into the system, which otherwise is difficult to do.

Need for a Linux Ecosystem

Writes ZDNet: “Open source tools are becoming more common in large businesses, but analysts say that a perceived lack of service and support options could hold back growth.”

I think what the Linux community needs to do is to build an ecosystem of companies to rival that which Microsoft has created. This needs to go right from companies (or individuals) that can install and support the software, to independent software vendors who can build applications, to the availability of mainstream applications on Linux, and training for end-users and system admins.

The game is now no longer about technology – yes, Linux cannot run every Windows applications written. It is about first getting Linux into the door into companies, and that can only happen when there is awareness that Linux is fully supported by multiple vendors – not just the IBMs of the world, but also the smaller companies.

Linux needs champions in every georgraphical zone and every vertical around whom others can coalesce. Linux needs to build its set of complementers, the way Intel and Microsoft did in the 1980s and 1990s.

Indian Languages and Linux

A Linux Journal story on Indian Language Solutions for GNU/Linux mentions two of my Netcore team members – Prakash Advani and G. Karunakar.

Here’s why it is important to important to support the various languages, according to Frederick Noronha: “Some Indian regional languages are larger than those spoken by whole countries elsewhere. Hindi, with 366 million speakers, is second only to Mandarin Chinese. Telugu has 69 million; Marathi, 68 million; and Tamil, 66 million. Sixteen of the top 70 global languages are Indian languages with more than 10 million speakers. Other languages spoken in India are also spoken elsewhere. Bengali has 207 million speakers in India and Bangladesh, and Urdu has 60 million in Pakistan and India.”

The Emerging SuperPortlet

From PortalsMag.com:

First-generation “portalized” views of enterprise applications, as presented by portlets, gadgets or other branded tools, give users a scaled-down, customized look at the larger software systems they represent in a single desktop window.
That’s certainly useful, however, these mini-applications could soon give way to a new troupe of so-called composite applications, which the Delphi Group has recently identified as InfoServices.

According to John Kunze, chief executive officer of Plumtree Software, composite applications are created by grabbing functions from multiple front- and back-office applications, including CRM, ERP, and others, and stitching them together. The technological threads here are Web services, particularly XML (extensible markup language) and SOAP (simple object access protocol). “The portal is no longer just about creating an entry point into disparate applications and content; it has more and more to do with building completely new applications,” Kunze says. “These composite applications, which are part of a layer we [Plumtree] call foundation services, is also an indication of how the portal is shifting towards the Enterprise Web.”

Composite applications are the driving market trend here,” says Haridas Nair of Sybase. “The ability to build a customized application from multiple applications–and then to make it available through the portal–is extremely powerful for the enterprise.”

This emerging breed of applications also represents one of the true promises of Web services: application reuse. As opposed to continually purchasing applications or upgrading existing ones, Web services technologies potentially let you repurpose and combine pieces of existing applications, and then expose the results to the appropriate users through the portal.

A related discussion on portals is in a story by Line56, based on a report by Illuminata, which favours the integration-centric portal over what the information-aggregating portal.

This discussion is very relevant for our Digital Dashboard.

Microsoft’s new Web Services Strategy

The focus is on business processes and collaboration, writes Information Week: “Microsoft’s web-services strategy is shifting to the more strategic, and tougher, set of problems involving business processes. Products in development, described for the first time last week, aim not only to connect employees and companywide operations but also to improve collaboration and maybe even reinvent processes.”

Here’s a glimpse of the future:

In Microsoft’s future architecture, companies will use Web services to simplify workflows, let employees sift through business data using familiar desktop apps, and establish business-to-business hookups. Solutia Inc., a $2.8 billion-a-year manufacturer of carpet fiber and specialty chemicals, is testing XDocs as a way of extending the benefits of XML to small suppliers that aren’t investing in it themselves. “Not everyone we deal with is a huge company,” says Art Huggard, director of digital strategy.

With XDocs, Solutia can create order forms that store each field’s entries in XML format. “You have an interface that looks a lot like what they’re used to,” Huggard says. Suppliers complete the forms and send them back via E-mail, where a BizTalk server grabs the data and sends it to SAP.

OSAF, Mitch Kapor and PIM

Open Source Applications Foundation and Mitch Kapor’s Weblog. What I liked is the project Mitch and his team is working on: “A new take on the Personal Information Manager. It will handle email, appointments, contacts and tasks, as well as be used to exchange information with other people, and do it all in the spirit of Lotus Agenda. Agenda, for those who aren’t familiar with it, was a DOS product I designed (along with Jerry Kaplan) in the late 1980’s which introduced a new kind of database optimized for entering small items of information in a free-form manner, and then adding organizational categories on-the-fly. It was much beloved by a few, despite (or perhaps because) being abandoned by Lotus.”

Here’s more:

We are trying to make a PIM which is substantive enough and enticing enough to make people want to move to it from whatever they are currently using, which statistically is probably Microsoft Outlook. I’m not going to bash Outlook here. Suffice it to say that while feature-rich, it is highly very complex, which renders most of its functionality moot. Its information sharing features require use of Microsoft Exchange, a server-based product, which is both expensive and complex to administer. Exchange is overkill for small-to-medium organizations, which we think creates on opportunity we intend to pursue (as well of course as serving individual users)

Have I mentioned it’s going to run on Macintosh, Linux, and Windows and will not require a server? This is an ambitious goal, but we are convinced is possible to achieve using a cross-platform tool kit. (We are working with wxWindows/wxPython).

Also, everything is going to be fully open sourced.

A more detailed report comes from Dan Gillmor:

Kapor and his small team have been working on what they’re calling an open-source “Interpersonal Information Manager.” The software is being designed to securely handle personal e-mail, calendars, contacts and other such data in new ways, and to make it simple to collaborate and share information with others without having to run powerful, expensive server computers.

As with other open-source software, the source code (programming instructions) will be freely available along with the working program. An early version of the calendar part of the software should be posted on the Web by the end of this year, and version 1.0 of the whole thing is slated for the end of 2003 or early 2004.

If the software lives up to the developers’ plans, it will have wide appeal. It should be highly adaptable to personal tastes, with robust collaborative features. I’m especially hopeful about a feature to build in strong encryption in a way that lets users protect their privacy without having to think about it.

The Chandler architecture builds on other open-source projects. These include Python, a development language and environment that’s gaining more and more fans among programmers, and Jabber, a communications infrastructure that started life as an instant-messaging alternative but has evolved into a robust platform of its own.

We should look at the OSAF’s Technology page for a look at the technologies that they find most promising and the reasons: wxWindows / wxPython, Python, Zope Object Database (ZODB), Jabber, RDF, Mozilla.

TECH TALK: Technology’s Next Markets: The Building Blocks

Here then is the challenge before our Deviant Entrepreneur put together a set of technology solutions for the next 500 million users (comprising consumers and employees in enterprises) in the worlds emerging markets:

  • Computers for USD 100 (Rs 5,000), so that there can be one in every home and office
  • A better, more intuitive desktop on the computer, making it easier to navigate
  • Ubiquitous, cheap, high-speed wireless communications
  • Software as a service for USD 5-10 (Rs 250-500) per month, so that it is affordable
  • Zero-latency, real-time presentation of information, because this is what their partners, customers and peers in the developed markets are likely to have
  • Seamless integration of information across the extended enterprise
  • A single, unified database, such that information is entered only once and does not reside in silos
  • Leveraging the tacit knowledge that lies within people, because even though one person may not know everything, as a collective, they can know it all

    [Well present the rest of the series as an imaginary conversation between the Deviant Entrepreneur (DE) and Tech Talk (TT).]

    TT: So, DE, what is your motivation in targeting the next users?

    DE: As was discussed in the past few columns, technology faces a schism: on the one hand, there is a maturing set of users in the current markets the developed nations of the world, while on the other hand, there is the rest of the world comprising over 4 billion people whove yet to taste computing. The set of technologies that have been created have overshoot the needs of the current set of users. Yet, these technologies are too expensive for the next users. This has created an opportunity for disruptive innovations, which are low-cost, simpler and leveraging the state-of-the-art. This is exactly what I propose to do: create affordable technology solutions for consumers and enterprises in the worlds emerging markets.

    TT: What is your starting point?

    DE: The two building blocks are computers and connectivity. What is needed is that computers need to be made available to all in every home and on every desk. This is exactly what Bill Gates set out to do and has done very well for the first 500 million users. Networked Computers have been at the heart of the technological revolution in the developed nations. They adopted computers for automation and enhancing productivity in the 1980s, and then connected them to each other in the 1990s.

    Ironically, at present, most of the worlds computer and telecom industries are in a state of flux PC sellers are wondering how the industry is going to start growing again, while telecom companies find themselves submerged under debt and competition. Their current users are not adopting their solutions at historic rates, and the price-points are too high for the next users to adopt computing.

    The starting point for the revolution is to make networked computing a reality for the next users at low price-points. For this, we need to first bring together a triad of ideas: computers for USD 100, software for USD 5-10 per month, and broadband connectivity for less than USD 10 per month.

    The challenge is to all this with minimal R&D budgets and loss of time. The components to put this together already exist. What is needed is innovative, value-added aggregation.

    Tomorrow: Recycled Computers