IBM’s On-Demand World

IBM’s Sam Palmisano outlined his view of the coming era in computing. Writes WSJ:

In IBM’s view, the business and technology industries are entering a new era called “on-demand.” In this era, companies will have to respond rapidly to customers’ demands, market opportunities and external threats. To do that, Mr. Palmisano said it will require technology that is based on open standards, can easily be integrated, and can identify and fix problems itself. According to the executive, this new model will save companies money and will reduce the complexity of systems.

Adds on IBM’s USD 10 billion investment being made into on-demand computing: “[Palmisano] described a future in which huge computer networks are made up of powerful, self-repairing machines. Ultimately, IBM foresees that the combination of these networks and other advances such as grid computing will allow businesses to buy computing power on demand, similar to the way electricity is purchased….In IBM’s vision, on-demand networks will rapidly adjust to spikes in usage or to disasters such as fires or floods to keep businesses up and running. They will use standards to maintain interoperability between different varieties of hardware and software, and they will offer customers more flexibility in payment and in choosing services rendered. Companies in such an environment, Palmisano said, could contract with an outside partner and, say, pay by the month for a financial accounting system rather than maintaining it in-house. IBM says that by adopting the on-demand concept, customers could save money and gain competitive advantage by gaining access to more responsive computing capabilities.”

TECH TALK: Technology’s Next Markets: The Software Edge (Part 2)

Tech Talk: Lets talk next about Weblogs and RSS.

Deviant Entrepreneur: Weblogs are personal journals. They are a relatively new concept, but really should have come first on the Web! In the early days of the web, we had everyone wanting to create their home pages. The problem was these pages were static and no one really visited them. Weblogs are an extension of oneself and more importantly, ones thinking. They are regularly updated with what the blogger finds interesting. This facet makes them valuable to others because they mine the bloggers brain. In doing so, they capture the tacit knowledge that lies within each of us.

Bloggers do not exist alone. A community of bloggers is needed to create the flow of ideas. Because everyone is writing in their own space and not on a shared, public bulletin board, blogging brings out the best in the individuals and teams.

RSS (Rich Site Summary) feeds from blogs and news sites play an enabling role to amplify the information processing ability of people. RSS feeds are an XML and can be processed through news aggregators. By leveraging on publish-subscribe events, they bring updated information to users, rather than making them go from site-to-site seeking out whats new. RSS feeds can also can be used to distribute enterprise events, which can be blogged with comments and then re-distributed as an updated item via the RSS feed.

Thus, Knowledge blogs, or k-logs as they are called, and RSS are instrumental in building out the two-way web, on the Internet and within enterprises.

TT: What about the Digital Dashboard?

DE: The Digital Dashboard brings all the information and applications together on a single screen. It is like the My Yahoo page that we are so familiar with. The difference is that the Dashboard aggregates not just personal but also enterprise information. For many of the new users, the Dashboard will become the de facto desktop. They will not have to worry about all the applications the Dashboard will become the gateway to them. It will also take care of sending out alerts when events take place thus ensuring the last-mile connectivity between information and the individual.

TT: How does Information Visualisation fit in?

DE: Rather than limiting users to the file-folder-directory metaphor for navigation, we can use ideas from information visualisation (especially the world of video games) to create richer and more immersive, experience-oriented worlds. This is where a lot of the plentiful computing power that is becoming available can be leveraged.

TT: You also mentioned about the need for support in local languages.

DE: Yes. The next users are not going to necessarily have English as their primary language. We will need to support all the different languages from Indian languages like Marathi, Bengali and Telugu to Asian languages like Chinese and Thai, to others like Spanish and Portuguese. Multi-lingual support needs to be ingrained into the content and software for tomorrows users.

TT: What are your thoughts on pricing?

DE: Software has to become a service. It has to offered for subscription for the equivalent of USD 2-10 (Rs 100-500) per month for each user. Users should be able to decide what modules they want and pay for them on a need basis. What this does is bring down the up-front investments needed by the users. It also is a recognition of the incremental development that is needed. In this case, software is more akin to a TV soap opera than a film, with new installments being put out regularly and frequently, rather then the big-bang, all-or-nothing approach of a film.

Software is where the real value in the solution will come from. They are the blades, as compared to the computing and WiFi razors. Software development also plays to the unique strengths of the emerging markets. It does not require large investments unlike hardware.

Tomorrow: Making It Happen


Business Week writes about The New Push for E-Government in the US:

Technological innovation in government has long been an oxymoron. Bureaucrats hate change — and the very concept of streamlining. But what happens when the immovable object meets an irresistible force called the Internet? The cheap computing, fast connectivity, and easy-to-use interfaces that characterize the Web are just too powerful to ignore as tools for making government more efficient.

Among other things, they allow citizens to take over tasks — such as deciding what types of benefits or grants they should apply for — that once were the domain of clerks. Give government agencies the ability to easily share data and communicate, moreover, and they’ll enjoy the same productivity gains that the Net has produced for businesses.

Over the past couple of years, the push for e-government has taken on a new sense of urgency. A ballooning federal budget deficit and the looming retirement from the taxpaying ranks of the massive baby-boom generation mean that over the next 20 years, governments at all levels will probably have to accomplish far more with relatively stagnant revenues and resources (except for the military, of course).

I recently gave a presentation (PPT file, 50 KB) in India at an e-governance conference. It was interesting to see that in thinking, we are right up there with the best. Many Indian states are pushing forth on citizen-centric services. There have been many good projects done in different states. What is lacking is sharing and co-ordination: if only they could pool their ideas together.

Governments also need to focus on the front-end: the access to these services. That was the thrust of my presentation: low open-source and low-cost diskless terminals to build computer centres everywhere.

Blade Servers

Says Marc Andreessen of OpsWare (in InfoWorld):

Blade servers are a subset of a broader trend we see happening that we call “disposable servers.” Servers have gotten a lot cheaper since the mainframe and since the 80s. A server two years ago was likely to be a box costing anywhere between $30,000 and $3 million, and it was a big important box that runs big important applications and if it breaks or something goes wrong there’re a lot of people that are going to get involved in fixing it. If people shift to the distributed application architecture with the Web, application servers and redundancy, and horizontal scaling where they’ll tend to run a much large number of servers but those servers will tend to be much cheaper, you head into a world where the servers themselves become disposable, and I literally mean disposable. Which is, a server that cost $1,000 or $2,000 is not worth fixing. It’s so cheap it’s not worth the effort to ever try to open it up and fix it. If it breaks, if the hard drive crashes or CPU crashes, you throw the server away. That can apply to Intel 1-U rack-mounted servers or the same concept applies to blades. If a blade failed, you’re going to throw the blade away and pop a new one in. And that’s a big, big change and it’s going to really drive a lot of cost reduction, especially if people migrate off of proprietary Unix onto Linux or onto Microsoft and migrate from old application architectures onto application servers, like BEA and WebSphere. The consequence of that is you’re going to have a much larger number of servers that are individually going to be much cheaper and disposable, and applications will be written to be redundant across many servers so you can fail over and can throw those things away.

Intel’s Big Bet

From Fortune on Intel’s USD 10 billion investment in its newest fabs:

Intel is gambling that by pushing the state of the art in chipmaking faster than rivals are able to, it will reach a point where it can use sheer manufacturing prowess and capacity to undercut any competitor in price, performance, and variety. That means not just fending off would-be archrival Advanced Micro Devices and continuing to dominate the business of making chips for PCs, but also challenging Texas Instruments, IBM, Motorola, and a spate of smaller competitors in chips found in everything from cellphones to cars.

“Capacity is strategy,” says Andy Grove, Intel’s chairman and former CEO. “Henry Ford used it to revolutionize the automobile industry; the Japanese used it to push us out of the memory-chip business 25 years ago; we used it a decade ago to ignite the explosion of the PC industry. Now we’re using it again so we can broaden our business beyond the PC.”

Intel thinks its manufacturing capabilities will speed the introduction of incredibly powerful chips that take the Internet to the next level, enabling hundreds of millions of computers, phones, and other devices to be always tied to wireless networks. “We’re talking about a half-billion transistors on a chip, and perhaps even a billion,” says Paul Otellini, Intel’s president, COO, and likely the next CEO. “Suddenly there will be very little limit to what you can design into a single integrated circuit. If you want to talk about a golden age for semiconductors, that’s when it will be, and the IT and telecom and consumer electronics industries will be the biggest beneficiaries.”

Fortune writes that Intel’s big bet is on communication chips. Its strategy gives an idea of the future we can expect.

The chips allow notebooks to speak wirelessly to networks, enable cellphones to make calls, and help route web pages, e-mail, and streaming media around the Internet. Intel thinks it can win business by finding a way to marry computing and communication, quite literally on the silicon chips themselves.

Chief technology officer Pat Gelsinger dubs the strategy Radio Free Intel. Simply put, he wants Intel to incorporate, right into many of its processors, radio transceivers that can automatically detect and connect to hot new Wi-Fi wireless networks and even cellphone networks. “How can we beat Texas Instruments or Motorola, companies that have decades more experience than we do in communications technology?” Gelsinger asks. “By changing the rules and defining a new architecture for integrating communications into smart devices. We want to make a radio transceiver something that you expect to be just another feature of just about any device with a microprocessor.”

The most accessible market for Intel’s radio-enhanced processors is mobile PCs. By the end of the year Intel will begin shipping samples of specially designed chip sets for notebooks that include ultra-low-power Pentium processors, graphics chips, and other support circuits, and a built-in ability to attach to a Wi-Fi network. These chip sets will enable a notebook computer to sense and connect with wireless networks as its owner moves around, and even switch from one network to another on the fly. “In mobile computing, to focus on the processor performance as we have in the past would be missing the point,” says Anand Chandrasekher, the vice president in charge of the product line. “The trick is to make all the extra performance that wireless requires invisible, so it just works, and the user can count on it.”

A second big target for the Radio Free Intel initiative involves cellphones and PDAs–markets Intel competes in but doesn’t dominate. This year 400 million cellphones will be sold, and many of them will contain Intel’s flash memory chips. But phones are also getting smarter and beginning to resemble PDAs in their ability to handle address books, calendars, and the like. Meanwhile Intel’s XScale processor is the brains for most PDAs that use Microsoft’s Pocket PC software, and it recently won the support of Palm. It has a shot at becoming an industry standard, much as the Pentium is the standard processor in the PC.

Intel’s grand plan is to couple its XScale chip with flash memory as a way to get more of its chips into cellphones. It also plans to use the same part, attached to a new Wi-Fi chip, to make PDAs more versatile communicators. Ultimately Intel wants to put everything–the communications transceiver for both Wi-Fi and voice cellphone service, the XScale processor, and loads of flash memory–into a single part that would function equally well as the heart and soul of a PDA or a cellphone. Creating that can be achieved only if Intel can make chips with much smaller transistors, and if it can learn how to place radios, logic circuits, and memory in the same chip package without having their electrical signals interfere.

Andy Grove on the future that will be: “Just wait five years. Hundreds of billions of dollars we now spend on voice telecommunications will become a freebie–just like [Cisco CEO] John Chambers has said. That’s Moore’s Law at work. The entire entertainment industry will be digitally distributed over broadband networks. [Media companies are] going to tip over, because one of them, with its back to the wall, will make the transition, and the others will have to follow. That’s Moore’s Law at work. Houses will be wireless, broadband will be delivered wirelessly, and home and portable computers and consumer electronics are going to be built to facilitate all of the above. Okay, it hasn’t happened in the first five years; it’s going to take ten. And there will be a lot of pain for some. But it will happen, and we’ll all benefit.”

Continue reading

On Time

Writes John Robb: “Time is much different when you are managing a company. It’s less important how you spend an hour here or there. What is more important is making sure that the days, weeks, and months spent working are going in the right direction. That is the wasted time I worry about. I guess I am still ruled by time but without the direct tyranny of the watch but more of the calendar.” I agree with John. The minutes don’t matter as much as the direction one is headed.

On a personal basis, I stopped wearing a watch about 20+ years ago. I lost it on a beach in a school picnic, felt bad about it, and decided not to wear one. Have never really felt the need for one. Over the years, I have developed a reasonable accurate sense of telling the time and can tell what time it is within a range of a few minutes (unless of course I am woken up in the middle of the night!)

In a way, the more one escapes from the tyranny of the clock/watch, it is there everywhere around us. On other people’s wrists, the cellphone, the computer, the car. In my room, there are 4 different clocks, each having its own opinion on the current time.

I have found myself to be much more happier without having to worry about thw watch on the wrist. Not to say that I am late for appointments. In general, I am always on or before time. In fact, I am a stickler for punctuality. But not having to worry about each minute passing by is good.

I like the timelessness of travel in an airplane, especially a long flight. Sitting in the seat strapped with the belt, there is little one can do. Except think and read, and let time slowly pass. Only then does one realise how long an hour is, and how little we value it on the ground.

TECH TALK: Technology’s Next Markets: The Software Edge

Tech Talk: Let us now move on to software. What are your thoughts on what software is required for the next set of users in the worlds emerging markets and enterprises?

Deviant Entrepreneur: So far, weve talked of leveraging technologies and platforms which exist terminal-server computing, open-source software and WiFi. Of course, we are using them differently from their original purpose for example, using recycled PCs as desktops, using desktops as servers, using Linux not just on the server but on the desktop, and using WiFi not just for wireless LANs but also for bridging the last-mile connectivity gap.

Software is going to be a critical component of the offerings it is the interface between the computing infrastructure and the users. It is also where we can think differently, because the users we are planning to target have little or no legacy. There also have been some very interesting developments in recent times in the software world which can be leveraged.

The key ideas are as follows:

  • Web Services to build re-usable, standards-based software components
  • An integrated eBusiness suite to provide the platform for an event-driven, real-time enterprise
  • Weblogs to harness tacit knowledge among people
  • RSS to create two-way flow of information
  • Digital Dashboard to aggregate personal and enterprise information on a single screen
  • Information Visualisation techniques to create a better and more intuitive user interface
  • Support for Local Languages

    By itself, each idea is not new or revolutionary. But once again, taken together, these ideas can form the platform for an exciting new leapfrog platform.

    TT: Lets start with Web Services and the eBusiness software suite.

    DE: The objective is to create intelligent, real-time, event-driven enterprises. These enterprises need to have a unified database at the backend, with OHIO (only handle information once) as the guiding principle.

    Small and medium enterprises are the weak links in todays supply chains. They have less technology penetration than the big companies, and yet are critical to the information and document flow. They need cost-effective software solutions. This is where web services and the software suite come in.

    John Hagel, writing in his new book, Out of the Box: Strategies for Achieving Profits Today and Growth Tomorrow through Web Services, provides the rationale behind using web services:

    Automating the flow of information between a company and its business partners has always been difficult and expensive. Many interactions thus require human interventionfor instance, employees who key into corporate systems the data retrieved from business partners through faxes, telephone calls, or even lists printed out from the systems of other companiesa practice that leads to human error. Furthermore, many companies maintain larger stocks of inventory than they really need, because the flow of information among partners in the value chains of most sectors just isnt efficient enough. Since activities near the edge of businesses abound in inefficiency, the opportunities for creating near-term value from Web services are substantial there, which makes it likely that companies will apply them in this way before using them to knit together core internal systems.

    Until now, though, integrating the systems of one company with those of its partners has been less feasible than integrating internal systems. Web services promise to change that. Better connections among trading partners are going to mean that companies will be able not only to streamline their edge activities but also to collaborate on improving internal processes, such as product development.

    But the real long-term prize of business collaboration lies in mobilizing the assets of partners to deliver more value to their customers. When cooperation among different businesses resembles the activity of a network, they can increasingly focus on innovation in their core activities, and the network becomes more efficient and flexible in what it can offer.

    Such process networks are powerful tools for unleashing the potential of specialization. Emerging Web services technologies will play a crucial role in facilitating them.

    Web Services provide the ability to build out the Software Lego building blocks, which can be used to architect the eBusiness software suite. The foundation is built on an open-source database like SAP-DB or PostgreSQL. JBoss, which is also open-source, can be used as the EJB/J2EE application server.

    On top of this come the business logic components. These have the specific logic for accounting, payroll management, HR, CRM, sales force management, and other business functions. By using XML and SOAP, they can be integrated together, and also can be used by independent software vendors to create vertical solutions. An added value can come in from using business process standards like ebXML and RosettaNet.

    Tomorrow: The Software Edge (continued)

  • Linux Desktops

    Amy Wohl writes about the market for Linux desktops: “Wed say the desktop Linux race is just getting started and whether its about devices or mainly about delivering function to existing desktops is yet to be determined. As to whether existing vendors the Dells, Gateways, and HPs will be the main desktop providers in a Linux desktop era or whether a new set of providers might arise its simply too soon to tell. In fact, its too soon to tell how important the Linux desktop is going to be, although many vendors and analysts are already predicting that outside of the highly penetrated North American and western European market, its likely to be an important player.”

    The newsletter also has a letter by John McCreesh: “The most effective use of computing power is to put an absolute minimal hardware on a diskless terminal on the desktop, and do all the ‘real’ computing on shared central servers. Virtually all the PCs currently being thrown out by corporates could be used indefinitely as diskless terminals. Comparatively small servers can support surprisingly large numbers of users- simply because the available horsepower is being used efficiently. Holding all the user data and software centrally provides for an extremely cost-effective support model.”

    This is exactly what Emergic Freedom does.

    Adds McCreesh: “As all the software required to try out Linux terminal/server is available under open-source licenses, it’s very easy to try out. Pretty well any networked PC can be converted instantly into a dumb terminal simply by booting it from a suitable floppy. Load Linux onto a spare server – or even a good desktop PC, add the package of ‘glue’ from the LTSP (Linux Terminal Server Project), and you can be up and running on a sustainable computing proof of concept without spending a cent. It’s an interesting exercise to try and see what the users think of the sustainable computing alternative. Reverting to Microsoft Windows is as easy as removing the boot floppies – but the chances are, you won’t want to do it. ”

    Inherent in what John says is an interesting, incremental idea for targeting existing Windows users. Here’s a staged approach to get them to first try out Linux on the desktop and then even shift:

  • first, target email. Get email on Linux through Evolution, by booting through the floppy. No Windows, No Outlook, No Viruses. And that’s a big deal. Viruses are the biggest bane in a corporate network, most viruses come in via email, and using Linux eliminates that risk.

  • next, put OpenOffice and Mozilla under Windows. Both (the Windows versions) can run from the Linux Thick Server. Users can get familiar with them. This way, three key applications (email, office and browser) are now not dependent on Windows.

  • finally, switch Windows to Linux, using the thin client-thick server architecture. No need for the local hard disk or CDROM drive, no need to ever upgrade the hardware on the desktop.

  • Wireless Chips

    Writes SJ Mercury News on the various startups working to get new chips ready for the coming future:

    In the not-too-distant future, a single box in your home will be able to send different cable channels and Web sites to multiple screens over a wireless network. Your car will eventually be able to download maps, MP3 music files and detailed traffic reports as you drive by specified “info-fueling” stations.

    These advanced wireless applications, powered by a communications standard known as 802.11a, are in development now, with many chip makers working on cramming combinations of radio technology and digital signal processing onto pieces of silicon small enough and cheap enough to be used in all kinds of devices.

    US Tech Lagging Behind?

    Writes CNN: “What has top executives of arguably the world’s two most important tech companies saying that the U.S. may soon cede its tech leadership? Three fundamental concerns: what both see as a disastrous diminution in national commitment to IT research and development, a dearth of engineering graduates, and the low penetration of broadband compared with other countries.” Look East.

    TECH TALK: Technology’s Next Markets: Why WiFi

    Tech Talk: What are the alternatives for connectivity? Why are you recommending WiFi?

    Deviant Entrepreneur: Of course, there are many options for the last-mile connectivity cable modems, ADSL, leased lines, even dial-up. The first three options require wires/cables to be laid, and that may not always be easy. In todays world, wireless is the way to go that is where innovation is happening at a rapid base, and we want to ride the wave. The backbones, connecting various wireless hubs across neighbourhoods, will have optical fibre. But WiFi remains the best possible option keeping the future in mind.

    There are two other reasons for using WiFi. Firstly, it uses open spectrum, so there are no licence fees applicable. In India, the government has still not permitted the use of 2.4 Ghz outside of a local setting, but hopefully, that is likely to happen soon. The 3G-type wireless alternatives require a huge infrastructure and expensive spectrum, which will lead to costly solutions. 3G can be useful for mobility (when the end points are moving), but in our case, that is not a requirement.

    This brings us to the second advantage. WiFi enables the build-out of grassroots, bottom-up networks. That was the way the Internet was constructed. Individual entrepreneurs can set up the wireless hubs in their neighbourhoods. That is the way a whole country can be connected up rapidly.

    In a white paper on Open Spectrum, Kevin Werbach explodes a number of myths:

    Wireless spectrum is scarce: If multiple users were allowed to dynamically share frequency bands, and to employ cooperative techniques to improve efficiency, spectrum could be as abundant as the air in the sky or the water in the ocean.

    Massive capital investment is needed to exploit the spectrum: Licensed service providers such as cellular telephone operators and television broadcasters must build out expensive distribution networks before they can deliver services to customers. Often, they must also pay to obtain the spectrum itself in auctions. These huge capital expenditures must be recovered through service fees. In an unlicensed environment, by contrast, access to the airwaves is free and the most significant expensethe intelligent radiosare purchased directly by end-users.

    The future of wireless lies in third-generation (3G) systems: 3G represents a useful advance in cellular technology, but it is hardly a panacea. Spectrum and build-out costs for 3G will be enormous. Many of the wireless data services identified with 3G could be more efficiently delivered through short-range and meshed unlicensed technologies, with wide-area 3G service reserved for situations where those alternatives arent available.

    Wireless technologies are not viable solutions to the last-mile bottleneck: The last mile does pose special challenges for wireless systems. However, these challenges may be overcome through unlicensed systems that use long-range communications, wideband underlay or meshed architectures. With cable and telephone wires into the home controlled by dominant incumbents, and enormous capital required to extend fiber to every home, open spectrum represents the best hope for a facilities-based broadband alternative.

    Emerging markets are years behind on cost-effective, last-mile connectivity solutions need to catch up fast, and WiFi bridges the gap very well, positioning these markets as technology leaders. In fact, it helps them leapfrog to build out a ubiquitous, always-on, broadband wireless network.

    Next: The Software Edge

    Grokker for Information Visualisation

    NYT writes about Groxis’ Grokker software “which is intended to allow personal-computer users to visually make sense of collections of thousands or hundreds of thousands of text documents.”

    It uses information visualisation techniques. According to NYT, “Grokker builds a visual map of the general categories into which documents fall by using what computer software designers call metadata, which describes each Web page or document. The program currently works with the Northern Light search engine, the Amazon online catalog and as a tool for scanning a user’s own PC file collection.”

    Mitch Kapor on Chandler

    Writes Mitch Kapor on his “interpersonal information manager” codenamed Chandler:

    We are trying to level the playing field by giving small & medium organizations collaborative tools which are as good as what large companies have had. We think we can do this in a way which doesn’t have the administrative burden of Notes or Exchange. We’re trying to be faithful to the original spirit of the personal computer — empowerment through decentralization.

    If Chandler gets initial traction, then perhaps with another turn of the wheel it will grow up, much as Linux did over the course of quite a few years to become an enterprise-class product. So, in this sense, it’s a potential long-term threat, just as Linux emerged as competition for Microsoft in the server market. If I were Microsoft, I’d be worried about open source in general, not about losing Outlook/Exchange market share any time soon. With or without OSAF, I believe all of the applications in Office will be commoditized with equivalent free versions. I can see it happening . It’s not quite there yet but I bet it will be. I’m imagining there are teams of programs around the world working on this at this very moment. In a few years generic PC’s will come with a free, competent office suite bundled. That will challenge Microsoft’s hegemony in desktop applications.

    A design note:

    Chandler will represent chunks of information as items, much as Agenda did. An item may consist of an email, an appointment, a contact. It can also be a document. An item can be thought of us having a body and a set of attributes (or meta-data).

    Views are formed (logically) by specifying a query and running that query against the repository of all items. As in Agenda, an item can appear in more than one view. This is the underlying mechanism by which we will do the equivalent of “virtual folders”.

    Views can be of a single item type, e.g., email, or than can be of mixed types, e.g., all items relating to a single subject, regardless of whether they are emails, attachments, contacts, or appointments.

    Every item in the system will have a unique URI, so it is referenceable, both from the user’s own machine and remotely.

    Items can be linked in arbitrary ways as well.

    Whereas Agenda was limited to a single hierarchy of categories (equivalent to attributes), in Chandler we are using an RDF-compliant schema as the backbone. It will come with a basic schema for PIM’s and it will be extensible, although we are still thinking about how extensible it will be, e.g., in terms of interoperability between different schemas.

    Adds Nick Denton: “Outlook is the one piece of software, apart from an internet browser, I can’t do without. But it’s a painful dependence: the data file, full of email and contact details, is so huge that I can’t even back it up onto CD; when I last had a hard drive problem, the entire file was unrecoverable; and searching within Outlook is ludicrously slow. Microsoft is highly vulnerable in personal information management. The lock it has on wordprocessing and spreadsheets is to do with users’ need to exchange files; compatibility is everything; and users won’t go on a limb by trying new software. That protection does not apply to Microsoft Outlook.”

    Slashdot: Yet Another Exchange Killer?

    Publish-Subscribe Internet

    From Jon Udell:

    The Internet isn’t one giant LAN. It works remarkably well much of the time, but variable latency and sporadic failures are no longer exceptions, they are the rule. HTTP’s statelessness was the first major adjustment to this new reality. Hit the InfoWorld home page, and your browser will make a dozen separate requests of our server. If a few of those fail, it’ll keep trying until it finally assembles the whole page.

    Of course, why were you hitting our site in the first place? Presumably to find out about topics that interest you. Since you’ve got better things to do than poll the site to find out what’s new, we publish feeds that you can subscribe to in order to be notified when updates occur. If you’re reading this column, you’ve subscribed to one of those feeds, in the form of an e-mail newsletter. (There are others[1] as well.) You might also be using an RSS (Rich Site Summary) newsreader to follow one of our syndicated XML newsfeeds[2,3,4,5]. These are all simple examples of a publish/subscribe services architecture. As pub/sub and asynchronous messaging get baked into the Web services stack, things are going to get a whole lot more interesting.

    IDC on Web Services

    Web services a decade away (

    Tight IT budgets mean that Web services are being used merely as integration tools, said IDC, noting that “most of the Web services vision is just pure speculation.”

    IDC argues that delivering software as a service will require a lot of components and applications that don’t yet exist. In addition, “the sharing of components and data required by the Web services vision will raise a number of difficult business, legal and contractual issues,” said IDC.

    For Web services to work as imagined, IDC said, technology hurdles must be the first challenges overcome, but businesses also will have to change the way they view software and intellectual property rights. Proponents of the Web services vision also face work in the areas of security, standards and privacy.

    The early adopters of web services can be companies in emerging markets, with little or no existing legacy of software. This helps them “leapfrog” – it is a theme I am writing about in this week’s Tech Talk series.

    Werbach on Decentralisation


    Centralized systems are failing for two simple reasons: They can’t scale, and they don’t reflect the real world of people.

    The world is becoming increasingly complex. Companies manage supply chains in real time, while hundreds of thousands of gamers gather in shared virtual worlds. Networks must carry vast and growing amounts of traffic, with no end in sight. Centralized systems eventually crumble under the strain of that complexity.

    Decentralized approaches often seem impractical, but they work in practice. The Internet itself is a prime example–it works because the content, the domain name system and the routers are radically distributed.
    But it’s the human element that is really driving the pressure for decentralized solutions. This shouldn’t be too surprising. Biological phenomena like the human body and the global biosphere have had billions of years to evolve, and they are the most complex decentralized systems we encounter.

    More concretely, people are seeking ways to communicate and collaborate across the artificial boundaries of organizations and geography. They want their music, on their terms, just as they want high-speed connectivity anywhere, any time.

    Werbach ties in decentralisation with the next WWW – Web Services, Weblogs and WiFi.

    TECH TALK: Technology’s Next Markets: Last-Mile Connectivity via WiFi

    Tech Talk: Youve talked about using recycled computers to build a diskless terminals as thin clients thick server and use Linux and other open-source software on a thick server to build out a low-cost computing infrastructure. Let us move now to the next challenge facing the emerging markets: that of connectivity. How we do connect the computers to the Internet?

    Deviant Entrepreneur: The disruptive innovation we need to use is WiFi (802.11). WiFi stands for Wireless Fidelity. It uses unlicenced spectrum in the 2.4 Ghz and 5 Ghz bands to provide connectivity at speeds ranging from11-54 Mbps. The current distance limitation is about 100 metres. The technology is developing very rapidly and prices are falling rapidly. In the developed markets, WiFi is being used as a Wireless LAN solution. The wireless access points cost about USD 140 (Rs 7,000) while the 802.11b cards cost less than USD 70 (Rs 3,500).

    An article in McKinsey Quarterly on elaborates on WiFi:

    Wi-Fi is an alternative means of Internet access: Simply hook up an inexpensive Wi-Fi base station (a chip plus a transceiver) to a high-speed Internet connection such as DSL, a cable modem, or a T1 line and place this base station within a couple of hundred feet of a house. All people in the vicinity who have a very inexpensive Wi-Fi device in their PCs or PDAs can then share low-cost, high-speed access to the Internet without having to pay individually for more expensive dedicated DSL or cable modem service.

    Even better, with exciting new technologies such as mesh and ad hoc networks, improved Wi-Fi devices could create overlapping Wi-Fi networks in hotels, airports, office buildings and malls. Strings of linked Wi-Fi networks can stretch through apartment buildings, campuses and neighborhoods. Forget about digging up streets for fiber to every building or about erecting forests of towers. Wi-Fi can stretch the fabric of Internet connectivity, cheaply and painlessly, over any community to points where traffic is aggregated onto high-speed fiber backbone networks.

    Wi-Fi exploits the spectrum used by gadgets such as cordless telephones and microwave ovens–airwaves that havent been auctioned or allocated to an exclusive user. This is the proverbial free lunch of spectrum. At last, Internet access can be easy, cheap, always on, everywhere. And Wi-Fi access is fast: Indeed, with a fiber rather than a DSL or cable modem connection from the backbone network to the Wi-Fi base station, the transfer speed of Wi-Fi can be faster than the typical speeds of those technologies.

    In the emerging markets, we need to use the 802.11 technologies to solve the last-mile problem and build out wireless community networks. The thick servers in buildings, schools and corporates can be connected to wireless hubs in a neighbourhood. With directional antennae, it is possible to have the WiFi range go beyond 100 metres. [In fact, a recent announcement by Proxim says that they have created a solution that can be used over a range of 12 miles.] The hubs can be at community places like post offices, banks, telephone booths or the tech 7-11s that we talked of earlier.

    While the WiFi solution can be used for LANs in places where it is difficult to do the network wiring, Ethernet cabling still remains the cheaper alternative. In due course of time as costs fall even further, it is going to possible to use for setting up the LAN. But I see the initial value coming in its use for building neighbourhood area networks, or NANs as they are called.

    Tomorrow: Why WiFi

    Digital Security

    From The Economist Survey: “One of the many prerequisites for computing to become a utility is adequate security. It is dangerous to entrust your company, your personal information or indeed your life to a system that is full of security holes. As a result, the problem of securing computers and networks, which used to matter only to a handful of system administrators, has become of far more widespread concern.”

    Hagel on Web Services

    From The McKinsey Quarterly: Edging into Web services:

    The advent of Web services promises to let a company connect its applications to any number of trading partners relatively inexpensively and easily. Of course, Web services could also be used to link applications inside the company… The best use of Web services currently lies in edge applications, where connectivity problems are more complex, efficiency gains are greater in the near term, and alternative ways of connecting companies are limited.

    These evolving technologies are essentially a number of Web-based standards and protocols that enable companies to connect applications and data directly to one another. The standards can be incorporated in a layer of software (an interface) that companies put atop an existing application, thereby allowing any other application with a similar interface to link up with it and communicate data. Writing this layer of middleware is far less expensive than customized codeabout $30,000 for a modest connection between two applications, according to one financial-services company, compared with $800,000 for the customized version. Moreover, code rooted in a feature of an application makes for a rigid connection: if the underlying application is changed, the customized connection must also be changed or even rewritten. Web services let companies tinker with the application while avoiding changes to the interface.

    What this means for business is that a company like Nike, with many product iterations and a broad range of partners, will be able to connect its own technology to that of its suppliers more efficiently, reducing the need for employees to send, receive, and reenter transaction data manually. Such a company could expand the amount and kind of data it exchanges with trading partners, thus not only improving the way both sides interact and collaborate but also transforming the way they develop, make, and distribute products. By using Web services to enhance collaboration in business alliances, some companies could even expand the value of the goods and services they deliver to customers. In addition, Nike would enjoy greater flexibility, so that when fashions changed the company could add new suppliers and drop others quickly and inexpensively. Similarly, it would be able to connect more readily to the large and fragmented retailer network that sells its shoes.

    There is also an article by Hagel in the October issue of the Harvard Business Review.

    For Richer – Paul Krugman

    A long article in the New York Times magazine, but worth reading: “Over the past 30 years most people have seen only modest salary increases: the average annual salary in America, expressed in 1998 dollars (that is, adjusted for inflation), rose from $32,522 in 1970 to $35,864 in 1999. That’s about a 10 percent increase over 29 years — progress, but not much. Over the same period, however, according to Fortune magazine, the average real annual compensation of the top 100 C.E.O.’s went from $1.3 million — 39 times the pay of an average worker — to $37.5 million, more than 1,000 times the pay of ordinary workers. The explosion in C.E.O. pay over the past 30 years is an amazing story in its own right, and an important one. But it is only the most spectacular indicator of a broader story, the reconcentration of income and wealth in the U.S. The rich have always been different from you and me, but they are far more different now than they were not long ago.”