Web Services Adoption Timeline

From a white paper by IDC on “IBM and the Strategic Potential of Web Services”:

2002: Within the Firewall
– simplified application integration
– increased developer productivity

2004: Contained External Users
– simplified business partner connectivity
– richer application functionality
– subscription-based services

2006: Fully Dynamic Search and Use
– casual ad-hoc use of services
– new business models possible
– commoditisation of software
– pervasive use in non-traditional devices

This seems to be too slow a deployment – perhaps, it is applicable to the larger enterprises. Increasingly, I feel that web services area great opportunity for the have-nots in the world of software. By combining software standards and business process standards, they can do cost-effectively what the bigger companies will take years to do. The smaller enterprises have little legacy. If they have a framework which can be implemented by using shareable business process components (so that each business does not have to go through the same learning), they can be closer to the dream of a real-time enterprise.

Setting up a Blog

Its a question I get asked often. NYTimes provides an answer:

For those who want a simple Weblog and do not have a file server to store it on, several sites offer templates and serve as free hosts, for example, LiveJournal (www .livejournal.com) and Blogger (www.blogger.com). For more complex blogs with more customizable features and fewer advertisements, both sites also offer services for $50 a year or less.

You can also buy desktop software programs to create a Weblog. Radio UserLand sells for $39.95 and works with Windows and Macintosh systems. Users receive 40 megabytes of space to store their blogs on the software company’s servers. A trial version can be downloaded at radio.userland.com, and UserLand Software has a Weblog information site at www.weblogs.org, which itself is a Weblog about Weblogs.

For diarists with their own Web server accounts, Movable Type has free software for creating a Weblog at www.movabletype.org, and Fog Creek Software has a free version of its CityDesk program for Windows at www.fogcreek.com/citydesk.

Wireless Mesh Networks

Writes Wired:

The beauty of meshes? They’re bottom-up networks that capitalize on the rise of Wi-Fi and other open wireless technologies. They shimmer into existence on their own, forming ad hoc out of whatever’s in range – phones, PCs, laptops, tablet computers, PDAs. Each device donates a little processing muscle and some memory. Packets jump from one user to the next – finding the best path for the conditions at any given moment – and finally skip to a high-bandwidth base station, which taps into the Internet.

The result: big boosts to the range and speed of wireless signals. With the help of, say, 50 meshed PCs, PDAs, and phones, a typical Wi-Fi network with a 500-foot range can be transformed into one that extends 5 miles.

An interesting point was made in an email to Mohan Narendran sometime ago:

Now that India is loosening up on indoor 802.11b regs, you may want to blog about “used Wi-Fi cards” which I reckon are going to be quite plentiful next year when upgrades happen to 802.11a/g. It could be that your thin-clients are in a Wi-Fi mesh network with the access-point at the thick-server. You had blogged about the Economist article: Mesh Networks is releasing their all-software product MeshLAN on 1st September (I don’t have the pricing), while Jon Anderson at Locust is working out open-source mesh solutions, www.locustworld.com. Used card plus software could add USD30 to your thin-client (I have not seen cabling costs in your SME budgeting …)

We expect this to work well, and we are evaluating the headless Linux-based Toshiba SG20 as the “mobility server” for our internal deployments: The Toshiba SG20 may also work well you – a portable thick server!

Will also be writing about Mesh Networks in tomorrow’s Tech Talk.

Intel’s MP3 Player for Video

Writes WSJ:

Intel said it has developed underlying technology for a new class of pocket-size gadgets that will store as many as 70 hours of video programming on a hard disk.

The gadgets, which Intel is calling portable videoplayers, are the latest in a growing category of devices that use storage capabilities normally found in computers to store digital entertainment.

To load up the player with video, users will need to connect the devices to a computer or a digital video recorder such as TiVo, a popular device for storing television programming onto a hard disk in the home, made by TiVo Inc., Alviso, Calif. Intel said it will take about three minutes to download roughly four hours of video, assuming a fast network connection and a new image-compression plan Intel is pushing. The devices, which also will store digital pictures and audio, will sport small color screens and enough battery life to play four hours of video.

Adds News.com: ” Intel won’t manufacture the device, but the design is meant for a player to be the size of a paperback book that downloads content from a PC via a USB 2.0 port or through wireless 802.11b networking technology. The device would include an Intel XScale processor, hard drive and liquid crystal display. The devices are expected to be available from manufacturers next year for about $400.”

T-commerce

News.com writes on the battle for the future of digital entertainment [thanks to Jayen Mehta for the link]:

“The interactive programming guide is going to be the first thing you see when you turn on the television, and that’s what everybody’s fighting about,” said Richard Sherrill, senior vice president of the ITV group at Kitro Media, who has a long history in the cable TV industry. “They all want to control it: The cable guys say they own it, TV Guide says it owns that real estate, even TV manufacturers want a piece of it.”

For years, on-screen programming guides have been little more than electronic versions of paper TV listings, reflecting the limited designs of the boxes that housed them. But today, Gemstar and its competitors are experimenting with set-top devices that combine the functions of a DVD player, video recorder, digital jukebox and game arcade, as well as deliver hundreds of channels.

By directing consumers to a multitude of content, interactive guide companies could become the make-it-or-break-it marketing vehicle for new TV network shows, subscription services or pay-per-view programming. The guides are also seen as vehicles for “t-commerce”–television commerce–with the potential to lead to lucrative advertising and retail revenue-sharing agreements with cable or satellite operators licensing their technology.

“The more choices you have in entertainment programming, the more time consumers will spend in front of a guide,” said Josh Bernoff, analyst at Forrester Research. “Any guide company is in a central, powerful position. The same way companies like Yahoo and AOL are crucially important to guide people on the Internet.”

Latin American Challenges

Writes WSJ:

The global market upheaval portends even more trouble for Latin America, as investors grow wary about putting their money in unstable countries. Foreign investment to Brazil slowed abruptly this month compared with its pace in the first half of the year. The Brazilian real hit an all-time low Wednesday.

This year has brought one debacle after another: an economic collapse in Argentina, a failed coup in Venezuela, street protests in Peru and Paraguay, and a financial shock in Brazil. A four-decade conflict between Colombia’s government and Marxist guerrillas is growing hotter. Even the bright spots are hardly glittering. Mexico’s future looks secure under the umbrella of the North American Free Trade Agreement, though its economy is vulnerable to ups and downs in the U.S. business cycle. Chile remains a small island of prosperity, but its economy has been slowed by the chaos elsewhere in South America.

Blogs for Business

An excerpt from a forthcoming book on blogs – “We Blog: Publishing Online with Weblogs”. This chapter is on the use of blogs in the enterprise:

Business weblogging’s potential is the vast expanse hidden from view beneath the surface. Just because you can’t see intranet weblogs in action isn’t reason to believe they don’t exist, or to overlook the tremendous potential of intranet blogging. From meeting notes to project or task-based weblogs, the ease of use of weblogging software and the chronological format of the weblog can transform an intranet site into a vibrant, engaging, and useful online resource for your company or organization. One potential application that’s received quite a bit of discussion is the weblog as an informal, or stealth, knowledge management tool.

Information Refinery

Much of what we do as information/knowledge workers is process information. We get it from multiple resources, we assimilate it, we route it to others, we translate it into different formats. In fact, in the enterprise, we can think of business processes as rules for routing information, as tracing the flow of events through various filters and actions. In essence, we have information ores which are refined by the enterprise. What we have been thinking about in BlogStreet and Digital Dashboard can now be encapsulated into the concept we will call an Information Refinery.

Whether it is news items or blog posts or enterprise events, the information refinery should be handle to handle all of them. The refinery consists of the following entities:

Miners

Just likes ores need to be extracted from mines, RSS miners collect the RSS feeds from different sources. Miners listen to sources which send the raw, unstructured information to them, or extract information through bots. They then take the information and send them to specialized adaptors for RSS extraction.

Adaptors

These are the interfaces to the outside world. A 1-way Input Adaptor takes events and RSS-ifies them for use by the refinery. A 1-way Output Adaptor takes an RSS feed and converts it into events suitable for use by an external application. A 2-way adaptor does both. In other words, from the point of view of the refinery, an Input Adaptor is read-only and an Output Adaptor is write-only. The 2-way adaptor is read-write.

For example, there are Blog 1-way adaptors which will take RSS feeds from weblogs (or create RSS feeds if one does not exist). We can think of News Adaptors which take a URL for the site, and make an RSS feed comprising the headlines. A Mail Adaptor can read from an email message and convert to RSS. A File adaptor can take information about the file and create an RSS feed about it (extracting perhaps the first few sentences from the file as the description). We can think of a Tally (or Quicken) 1-way adaptor which extracts information from the application and creates RSS events (using financial reporting standards) which can be fed into the information refinery.

As of now, some news and blog sites put out RSS feeds. But most enterprise applications are not publishing RSS. This is what we will need to develop. Thus, the Adaptors will need to know (a) the I/O formats of the proprietary applications, and (b) the vocabularies that the segment uses. For starters, we should think of 1-way (read-only) adaptors which enable us to aggregate the raw material (information ores) from multiple disparate sources.

RSS Spoolers

They take the RSS feed from the adaptors. They take subscription requests for the feeds through Agents. An agent is instantiated for every subscription request. Agents have rules to decide what to do. For example, Rams agent can decide that he wants the News.com and WSJ RSS feeds in full, while he wants all other entries with the keyword XML in them. The agents look at each of the incoming feeds and take the appropriate action. They route the RSS feeds/entries to other RSS spoolers or RSS Aggregators.

RSS Aggregators

There is one RSS Aggregator per subscriber/entity which is the end consumer of RSS feeds. Think of these as comprising one set of end points (the other being the information sources). RSS Aggregators have an RSS Store (database), for archiving older entries. They can publish the aggregate feed to an RSS Viewer. Or, then can send the RSS entries to Processors.

Processors

They embed business logic/ rules as to what action needs to be taken with RSS feeds. For example, they can send events to an Alerts engine to notify the user via email/IM/SMS, or create other RSS events. As such, processors can thus also be considered as information sources which put out RSS feeds for re-distribution. Processors can also work on the RSS store to create specialised applications (eg. Talent Search in an organization, Business Process Analytics to see how information flows and where the potential bottlenecks could be).

RSS Viewer

This shows all the RSS entries organized by source or time to a user, who can then decide if he wants to either delete the entry or publish it to a blog. An RSS Viewer should be capable of having multiple pages to support categorization of the incoming entries.

Blog Publisher

This uses the Blogger API to post events to a blog.

Blog Platform

This is a weblog tool like MovableType or Radio which enables management of the blog. A blog should be capable of having multiple categories (with sub-levels). The user can decide on the access rights for the categories (public, private, group). It should also be able to publish RSS feeds for specific categories.

Digital Dashboard

This is a browser with 3 tabs: one for viewing the RSS Aggregator feed, one for writing and one for viewing a users own blog. It is the unified read-write interface.

Putting It All Together

What the information refinery creates is a peer-to-peer architecture of information sources and RSS routers, filters and processors. Initially, we should look at starting at the edges this means, that let the enterprise applications do their own thing. What we want to extract from there is an RSS feed which can be aggregated with other feeds and put forward before the user. We should not focus initially on trying to create the information. As an analogy, Samachar.com did the aggregation of news content through using templates and a single page as the viewer. It did not try to create the content. Similarly, we need to create adaptors and miners for the various information sources that are there (news sites, blogs, mail, enterprise applications) to aggregate them together.

One of the first applications we can consider for application within the enterprise is the following: create a Client Information System, with a weblog which has one page per client to aggregate financial, technical, marketing interactions, support and external news together.

– Extract accounting events from Tally (cheque deposited, payment made, etc.)
– As marketing and support people interact with a client, they write the conversations or forward the mails to the client blog page. If they need to write, they should write in their own space, and then publish to the client blog.
– Invoices sent should be linked as files from the blog, with a brief summary of the contents. Invoices are sent by the marketing department.
– Scan various newspapers and magazine RSS feeds to search for news relevant to the client.

This way, there is a (reverse) chronological history of all interactions with the client in a single page, across the diverse groups in the company. Today, this resides in different mail folders of people, and different applications and spreadsheets. This is one example of how the Information Refinery can streamline the raw data ores to create an integrated emergent system, where the whole is much greater than the sum of the parts.

There are, perhaps, many knowledge management systems and corporate portal applications which could do something similar. The difference in the approach we have taken here is that this can be built in a very general-purpose manner (and even customizable easily at the interface points like Processors), it encourages users to continue what they are doing with the only addition of writing they will do this because they will derive significant value from the system, since it creates a positive feedback with the information flow, it can be put together quite quickly, and it leverages the existing enterprise applications.

The key is to first start at the edges and create the unified viewing interface (the digital dashboard, as it were) a read-only interface, but with information aggregated from multiple sources. In general, in organizations, there are 10x more readers than writers (i.e, one person may update the accounts information, but there are likely to be 10 people relying on that information for analytics and decision-support). The next step is to enable two-way communication into the applications, so the Digital Dashboard can also become a writable area which interfaces to applications. This is as a precursor to then replacing the expensive non-integrated enterprise applications with integrated web services- compatible components.

Whats missing in here is a more detailed discussion on how business processes can be reflected and managed through this information refinery, and how we can use business process standards as embedded in RosettaNet.

This approach is obviously not going to work for every organization. The main target are the SMEs, who today use very few enterprise applications. They can now, over time, be given a full suite of applications through a common interface. Just as the browser created the front-end which became a window to the web, with HTML and HTTP taking care of the backend, similarly, we want the RSS-driven Digital Dashboard to become the enterprise information workers front-end with the back-end being composed initially of adaptors to existing enterprise applications and later of newly minted web services compliant software components which embed the appropriate business logic and work together like Lego blocks.

Taken together with the Thin Client-Thick Server project which creates the IT infrastructure providing connected computers for all, the Information Refinery and its constituents provide the framework to build the Real-Time Enterprise for SMEs. The price point is what Ive talked about in the past: all the hardware, software, support and training thats needed for no more than USD 20 per person month. This, according to me, is the next real opportunity, and the vision that we want to work towards in Emergic.

TECH TALK: Tech’s 10X Tsunamis: Wireless (Part 2)

The communications industry is also being transformed, just like the PC industry was verticalised more than a decade ago. Writes Kevin Werbach:

The hardware elements of communications have already started to go horizontal. Handset vendors are outsourcing production to the same contract manufacturers that are prominent in the PC industry. Switching equipment increasingly uses general-purpose semiconductors. Once connectivity can be integrated into devices at marginal cost, though, the possibility of an entirely different communications industry arises.

Imagine that every laptop, every PDA, every home media server is also an agile communications device, able to connect to any available network. In such a world, paying a carrier for access to a single network, with a limited choice of services and hardware, will seem archaic. There will still be services businesses linking together these devices and, more important, the user data that flows across them. But they won’t look much like the integrated communications carriers of today.

We saw exactly such a 10X force shake up the computer industry in the 1980s. The communications industry is ready for dis-integration.

WLANs

While wireless has revolutionised voice communications over the past few years (there are now estimated to be a billion cellphone users worldwide), the world of data is also starting to get impacted with a protocol that goes by the unlikely name of WiFi, or 802.11. WiFi protocols offer connectivity in a 100-metre radius of the access point at speeds from 11-54 Mbps.

Wireless local area networks (WLANs) can, in the words of Douglas Hayward (Financial Times, July 17, 2002), cut cabling costs, pack more users into offices, and dramatically cut the cost of moving employees around buildings. Users can even wander around offices while staying connected to corporate networks including the telephone network, if it runs over IPThey are valuable for workers in non-office environments where fixed-wire networks are impossible, such as maintenance engineers in factories and doctors and nurses in hospitals.

WLAN hotspots are sprouting up all across the globe in a grassroots phenomenon, reminiscent of how the Internet access itself was popularised in the mid-1990s. Most recently, a group of companies in the US, led by Intel, IBM and ATT Wireless, have come together to discuss the formation of a nationwide 802.11 data network. WiFi, which has been seen as primarily, a LAN technology is being seen as the platform to build high-speed public data networks. This is happening because the technology has become standardised, popular, powerful and cheap.

Writes the H. Asher Bolande in the Asian Wall Street Journal (June 18, 2002):

[WLAN] has proven irresistible to the telecom carriers who provide the bulk of Asias Internet access. Fixed line providers in Japan, Korea, Taiwan and Hong Kong are suddenly racing to dot cities with public hot spots designated zones ofwireless coverage like restaurants or hotels, where customers with specially equipped laptops can conveniently flip up an antenna and surf at broadband speed.

WLANs move from private computing to public networks puts fixed and mobile service providers on a collision course, analysts say. By rolling out hot spots, the fixed-line operators are venturing into high-speed wireless data, territory mobile providers have staked out for themselves as their future basket. After all, the promise of Internet anywhere, anytime was the selling point that drove them to invest billions of dollars in 3G.

Tomorrow: Wireless (continued)