Groupware

ServerWatch has a nice round-up of the state of Groupware and Collaboration. “Groupware and collaboration is an extremely elastic category. The terms, in essence, refer to anything electronic that helps people work together efficiently. The space can include e-mail, calendaring, instant messaging, audio and video conferencing, document repositories of different types, content management, bulletin boards, and voice services.”

Microsoft and IBM control 60% of the market. Microsoft will be launching Titanium later this year, while IBM has its Notes/Doomino platform.

A summary of the trends as mentioned in the article:

Early adopters in the small and midsize business space are starting to move away from Exchange server to products such as VirtualTek’s Joydesk and Bynari.

Software vendors generally associated with e-mail servers are entering the space more fully. Generally, the idea is that there will be a market for products “thinner” than Exchange and Domino that focus on the two key applications: e-mail and calendaring.

Since e-mail is the single dominant application, anything that impacts e-mail has a profound effect on groupware and collaboration. The ongoing struggle against viruses and the explosion of spam will have a big impact on how servers will be configured and protected. For example, servers must be able to accommodate rapidly changing third-party antispam software.

The usage patterns of groupware and collaboration are changing in three important ways. 1) Users will be more mobile, 2) real-time audio and video will become more prevalent, and 3) instant messaging will continue to grow as a corporate tool. Corporate IM may, in fact, prove the big story in 2003 because its deployment demands significant infrastructure adjustments. “IM … at a consumer, teenage level is kind of ephemeral,” said David Marshak, a senior vice president at the Patricia Seybold Group. “But doing it in a business context means it has to be stored. You have to prove you made the offer, gave this advice to this client, etc. It has a big impact on storage, and on search and retrieval capabilities.”

There is an increasing demarcation between users served by an enterprise that need full-featured collaborative applications and those that need only e-mail and calendaring. This is best illustrated by the example of a college campus that has a mix of permanent staff and faculty — some of whom are heavily involved in research that cries out for collaboration — and students who are typically there for four years. The students may be more aptly served by lightweight servers supporting Web-mail-based e-mail, while the permanent employees would ideally have an entirely different and deeper system. This type of bifurcated environment is not uncommon.

Messaging is a space we are in, too – with our MailServ solution. It does not yet have groupware and collaboration features, but its something we are working on.

WiFi and Cellular on New Mobile Phones

Writes WSJ: “New Wi-Fi phones, expected to be available world-wide within the next 12 months, would allow users to connect to Wi-Fi networks, which can be 100 times as fast as even relatively speedy cellular-data connections. The phones could be used for everything from e-mailing to surfing the Web and downloading music, video or big files such as PowerPoint presentations, and making voice calls to boot. When the devices aren’t in the vicinity of Wi-Fi links, which are increasingly found in homes, offices, airports and cafes, users could switch over to traditional cellular services.”

These WiFi cellphones are expected to cost USD 500 and more. Samsung’s Nexio (USD 1,125) device connects to both WiFi and cellular networks.

Why in the world would anyone need 3G?

Continue reading

CPU Performance

Tom’s Hardware has a report discussing CPU performance over the years. This is how Intel’s CPUs have progressed:

486DX to the 486DX4/100: 1989 to March 1994
Pentium 60 and 66: From March 1993
Pentium 75 to 200: March 1994 to June 1996
Pentium 150 to 233 MMX: October 1996 to June 1997
Celeron 233 to 533: April 1998 to January 2000
Pentium II/233 to 450: May 1997 to August 1998
Pentium III/450 to 1000: March 1999 to March 2000
Pentium III/500 to 1133: October 1999 to July 2001
Celeron II/533 to 1100: January2000 to July 2001
Celeron/Pentium III/1000 to 1400: January 2000 to July 2001
Pentium 4/1300 to 3066 MHz: November 2000 to date

Looking at this from our “thin client” viewpoint, if we are to use old PCs, then, in theory, we could use CPUs dating back to March 1994 (Pentium 75 and higher). So, in effect, even 9-year-old PCs can be used as the low-cost desktops in the thin client-thick server computing environment.

Emergent Democracy

Joi Ito connects many ideas together (blogs, emergence, strength of weak ties, social networks, trust). His conclusions:

The world needs emergent democracy more than ever. The issues are too complex for representative governments to understand. Representatives of sovereign nations negotiating with each other in global dialog are also very limited in their ability to solve global issues. The monolithic media and their increasingly simplistic representation of the world can not provide the competition of ideas necessary to reach consensus. Emergent democracy has the potential to solve many of the problems we face in the exceedingly complex world at both the national and global scale. The community of toolmakers will build the tools necessary for an emergent democracy if the people support the effort and resist those who try to stifle this effort and destroy the commons.

We must make spectrum open and available to the people, resist increasing control of intellectual property, and resist the implementation of architectures that are not inclusive and open. We must encourage everyone to think for themselves, question authority and participate actively in the emerging weblog culture as a builder, a writer, a voter and a human being with a point of view, active in their local community and concerned about the world.

Linux Desktop: Win4Lin

Jim Curtin (CEO of NeTraverse) discusses the Linux Desktop and his company’s product (Win4Lin), which runs Windows on Linux. Some comments:

The desktop status quo has become increasingly expensive — not only the support and change-management model, but the increasing frequency of upgrades that require new hardware, applications and training. The desktop model has been begging for a disruptive technology to come along and displace it.

There are three approaches to running Windows in Linux; emulation, integration and virtualization, or in other words; WINE, Win4Lin and VMware, respectively. Win4Lin offers the best of all worlds in that it gives pure compatibility, broad application coverage and no loss of performance. It also accomplishes this with a minimal resource footprint.

As for [the future of] desktops, in five years we should be well on the way to more prevalent use of thinner devices with user “state” hosted in the network. The majority of desktops, to the extent they exist, will not be stateful. People, especially home users, will have personal servers (either hosted at home or at a third party hoster) that broadcast apps and data to a range of personal devices, but these devices should be more about ergonomic elegance than operating systems. Corporate users will have stations that are big on display and light on local disk access.

An alternate idea to Win4Lin is “Lin-on-Win”. What I have been thinking is that the world is full of Windows desktops (either legally purchases software or pirated). Trying to get people to switch completely to Linux – atleast in the case of existing users – is quite disruptive, and is naturally opposed by users.

Instead, think of a solution where the Linux desktop becomes an application on the Windows desktop (through vnc or other alternatives). Wean the user away slowly with some applications. For example, start by promising virus-free email on the Linux desktop – this will mean moving mail and files to a Linux server. Next setp, get OpenOffice on the Linux desktop.

This way, the users still have Windows as their primary desktop, but key applications (mail, files, office suite) are moved on to Linux. Once they get more comfortable with the Linux environment, Windows can be replaced completely and the users won’t notice as much (as long as the Windows apps they need can be supported or migrated).

RSS Subscriptions

John Robb gives his view (one which I agree with) on the value of RSS feeds, after finding that he has subscribed to 115 feeds:

RSS subscriptions are much more than just automated bookmarks that take the pain and hassle out of browsing for relevant content. They also allow me to quickly repurpose the content as content on my weblog — all I have to do is hit the post button next to any item I want to comment on. RSS can also be a fantastic delivery system for large content via enclosures. One important point to remember is that unlike bookmarks, RSS subscriptions don’t atrophy — they live until they are actively deleted.

On a related note, this is the comment made by Jeremy Allaire in relation to Jon Udell’s Team Blog comment:

All of a sudden the world of weblogs is colliding with other established team collaboration and work productivity spaces such as content mangement, knowledge management and enterprise portals.

To what degree will Blog Readers become a natural client software category; will they be part of browsers; of communications/messaging apps (Outlook/Notes); standalone? As RSS 2.0 gains traction and the content moves from being simple text content to richly tagged meta-data and more or less structured content (like ‘system status reports’, ‘bi data’, as Jon suggests), what’s the proper productivity interface for digesting and regurgitating all that data.

I believe that a “new desktop” can be designed around the microcontent client, built around a digital dashboard and news readers, with the backend integrated with enterprise events, available via RSS. Will take this discussion up in a forthcoming Tech Talk series.

TECH TALK: The Rs 5,000 PC Ecosystem: Homes

The question that has puzzled me for some time is as follows: Why arent more people in India buying computers for their homes? I am sure they recognise the importance of computers, especially for their children. And yet, less than a million PCs are being bought for homes. (The number is probably lower: a total of 2 million PCs will be sold in India this year). Availability of cheap consumer financing means that one can take home a PC for Rs 1,000 per month (payable for 36-42 months). And yet, home sales are not skyrocketing.

Why is that? Is it that people find the Rs 1,000 monthly installment high? Or is that there are other factors electricity issues (lack of available of reliable power across India), space issues (where do they keep it in small homes), software issues (where do I get pirate relevant software), Internet access costs (which can run upwards of Rs 500 per month for as little as 30 minutes of daily dial-up access).

My belief is that one of the key issues in this conundrum is the total cost of ownership. Computers are believed to become obsolete in 3-4 years. Which means, that from the home users point of view, the monthly outflow for a connected computer is at least Rs 1,500. This is not a small figure for most Indian families. The price point for mass market adoption is, according to me, no more than Rs 500-700 per month. Or, put another way, the upfront cost should be no more than Rs 5,000 and monthly outflow should be about Rs 250-500 per month. At those price points, it is comparable to the cost of two other networked devices the TV and the cellphone. So, why not apply the same model to the computer to drive mass market adoption?

The Rs 5,000 PC (5KPC) can do just that. It needs a network to come alive and be useful, just like the TV needs the cable network and the cellphone needs the GSM/CDMA network. So, to make the 5KPC an economic reality, it means that the service operator has a three-year revenue base of a minimum of Rs 9,000 (36 months at Rs 250 each month) to provide for the loaded thick server costs, software and wireless connectivity. The thick server loading is about Rs 2,000 per 5KPC in enterprises this will probably be the same homes, since even though usage is not that high, it is likely to be bursty, especially early in morning and late evenings.

Wireless connectivity using WiFi can probably be provided for Rs 3,000 (cost of equipment cards and access point) since there are no spectrum costs. These costs will probably halve in the next year. That leaves a budget of Rs 4,000 for software and operations. Considering that all the software used is going to be open-source and based on Linux, it should be possible to put a solution together within these limits.

What this does not take account is the upside. And that is where the 5KPC scores. The 5KPC desktop can be remotely controlled by the operator, who can now sell icon space to banks and other service companies for a small fee per user. Everytime the home user starts the 5KPC, these icons (and therefore the brands) will be visible. It is akin to how cellphone companies can provide links to value-added services which are one-click away on the cellphone.

In addition, as desktops proliferate at home, it now becomes possible for the nighbourhood stores to set up relationships with consumers in the neighbourhood through RSS (Rich Site Summary) feeds which provide information on whats new. RSS feeds are based on XML and can be automatically picked up by special programs (RSS Aggregators and News Readers), based on the users subscriptions.

For years, people have talked of interactive TV with set-top boxes. That hasnt happened. The computer is the ideal interactive device. In effect, the 5KPC will create a whole new ecosystem of interactive services targeted at home users. The 5KPC is what can make the vision of a connected computer accessible to every family a reality.

Tomorrow: Moreover

Continue reading