OpenOffice Future

From an interview with Sam Hiser of the OpenOffice.org Marketing Project (from NewsForge) on what’s missing: “…a few key features–which we are presently working on — like calendar, scheduling and mail handling, or full database integration, to which office suite users have become accustomed; we already have the killer file format.”

When asked about some of the major goals for the software going forward, Sam says:

– All languages supportable. Not 35, but 135! (23 available today)
– Great integration with all supportable OS platforms
– Full integration with Mac’s OS X Aqua!
– Groupware across all platforms
– Database (some integration is available today to experts)
– a great implementation of XML in the file format (done)
– DVD integration with Impress (OOo’s presentation module)

My wishlist:
– better display of Microsoft files: every once in a while some formatting issues crop up
– use of OO as a web service (essentially, as a programmable component)
– blog support: add integration with Radio / MovableType / Traction

Comments Amy Wohl on the same interview with Sam Hiser, using it to talk about innovation:

I don’t expect OpenOffice to push Microsoft out of the market any time soon, but it (1) presents an iincreasingly appealing alternative, particularly for customers who are looking for something lighter weight and (2) free or low-cost (Sun’s StarOffice is not free) can be a big advantage in developing markets.

Nearly every day — certainly every week — I get to look at great software, some of it very innovative, being built by clever people. Some of it will never get to market and some of it will, of course, not succeed. It’s a rule of innovation that if one in ten projects succeeds, we’re doing well.

Microsoft may have an advantage — fair or unfair — based on its enormous success in operating systems and office applications, but it’s far from being the only game in town.

Open Spectrum

Kevin Werbach has written a white paper [PDF HTML] on “Open Spectrum: The New Wireless Paradigm”. His conclusion (more from a US point of view):

We are living under a faulty set of assumptions about spectrum. Licensing may have been the only viable approach in the 1920s, but it certainly isnt in the first years of the 21st century. We take it for granted that companies must pay for exclusive rights to spectrum, and that once they do, they must invest in significant infrastructure buildout to deliver services. We also take for granted a pervasive level of regulation on how spectrum is used, which would be intolerable for any other medium so connected to speech. We assume that market forces, if introduced into the wireless world at all, must be applied to choices among monopolists rather than free competition. We make these assumptions because we cant imagine the world being otherwise.

Open spectrum technologies forces us to rethink all of our assumptions about wireless communication. By making more efficient use of the spectrum we have, it can effectively remove the capacity constraints that limit current wireless voice and data services. By opening up space for innovation, it could lead to the development of new applications and services. It could provide an alternative pipe into the home for broadband connectivity. And it could allow many more speakers access to the public resource of the airwaves.

Using technologies like WiFi is critical for emerging markets to leapfrog and create a high-speed wireless data infrastructure. Kevin has an extensive discussion on WiFi and its applications.

A Better Browser

Writes InfoWorld: “Rich Internet applications aim to smarten up browser-based interaction by making better use of client-side processing capacity. By leveraging tools such as Java applets and client-side databases to greater processing advantage, rich Internet applications improve functionality while reducing the number of calls to the server for information.”

What is needed is to embed real-time capabilities in the browser, so that a Web server can push out updates / notifications to the client. I think KnowNow has done this by running some software on the client browser which keeps the server connection on. Would be nice to do this for our digital dashboard.

TECH TALK: Technology’s Next Markets: Recycled Computers

Tech Talk: Lets talk about the first idea computers for USD 100. How do you propose to make that happen?

Deviant Entrepreneur: Recycle computers! Why do we discard computers after just 3-4 years of use? Consider this: in the US, even after the slowdown, business desktops are being upgraded after 41months which is long by their standards. We keep using cars, homes, even TVs. But computers users want the latest and greatest. In the context of what computers are being used by most people for send and receive email, access the Internet access, write letters, do spreadsheets, make presentations, and do IM it simply does not make sense to have multi-Gigahertz desktops.

But the problem is that with the software that runs on these machines. As Aaron Goldberg of Ziff-Davis says in Fortune, Windows XP doesnt run noticeably faster on your 2.4 Ghz Pentium 4 than it did on your 700 Mhz Celeron. Computers need to be upgraded simply to keep up with the newest software which runs on the desktops.

Let us rethink this architecture. Instead of thinking about fat desktops, what if we used fat servers and thin clients? Nothing new with this idea Sun has been advocating it for more than a decade, Novell had the same concept with its Netware. But at those times, the applications were client-centric, which drove the demand for increased processing power and memory, and LANs werent fast enough. Now, the situation is different applications are increasingly server-centric, and LANs run at speeds of 100 Mbps or more, allowing for server-centric computing.

In this situation, think about the following: what if the worlds emerging markets used the computers discarded by the developed markets and made them into thin clients? These clients dont need any hard disk or CD-ROM drive, they just made the bare minimum processing power and memory to run a windowing server (like the X Server). Essentially, the recycled PCs become graphical terminals, which connected to thick servers. All computing and storage happens on these servers.

Of course, it is possible to consider new, stripped down computers which cost USD 200-300 (in most cases, without a monitor, which will cost an additional USD 80-100). One could use these also, but it is definitely possible to get the older PCs at prices of USD 100 or so, shipped from the developed markets where they clutter up landfills and are an environmental hazard to the emerging markets, where they can continue their life. Motherboards dont die they keep running and running and running! So, lets use them.

At the heart of this thin client-thick server solution is the need for software which can convert these inanimate thin clients into livewire desktops. That is the real challenge. And that is where Linux and open source comes in.

TT: Will people use old computers?

DE: The analogy is Amuls Rs 20 (40 cents) pizza. It does not take away pizza eaters from the likes of Dominos or Pizza Hut, but opens up a whole new market for whom Amul makes the taste of pizza a reality.

So, the answer depends on the target market. People like you and me who are very computer-savvy (so-called power users) will not. But there is this entire segment at the bottom of the pyramid for whom there is no alternative. As Clay Christensen would have put it, the crummy old PC can actually delight these users for them, this will be their first (and only possible) taste of computing. After all, who cares whats under the hood? As long as the solution works and the computing experience is nearly as good, people will use them.

Tomorrow: Thin Client-Thick Server