Eventually the battle for Wi-Fi users will likely be fought between cellular carriers and wired network access providers (dial-up, DSL, and cable). From a marketing perspective, cellular is synonymous with wireless access, and upcoming dual-mode (GPRS/Wi-Fi) hardware might give subscribers automatic access to the best available data network wherever they are at any moment — something only a cellular data operator can currently offer. But as a group, the cellcos are inexperienced at billing for data.
When it comes to the all-important question of charging users, the landline access companies are better positioned: They have the billing relationships with data customers and are already supporting hardwired connections to the Internet. Adding Wi-Fi access points is not terribly expensive, and the acquisition of a few WISPs would make it easier.
Writes Robert Cringley of InfoWorld: “Although Microsoft’s forthcoming CRM solution is positioned as a down-market offering, when it ships in December it will come with two connectors to SAP via bundled BizTalk technology. It will also coincide with an initiative rebranding all of Microsoft’s business applications and will be followed by an ASP-based version of the CRM offering targeted directly against Salesforce.com. Also on tap are third-party offerings that bundle a low-cost computer telephony feature and a partner relationship management app.”
He adds: “With Office and Sharepoint built into a fresh application that has no code from either Great Plains or Navision, two relatively recent Microsoft acquisitions, there is nothing “down market” about this application push into the enterprise.”
Anurag Phadke from our company has contributed “documented the process of building a custom kernel for an LTSP workstation, and also included information for using the LPP (Linux Progress Patch) to display a nice graphical screen during the bootup” to LTSP (Linux Terminal Server Project).
Business Week (July 8, 2002) had an article entitled The End of the Road for Bar Codes, which discussed how RFID could revolutionise retailing:
In stores, RFID enables salespeople to locate and read tags at a distance–a big advantage over bar code systems, which require line-of-sight between the reader and the colored stripes. In warehouse settings, RFID systems can inventory an entire building full of goods with minimal human supervision.
In March, Marks & Spencer, one of Britain’s largest retailers, began replacing bar codes with an RFID system in its $4.4 billion fresh-food business. The company is putting radio tags on all of the 3.5 million plastic containers it uses to tote food from suppliers to stores. Before, suppliers had to print bar-coded labels for each of the 7 million bins M&S handles every week. Now, each time one of M&S’s 340 suppliers packs one of these bins, the system encodes shipment data–including product codes, quantities, and expiration dates–onto RFID tags embedded in the cartonThanks to increased handling efficiencies, less spoiled food, and fewer lost shipments, M&S expects to recoup its $3 million smart tag investment in three years.
To bring the cost [of the smart tags] down, Massachusetts Institute of Technology is heading an ambitious project called the Auto-ID Center, backed by 52 companies including Procter & Gamble, Coca-Cola, and Wal-Mart. The plan is to develop a common language of electronic product codes to identify billions of items, and to slash the cost of RFID tags to 5 cents by 2005–cheap enough to embed a tag in each can of Coke.
In a recent article, the Economist (August 15, 2002) wrote about the future direction of tracking technologies: The new generation of tracking devices combines two existing technologies. One is a global-positioning-system (GPS) chip, which uses radio signals from a network of satellites to work out where it is on the earth’s surface to within a few metres. The other is a mobile-telephone chip, which broadcasts that location to whoever needs to know it. The result is a pocket-sized, or even wrist-sized, personal locator.
It gives the example of Applied Digital Solutions (ADS), of Palm Beach, Florida, which calls its version of the technology a digital angel:
The angel comes in two versions. People get a pager-like device that clips on to their clothing. Animals get a collar. The digital angel can also issue an alert when its wearer has fallen down, or when there has been an unexpected change in local temperature of the sort that might be caused, say, by someone falling into a pond. For that to happen, the wearer needs to sport a specially modified wristwatch which has suitable sensors and a wireless link to the pager.
ADS’s device is a type of radio-frequency identification (RFID) chip. When an RFID chip is interrogated by a reading machine operating at the right radio frequency, its antenna picks up a small amount of energy from the signal. This is used to power the chip. The device then broadcasts data in the chip back to the reader.
Glover, in his HBR article, provides a glimpse of the future of object-to-object communications: Already, sensors and network of readers are making possible sophisticated four walls applications that can be carried out within the confines of a facility, an organization or (if the company is large, like Wal-mart) a corporate ecosystemSomeday we may well have a universally accepted standard for communication and a globe-spanning infrastructure of readers. At that point, objects will have wide-ranging and deep conversations with other objects, and their silent form of commerce will be the rule. Glover gives examples of the possibilities A to Z Product Tracking, Proactive Products, Variable Pricing, Continuous Selling.
Think of RFIDs as the next wave in communications: we first had people talking to people (through language and publications), then we had people interacting with computers (through the Internet, HTML and HTTP), now we are seeing applications talk to other applications (through Web Services). The next leap will be objects talking to other objects. When that happens, life will, truly, never be the same again.
Writes Business 2.0: “Blogs are also powerful knowledge management tools…Blogging is attractive as a vehicle for personal expression because it’s an easy way to capture, comment on, and keep abreast of interesting tidbits of information. The same characteristic makes blogging well-suited to businesses that want to track information about products and markets, or distribute information to employees and customers. You see something interesting on the Web, and within seconds you can put a link to it on your weblog, add some comments, and be on to something else. Naturally, other bloggers are doing the same thing. Over time, your own blog and the other blogs you spend time reading develop into a big, interconnected web of information. It’s like a quick-and-dirty, easy-to-use knowledge management system. ”
Echoes our line of thinking…we have in fact set up blogs within our (small) organisation.
Slashdot has an interesting discussion as a reader asks: Slashdot | Can We Finally Ditch Exchange?: “With all the innovations going into open source software these days, why do I still need to run Exchange to meet my clients’ needs? Even when demonstrating technology like LTSP mixed with any combination of OpenOffice, Star Office, even Codeweaver’s Crossover Office running the latest Microsoft suite, the clincher is always over Exchange functionality. I’m aware of Bynari’s InsightServer(Coincidentally, I noticed on that page, that their code is for sale) and have started using that as a possible closer, and the cost is much less prohibitive than eXchange w2k server CALs; but why isn’t there an open source solution to this problem yet?”
Glenn Fleishman reviews the new Mac OS (v 10.2), codenamed Jaguar. [via Scripting.com]
The UI is what will make a big difference for our Thin Client, and we should see if we can be inspired by Apple’s OS X rather than Microsoft’s Windows.
News.com asks the question, and adds: “Web services can be implemented in basically two ways–either by exposing an application interface as a Web service or as a “document-based” Web service. One is not as good as the other….The alternate approach of document-based Web services, also known as “messaging style” Web services, provides several capabilities that make it more viable.”
While I don’t understand the article, I think its important to read and remember.
Weve all been through the experience. After a long flight, we wait at baggage claim, waiting it seems forever and wondering if the bags will arrive. We check different bags (because they all seem so alike). On especially bad bag days, the bags dont arrive (for a long time) and we go through the horror of imagining their loss! Wouldnt it be nice if bags could talk with our boarding pass, and send out alerts if (a) they werent on the same flight as us, and (b) when they are close to us as they make their way on the baggage conveyor belt. That day may be coming close, thanks to a technology called radio frequency identification (RFIDs).
RFID systems consist of smart tags and reader devices. The tags send out radio frequency signals, which can be picked up in a short range by readers. Unlike bar codes which can carry very limited information, smart tags can store and broadcast object-specific information, giving each item its own unique identify and history. This aspect of RFID systems is creating applications which may today seem like science fiction, but will quickly become reality.
Glover Ferguson, writing in Harvard Business Review (June 2002), provides some examples:
Another example comes from the New York Times (July 7,2002): Millions of motorists in the Northeast have discovered the convenience of E-ZPass, which lets them move quickly through toll stations as electronic readers automatically deduct their fees. The system has become so popular that the consortium of states that operates the technology has increased its projections for its use to 53 percent of vehicles, from 35 percent. The paper also talks about SpeedPass, which lets customers pay for gasoline and convenience-store products at Exxon and Mobil service stations.
The article says that RFIDs convenience is now opening it up to new uses in mobile commerce: RFID systems are much faster than other types of payment. There is no fumbling through a wallet, no punching in personal identification numbers, no signatures and, most certainly, no Web browsing. All that is needed is a tiny device called a transponder that might hang on a customer’s key chain and is waved in front of an electronic reader like a magic wand.
Tomorrow: RFIDs (continued)
For many years, India has been the poster child of the offshore software development industry…But a serious rival now is emerging, one with the resources and determination to take on India — Russia.
The technology sector in Russia achieved $3 billion in revenue last year, up 19% from the previous year. Offshore software development now is a large slice of that total, growing at an estimated rate of 50% a year.
”Our research shows that Russian development resources have stronger math skills and are often used to develop algorithms and complex formulas,” said analyst Laura Carrillo of Boston-based AMR Research.
Carrillo pinpoints the Russian education system as offering high-tech workers there a competitive advantage. In Indian universities, students learn generic development and mass-produced coding for Java and C++.
”Russia takes a higher-level approach, picking individuals more carefully in a similar manner to MIT,” said Carrillo. ”As a result, Russian programmers and developers are more schooled on advanced math and computing techniques than their Indian counterparts.”
Must look at visiting Russia sometime soon!
An interesting article in the Economist concludes that in the wired world, physical presence is becoming even more important:
Tony Venables, an economist at the London School of Economics, believes that businesses that thrive on face-to-face communications — or what some call F2F — now account for a growing share of economic activity.
Michael Storper of the University of California, Los Angeles, has written a paper with Mr Venables on Buzz: The Economic Force of the City. They argue that cities are where information and ideas are developed and swapped. But not all information is equal. Some (a bank statement, say, or a booking) is easily codified and electronically swapped; while some (I have a deal for you, why don’t we do it this way?) requires context and trust to be meaningful. It is the second kind of information that requires F2F.
Writes Economist: “Companies are not only spending much less on IT now, but they are also spending differently. Software vendors, in particular, can no longer depend on quick multi-million-dollar deals, but must work harder to win contracts that tend to be much smaller. More importantly, customer priorities have changed. Rather than buying e-business software or new computers, companies want gear that helps to cut costs, improve security and integrate existing software applications.”
The article goes on:
Being forced to do more with less, IT managers are coming to like Linux, the free operating system. Linux and the universe of open-source businesses that surround it are one of the few areas of the technology business that is actually growing. Almost a fifth of server computers sold by Dell now have Linux installed rather than Windows. Sun Microsystems has begun offering Linux servers, and might soon add a Linux PC to its product line.
If this trend towards openness continues, the IT industry will probably have to live with lower profit margins than in the past. Either way, argues Steven Milunovich, an analyst with Merrill Lynch, profits will shift away from hardware, which is becoming more and more commoditised, to software, services and consulting. IBM forecasts that these last three will account for 58% of industry profits in 2005, up from 42% in 2000.
The buzz is growing around Web conferencing, but you’ve got to listen closely to hear it.
Since the futuristic dreams of the “picture phone” first appeared in the 1960s, engineers have aspired to blend voice, data and video into a single service.
It’s arrived, but the emerging reality is that businesses use conferencing for limited purposes, not as a broad communications medium. Consumer use is still years away.
A conferencing application that might be ideal for a training meeting is often inadequate for an interactive sales presentation or a product design summit. Success comes from exploiting niches, not serving waves of demand. And any notion that conferencing would spur Web usage like instant messaging did for e-mail is misplaced.
Have been meaning to do a detailed write-up for quite some time. I finally got some time recently, and put this longish note together. It consolidates a lot of my thinking in recent months. (It needs to be linked with some of my earlier posts to add depth – will get around to doing that sometime later.)
Emergic is about creating a software platform which brings down costs of technology by a factor of 10, thus making it affordable for consumers and enterprises in the worlds emerging markets.
Emergic is about realising Bill Gates vision of a computer on every desktop and in every home a vision which has not yet gone beyond the worlds 10,000 large companies and 500 million consumers, most of whom are in the worlds developed markets.
Emergic is going to become the computing platform for the next 500 million consumers and the worlds 25 million SMEs who have not been able to adopt technology because of its dollar-denominated pricing.
Emergic is targeted at the worlds emerging markets, because they are where technology has not yet penetrated deeply, and yet, for whom, technology offers perhaps the last opportunity to better integrate into the worlds value chain and improve the standard of living for their people.
The Emergic architecture provides 4 key benefits, which are unmatched in any existing solution in todays marketplace:
1. Brings down the cost of hardware on the desktop to USD 125-150
2. Brings down the cost of software on the desktop to USD 5-10 per person-month
3. Creates a solution that is easy to manage and scale through server-based computing
4. Integrates with the Windows world, by supporting MS Office file formats
Emergic is a solution which can thus bring down the total cost of ownership of technology (hardware, software, training, support) to no more than USD 15-20 per person-month. More importantly, from an emerging market perspective, it ensures that most of the IT spend is recycled among local companies, thus providing a fillip to the domestic IT industry, especially the independent software vendors (ISVs).
The key components that comprise Emergic:
Thin Client-Thick Server
Tentatively christened Emergic Freedom, this is the Linux-based platform which can bring down computing costs dramatically by leveraging older computers (anything from 486 machines with 16 MB RAM will work) and combining it with a Linux-based desktop (KDE) and set of open source applications (Evolution for email and calendaring, Mozilla for browsing, OpenOffice for the productivity applications word processing, spreadsheet and presentation, GAIM for instant messaging, providing a single-window login to Yahoo, MSN, AOL, ICQ and Jabber). Think of TC-TS as providing new wine in an old bottle.
What is different about Thin Clients this time around? After all, theyve been talked about ever since computing began.
The major difference is the re-use of older hardware. We use older cars, older manufacturing plants, older homes, but we dont tend to use older computers. The problem is that on the desktop, we do not get access to older software from Microsoft! Only the latest versions of Windows and Office are available all of which require the latest processors and plenty of memory. It is a vicious circle which forces upgrades of desktop computers every 3-4 years. It also maintains Microsofts monopoly on the desktop there is no alternative because the world uses DOC, XLS and PPT file formats.
The price points of USD 700-900 for a new desktop and USD 500 for MS Windows and Office may be fine in countries like US, UK, Germany and Japan, but it pinches a lot (and is almost unaffordable for the masses) in countries like India, China, Brazil and Mexico. For these low-income markets, hardware and software need to cost a tenth to bring them in line with what their people are earning for technology to go mass market and become a utility. Older PCs can now be leveraged as Thin Clients, without sacrificing performance or the desktop look or the applications.
The worlds developed markets have been saturated with technology. New PC sales now imply upgrades, creating a huge supply of older computers. The PCs that are being still have a lot of life left in them after all, they are no more than a few years old. These PCs, which incidentally have become an environmental hazard in the developed markets, can now be shipped to the emerging markets where the Emergic platform brings them to life once again on the desktops and in the homes of people who may never have tasted computing. These PCs can be available for USD 100 or so in large numbers or more USD 125-150 in smaller quantities. Add USD 50 if one only uses the old motherboard and goes in for a new keyboard, mouse and monitor.
This happens because of the use of Linux in a server-computing mode. All the applications are run off the server, with only the display happening on the client side. In that sense, the desktop becomes a Terminal. The difference this time around is that the Linux-based Thin Client has all the key applications that the majority of people need to use (email, browser, instant messaging, word processing, spreadsheet, presentation) and there is no performance penalty even though the applications run off the server. The most recent releases of the open source versions of the applications is now more than good enough both from the point of view of stability and Windows compatibility.
Server-based computing using Linux (and built on the X protocol) is now possible because LAN speeds have gone up to 100 Mbps enabling the transfer of a lot more data over the same network. The result is that a Thick Server (which is actually a new desktop with 1 GB RAM and 2 hard disks in a software RAID configuration) can easily support 30-40 users. Such a Server would cost about USD 1,500-2,000, implying a per client cost of no more than USD 50.
Taken together, the Thin Client and Thick Server combination not only brings down the cost of both hardware and software by 90%, but also provides the IT manager complete control of the client desktop from the server. What every user sees on their Thin Client can be standardized and controlled from the Thick Server itself. In addition, the use of Linux does away with all the virus-related problems. Server-based computing also centralises management of data (backup and restore, for example).
The Thin Client-Thick Server solution is ideal for three environments:
1. Where there is software piracy (illegal copies of Microsoft Windows and Office). At one stroke, all users can now be given legal compatible software for a small incremental cost (10% or less of what MS Windows and Office would cost). Considering that Microsoft and the Business Software Alliance have been targeting businesses in emerging markets, the Emergic architecture provides an alternative which is not disruptive and can be implemented very rapidly.
2. Where there is an abundance of older PCs and the upgrade costs are proving to be prohibitive. This is where the same older PCs can now be converted into Thin Clients, thus protecting the investment that has already been made.
3. Where computing is not present. This means providing a computer on the desktop of every person in the organisation. Consider the economics: the Thin Client-Thick Server solution costs about USD 15-20 per person-month. This means that if a person earning USD 150-200 per month can become 10% more productive using a computer, the investment pays back from itself from day one! The computer is perhaps the most transformative invention in the past century and the defining device for todays Information Age. It has significantly improved productivity wherever it has been utilised. The emerging markets of the world and the bottom of the enterprise pyramid are where computers have not yet penetrated because of their costs. The Thin Client-Thick Server solution now makes it possible to level the technology playing field for the SMEs in the emerging markets.
The limitation of course is that this solution will not support native Windows applications. There are some alternatives:
1. Try out Wine and other solutions on Linux (Win4Lin, Crossover Office) to see if they will support Windows applications.
2. See if the client application can work in a browser; the back-end can be Windows 2000 or anything else.
3. Use vncviewer to provide a Windows desktop (on a Linux Thin Client) grabbed from a Windows machine (only one user at a time).
4. Use the Windows Terminal Server to provide multiple users access to Windows on the Thin Client.
The way I look at it is that our primary focus should be on users who are either non-users or need just the base set of applications. We are not trying to position this as a Windows-clone. This solution will co-exist with Windows. Our approach is to look at whom we can delight rather than whom we may disappoint.
The Thin Client-Thick Server as envisioned here leverages the R&D that has given in the world. No solution for the bottom of the pyramid can be cost-effective if it involves significant R&D efforts (and the time associated with it), especially in technology where the research costs are huge. By lagging hardware technology by a few years but creating software using the latest modules and standards, we create a good and potent mix. In fact, the operative phrase for much of what Emergic is about is value-added aggregation. The innovation needs to be less in new technology, but in how one can best leverage the existing technologies which exist.
If we think of Emergic as a 40-cent (Rs 20) pizza, then the Thin Client-Thick Server solution creates the pizza base and a 20-cent (Rs 10) one at that! What we need next: the Cheese and the Toppings. That is where the Digital Dashboard and the eBusiness Applications come in.
Digital Dashboard is to applications on the desktop and in the enterprise what Samachar.com is to India-related news sites: both bring together everything that users want on a single page. Samachar brought together news headlines from dozens of Indian newspapers on a single page, along with categorized news links. It became the de facto home page for Indians worldwide. What Samachar did was value-added aggregation, leveraging existing standards (HTML and HTTP) to automatically put its page together. What the users saw was that it increased the information that they could process on India by a factor of 10 that is, in the same amount of time that they spent on India everyday (3-5 minutes), they now could get many more headlines and even go to their local city news (perhaps in their native language) by following one of the news links. Samachar also worked because it was one page my belief is that every link that a user has to click on reduces the probability by half or more that the user will actually click on (especially if one is expected to do it every day). Scrolling up and down for reading is still much more easier.
The Digital Dashboard works at three levels:
1. It aggregates all the personal information that I need from my Inbox (mail), Calendar (appointments), Contacts, Documents (my recent/frequently access files), and more. This is what Outlook does and what Evolution to a limited extent does. We want to retain Evolution as the write-environment, but make the Dashboard as the value-added read environment. For example, when I do a search, it should do so across my Inbox, IM log, my contact database, files, etc. This is a super-PIM (personal information management) system. It may even integrate with SMS to provide me updates on a cellphone, thus ensuring real-time alerts.
2. It navigates the space from a PIM to PKM (personal knowledge management). It does so by providing the infrastructure for blogging and RSS feeds. I can now build by third memory (after my own brain and Google). The blog aggregates news, links, quotes, comments. It can also be extended to the enterprise in the form of k-logs (knowledge blogs). So now, I have a writing space (which could even be OpenOffice rather than just a text editor, or even my Mail application), with my creating public, group or private posts. I can even set up filters to take specific actions for posts that come into my weblog. This is more relevant for the third level at which the Digital Dashboard operates.
3. This is where the blog (the RSS aggregator) also subscribes to enterprise events. Thus, the different enterprise applications can be enabled to become RSS publishers which can be fed to the individual dashboards of various users, after appropriate authentication. This is where the vision of the Information Refinery (or Router) comes in.
Lets discuss this in more detail. The Digital Dashboard is the framework which integrates two worlds the desktop applications world (Evolution, Mozilla, OpenOffice) and the enterprise applications world (Accounting, CRM, Databases, Spreadsheets, etc.) Today, these two worlds are not glued together. This is where the Digital Dashboard and its collection of glue tools comprising of blogs, RSS and outliners comes in.
This is an edge strategy. We let the applications be, and do what they are doing best. What we are doing is aggregating all the information from the various silos in a single place. This will initially entail writing adaptors first for the desktop apps and then for the enterprise apps. We want to use the desktop apps for what they do best do not re-invent the wheel, because we have open source components here for mail, browsing and the PIM. However, such a situation does not exist with respect to the enterprise applications. Over time, we will need to think of re-writing the enterprise apps bottom-up in an integrated manner (this is the next topic). The Digital Dashboard works at the edges of the applications but in due course of time become the core the first screen that users see everyday and multiple times during the day.
In that sense, this strategy is very similar to Samachar: it worked at the edges by screen scraping the existing news sites for the headlines, but in due course of time, become the core, the gateway which funneled traffic on to the various news sites, in effect becoming a news portal. Similarly, the Digital Dashboard will, in due course of time, become the Corporate/Enterprise portal.
I want the Digital Dashboard to become the killer app which will even make Windows users switch to the Thin Client because it will make them process 10X the information in the same amount of time, and in due course, make them much more productive. The Dashboard software is key to our plans its the connector between the Thin Client-Thick Server world (which anyone else could theoretically replicate since much of it is built out of open source components) and the Enterprise Applications world (which needs solutions specific to businesses and countries, and thus all that we can become are the component manufacturers).
That is why I think of the Digital Dashboard as the Cheese on the Pizza Base of the Thin Client-Thick Server solution. The Toppings will vary by segment, but the cheese should taste the same.
Enterprise Software Applications
Enterprise Software is what every business needs, and yet few can afford it. The penetration of ERP, CRM, SCM applications have so far been limited to perhaps the top 10,000 enterprises in the world. Of course, thats not bad because these enterprises have the millions that the enterprise software vendors and the consultants charge for putting the solution together. The enterprise software applications capture the business processes and embed the information. The problem is that different enterprise apps have their own formats and models, effectively creating silos of information.
In the world that we are targeting, most enterprises will have just 1-2 applications, with accounting being the most common. Other information may be on the computer, but is likely to be in Foxpro/Microsoft-SQL Server databases or even in MS-Excel.
What we want to do here is to make available the business software applications that the enterprises need for no more than USD 5-10 or so per person-month. Compare this to the hundreds or thousands of dollars that are paid per seat for enterprise applications today. Our price point will make the apps available to the bottom of the enterprise pyramid and if we are smart (and lucky), we may be able to move up the ladder (but that is not as important).
We will begin by writing adaptors to standard applications that exist and by defining a common business process reference model which encompasses the key processes, flows and numbers for the different types of businesses (manufacturing, trading, services, etc.). This is again an edge strategy: we are not going and saying we want to replace the application right away. What we are saying is that we can combine data from different applications via the Digital Dashboard to provide you reports and views of the business you may not have seen before. In defining these APIs, we need to leverage two sets of standards: web services (usage of XML, SOAP, UDDI and WSDL) and business processes (RosettaNet and others).
The next step is to start putting together the actual application building blocks evolving from the enterprise models that we have created and tailored to fit the business process standards. The Enterprise Model we need to build must be event-driven, on the lines of what Tibcos Vivek Ranadive speaks of in his book The Power of Now. We need to think in terms of a (near) real-time enterprise. We should look at the messaging frameworks and application integration being done by companies like KnowNow, Kenamea, Bang and Juice.
Our enterprise objects would perhaps have 70-80% of the functionality. These are the enterprise Lego blocks which could be assembled by Independent Software Vendors (ISVs) for a customised solution. ISVs will play an important role because they have the last-mile enterprise/industry knowledge and the customer relationship. We should publish the APIs of all our objects so if others build a better object, so be it. This is what Microsoft has done well leverage the developer community. Our success to a great extent here will also depend on whether we can get ISVs to extend (and replace) our enterprise objects.
Alternately, the enterprise itself could choose to take an existing set of processes from an online library which exists and then customise it. The latter needs the equivalent of a Visual Biz-ic, a scripting-base workflow programming environment which does for business processes what Visual Basic has done for software. We need to make this business rules driven (like what Versata claims to have done).
Either way, the enterprise events and information flow is to the Digital Dashboard. In parallel, we need to use inter-enterprise standards to streamline document flow based on standards between enterprises using RosettaNet (Basics).
Our approach here should be like that of a TV serial maker rather than a film-maker: we do want to wait for 3 years in the hope of a blockbuster. Rather, we want to come out with new objects every week, so we can do course correction any time, if required.
The integrated set of enterprise applications will leverage a common database, wth storage in XML. They will also follow what Ray Ozzie calls the OHIO principle: only handle information once. Data should not be replicated for storage. It is possible for us to think like this because we are building the base set of objects from scratch. We will use open source to make this happen with EJB/J2EE as the development environment, JBoss as the application server and PostgreSQL as the backend database. We should also see how to integrate OpenOffice Calc it is much more than a spreadsheet. We can think of it as a computational engine, with excellent reporting and graphing/charting capabilities.
The Enterprise Applications thus become like the toppings on the pizza-cheese base. Different segments may want different toppings. Either we can provide the toppings (or the recipes) or others can do their own.
What Emergic is Missing
There are many things which we will need to look at over time to complete the picture. I have just outlined some of them here, so we do not ignore them down the line:
– Communications / Connectivity: look at WiFi
– Software for Project Management (what we all do), Content Management
– Website Publishing with eCommerce support for small businesses (like what Trellix offers)
– Content and Community: an Enterprise Readers Digest meets Slashdot to create clusters of SMEs
– Customisation for different verticals (eg. Schools, Homes) what will need to be done differently
– SME Marketplace like what eBay has created
– MicroFinancing so that SMEs or even individuals can invest in these technologies
– SME 7-11s: tech shops in neighbourhoods which can also serve as cybercafes, demo centres, meeting points, payment collection centres and wireless hubs
– Custom Thin Clients, which use cheaper chips to create Thin Clients for USD 50 or so. My hesitation on this is I am very reluctant to get into the hardware R&D business.
– Optimising the Thin Client-Thick Server protocols in two steps: first, to bring down the LAN speed requirement to 10 Mbps so it will work over 802.11 (wireless) networks, and then bring it down to 1 Mbps so it can work on cable, thus enabling us to target the Home segment.
– Collaborative Workspaces across firewalls (like what Groove offers)
Perhaps the biggest thing missing is how we will go to market with Emergic how will be brand, price, distribute, sell and support this not just in India, but in other markets. I dont have many answers right now, but will do so soon. This is where we will need to experiment with different ideas in the next few months.
Netcores MailServ: A Starting Point
The Linux-based Messaging Server that we have, which includes support for Instant Messaging (via Jabber), Proxy (via Squid), a basic Firewall, Global Address Book (supporting LDAP), and integrated anti-virus screening. We have now begun work on the next version (4.0), which we hope to have ready by the end of the year. This is the product that is our bread-and-butter business right now, with an installed base of over 100 customers and 300 locations all over India. This gives us the first set of prospects to talk to for Emergic. This is what we have just started to do.
The Thin Client-Thick Server, Digital Dashboard and the Enterprise Applications strategies create the foundation for a new IT infrastructure based on Linux and Open Source for SMEs. Our solution cost will be a fraction of what the big players charge, but in functionality we must not be far behind. It is a strategy used by the Indian pharma companies in the last two decades as they made low-cost drugs available to the local populace (helped no doubt by the lack of copyright protection. In our case, we will build on open source technologies.) A similar strategy has been used by Huawei in China to take on the telecom majors like Cisco, Lucent and Nortel it provides perhaps half the functionality but at a fifth of the price.
Theres a lot that we have to do. The vision is quite audacious and large in its scope. But I sincerely believe that many things in technology are now coming together to dramatically change computing in the next decade. The Internet was one such 10X Tsunami which brought the worlds computers closer. And yet, a large part of the world is still relatively untouched by technology, largely because it is driven by US companies and dollar-based pricing. This other world is the one that now needs to be impacted by technology. I think of Emergic as the next Tsunami which will brings this worlds people and enterprises closer.
An interesting story in the Far Eastern Economic Review on how Manila Broadcasting Corp. is revolutionizing village radio in the Philippines by delivering the latest music and news from Manila plus local-language information about the town or neighbourhood:
Under the name Radyo Natin, or Our Radio, MBC has launched more than 400 low-power FM stations since late last year, with another 400 in the pipeline. This network of tiny radio stations represents an effort by MBC to convince national advertisers that they can reach virtually every consumer in the country at the local level. For the stations’ operators, the money they make will depend on convincing small local businesses to advertise, while also selling blocks of the stations’ airtime to local politicians, religious institutions and others with a desire to reach the community.
Using a combination of low-cost transmitters and satellite-programming muscle developed in Manila as part of its traditional radio network, MBC, the oldest broadcaster in the Philippines, stands poised to revolutionize local radio. For the first time, tiny local stations are able to deliver the latest music and news from Manila in tandem with local-language news and information about the village or the neighbourhood.
eWeek’s article on the lack of corporate acceptance of a Linux desktop quotes Andrew Care, CIO for Air New Zealand: “What is needed before we consider moving is an office productivity suite that has functionality and applications comparable to Microsoft Office. But, even more importantly, any Linux desktop will have to be completely compatible with Office and be able to translate and read all documents, templates and spreadsheets 100 percent.”
Also from the same article:
Nat Friedman, co-founder and vice president of Linux desktop developer Ximian Inc., of Boston, agreed that interoperability with Office is the biggest issue in corporate adoption of the Linux desktop.
“For a long time, usability was the big issue, but that is no longer the case. Microsoft protocols and file formats are. It takes us two years to write compatibility with any Microsoft product into ours,” Friedman said.
I don’t see anyone worrying about the next set of computer users. Windows and Office have won the battle with the first 500 million computer users in the first two decades of computing. In the next decade, we are going to see another 500 million users adopt computing. They are the ones whom the Linux desktop should be aimed because they cannot pay USD 700 for a PC and USD 500 for Windows and Office – they can afford a fraction of that. They are in the world’s developing countries.
Writes News.com on the launch of BEA’s new product, “which glues together all types of business information stored in various locations, allowing people to search and view that data as if it were stored in one location”:
The problem is as old as the computer industry itself. Despite years of development in database and infrastructure software, it’s still difficult–and sometimes impossible–to search across a corporate network for all e-mails, documents and spreadsheets related to a specific project, for instance. Searching through video, audio and image files is kludgy at best.
“So why is everyone trying to do this? It’s an age-old problem: Old data never goes away. It hangs around forever. So as you keep adding new applications and new databases, it gets more complex to try to make information integrate,” said Mike Gilpin, an analyst with Giga Information group. “Customers today have a concern with the difficulty in doing data integration. Its complexity is always an obstacle. It’s in the interest of IBM, BEA and others to remove those obstacles.”
This seems to be more of what a Digital Dashboard and Information Refinery should be doing. Would be interesting to see what BEA is doing, because we are trying to solve similar problems at a much smaller case for the smaller companies. Blogs are going to play a key part in this future.
Ubiquity (ACM) has an interview with Annabelle Gawer, the co-author of “Platform Leadership”. She talks about innovation and Linux:
[Openness] increases tremendously innovation that complements what was previously written. The old school of thought about innovation was that you had to protect everything in order to prevent substitute innovation. If you protect your innovation by a patent, it becomes unlawful to come up with a substitute during the time of the patent. The idea is a fundamental insight of economics, which is that you should protect the incentives of the innovator, because it’s very hard work to innovate. There’s a lot of trial and error, but once you’re done it’s easy for someone else to imitate you. If there were no protection then the innovators would stop trying to innovate because they would not have economic benefit. This philosophy underlies our whole patent system. Now, what the Linux story uncovers in a blatant way is that there are other kinds of incentives to innovate. It shows that you can open things up and not stop innovation. It might stop competitive innovation, but it doesn’t stop collaborative innovation. Those are the pluses of the Linux story.
The minuses are: How do you maintain such a product over the long run? Who is going to ensure the maintenance, which is less exciting than being a wonderful hacker inventing genius code? You need to have sound commercial organization behind it, and for that kind of thing to happen you need to have the traditional business incentives and therefore you need to protect from imitation. How do you reconcile openness and closeness? How to reconcile creativity and maintenance over the long run hasn’t been resolved yet. But I think the Linux story has expanded our understanding of the phenomenon of innovation.
Also see: my earlier post on this topic.
Writes Bloomberg: “MySQL and others are starting to eat into the $8.8 billion market for database software dominated by Oracle, International Business Machines Corp. and Microsoft Corp., users said. Yahoo, which uses MySQL to run the Yahoo! Finance Web site, may replace some Oracle databases with MySQL, said Jeremy Zawodny, a computer engineer at the Internet company.”
Among the other free databases is PostgreSQL. I think MySQL is not under GPL so one cannot bundle it commercially for free, while PostgreSQL is.
While businesses are by definition collaborative (in the sense, that every transaction has a buyer and seller), the way businesses have been run has been very individualistic. Every enterprise, small or big, has its own ways and means of managing its people, products, accounts, customers, partners and suppliers. Like fingerprints and snowflakes, no two businesses are alike.
What this means is that a lot of time is spent in defining and implementing business processes. While the big businesses may be justified is using their unique business mechanisms as a source of competitive advantage, for many small and medium enterprises (SMEs), figuring out the right business processes (concomitant with the right information flows and activity sequences) can be a source of pain a kind of trial-and-error until one gets it reasonably right.
This is what is about to change. For the past few years, many industry consortia have been working together to define business process standards. One of the factors which has made this easier has been the increasing adoption of XML for data and document interchange between enterprises.
Among the various business process standards initiatives, the one which is perhaps the most significant is RosettaNet. By its own definition, RosettaNet is a non-profit consortium of more than 400 of the world’s leading Information Technology (IT), Electronic Components (EC), Semiconductor Manufacturing (SM) and Solution Provider (SP) companies working to create, implement and promote open e-business process standards.By establishing a common language — or standard processes for the electronic sharing of business information — RosettaNet opens the lines of communication and a world of opportunities for everyone involved in the supplying and buying of today’s technologies. Businesses that offer the tools and services to help implement RosettaNet processes gain exposure and business relationships. Companies that adopt RosettaNet standards engage in dynamic, flexible trading-partner relationships, reduce costs and raise productivity. End users enjoy speed and uniformity in purchasing practices.
While RosettaNet may seem to have a focus on just a few industries, the standards it is developing can work across industries, especially for SMEs. RosettaNet defines PIPs (Partner Interface Processes), which are specialized system-to-system XML-based dialogs that define business processes between trading partners. Each PIP specification includes a business document with the vocabulary, and a business process with the choreography of the message dialog. PIPs apply to the following core processes: Administration; Partner, Product and Service Review; Product Introduction; Order Management; Inventory Management; Marketing Information Management; Service and Support; and Manufacturing. Of special interest is the set of standards covered by RosettaNet Basics, a set of core PIPs that help in simplifying implementation for the smaller enterprises.
A recent development is RosettaNets merger with Uniform Code Council Inc., best known for introducing the bar-coding system to the retail world. Together, they hope to hasten the adoption of B2B integration standards by organisations worldwide.
Taken together with Web Services which enable the creation of software components, the activity in the standardisation of business processes will streamline intra- and inter-enterprise interactions dramatically in the coming years, laying the foundation for real-time enterprises. The opportunity is greatest for the SMEs who can now use existing libraries of processes to connect into the supply chains of their larger customers.
Next Week: Techs 10X Tsunamis (continued)