Top 50 Agenda Setters has a list, led by Steve Jobs and Bill Gates. Linux Torvalds is at No. 5. Guess the Indian at No. 8. Hint:

His entrance straight in to the top 10 reflects the success of the whole Indian offshore IT services market in the past year. Globally, offshoring is forecast to grow 40 per cent this year alone, with three-quarters of businesses considering it – with most of the work tipped to go to the Indian sub-continent. India’s boom – largely engineered by [him] – means some analysts are predicting the country could face its own IT skills crisis over the next five years.

[his] role over the next few years will be crucial, with India facing stiff competition from South Africa, Eastern Europe and China as a base for low-cost, low-risk IT services, with the domestic political and union issues for those Western companies looking to move large numbers of jobs overseas also on the horizon.

NextGen Web Directories

Dave Winer writes about how OPML could be used as the base for creating the next generation of directories. Forst, he explains the rationale:

We accept that there are two main directories (Yahoo and DMOZ), but why do we accept that?

Suppose someone said there were two home pages for the Web. How good could they be?

What if they said that in order to write for one of these home pages you either had to be an employee of a relatively small company, or be part of an open source project. What if the subject you’re an expert in already has an editor in DMOZ? What if you already have a job you like and don’t want to go work for Yahoo? And how many people can Yahoo afford to employ to work on their directory? Wouldn’t it make more sense to open the process up, as the Web is open, and let the cream rise to the top?

First, we need to break the monopolies, to open up the idea of web directories to competition, so there can be as many sub-directories for a category as there are people who have an interest and have expertise.

It’s all just a bootstrap away. People have to start creating and maintaining sub-directories, and then link to each other through their directories, and away we go. This is something that is entirely in the user’s hands. The technology is already fully invented.

Smarter Client Architecture for Mobile Apps

Considering the growing importance of cellphones in our lives and the need to access enterprise data via these devices, this article by Martyn Mallick makes interesting reading. The focus is on a “mobile database component” and synchronisation.

There are three characteristics of wireless Internet applications that are not present in smart client architectures:

1) Immediate deployment of applications. Wireless Internet applications reside on a server and, as long as the user knows the URL of the application, he can access it immediately. The drawback, of course, is that network connectivity is required to access the server-based content. The introduction of mobile application deployment and management software has made deploying smart client applications much easier, so companies rarely decide to create Internet applications for this reason alone.

2) Simplified enterprise integration. If you have existing Web applications that need to be taken mobile, moving to a smart client architecture can be more challenging than using a Web model. A wireless Web application would be able to reuse the majority of the existing application, including Web servers, business logic and enterprise integration, whereas a smart client application may require components to be rewritten since a Web component is not required. For this reason, the ideal fit is often an offline Web application. This solution provides the best of both worlds in that it still takes advantage of existing Web technology while providing offline access to your Web content. In this type of solution, the Web content is still deployed on the Web infrastructure, but is then synchronized to the local device where it can be viewed even after the network is disconnected.

3) Real-time data access. There are situations where the data used by a mobile worker has to be 100 percent up to date at all times. These applications are few and far between, but when they do occur, a smart client solution will not offer the ‘data freshness’ that you require. Areas where real-time data access may be required include securities trading, information services such as sports scores or weather information, and m-commerce applications.

One way to mitigate this for corporate applications is by using server-initiated synchronization. Essentially, this means that any change to specified data in your enterprise database will initiate a synchronization of the local database on all mobile clients. With poorly designed database schema and synchronization logic, this may result in an uncontrollable number of synchronizations. Take time to properly partition your data to avoid such issues.

VoIP + WiFi

Red Herring (yes, it’s back) writes that “combining two hot technologies makes telcos seem so 1998.”

When the University of Arkansas wanted to trim the cost of intra-campus calls, it bypassed its local carrier by combining two technologies: voice-over-IP (VoIP) and 802.11 wireless. By using its existing TCP/IP networks and spending $4 million for a Cisco Call Manager, the university circumvented its local carrier and reduced monthly service fees from $530,000 to a mere $6,000. Meanwhile, the City of Dallas eliminated circuits altogether. It standardized on an IP network and expects to cut the costs of internal calls by $21 million over the next decade.

Individually, VoIP and 802.11 are hot technologies with promising futures. Now they are gaining attention for their potential as a combined force. Convergence, or the melding of voice calls over an IP network together with wireless 802.11 technology, is becoming increasingly popular. VoIP reduces the need for local carrier origination and termination. Wireless local-area networks (WLANs) offer cheap installation costs and wireless mobility.

Text Mining

NYTimes writes:

[Text-mining is] a technique that academics have been experimenting with for years but for which tools have only recently become commercially available. The prospect of rapidly scanning through reams of documents is stirring interest among researchers and analysts faced with more material than they can handle.

To the uninitiated, it may seem that Google and other Web search engines do something similar, since they also pore through reams of documents in split-second intervals. But, as experts note, search engines are merely retrieving information, displaying lists of documents that contain certain keywords.

Text-mining programs go further, categorizing information, making links between otherwise unconnected documents and providing visual maps (some look like tree branches or spokes on a wheel) to lead users down new pathways that they might not have been aware of.

In most cases, text-mining software is built upon the foundations of data mining, which uses statistical analysis to pull information out of structured databases like product inventories and customer demographics. But text mining starts with information that doesn’t come in neat rows and columns. It works on unstructured data – e-mail messages, news articles, internal reports, transcripts of phone calls and the like.

To make sense of what it is reading, the software uses algorithms to examine the context behind words. If someone is doing research on computer modeling, for example, it not only knows to discard documents about fashion models but can also extract important phrases, terms, names and locations. It can then categorize them and draw connections among the categories.

It would be nice to apply some of these ideas to blog posts.

India’s Telecom Boom

The Economist writes about how the Indian market for phones is growing rapidly. “Mobile telephony is finally taking off in India. Some 5m new users have signed up since March; there are now over 17m subscribers. Add to this around 3.5m subscribers to a ‘limited’ mobile service provided by fixed-line operators that works within a restricted area, usually a large city, and the total is even more impressive. What is driving this spectacular growth? Affordability. Limited mobile providers, authorised to begin operations early this year, cross-subsidise mobile from their fixed-line services. This has led to such fierce price competition that Indian mobile telephony is now the cheapest in the world.”

Yesterday, in a sign of things to come perhaps, Reliance Infocomm acquired Flag Telecom for USD 207 million. This is the largest acquisition by any Indian company abroad in the services sector. “Flag Telecom is a leading provider of international wholesale network transport and communications services to carriers, internet service providers, content providers and other broadband operators. Its unique network spans four continents, connecting key markets in the Middle East, Asia, the USA and Europe. It owns and operates an underwater cable system, the longest man-made structure ever built, stretching almost 28,000 km from the UK to Japan. With 16 operational landing points in 13 countries.”

Sun’s Eclipse

WSJ writes about the challenges facing Sun, pointing out some of the fundamental changes that are taking in the computer industry:

Standard chips made by Intel Corp. have caught up to Sun’s specialized models in performance, turning low-priced computer makers such as Dell Inc. into Sun rivals. Likewise, relatively cheap software from Microsoft Corp. and free Linux software allow corporate users to perform tasks that once required Sun’s pricier programs.

Sun appears to be the latest casualty of the rising tide of tech standardization, led by Intel and Microsoft. Many companies in the history of high-tech — Digital Equipment Corp. and Apple Computer Inc., among others — believed they could resist standard designs and thus ultimately charge a premium for their products. In the end, a lot of these companies were either acquired or hang on in the industry as smaller players.

In the end, Sun faces a changed tech landscape that no amount of remaking may be able to fix.

Continue reading

TECH TALK: SMEs and Technology: The Need for Reference Architectures

Consider a recent marketing leaflet by Intel and its partners targeted at small enterprises. It exhorts small- and medium-sized enterprises (SMEs) to buy a PC to increase their business. What are the specifications of the business-performance PC? Here is what the ad says: Intel Pentium 4 Processor 2.4 Ghz, Intel D845 GVAD2 Motherboard, 128 MB DDR RAM, 40 GB HDD, 15-inch colour monitor, ATX Cabinet, Mouse, Keyboard, 52X CD ROM Drive, LAN Card, Windows XP with CD.

Read the ad again and consider how an SME is expected to (a) understand the flurry of acronyms used (b) use the PC for business? In the first case, SMEs are expected to either master a whole new vocabulary (GVAD2, DDR, ATX). In the second case, there are no applications on the PC by itself, Windows XP can do a few things but not enough to make it usable for business out-of-the-box. The result: SMEs get caught in needless decisions comprising technicalities within a computer which are not necessarily germane for its end-use.

The focus instead needs to be on what SMEs can be do with technology, not confusing them with three- and four-letter words they will not even find in a dictionary! This is one of the motivations for putting together a reference architectures for SMEs. It can clearly specify what SMEs need in terms of the hardware, software stack, connectivity solutions and so on. It should be able to bring the discussion to what SMEs want to accomplish, rather than saddling them with expensive solutions that they may not necessarily need (or know what to do with). So, the reference architecture can help simplify the decision-making process for SMEs.

In addition, it can also ease the selling process to SMEs. Vendors can now focus on how they can provide a whole solution, rather than just selling the parts and having the SMEs assemble their own infrastructure. After all, SMEs want to buy the IT-equivalent of cars, and not check off a laundry list of steering wheels, gear boxes, piston rings, tyres, headlights and the like. Vendors should have the responsibility of putting together the complete solution for their SME customers. Having a reference architecture for them to work with makes it easier for them to do the integration.

The reference architecture also makes it better for the component developers. They can decide which layer they want to focus on, and be the best in that particular segment. For example, in software, we have developers trying to build it all by themselves, rather than using existing components and value-adding on or around them. This will make for more comprehensive SME solutions, broadening and deepening the market. Online directories can help assemblers aggregate the right solutions from different markets.

In many ways, the computer industry needs to start becoming like the auto industry in terms of standardisation. Coupled with the commoditisation of technology which can now provide for affordable solutions, business models which provide these solutions with utility-like pay-per-use payment options, modular and expandable platforms and solutions which can be remotely managed, we will have the fundamental building blocks for creating a complete ecosystem of solution providers for SMEs. This is what is needed to open up the global market of over 75 million SMEs, and create a wide variety of entrepreneurial opportunities.

Next Week: SME IT Reference Architectures

Continue reading