2 Years of Tech Talk

Recently (specifically, on November 13), Tech Talk completed 2 years. Tech Talk started as a daily (Mon-Fri) column on Tech Samachar, where it is still published. [Archives Page] My first series (from November 13-24, 2000) was on the “Mass-Market Internet”. Fittingly, this week sees the start of a series with a similar theme – bridging the digital divide, and taking computing and communications to the next 500 million users. It is a topic I’ve talked about often, using which series to fine-tune and incrementally build upon the ideas.

When I started Tech Talk 2 years ago, I was quite depressed about my pathetic knowledge about technology. I have always been a voracious reader, but the few years prior to then had kept me very pre-occupied with managing IndiaWorld and had narrowed by reading quite dramatically to portals and dotcoms. I had to get out of that and build a much wider perspective. So, inspired by Red Herring’s “Catch of the Day” column (which has since been discontinued), I ambitiously decided on a daily Tech column, with each column being in the range of 400-500 words. I wrote out the first series of 10 columns quite excitedly, since that was what I had been thinking over the previous months. That was enough for 2 weeks.

At that time, I had thought I’ll probably run out of topics in a few months, but let’s write anyways. We’ll see what happens later. I knew that I had enough topics for a few more weeks. And so was launched Tech Talk.

Two years on and 500+ columns later, the Tech Talk tradition continues. I have not yet run out of topics! I may repeat a bit of myself at times, but I think each column brings with it a least some fresh thinking. I try and follow a simple principle: write what I am thinking. Some of these ideas are also what I am trying to apply in my business (Emergic), so they are not just academic, but also incorporate feedback from the marketplace.

Also, to whatever extent possible, I do try and cover a diverse set of topics – entrepreneurship, books, the New India, and so on. But its still a somewhat narrow sliver. I am an entrepreneur first, and writer second!

My writing habits for Tech Talk have changed somewhat over the years. Earlier, I used to write the daily column, well, daily. That was too much pressure. Wake up in the morning with the deadline looming! That did not help thinking. So, over time, I decided to write a set of columns together. Now, since the past year or so, I write a week’s columns together on a Sunday. It takes me about 2.5-3 hours. This has helped in ensuring a certain continuity and making me actually enjoy the writing.

I also quote (quite liberally sometimes) from others but make it relevant to the point that I am making. I have found that others do write better than me, and so, if they make the point, or provide a take-off for what I want to say, let us quote them. This is perhaps why Tech Talk will never become a book. (Of course, the other reason is that I will not have the patience to read and edit what I have written!) Online is the best place for Tech Talk to be.

My writing has increased since I started my blog in May this year. At that time, I wondered if I should just discontinue Tech Talk and stick to the blog. But I realised that the value of the Tech Talk lies in its length and deep thinking style. That I would find it hard to replicate in the blog’s microcontent style. I am glad I left Tech Talk untouched.

On a personal level, I love writing and sharing. It helps me clarifiy my ideas, and forces a discipline of reading and thinking. The feedback from readers over the years has also helped a lot – introducing me to new people and new ideas.

So, here’s on to year 3, with a promise to continue the spirit of Tech Talk.

Open-Source Numbers

David Wheeler answers the question “Why Open Source Software / Free Software (OSS/FS)?”. From his introduction: “This paper provides quantitative data that, in many cases, using open source software / free software is a reasonable or even superior approach to using their proprietary competition according to various measures. This paper examines market share, reliability, performance, scalability, security, and total cost of ownership. It also has sections on non-quantitative issues, unnecessary fears, usage reports, other sites providing related information, and ends with some conclusions.”

His conclusions:

OSS/FS has significant market share in many markets, is often the most reliable software, and in many cases has the best performance. OSS/FS scales, both in problem size and project size. OSS/FS software often has far better security, perhaps due to the possibility of worldwide review. Total cost of ownership for OSS/FS is often far less than proprietary software, particularly as the number of platforms increases. These statements are not merely opinions; these effects can be shown quantitatively, using a wide variety of measures.

This doesn’t even consider other issues that are hard to measure, such as freedom from control by a single source, freedom from licensing management (with its accompanying risk of audit and litigation), and increased flexibility.
Realizing these potential OSS/FS benefits may require approaching problems in a different way. This might include using thin clients, deploying a solution by adding a feature to an OSS/FS product, and understanding the differences between the proprietary and OSS/FS models…OSS/FS products are not the best technical choice in absolutely all cases, of course; even organizations which strongly prefer OSS/FS generally have some sort of waiver process for proprietary programs. However, its clear that considering OSS/FS alternatives can be beneficial.

OSS/FS options should be carefully considered any time software or computer hardware is needed. Organizations should ensure that their policies encourage, and not discourage, examining OSS/FS approaches when they need software.

Throttling Viruses

An interesting story from the Economist on how viruses can be throttled early on in their life to prevent an epidemic breaking out:

Dr Matthew Williamson’s approach is based on the observation that computers infected by a virus behave differently in one key respect from uninfected computers. Once a virus has infected a machine, it will generally try to connect that machine to as many new computers as possible, as fast as possible, so as to spread itself further. A virus called Nimda, for example, gets its hosts to make new connections at a rate of up to 400 a second. Uninfected machines normally make connections at a far less frantic rate. Those connections are also more likely to be to machines that are both familiar and in big demand, such as mail servers or the hosts of favourite websites.

The idea, then, is to limit the rate at which a computer can connect to new computers, where “new” means those that are not on a recent history list. Dr Williamson’s “throttle” (so called because it is both a kind of valve and a way of strangling viruses at birth) restricts such connections to one a second. This might not sound like much to a human, but to a computer virus it is an age.

And it seems to work. Recently, the throttle was tested on a group of 16 machines connected in an isolated network. When one of these machines was exposed to Nimda without the throttle being installed, all but one of the group were infected within 12 minutes. However, in one test when the throttle was applied, it took 13 minutes for a second machine to be infected, and half an hour for a third.

But the particular benefit of throttling is that it alerts people to an attack. When a virus infects a computer with a throttle, a huge backlog of requests develops within a few seconds. This is easy to detect, and once detected, human intervention becomes possible. In addition, though throttling has a big impact on the spread of a virus, it makes little difference to ordinary activities such as web browsing. Dr Williamson has been testing the system on his colleagues over the past three months. Some 98% of connections were made with no extra delay. The maximum delaywhich was experienced in one connection in 80,000was of only five seconds.

Why did no one think of this before? The Economist has an interesting point: “According to Dr Williamson, part of the reason is that most people think of computer security in a binaryie, on or offfashion. Throttling merely slows things down, making a system resilient rather than completely resistant. People also, not unnaturally, think mainly about protecting themselves from attack. Yet, like vaccinating children, much of the benefit of throttling accrues to othersie, those to whom the virus is not transmitted, even if those others have not taken the trouble to protect themselves. In fact, it is in some ways worse than vaccination, since at least a vaccinated individual is also protected (albeit at the small risk of an adverse reaction to the vaccine). With throttling, all the benefit accrues to others.”

The story is fascinating example of how thinking a little differently and creatively can make a big difference.

Out-of-the-Box Java

Press Release: “EJB Solutions, Inc. announced the immediate availability of Out-of-the-Box 1.0, a fully integrated, documented, and tested distribution of over 50 Open Source projects targeted at Java developers. As companies tighten their belts in the current economy, more IT departments look to Open Source solutions as a way to cut costs. Many times, they find the tools to be useful once integrated, but that the general state of documentation leaves a lot to be desired. Installation time and difficulties frequently lead them back to proprietary commercial solutions. EJB Solutions has reduced this barrier to entry practically to zero with its Out-of-the-Box offering. With fully automated installation, configuration, deployment testing, documentation generation, and more, any developer can be up and running with tools such as the Apache Web Server, Tomcat, Struts, JBoss, Ant, MySQL, CVS, and dozens more in a matter of minutes.”

Something we should definitely look at.

IBM’s On-Demand Applications

Writes Line56:

IBM has announced the first e-business on demand applications for companies with between 100 and 1,000 employees. Partnering with independent software vendors (ISVs), the solution set covers CRM, HR and accounting applications, and pricing begins as low as $50 per month per user.

Mid-market customers will see the greatest value from on demand, according to Mike Riegel, global marketing executive for e-business hosting at IBM. “They don’t have large IT staffs and yet they still need the IT and business process applications. They told us they’d like to start with some of the back-office things, something for HR, CRM, accounting.”

Onyx is providing on demand CRM marketing and sales force automation; Intacct is providing on demand accounting applications for general ledger, AR and special applications; and Employease and HRsmart are delivering of human resources and benefits solutions that cover core HR functionality and also recruitment and workforce management.

Billing models vary by provider. While HRsmart is a fee per seat per month, Employease pricing is based on how many people are in the company… The business model is that as the ISV builds customers, IBM will gain more hosting revenue.

This seems somewhat similar to what Jamcracker had tried a few years ago. Integrated ASP services, aggregated from different vendors (best-of-breeds).

I think to make the ASP concept work, especially in emerging markets, what is needed is that the applications should run off a server on theLAN, rather than the Internet. Connectivity and bandwidth are big challenges. Also, what the customers will want to see is an integrated backend, rather than silos of information.

Wireless Sensors

WSJ writes about ABB’s use of wireless technologies, naming the company as the winner of its European Technology Innovation award:

Swiss engineering company ABB Ltd. is aiming to usher in a new era for assembly-line robots, which have typically been weighed down by the cables used to transmit information about their performance to factory computers. Instead, Robbie (a robot) is fitted with wireless sensors that can send the performance data through the air using radio waves.

These sensors are just one example of how wireless technology is moving beyond mobile phones and into hidden corners of industry, transforming the way factories and distribution chains operate. Radio tags are already being used to track the movement of beer kegs, and experts believe wireless chips will be attached to billions of machines and objects over the next decade, allowing computers to continually monitor their condition and location.

The biggser story: “As well as helping industry to better manage assets, Lars Godell, an analyst with Forrester Research in Amsterdam, says this invisible world of wireless will open new business opportunities for semiconductor suppliers and companies selling computers to handle all the data the wireless chips will generate.”

The future: a world of machine-to-machine communications. Writes WSJ:

Deloitte Consulting, pointing out that machines outnumber humans by 4 to 1, believes that machine-to-machine communication could be a vast new market for mobile-phone operators. But that might depend on operators offering machines cut-price tariffs. If not, much of the data generated by wireless sensors is likely to travel just a few yards through the air using unlicensed spectrum, rather than commercial mobile-phone networks, before being piped through fixed networks to corporate databases.

Still, there are companies building machines designed specifically for use with mobile-phone networks. Nokia Corp., the world’s largest handset maker, plans to launch an “observation camera” with a built-in mobile network connection in the second quarter of 2003. Nokia envisages that users will position the camera in a room they wish to monitor, such as a kitchen with a pet dog in it. They will then be able to use their mobile phone to send a text message to the camera requesting it take a picture. The camera will then send the photo back to the user’s handset.

Nokia says the observation camera can also be configured to send a picture when it detects motion or at regular intervals. It is also designed to record sounds or take the temperature of its surroundings. “You could, for example, check the weather at the golf course before leaving home to play,” says Janne Jormalainen, vice president of Nokia’s mobile enhancements business unit.

Amazon’s Secret

Amazon Prospers on the Web By Following Wal-Mart’s Lead (WSJ):

Retailers face two choices, Amazon founder and Chief Executive Jeffrey Bezos said in a conference call earlier this year: Work hard to raise prices or to lower them. Amazon, he says, has “decided to relentlessly follow the second model.”

Amazon’s surprising formula is exactly what helped ruin many other online businesses and once added to the gush of red ink at Amazon itself: aggressive discounting and free shipping. To pay for those goodies, Amazon is behaving like every other successful mass retailer and slashing its costs wherever it can. A smarter system of processing orders means fewer errors. It has cooked up imaginative ways to cut shipping fees by consolidating orders. And it has a lucrative new business selling new and used goods online on behalf of other merchants. Amazon collects commissions on those third-party sales without the risks and costs of owning the inventory.

Discount retailers such as Wal-Mart Stores Inc. continually lower prices by squeezing inefficiencies from their operations, sacrificing fat profit margins on products in favor of selling in high volumes. By adopting this strategy, Amazon appears finally to be doing what industry officials have long said the Internet would allow retailers to do — drive down prices aggressively for consumers.

TECH TALK: Disruptive Bridges: The Coming Computing Shift

Theres some interesting activity afoot in the world of technology. Even as the industry reels from one of its worst slumps in recent times, some very interesting shifts in strategies can be seen.

Business Week has a story on IBM’s decision to put $1 billion into services R&D and asking if it marks the end of the hardware era and the emergence of software services as the primary technology drivers.

Simply put, the age of hardware supremacy is on the wane. Chips — and the computers and gadgets they power — will continue to get faster. But speed itself will matter less and less, thanks to a host of confluent trends.

On the other hand, services will increasingly be where both the value and the interesting activity are. “The whole [info-tech] industry is becoming more services-based. The growth of services is outstripping the growth of hardware and software,” says Paul Horn, an IBM vice-president who heads its R&D operations.

The handwriting for hardware is on the wall. For starters, the network truly is becoming the computer, as Sun CEO Scott McNealy has long proclaimed. The Internet now reaches over 150 million Americans. Their connections are increasingly fast, with close to 20 million users surfing the Net on broadband connections and dial-up users now achieving regular download speeds approaching 56 kilobytes per second. As a result, more and more computing is being offloaded to central servers.

Those servers, however, are more likely to be cheap boxes that can be easily stacked and configured in ways to handle the majority of computing tasks. So, faster and faster chips and computers have a marginally smaller return when IT managers can simply throw another box in the rack and add 10% more power and processing speed, as needed.

Another key factor in ending the hardware era is simple satiation. Hardly anybody outside a corporate IT shop fills a 100-gigabyte hard drive. And for the average consumer, the marginal difference between a 2-gigahertz Pentium and 2.4-gHz processor is virtually zero. That might change somewhat as video and other more processor-intensive applications take off. But even those new uses for computers seem unlikely to push hardware capabilities to now-undreamed-of levels.

The reasons why Big Blue is putting more emphasis on software and services research will only accelerate as computing becomes increasingly commoditized.

Whats Microsoft thinking about this?

Tomorrow: The Coming Computing Shift (continued)