Searching for Customer No. 1

Its the challenge which every entrepreneur faces – a product is ready for the marketplace, awaiting the first customer. Until the first customer signs up, the product is merely a research project, an interesting idea. The first (paying) customer lends legitimacy to the entire effort. It also gives the entrepreneur the much-needed boost that if there’s one out there willing to pay there will be others. Finding the all-elusive first customer is always a big challenge.

We are in exactly the same situation with our Thin Client-Thick Server project. We have got a few trials ongoing, but no paying customer yet. Maybe I was overly optimistic that by September-end we’d have had our Customer No. 1. It still looks to be some distance away. I’ve been through this many times before. It is never an easy feeling. One has to battle all kinds of self-doubts. This is the time when has to keep faith in the vision – because everyone else in the organisation is watching.

I know we have a winner on our hands with TC-TS (have christened it Emergic Freedom: Thin Client Desktop-Thick Server OS). But we need to position it right and take it to the right customers. We are struggling a little here. But I am hoping to set that right soon. Initial feedback from those who have seen it has been positive, but we haven’t yet made the conversion. I am getting the brochures ready, so we ourselves are clear on the product.

This is the fun part of being an Entrepreneur. One has chosen the road out of choice with all its ups and downs. Each day brings greater hope, greater optimism. There is a confidence we are headed in the right direction. It is a long road ahead. I am prepared. Disruptive Innovations and Revolutions aren’t for the faint-hearted. And for me, Emergic is even bigger – it is about taking computing to the other 90%. One does not get such opportunities every day.

Proxim takes WiFi to 12 miles

From News.com: “The Proxim product can achieve long distances because the company boosted the power inside its access points–the radios that create the network. It also added additional antennas to the access points so signals could be beamed directly to a home, rather than creating a cloud of access. Proxim’s product, priced from about $2,000 to $6,000, will include all the equipment necessary to become a small-scale network provider. The price differs depending on the quality of equipment and add-ons that a buyer may want. Each kit can serve about 250 customers.”

What began as technology for a Wireless LAN is on its way to bridging the last mile(s). The 3G cellcos better watch out.

Continue reading

PC’s Next Innovations

John Robb writes about what is needed to spur PC sales (ref: this NYT article my comments):

In my view, the personal computer is all about personal leverage (as is most technology: my car, microwave oven, telephone, etc.). If you want to keep the cycle alive, increase the leverage. What is the biggest opportunity available for increasing personal leverage? Suck the Web down to the PC. Reinvent it on the PC.

I want the entire value of the best of the Web on my PC (Watson). I want to be able to publish a complex website (Radio). I want to add rich content (video and audio files) via my personal website and distribute it in a way that doesn’t break my piggy bank (P2P multi-cast). I want to recombined data available via the Web in new ways that make it more meaningful to me. Reinvent the Web and it will drive PC sales.

An interesting comment on what John has to say comes from Seth Russell: “Indeed ! That way one browses and sees the web according to and within one’s own context. This doesn’t mean that each PC needs the power of a Google on the desktop, just that each PC cotains the ability to remember the context of of the PC’s owner. See also CoherentExperience mentograph. The node labeled `Your Memory’ is what is lacking in a PC at the moment and is the reason that human’s experience of the semantic web is not quite working yet.”

For emerging markets, this implies bringing the web down to the LAN server (because the desktop may actually be a Thin Client). Bandwidth is a huge problem. But one can be creative by using RSS streams and replication. Yes, at some time bandwidth will improve, but the LAN-WAN disconnect in emerging markets is massive. In India, for example, most WAN connections to the Internet are 64-128 Kbps – for entire organisations. This is why locally generated content becomes content and that is where k-logs can come in.

The next PC innovation which can make a big difference is Google News or Samachar. Take RSS feeds, add a Scratchpad as a universal writing tool and an Events Horizon as a unified reading tool and lace with application links. One screen to show them all.

Predictions from Tech’s Thinkers

From the WSJ comes an analysis of some of the past predictions (along with some new ones) made by technology’s big thinkers. Here’s what one can expect in the future:

Nicholas Negroponte: Grass-roots wireless services could in some ways overtake traditional telecom operators.

Lester Thurow: Biotechnology advances will radically transform our world and our bodies.

Glover Ferguson: “Insight” is the next key to competitive advantage. [Mr. Ferguson says services will become commodities, too. Insight is essentially a breakthrough analysis of a situation you keep secret and use to beat rivals. So, a bank might see a baby crib purchase on a customer’s credit-card bill and know from previous analysis the majority of people buying cribs are having their first babies. There’s a predictable progression of needs and purchases, including a bigger home, life insurance and so on. The bank can exploit that knowledge to cater to the customer. But it needs to keep the insight to itself.]

Alan Nugent: Web services will come of age.

Peter Cochrane: Customers will take over local telecommunications.

Michael Earl: The Internet will ultimately be more about information than transactions.

Laos project for Internet access

Look, ma, no electricity needed! This is exactly what is happening in Laos. Writes the Economist: “The Jhai Foundation devised a machine that has no moving, and few delicate, parts. Instead of a hard disk, the Jhai PC relies on flash-memory chips to store its data. Its screen is a liquid-crystal display, rather than an energy-guzzling glass cathode-ray tubean exception to the rule that the components used are old-fashioned, and therefore cheap. (No Pentiums, for example, just a 486-type processor.) Lee Thorn, the head of the Jhai Foundation, an American-Lao organisation, estimates that, built in quantity, each Jhai PC would cost around $400. Furthermore, because of its simplicity, a Jhai PC can be powered by a car battery charged with bicycle cranksthus removing the need for a connection to the grid. Wireless Internet cards connect each Jhai PC to a solar-powered hilltop relay station which then passes the signals on to a computer in town that is connected to both the Lao phone system (for local calls) and to the Internet.”

Next Markets for PCs

Intel’s and Microsoft’s problems are encapsulated in the following statements in an NYT article: “Computers have reached a point where for the most common home purposes Web surfing, e-mail and word processing they are already more than fast enough to suit a typical home user’s needs…No new computer generally means no new copy of Microsoft Windows sold, no upgrades to word processing or spreadsheet programs.” Adds the article:

Mr. Paul Otellini [Intel’s president and COO] acknowledged that most of the incremental growth in the personal computer market since 2000 is already coming from what he calls “emerging markets” developing countries where there are now few computers.

“We believe that 50 percent of all the incremental units sold in the next five years will come from these markets,” he said. There are now about 500 million personal computers in the world, he said, and with the help of the emerging markets the industry, over a long period, could still expect to see double-digit growth outside the industrial world.

The next set of users in the world’s emerging markets need computing at much lower price points, and so far Intel and Microsoft are doing nothing to cater to that. The cost of computing needs to fall by 50-70% to spur the next wave of buying. This is unlikely to be led by Intel and Microsoft.

Microsoft’s Lack of New Products

Writes Robert Scoble (via Rahul Dave):

Tell me again why companies with one, two, or maybe a handful of employees can come out with products like Blogger, Radio UserLand, Movable Type, TopStyle but the 45,000 employees of Microsoft can’t figure out how to upgrade Office or Windows or many of its other products with many new features that anyone is willing to pay money for?

I think Microsoft has a leadership problem. What’s the problem? They’ve forgotten to ship new things once in a while. Tell me again, what’s the thing that Microsoft has shipped in the past year that’s really new and has radical new features?

Windows XP? It’s more than a year old now and it really didn’t have a radical new feature set over Windows 2000. Xbox? Oh yeah. Anything else? PocketPC? Come on.

The problem isn’t with the evangelists. We’re out here. The problem is we don’t have anything new to talk about. So, we’re going elsewhere. RedHat is shipping a new version. Mozilla is shipping a new browser. Macromedia is shipping a FrontPage-killer. ActiveWords has a better way to interface with your computer. Radio and MoveableType and all the other blog tools are giving us a better way to build a Web site.

Interesting points. Maybe the big companies like Microsoft think that unless they create something which generates hundreds of millions of dollars in revenue (or unless they spend millions), it is just not worth it. Perhaps, also the fear of failure – what will peope think of us if we do something this small. On the other hand, the tiny companies and entrepreneurs have little to lose. In fact, they have to think differently, they have no legacy of the past.

TECH TALK: The Years That Were: 1996 (Part 2)

Mark Halper wrote about A World of Servers Great and Small in Forbes ASAP (June 3, 1996):

In [Oracles Larry] Ellison and [Suns Scott] McNealys view call it the nothing but net vision servers will hold all the applications and data that any $500 user would need. In the same fabric, servers will server the 150 million-odd PCs entrenched in the business world and now getting hooked into corporate Intranet networks.Already a corporate fixture dispensing and managing databases, applications, messages, print commands and other daily occurrences, the server will take on even larger dimensions in a world wired to dumb boxes. The more powerful the server, the more powerful the network, and the richer the network, the richer the network computer, says Ellison.

Forbes ASAPs year-end 296-page issue (December 2, 1996) had articles by many luminaries on the techno-future. Here are some excerpts:

Bill Gates: To make intelligent bets, you have to understand what will be going on in the next ten years. Most people overestimate what is going to happen in the next two or three years and underestimate what is going to happen in the next decade.In ten years, it gets wild. One is the predictable result Moores Law has on computer capability. PC power in absolute terms gets so large that your ability to do rich thingslike keeping your entire personal photo collection on your PC will be a piece of cakeThere are breakthrough things that are certain to come between now and 2006. Extremely cheap flat-panel displayOne the surface of my desk and a lot of my walls, Ill have displays with project status, sales data all there. Input will be done with pointing devices or by talking to the computer. The computer will be talking to us, and it will see. It will see when we walk into a room.

Nicholas Negroponte: So, what is the next Big Thing? What atoms will be turned into bits and really change the world? For me, the answer is simple: cash. What we know today as coins and paper currency will become bits. I dont mean credit or debit cards or accounting systems of that kind. I mean stored value, bits on your hard disk or in your electronic wallet.

Ann Winbald: Object-oriented programming software that can re-used and interchanged among programs has finally hit its stride. For years,software developers have talked of object-oriented software: smart components that can be assembled into a manageable environment. Programmers dared to imagine marts of such components, where they could buy and sell each others components instead of recoding already invented parts the equivalent of a bill of materials for the software factorySmall developers have become specialty parts suppliers to the growing population of software assembly-line workersThe true realization of a working software assembly process, readily available tools, and a rich supplier base has just begun to materialize as we approach the turn of the century.

Scott McNealy: Go webtop. Publish all the information on your internal web. Dont send it to employees on paper or even email, because by the time they can print it out and get through it, its already old. Give the user safe and instant access to the network from a personal web page, from any machine, using any operating system, at any time, with dial-tone reliability. Thats a utility model of computingThe digital industries are converging on this utility model. Data tone will become as commonplace as dial toneThis is where I think the puck is heading. Its a new wave of network computing, based on the fat-server, big-pipeline, thin-client model. Its applications written once to run anywhere, safely. Its web-centric, web-toned and irresistibly open.

Tomorrow: 1997

Continue reading

Sun, Linux and Emergic

From InfoWorld comes a fascinating and revealing interview with Jonathan Schwartz, who heads Suns software business, Much of the focus of the interview is on Suns Linux strategy and its recently announced Linux desktops. Steve Gillmor then dissects Schwartzs comments. (Thanks to Rahul Dave for the pointer.)

Writes Gillmor summarising the opportunity and the solution: The dynamics: Microsoft’s Software Assurance program, the phaseout of Windows NT4, and the post-Sept. 11 economic and security landscape. The target market: call centers, cost-and security-sensitive environments, government agencies, and Third World nations. The deal: a free desktop software stack for the first 100 users. The platform: Linux.

I am beginning to understand Suns strategy [1 2] better. There are two elements in it. One, take away money going to Microsoft for the desktop software and redistribute it between itself (on the server side) and the customer. Two, make the money selling servers and storage.

Here is Suns (new) view of the world: Linux with open-source software on a secure desktop using JavaCard for authentication and the browser as the desktop (Schwartz makes the point that the PC is the only unauthenticated network access point), with Java on the server side, and applications glued together using web services. In short, no need to ever write to Windows (since there is J2ME for mobile/device applications). The server runs the portal, messaging, directory, identity management software that is Suns ONE platform.

Says Schwartz: “Linux is an operating system, it’s not a developer platform. Linux is a tactic. Java is the strategy. The developer platform that we’re encouraging is for line of business applications, content-based applications, distributed applications. Java is the architecture. It runs on the highest end carrier-grade servers and it runs on the military-grade, most secure smart card microprocessor platform on the planet…We will integrate Java Card into the J2SE platform. The one bug in the system right now is that for the most part, the Java platform and the Web content worlds have diverged. It’s incumbent upon us in a Web services way to cause them to converge.”

A key point Schwartz makes is that developers are not loyal to a single platform they are loyal to volumes. That is what Sun intends to give them through its Linux boxes.

Another interesting comment from Schwartz is on the three issues CIO want to hear answers for: “Save me money, increase my level of security, and please help me consolidate away all of this ridiculous complexity.”

Emergic Comparison

As it turns out, our ideas in Emergic are not very different. The Thin Client runs Linux, KDE (instead of Gnome), Evolution, OpenOffice (instead of StarOffice), Mozilla (or perhaps, a lighter browser based on Mozilla) and GAIM. The Thick Server does all the processing and storage. Down the line, we want to add business applications builtaround J2EE on Apache (web server), JBoss (application server) and PostgreSQL (database). There is one additional component which we have: the Digital Dashboard, to create a unified events processing centre, which can be especially useful for first-time users.

What Sun has in its model is the JavaCard, a smart card for security on the desktop. I like the idea perhaps we could accomplish the same via the floppy which is needed for the client to boot-up. (In todays world, floppies are a bad way to do anything, but they already exist in the old PCs while card readers would cost additional money).

Where we differ is in the business model: our aim is to make money off the software and leave the hardware to the channel partners. Sun wants to sell the desktop and server hardware, along with the software as a solution the way they have always done. Suns target audience is also quite different: they want to go primarily after the cost- and security-conscious entities in the worlds developed markets, while our focus is largely the worlds developing countries. Suns solution will be 70% cheaper than Microsoft (as per their claims). Our solution will be 70% cheaper than Sun.

Continue reading

Intel’s Itanium – NYT

Writes the NYTimes on Intel’s Itanium (its 64-bit processor) in Intel’s Huge Bet Turns Iffy:

Inreasingly, Intel is facing the risk that it has chosen the wrong path to high-performance computing. It may have looked backward as it developed the microchip equivalent of the behemoth computers of the past.

Eric Schmidt, the computer scientist who is chief executive of Google, told a gathering of chip designers at Stanford last month that the computer world might now be headed in a new direction. In his vision of the future, small and inexpensive processors will act as Lego-style building blocks for a new class of vast data centers, which will increasingly displace the old-style mainframe and server computing of the 1980’s and 90’s.

It turns out, Dr. Schmidt told the audience, that what matters most to the computer designers at Google is not speed but power – low power, because data centers can consume as much electricity as a city.

Thick-and-Cheap Linux Desktop

Writes Gary Krakow (MSNBC): “Wal-Mart.com is now selling a computer made by Microtel which retails for $199.86. As you might expect, the computer is on the bare-bones side. Microtel also uses a new, free version of the Linux OS instead of Microsoft Windows to save money. Thats where the story gets interesting. The operating system is called Lindows. Add to the Via 800 MHz C3 chip: 128 MB of RAM (expandable to 1 GB), 10 GB hard drive, 52x CD-ROM, 10/100 Ethernet connection (a modem is $30 additional), keyboard, two-button wheel mouse, and a small pair of powered speakers and you get the new $199 Microtel SYSMAR 710.”

Where it gets interesting is the addition of the AOL programs: “Think of Lindows 2.0 as the AOLs new Netscape OS. Actually, if you dig hard enough on the Lindows Web site, youll be able to find a preview version of an AOL 7.0 client for Lindows. No Windows needed, no Macintosh needed just AOL. Thats something AOL has been trying to do for years. This time they might succeed.” The USD 200 computer can be thought of an AOL PC, according to Gary.

The other interesting element here is Lindows’ ability to run (some) Windows applications: “Lindows was originally touted as being able to run Microsoft Windows programs. Guess what? IT CAN. I was able to take my old, now unused Office 2000 disk, insert it into the Lindows computer and watch in amazement as it installed easily. Theyre now working on getting Office XP to install. The Lindows OS always included reader software for Word, Excel and PowerPoint, but now the full programs can be installed.” This is because of a Windows emulation program called WINE (which is open source and contains no Windows code).

Slashdot thread

I have been thinking how these USD 200 PCs can make a different to our Thin Client-Thick Server project. Some thoughts:

– the computer cost of USD 200 (excluding monitor) would translate to about Rs 16,000 in India (USD 320). This is still at least twice as expensive as what an old PC would cost. Getting the PC cost to less than Rs 6-7,000 in India is critical for mass market adoption (even by corporates).

– by doing all the processing locally on the desktop, one now needs to worry about administration. The TC-TS architecture simplifies this by centralising storage and processing.

– The stand-alone Linux desktop has its market. The TC-TS solution needs about 7-10 TCs to justify the increased cost of the server. It also needs a 100 Mbps LAN between the clients and the server. This eliminates it from various markets (especially, the home segment, kiosks, small branches, etc.) That is where the stand-alone Linux PC can be used.

– In other words, the bigger picture for us needs to address two markets: the “thin client” (which needs a thick server) and the “thick-and-cheap” client (which can be stand-alone). Together, they provide the artillery for an assault on the computer market which can dramatically bring down costs and increase penetration in emerging markets.

For now, we will focus on the first of these markets (TC-TS), and build on software like the Digital Dashboard which can add value when we target the second set of users. We are reasonably agnostic to the hardware – our focus is on the software and the value-added services that can be provided on top of that.

There are some very interesting opportunities for the cheap, stand-alone Linux desktop, and I’ll talk about this in a more elaborate post later.

Microsoft’s Vision for Future Office

From Seattle Post-Intelligencer Reporter a view of how the (physical) office of tomorrow may look like:

Located on Microsoft’s Redmond campus, the center sports a main room with several desks, each with a different configuration of monitors. It’s dark and hushed — except for the Dolby Digital 5.1 surround sound booming out from concealed speakers and whooshing, “Star Wars” noises emitted whenever new information comes onscreen or is e-mailed away.

“Surround sound is going to be increasingly important in future offices,” says group marketing manager Tom Gruver in leading a tour of the new facility.

At one desk, users can move a wireless mouse’s pointer from the screen of one computer to the screen of a laptop, with no wire or wireless connection between the computers themselves. That allows copying or moving material between the computers, a task that would otherwise be more difficult.

In an adjoining, ultramodern meeting room, visitors — expected to number about 1,000 corporate executives per month — can role-play workers in a hard-charging widget company, striving to nail down production of a new model before the company’s chief executives goes on television for an interview.

In the article, there is a nice picture of a largish concave screen called Broadbench, which is presumably the type of display we can all expect.

As the article notes in the beginning, most of these technologies are 5+ years away.

Smarter Devices

A WSJ story on DEMOmobile says: “As wireless data networks spread the Internet, previously dumb machines are being connected and endowed with intelligence while portable communications devices are getting smarter.”

Among the products mentioned include “Vocera Communications System, which bundles the functionality of a walkie-talkie, phone and pager into a 1.6-ounce badge that users wear around their necks and operate hands-free with voice commands.”

Linux Interest Rising – Economist

From the Economist (thanks to John Robb):

One reason why alternatives to Microsoft’s products are catching on at the moment is the company’s new pricing policy. Instead of paying once for new products or upgrades, Microsoft is now charging many corporate customers and government offices a form of subscription. Many complain that this has increased the cost of their software. So far, computers with Linux-based operating systems are most popular with retailers and companies that run call centres dealing directly with customers. But businesses offering financial services have also begun switching to Linux.

Yet companies offering Linux still have a long way to go before they begin to reel in Microsoft. Even with its current, more-humble posturethe company expects soon to wrap up its long-running, antitrust dispute with the US governmentMicrosoft remains a formidable foe. International Data Corporation, a market-research firm, reckons that Linux has just 2.7% of the market for desktop operating systems. So even if its sales grow like topsy, it could still be years before Linux makes an impression on Microsofts 94% share.

The interesting thing to look at would be (a) new users (b) emerging markets. These are the markets that Linux should be focusing on, and this is where it can make headway (especially, new users in emerging markets). There is increasing momentum for Linux on the desktop with initiatives from Sun, Red Hat and Lindows. They may not make much of a dent in the existing Windows user base, but the next set of users may not see the same levels of penetration for Microsoft as in the past.

Continue reading

TECH TALK: The Years That Were: 1996

Forbes ASAPs February 26, 1996 issue celebrated the 25th anniversary of the microchip. Discussing the road ahead, it asked a number of experts: What products will be changed radically by embedded processors in the next five years? The answers with their then designations:

Ron Bernal, president, MIPS Technologies: Consumer products; more applications to help people with physical impairments.

Gordon Campbell, chairman and president, 3Dfx: Consumer products, particularly convergence of TV and PC.

Henry Fung, VP engineering, Vadem: Something to do with neural nets or AI.

Tommy George, GM< semiconductor products, Motorola: Credit cards will migrate rapidly from plastic to smart cards. Mike Hackworth, CEO, Cirrus Logic: Phone, entertainment, security systems, Web appliances and pocket products such as organizers. Bill Joy, software legend, Sun Microsystems: Anything with batteries. Dan Klesken, oft-quoted analyst, Robertson, Stephens & Co.: Cellular phones, electronic games, portable electronics and car electronics. Dan Lynch, cofounder and chairman, CyberCash: Household energy devices: heating, cooling, lighting. Plus, GPS applications everywhere. EE Times Editors (Ron Wilson and Richard Wallace): All products involving motion and sensors. Expect a new generation of voice-operated stuff. Wes Patterson, CEO, Chromatic Research: High-speed media processors will make consumer audio and video products hugely better. Mark Stevens, venture capitalist, Sequoia Capital: Autos, household consumer goods, portable communications.

In the same issue, Frederico Faggin, the builder of the first microprocessor and then CEO of Synaptics, said:

We tend to predict the future of microprocessors as heading into a single direction: Pentium, Pentium Pro, P7, P8 and so on in the future. But I think there is a second path thats now emerging as well.

Call it point-of-need hardware, PON. It arises from that fact that computers are coming down in price and being used by people who dont know anything about computers. We need computers that are more like us, that can talk to us, adapt to our needs, learn from usWe want computers to be intuitive and have common sense; most of all, we want them to be able to interact with us naturally, through sight, sound and touch.

Todays microprocessor architectures just cant do that. But there are other ways. We have a whole body of programmable chips plastic hardware, you might call them like field-programmable gate arrays and field-programmable interconnect components that could be linked together into powerful systems that could be rewired and reprogrammed as you need it. This brings us into a world of appropriate hardware.

How far off is this? Twenty years, probably. Fifty years for biological computers. In the meantime, a convergence of computers and communications, wireless, Internet. Thats what will impact you tomorrow.

Wrote George Gilder in Forbes ASAP: Over the next five years, @Home will increase the bandwidth to home- and small-business computers by a factor of thousands. While Moores Law doubles computer power every 18 months, the law of the telecosm, by the most conservative possible measure, doubles total bandwidth every 12 months. This adds up. Over the next decade, computers will improve a hundred-fold while bandwidth will improve a thousand-fold.Combined with a broadband network, the $500 teleputer (Internet PC) will be more flexible and powerful than existing PCs. Rolling out both the network and the teleputer will be the central activity in the industry over the next two years.

Next Week: The Years That Were (continued)

Continue reading

Engineering’s Future

Writes Robert Lukcy in IEEE Spectrum:

Engineering today feels like that window seat on the airplane. Those can’t be real transistors and wires down there, can they? Watching the simulations on my computer monitor is like watching the movie on the airplane – an unreality wrapped in another unreality. I feel that I have lost touch with Edison’s world of electricity – a world of black Bakelite meters, whirring motors, acrid chemical smells, and heated conductors. I miss Heathkits and the smell of molten solder and burning insulation – the sensual aspects of engineering that have been replaced for many of us by the antiseptic, ubiquitous, and impersonal CRTs.

I have a deeper worry that math itself is slipping away into the wispy clouds of software that surround us. I walk down the aisles of laboratories, and I see engineers staring vacantly into monitors, their desks piled high with anachronistic paper detritus. Is anyone doing math by hand any longer, I wonder? Do they miss the cerebral nourishment of solving equations? Perhaps math in the future will be the exclusive province of a cult of priests that embeds its capability in shrink-wrapped, encrypted software.

I can’t believe that 20 years from now engineers will still stare into displays, run CAD tools, and archive their results in PowerPoint. But what will they do? My deepest fear is that the reality gap becomes so great that the best-selling software will be called Engineer-in-a-Box.

Slashdot thread

Perhaps the best thing I learnt in my engineering is the ability to think differently – there are multiple ways to approach the same problem. If one does not work, then try another. Engineering instills in one discipline and logical thinking laced with a practicality, because at the end of the day one has to solve real-world problems.

I may not remember much of what I learnt within the classroom during by BTech and MS days, but I do know that I would not have been what I am had it not been for my learnings in the schools of Electrical Engineering at IIT and Columbia.

Browser Options

Three articles:

– O’Reilly: Let One Hundred Browsers Bloom surveys many of the alternate Mozilla browsers currently available including Chimera, Galeon, Phoenix and Aphrodite.

– O’Reilly: Roll Your Own Browser (using Mozilla) [Slashdot thread]

– News.com: Mozilla browser gets some bite writes about Phoenix 0.1, which is based on much of the Mozilla code, includes a customizable toolbar, new design, improved bookmark manager and loads in nearly half the time of Mozilla 1.1.

Continue reading

Fallacy behind Fibre Glut

Writes WSJ:

Of all of the myths that drove the 1990s technology boom — dot-coms made good investments, the New Economy would never experience a recession, small telecom companies could beat the mighty Bells — the most damaging may have been the fallacy that Internet traffic was doubling every three months.

The belief that Internet traffic could grow so quickly — if true, it would have meant annual growth of more than 1,000% — led more than a dozen companies to build expensive networks as they rushed to claim a piece of the next gold rush. The statistic sprouted up in reports by industry analysts, journalists and even government agencies, which repeated it as if it were the gospel truth. “Internet traffic,” the Commerce Department said in a 1998 report, “doubles every 100 days.”

Except that it didn’t. Analysts now believe that Internet traffic actually grew at closer to 100% a year, a solid growth rate by most standards but one that was not nearly fast enough to use all of the millions of miles of fiber-optic lines that were buried beneath streets and oceans in the late-1990s frenzy. Nationwide, only 2.7% of the installed fiber is actually being used, according to Telegeography Inc. Much of the remaining fiber — called “dark fiber” in industry parlance — may remain dormant forever.

It is quite amazing in retrospect how much money was invested (and has been lost) due to such mistakes predictions. As I was reading some of the older magazines for the latest Tech Talk series, it is now evident that many among us were taken in by such optimistic views of the future. The more outrageous the prediction, the more realistic it seemed!

Another story in the Journal quotes a rueful Dr. Brinkman, a distinguished researcher who spent 35 years at Bell Laboratories, as saying, “Maybe we should have been less smart.” The story adds:

Never before has the efficiency of an industry’s technology gotten so far ahead of demand, creating a glut of capacity that will take years to work off — and crippling dozens of companies in the process.

Scientists perfected once-exotic methods for cheaply sending vast amounts of voice and data, such as Internet traffic, over fiber-optic lines. These advances far exceeded the pace of telephone-industry innovation in the 100 years before it. Prior to 1995, telecom carriers could send the equivalent of 25,000 one-page e-mails per second over one fiber-optic line. Today, they can send 25 million such e-mails over the same fiber strand, a 1,000-fold increase. Yet the cost of making that upgrade rose by just a few times over the 1995 price, and in some instances actually declined.

A similar thing may be happening in the compuer indsutry where microprocessor speeds are moving ahead far rapidly than what we can use. They challenge here is to expand the market (target the underserved people in the emerging markets) rather than overserving the same existing set of users, who have little incentive to upgrade.

Mapping Ideas

Writes Walter Mossberg (WSJ) reviewing a software called MindManager from Mindjet:

Mind mapping, or brainstorming, software has been around for years, but hasn’t caught on except for a small cadre of devoted users, mainly from big business and academia.

To see what mind mapping is all about, I tested MindManager for a week or so and found it to be a fascinating way to organize one’s thoughts about a subject or project. It’s not for everyone, and it has some downsides. But I suspect it could be effective for many people in many walks of life.

MindManager’s so-called mind maps are special documents that look like spider webs. The central idea or project title is at the center, and a series of branches radiate outward to represent subtopics.

As you build the map, fine-tuning your ideas, you add branches, and all the branches, in turn, sprout subnetworks of lesser branches. The branches are labeled with text and graphics, and can be linked or related to one another. Mindjet calls this visual thinking.

The software costs USD 99-269. Mossberg’s conclusion: “If you want to improve your writing or planning process, MindManager is worth a try. It just might make you look smarter.”

I find a blank sheet of paper the most effective for thinking. A mix of doodling, writing and redrawing works well. Of course, the pre-requisite is thinking and generating the ideas in the first place.

Continue reading

Recycling Computers for Social Change

Salon writes about how anti-globalization activists in Oakland, Calif., are recycling old machines, loading them with free software and shipping them off to Ecuador:

The software part of the project is less hit-and-miss than the hardware. The activists are using Mandrake Linux with installation scripts provided by Free Geek, which makes the whole thing rather foolproof — it’s the kind of pop-in-a-CD, point-and-click thing a 10-year-old could do. Or a 60-year-old, for that matter. The average volunteer can build about 10 computers in a day, Henshaw-Plath says; people with lots of experience and some luck can build as many as 25.

If you just look at their specifications, the systems the activists are building here seem almost worthless, Pentium 100-class machines with about a gigabyte of hard drive space and 80 megs of RAM. The sort of computer that went for thousands in 1996, but that wouldn’t fetch $50 on eBay today.

But if you wipe Windows off these systems and replace it with a Linux-based operating system, and if you just plan to use them for the Web and e-mail, they can be quite useful, says Henshaw-Plath.

In the remote villages of South America, “all they need computers for is communication,” says Henshaw-Plath. “They’ll use it mostly for e-mail — and it’s not e-mailing someone far off, it’s just someone in the next village. They only need some way to communicate between the two of them that will allow them to coordinate and articulate strategies for social change.”

The recycling of old computers from the developed world to the developing world will great the next big opportunities for technology.

Continue reading