# Career Calculus

Eric Sink has an excellent article on the importance of learning in shaping our careers.

In basic calculus we learned that the first derivative of a function is the “rate of change” of the value of that function with respect to another variable. In the case of your career, the other variable is time. The basic equation for a developer career looks like this:

C = G + LT

C is Cluefulness. It is defined as an overall measure of your capabilities, expertise, wisdom and knowledge in the field of software development. It is the measure of how valuable you are to an employer. It is the measure of how successful your career is. When you graph your career, C is on the vertical axis.

G is Gifting. It is defined as the amount of natural cluefulness you were given “at the factory”. For each individual, G is a constant, but it definitely varies from person to person.

L is Learning. It is defined as the rate at which you gain (or lose) cluefulness over time.

T is Time. It is on the horizontal axis of your career graph.

As you can see above, your career success is determined by three variables, only one of which you can control:

• You obviously can’t control T. Time marches forward mercilessly at the same rate for everyone.

• You also can’t control G. The truth is that some people are just naturally smarter than you are, and that’s the way it is. But G is not the sole determiner of your success. I have known some truly gifted programmers with lame careers, and I have also known some less-gifted folks who have become extremely successful.

• You can make choices which affect the value of L. In fact, you do make choices which affect the value of L, every day, whether you know it or not.

• To which I would add, that a weblog can play a key role in furthering learning. When you read, think and blog it – adding your comments and creating a personal knowledge base. My formula: Learning = Reading + Thinking + Blogging.

# Emergent Power Grids

Reuben points to Steven Strogatz writing in the NYTimes on what the US needs to do fix its electrical grid, taking a leaf from biological systems:

What is needed is a more subtle, coordinated mode of response. When our own immune systems are performing at their best, they orchestrate their defenses through countless chemical conversations among T-cells and antibodies, enabling these defenders to calibrate their response to pathogens. In the same way, the thousands of power plants and substations in the grid need to be able to communicate with one another when any part of the system is breached, so they can collectively decide which circuit breakers should be tripped and which can safely remain intact.

The technology necessary to achieve this has existed for about a decade. It relies on computers, sensors and protective devices tied together by optical fiber so that all parts of the grid would be able to talk to one another at the speed of light fast enough to get ahead of an onrushing blackout and confine it.

The sensors would continuously monitor the voltage, frequency and other important characteristics of the electricity coursing through the transmission lines. When a line appeared at risk of being overloaded, a computer would decide whether to switch on a protective device. At present, such decisions are made purely parochially. Power plants defend themselves first, and don’t worry about the consequences for neighboring plants on the grid. Nor do they consider any potentially helpful or harmful actions that those neighbors might be taking at the same time.

In the new approach, each plant would have nearly instantaneous information about all the other plants and power lines in its extended neighborhood. Everyone would know what everyone else was doing and thinking. As threats arose (either from random failures or malicious attacks), the sensors would fire a flurry of warning signals down the optical fibers, and the networked computers would decide which protective devices to activate to contain the threat most effectively. The grid would then be responding as an integrated entity, not as a ragtag collection of selfish units. It would look a lot like an organism defending itself.

# Nature and Science

NYTimes has a fascinating story with two elements to it: how ideas can lead to innovation, and how nature is far ahead of our science.

While in San Francisco for a scientific conference last year, Dr. Joanna Aizenberg, a research scientist at Bell Labs and senior author of the Nature paper, wandered through shops looking at shells and other collectibles from the sea.

In one shop she spotted a Venus’ flower basket sponge, a creature with an intricate hollow latticework skeleton that lives thousands of feet deep in the western Pacific Ocean. What caught her eye was a ring of glassy filaments at the sponge’s base that once tied it to the ocean floor. She bought the sponge.

Back at the laboratory, Dr. Aizenberg and colleagues from Bell Labs, Tel Aviv University and OFS, a Lucent spinoff, discovered that the filaments, about the length and thickness of hair, also carry light. While other researchers discovered a few years ago light-carrying fibers in a sponge off Antarctica, the fibers of the Venus’ flower basket sponge are exceptional because their structure is “strikingly similar” to those of commercial optical fibers that ferry pulses of light in telecommunication systems, Dr. Aizenberg said.

“Nature came up with exactly the same design millions of years ago,” she said. “I would be surprised if it’s accidental.”

The fibers gather the light from luminescent organisms in the depths and give the sponge a slight glow. “It will act as a fiber optical lamp,” Dr. Aizenberg said.

# IT Matters

HBS Working Knowledge continues the debate on the Nicholas Carr’s HBR article “IT Doesn’t Matter”. [I had written a series on this recently.] HBS professors F. Warren McFarlan and Richard L. Nolan write:

The most important thing that the CEO and senior management should understand about IT is its associated economics. Driven by Moore’s Law, those evolving economics have enabled every industry’s transaction costs to decrease continually, resulting in new economics for the firm and creating the feasibility of products and services not possible in the past. The economics of financial transactions have continually dropped from dollars to cents. New entrants have joined many industries and have focused on taking strategic advantage of IT’s associated economics. Company boundaries have become permeable, organic, and global in scope through IT networks and the Internet.

As the pace of doing business increases, the CEO and senior management team must be aware of how IT can change rules and assumptions about competition. The economics of conducting business will likewise continue to improveproviding opportunities for businesses to expand the customer value proposition by providing more intangible information-based services. For example, the automobile value proposition continues to expand with technology that continuously senses road conditions and applies the appropriate wheel traction and suspension system pressures.

New technologies will continue to give companies the chance to differentiate themselves by service, product feature, and cost structure for some time to come. The first mover takes a risk and gains a temporary advantage (longer if there are follow-on possibilities). The fast follower is up against less risk but also has to recover lost ground. Charles Schwab versus Merrill Lynch and Walgreens versus CVS are examples of this playing out over the past decade. Our advice to the CEO is to look at IT use through several different lenses. One lens should be focused on improving cost savings and efficiencies. Another should be focused on the incremental improvement of organizational structure, products, and services. Still another should be focused on the creation of strategic advantage through extending competitive scope, partnerships (customers and other parties), the changing of the rules of competition, and the provision of new IT-based services to extend the customer value proposition.

Shrikant has mentioned about what is happening on South Korea in the context of small businesses on more than one occasion. So, it was good to read this story in Business Week – there are lessons in what we need to do in India for us.

After years of digitizing everything from stock trading to gaming to education, [South Korean] officials realized that one group — the country’s nearly 3 million small businesses — remained thoroughly low tech.

Those companies employ two-thirds of the workforce and generate 30% of gross domestic product. So Seoul is spending \$75 million over three years to give small businesses online access to the same kind of planning, management, and accounting tools that big companies use. Officials hope the program will not only make small companies more efficient but will also let them more easily hook into bigger companies’ supply chains, which are largely powered by the Internet. “On a national scale, the synergies will be enormous,” says Thomas Yoon, senior vice-president of the state-run National Computerization Agency.

To carry out this ambitious program without breaking the bank, Yoon’s agency has recruited the nation’s telecom companies. They are making room on their computer servers for small-business owners. Software and technology consulting companies then load the telcos’ computers with programs catering to the needs of small companies. The government subsidy pays for development of the programs and for training the small-business owners in using the software. The small businesses have to buy their own PCs and broadband connections to reach the programs on the telecom companies’ servers. “Unless small suppliers become part of the networked systems, you can’t expect industrywide electronic transactions or efficient supply chains,” says Paek Ki Hun, director in charge of Internet policies at the Ministry of Information & Communication. The goal: making 30% of all business transactions electronic by 2005, up from 12% now.

[Small businesses] can buy access to the computer network and basic business-management programs for an average of \$15 to \$25 per month. More robust software for bigger companies costs \$75. The computerization agency has put together customized packages of software for 22 business lines, including real estate brokers, eyeglass shops, beauty parlors, sports clubs, and restaurants. Programs for an additional 36 business types are being developed.

Around 150,000 of South Korean business are connected, and the aim is to reach 500,000 by end of 2004.

In the context of SMEs, the three big technology impacts they are seeing (all pretty much at the same time) are mobiles/wireless, broadband and affordable business applications. A potent combination, indeed.

# TECH TALK: The Death and Rebirth of Email: Solution Ideas (Part 2)

For countering viruses, we have desktop and server-side anti-virus software. While this has mitigated the virus bane for many, there are a couple of limitations: the software needs to be updated regularly (many computer in countries like India and China use pirated software on the desktop, which may not have the latest updates to keep out the newest viruses), and there is a time lag between the virus being detected and the cure being made available. The time delay means that even a single infected computer can cause significant damage to a corporate network.

Spam is different. It is incessant. There is a continuous battle of wits as spam marketing companies try to outwit the spam filters that users are setting up on the desktop and server-side. InfoWorld uses SpamAssassin: [It] is easy to install and customize, with a basic interface for adding domains and e-mail addresses to blacklists and white lists. To do its work, SpamAssassin uses word and phrase matching, real-time blocking lists, and on-the-fly spam-reporting features. Each e-mail is assigned a score depending on the detected level of spam probability. By default, SpamAssassin flags a message as spam if the score is above five. We actually use a score of six for the [SPAM] flag to be added to a message subject line and a score over 10 for it to be automatically moved to a [separate] folder.

For Outlook users, Jon Udell recommends SpamBayes:

The commercial e-mail that I want to receive (or reject) will differ from the ones you want (and don’t want) according to our interests and tastes. A filter that works on behalf of a large group, such as SpamAssassin, which checks and often rewrites my infoworld.com mail, or CloudMark’s SpamNet (formerly Vipul’s Razor), which collaboratively builds a database of spam signatures, will typically agree with SpamBayes on what I call the Supreme Court definition of spam: You know it when you see it. What sets SpamBayes apart is its ability to learn, by observing your behavior, which messages you do want to see, and the ones you don’t.

SpamBayes appears as a toolbar item called Anti-Spam. To use the add-in effectively, you’ll need to point it to a pile of ham. These messages may simply be the contents of your inbox if you keep it squeaky clean. But they can also live in other folders. That’s great news, because I use Outlook’s filters aggressively to route messages from known correspondents to folders.

You’ll also need to point SpamBayes to a big pile of spam. In my case, that folder was called NotToMe, where an Outlook filter has long been accumulating messages that are neither To: nor CC: my primary e-mail addresses. This simple rule is so effective at filtering spam that it was my sole defense until InfoWorld installed SpamAssassin a few months ago. But lately, as I’m sure you’ve noticed, the volume of spam has exploded. Even with SpamAssassin, the hassle of plucking the few wanted messages from my NotToMe folder, plus the growing amount of spam sent to my primary e-mail addresses (and not caught by SpamAssassin), spurred me to take the next step.

Tomorrow: Solution Ideas (continued)