Chess and Business

[via Anish Sankhalia] Fast Company has an article by Garry Kasparov:

Ultimately, what separates a winner from a loser at the grand-master level is the willingness to do the unthinkable. A brilliant strategy is, certainly, a matter of intelligence, but intelligence without audaciousness is not enough. Given the opportunity, I must have the guts to explode the game, to upend my opponent’s thinking and, in so doing, unnerve him.

So it is in business: One does not succeed by sticking to convention. When your opponent can easily anticipate every move you make, your strategy deteriorates and becomes commoditized. So, yes, a sort of courage is paramount. But that courage must be tempered by other less-glamorous qualities.

For one thing, the game requires the discipline to think beyond the present — and beyond yourself. You must consider not just your side of the board but also your opponent’s. For every move you ponder, you must mentally calculate your opponent’s response — not just the immediate one, but those 10 or 15 moves ahead.

At the highest levels of chess, before you touch a piece, you are playing out an entire game of moves and countermoves in your head. In effect, you are thinking for two people. In business, too, successful strategists think not just about their own new products, pricing, and marketing but also about how their rivals will respond — and how to respond to them. Can you imagine not doing so?

Smart executives, correspondingly, must understand that their competitors are at least as smart as they are. Only the most arrogant fail to acknowledge that they do not have a monopoly on brainpower, ideas, or will. In chess, I know that my rival sees everything I see. Even if I do the unthinkable — a bold, unprecedented move calculated to leave him gasping — I must assume he has anticipated it and will have an equally daring answer. Call it the courage to accept humility.

Tim Berners-Lee on the Semantic Web

Excerpts from an interview in Technology Review:

The common thread to the Semantic Web is that theres lots of information out therefinancial information, weather information, corporate informationon databases, spreadsheets, and websites that you can read but you cant manipulate. The key thing is that this data exists, but the computers dont know what it is and how it interrelates. You cant write programs to use it.

But when theres a web of interesting global semantic data, then youll be able to combine the data you know about with other data that you dont know about. Our lives will be enriched by this data, which we didnt have access to before, and well be able to write programs that will actually help because theyll be able to understand the data out there rather than just presenting it to us on the screen.

Suppose youre browsing the Web and you find a seminar advertised, and you decide to go. Now, there is all sorts of information on that page, which is accessible to you as a human being, but your computer doesnt know what it means. So you must open a new calendar entry and paste the information in there. Then get your address book and add new entries for the people involved in the seminar. And then, if you wanted to be complete, find the latitude and the longitude of the seminar, and program that into your GPS [Global Positioning System] device so you could find it.

Its very laborious to do all this by hand. What you would like to be able to do is just tell the computer, Im going to this seminar. If there were a Semantic Web version of the page, it would have labeled information on it that would tell the computer this is an event, and what time and date it is. And it would automatically add your travel to your event book. It would add the people to your address book, and it would program your GPS to give you directions. It would have the relationships between the event and the various people chairing it. And those people would have Semantic Web personal pages, which contained information about how you could contact them.

Your address book can now grow from a closed repository of private data to a view on the people-related data in the world.

VoIP Insurrection

Om Malik has a guest column by Daniel Berninger, senior analyst for Tier1 Research:

As of 2004, every project at the post-divestiture AT&T Labs and Lucent Technologies Bell Labs reflects the reality of voice over Internet Protocol. Every major incumbent carrier, and the largest cable television providers, in the United States has announced a VoIP program. And even as some upstart carriers have used VoIP to lower telephony prices dramatically, even more radical innovators threaten to lower the cost of a phone call to zeroto make it free.

The VoIP insurrection over the last decade marks a milestone in communication history no less dramatic than the arrival of the telephone in 1876. We know data networks and packetized voice will displace the long standing pre-1995 world rooted in Alexander Graham Bell’s invention. It remains uncertain whether telecom’s incumbent carriers and equipment makers will continue to dominate or even survive as the information technology industry absorbs voice as a simple application of the Internet.

VoIP turns telecom into a simple extension of consumer electronics business, because Internet applications exist without metering for time and location. Users of VoIP need not worry about the destination or duration of their calls any more than someone sending an email or browsing the web. People do not pay each time they play a CD, and communications seems headed in the same direction. Microsoft X-Box Player already offers VoIP for participants in multi-player games. Metering and billing calls can easily cost more than delivering the service itself, and the flat rate access billing model eliminates the need for solving inter-carrier compensation.

The decoupling that produces rapid improvements in connectivity and processing platforms also facilitates software development. People working on VoIP applications don’t need to change the nature of the Internet with each new application, and everyone with a computer becomes a potential member of the Internet development team. Applications of the Internet from email to the web to instant messaging and VoIP without exception have come from the tinkering of entrepreneurs rather than an industrial research center backed by market research.

Economic Gardening

[via Atanu Dey] An essay on the City of Littleton website:

In 1987, the City of Littleton, Colorado pioneered an entrepreneurial alternative to the traditional economic development practice of recruiting industries. This demonstration program, developed in conjunction with the Center for the New West, was called “economic gardening.”

We kicked off the project in 1989 with the idea that “economic gardening” was a better approach for Littleton (and perhaps many other communities) than “economic hunting.” By this, we meant that we intended to grow our own jobs through entrepreneurial activity instead of recruiting them. The idea was based on research by David Birch at MIT that indicated the great majority of all new jobs in any local economy were produced by the small, local businesses of the community. The recruiting coups drew major newspaper headlines but they were a minor part (often less than five percent) of job creation in most local economies.

What I love about economic gardening is the intellectual stage on which we get to explore. Its very essence requires that we not only understand the complex mechanism of economies but the never ending kaleidoscope of human activity as it relates to the building, maintaining and survival of communities. I doubt if we will ever completely understand it but if we come to an appreciation of how complex of a task we have undertaken, that will be a major step forward.

Marrying Fixed and Mobile Phones

The Economist writes that “new technology will abolish the difference between fixed and mobile phones.”

The current enthusiasm throughout the telecoms industry for the idea of fixed-mobile convergence, which uses clever technology to provide the best of both worlds: the freedom of mobile and the reliability and low cost of fixed lines. Subscribers use the same handset to make calls via fixed lines at home, and mobile networks when out and about: they have one number and one voicemail box, and receive one bill.

Behind the scenes, this involves some clever tricks. Calls are handled within the home by a small base-station plugged into a fixed-line broadband-internet connection. This base-station communicates with nearby handsets using radio technology that operates in unlicensed spectrum, such as Bluetooth or Wi-Fi (so you will need a new handset). The base-station pretends, in effect, to be an ordinary mobile-phone base-station. As you enter your house, your phone roams on to it. When you make a call, it is routed over the broadband link, which has enough capacity to handle several calls at once by different members of the household. Calls made in this way are billed as fixed-line calls. If you leave the house while making a call, you roam seamlessly back on to the ordinary mobile network. And when a friend comes to visit, her phone roams on to your base-station, but the charges for any calls made appear on her bill.

China, India and the World

WSJ writes that “competition from China and India is changing the way businesses operate everywhere.”

China and India — two of the world’s hottest economic powerhouses — are rattling businesses around the globe, in very different ways. The boom in China’s world-wide exports — up 125% in four years — has left few sectors unscathed, be they garlic growers in California, jeans makers in Mexico or plastic-mold manufacturers in South Korea. India’s punch has been far softer, but the impact has still altered how hundreds of service companies from Texas to Ireland compete for billions of dollars in contracts.

The causes and consequences of each nation’s surge are somewhat different. China’s exports have boomed largely thanks to foreign investment: Lured by low labor costs, big manufacturers have surged into China to expand their production base and push down prices globally. Now manufacturers of all sizes, making everything from windshield wipers to washing machines to clothing, are scrambling either to reduce costs at home or to source more of what they make in cheaper locales. Some of the braver small fry are even setting up factories in China despite huge cultural and logistical challenges.

India, too, is prompting a massive rush east by many U.S. and European service providers. But, unlike the manufacturers that headed into China, service companies didn’t go to India until cheaper and increasingly sophisticated Indian enterprises invaded their territory. Bangalore-based consulting and information-services firm Infosys Technologies Ltd., for example, nearly tripled its overall revenue from 2000 to 2002, in large part thanks to surging sales in North America.

U.S. service companies say they have little alternative other than to confront Indian competitors on their home turf: For many of these companies, the price of manpower is king. Consulting and tech-services company Accenture Ltd. plans to have as many as 10,000 people in India by the end of this year, or about one-eighth of its entire work force.

TECH TALK: The Network Computer: What Went Wrong

One of the most detailed analyses of the reasons behind the failure of Ellisons vision of the network computer (NC) comes from Bhaskar Chakravorti in his book The Slow Pace of Fast Change. This is what Bhaskar writes:

For an answer, we must recreate the qualifying conditions for an NC-favorable endgame. Consider four crucial parties:

  • Buyers: demand-side players who decide to purchase PCs or NCs for the department
  • Users: demand-side players who work in the department
  • The NC coalition: supply-side players from Oracle, Sun and IBM
  • OEMs: supply-side players who manufacture PCs

    Choice Factors for Buyers
    Being responsible for buying the equipment used by the employees in various departments, the buyers were motivated by the various applications that the computers would facilitate. The other major motivator, as well as constraint, was the budget outlay necessary to meet the department’s needs for computing. Buyers would also be motivated by the need to keep systems service calls under control.

    Expectations about the choices of buyers in other units or firms with which this department interacts would also play a role. It is important to maintain compatibility to smoothly communicate or exchange information. The expectations about what others are buying also drives expectations about the software and support that would be available from the providers.

    In larger corporations, the computing architecture used a client-server model configured with the PC as the client. Any change in the client device would have required upgrading the capabilities of these other components of the system. This would be a distinct constraint to adopting an alternative to the entrenched status quo; the costs in the existing network and servers were already sunk. Change would require the activation of a new buying process for these other parts of the information technology infrastructure.

    Choice Factors for Users
    Users are, in general, motivated by the desire to do their job without having to relearn how to use a device or get used to new software or interfaces. They would usually prefer the attributes of the PC over those of the NC since they do not bear the direct costs of purchase. The PC gives them the control and flexibility to utilize a vast amount of computing power independently. With a PC, the user can run programs with minimal reliance on connection to a wider network.

    Choice Factors for the NC Coalition
    The coalition was motivated by the desire to supplant the PC with the NC. However, for each coalition member, the degree to which it would be willing to invest in selling NCs was constrained by several other factors. The NC applications and operating system had not sufficiently matured. There was insufficient market impetus for their development at optimal scale. With the Internet and e-business initiatives emerging as the single biggest attention-grabber for executives at Oracle, IBM and Sun, as well as their most demanding customers, the coalition’s marketing and sales resources were feeling constrained.

    Choice Factors for the PC OEMs
    A critical constraint governing the PC OEMs’ choices was the PC industry structure. When the NC was being launched, the PC had become more of a commodity, with relatively low entry barriers into the PC manufacturing and assembly business. Among the so-called tier-one OEMs, there was intense competition for the high-end PCs. A similar pattern existed among lower-end PCs as well, which were continuing to take potential customers away from tier one. This dynamic was reinforced by a highly competitive component-manufacturing industry serving the OEMs.

    The combination of easy entry into PC assembly, increased competitiveness, and standardization resulted in a diminished potential for product differentiation across different brands of PCs. Much of the motivation among PC OEMs was becoming focused on taking costs out of the system. The OEMs were being pushed further in this direction by the competitiveness among component makers, by the continued streamlining of production and supply-chain processes, and by the simplification of the distribution model.

  • Bhaskar summarises: The NC’s primary point of value had been focused on the notion that it was a less-expensive alternative to the PC. The nature of the choice factors driving the highly competitive PC industry had effectively resulted in a closing of the price gap. The PC industry had de facto neutralized the NC’s differential value proposition through its own internal competitiveness across PC brands. Buying behaviors were structurally incapable of changing over to the NC in the way it was positioned. The lower-cost-positioned NC was not on course toward its intended endgame.

    Tomorrow: Information Appliances

    Continue reading

    Craig Barrett Interview

    Excerpts from a SF Gate interview:

    You didn’t have the right specifications initially for the product. You didn’t staff the product properly, either numbers or the numbers with right expertise in terms of employees. Or you weren’t carefully managing the project. In 20 years I’ve looked at projects that have not met the schedule at Intel, it usually falls into one of those three categories.

    This convergence of computing and communications, wireless technologies are popping up everywhere. The concept of a digital home where you have digital entertainment, digital computing, digital capability and the ability to move and manipulate personal content and professional content around the home, there’s lots of excitement.

    There are four things you can do in the United States to be competitive, and none of them is easy. The education system is first and foremost. You need to fix the K-12 education system and have a higher influx of kids into college in the technical areas.

    The second one is research and development, because R&D is the seed corn for products and services of the future. How much does the U.S. invest annually in agricultural subsidies, the industry of the 19th century? If you put food stamps in, you can get to a figure of $30 billion or $35 billion. If you keep food stamps out, you get $20 billion to $25 billion. How much does the United States invest annually in basic R&D in physical sciences? About $5 billion.

    Depending on how you count it, you spend four to six times more on agricultural subsidies, the industry of the 19th century, than you invest in producing the ideas for the industries of the 21st century. So, R&D spending is critical. It’s also infrastructure. It’s not bridges or roads. It’s communications infrastructure, information technology infrastructure. You know that the United States is a laggard in broadband. We’re kind of a third-world country from a wireless standpoint.

    And the last thing that you can worry about is the Hippocratic oath of “Do no harm,” but not applying to doctors, applying to governments. California (is) a wonderful example of where government rules, (and) regulations and policies are not only restrictive, but detrimental, in driving business away. Other countries are aggressively pursuing investment, much more than the United States.

    Mobile Web

    Richard MacManus writes:

    One of the key factors in uptake of the Mobile Internet is data speed. And although subscriber and developer-wise we’re getting closer to Mobile Internet Nirvana, the fact is a lot of us are still on pre-3G mobile networks. Roland Tanglao recently called it “the GPRS version of the mobile internet” and we in New Zealand are in the same boat. NZ has GPRS and CDMA mobile networks, but we’ve been promised 3G for years. Our neighbour Australia is a bit ahead of us in the mobile world, as Hutchison already has a 3G network – using the brandname 3.

    Apart from speed, the user-friendliness of the mobile internet and its applications is another hurdle. As of this date, it’s still a pain for people to use a pokey little keypad and screen for mobile internet. The mobile jigsaw (fitting all the pieces together) I wrote about earlier is also an issue.

    On the positive side, the handsets available these days are much easier to use and have more functionality than even a couple of years ago – and they will get even better before the year is out. Plus with people like Russell developing new services and apps, there’s a lot of developer enthusiasm around (don’t forget it wasn’t that long ago that WAP in particular was ridiculed by developers). So mobile apps and services are getting increasingly user-friendly.

    As Russell expanded upon, it’s a new form of media. Just as eBooks shouldn’t just duplicate paper books, the Mobile Web shouldn’t be about replicating PC Websites and apps onto a mobile platform. And as Sir Tim says, it’s all about extending the Web so the Mobile Web complements and interoperates with the PC Web.

    Distributed Directories

    Dave Winer points to a GeekCentral post. Dave adds: “The right way to do it is decentralized, using a convenient XML format for representing and editing directories. We happen to have one, it’s quite popular and has a lot more power than is being used. Please see the Googlish way to do directories, which could easily be the MSN-way to do directories, or what Yahoo can do when they’re ready to give up the centralized model. It’s really simple. Teach your search engine to look inside OPML files, and index them using the same page-ranking method you use for HTML. When the searcher wants to go into a directory, display it like Yahoo or DMOZ. Voila, let a thousand directories bloom. If someone tries to ‘own’ a category, route around them. The Web doesn’t have a single home page, why should directories? Competition is the way of the Web.”

    Distributed directories are the way to construct the Memex.

    Let Users Control Data and Processes

    Phil Wainewright writes:

    One of the myths about ASPs has always been that they’ll fail because people won’t want to entrust their data to a third party. This has always been an absurd myth — by the same logic, businesses should keep all their cash on-site rather than having banks manage it, which of course would be ridiculous. But the focus on data has always been missing the point anyway. It’s not the data itself, it’s what you do with it that matters. Process is the thing that businesses don’t want to have third parties in control of. And the irony of course is that traditional software is suffering a backlash precisely because it forces companies to yield up control of their process automation to software vendors and their systems integrator collaborators.

    What Jon pointed out was that the latest generation of online web services providers are leaving users in control of both data and process. We’re talking about software providers that don’t even need you to give you their data. They simply add process to it by interacting with it, and if users decide to discontinue those processes, they simply withdraw their interaction.

    This a great example of how far out of the box people are going to have to think to really take advantage of service-oriented architectures. As Jon points out, even a leading light of the online services revolution like Amazon hasn’t fully got it, because it still tries to own user reviews rather than simply linking to them in some kind of value-added aggregation or syndication model.

    Greg Gianforte, the CEO of CRM provider RightNow Technologies, likes to say that we’re just at the beginning of several decades of exploitation of the software services model. Jon Udell’s examples of next-generation infoware are a great illustration of just how far we still have to travel.

    Intel’s WiMax View

    Intel Technology Journal has an issue dedicated to WiMax. Here is how it sees the likely deployment scenario:

    Service providers will operate WiMAX on licensed and unlicensed frequencies. The technology enables long distance wireless connections with speeds up to 75 megabits per second. (However, network planning assumes a WiMAX base station installation will cover the same area as cellular base stations do today.) Wireless WANs based on WiMAX technology cover a much greater distance than Wireless Local Area Networks (WLAN), connecting buildings to one another over a broad geographic area. WiMAX can be used for a number of applications, including “last mile” broadband connections, hotspot and cellular backhaul, and high-speed enterprise connectivity for businesses.

    Intel sees WiMAX deploying in three phases: the first phase of WiMAX technology (based on IEEE 802.16-2004) will provide fixed wireless connections via outdoor antennas in the first half of 2005. Outdoor fixed wireless can be used for high-throughput enterprise connections (T1/E1 class services), hotspot and cellular network backhaul, and premium residential services.

    In the second half of 2005, WiMAX will be available for indoor installation, with smaller antennas similar to 802.11-based WLAN access points today. In this fixed indoor model, WiMAX will be available for use in wide consumer residential broadband deployments, as these devices become “user installable,” lowering installation costs for carriers.

    By 2006, technology based on the IEEE 802.16e standards will be integrated into portable computers to support movement between WiMAX service areas. This allows for portable and mobile applications and services. In the future, WiMAX capabilities will even be integrated into mobile handsets.

    Broadband Gaming

    [via Rafat Ali] Communications Engineering & Design writes:

    When it comes to broadband gaming, it looks like the cable industry will be playing for keeps.

    And why not? Its pretty much consensus among industry analysts that the online gaming market will blow up (in a good way) over the coming years.

    Accounting for $353 million in subscriptions and sales revenue in 2003, the market will triple to more than $1 billion by 2008, forecasts the Yankee Group, in a recent study. Throw in advertising revenue and the figure could approach $4 billion, says research firm InStat/MDR, a sister company to CED.

    Thats hefty growth for a sector thats quickly shedding its label as a niche market, and its no surprise that cable operators are positioning themselves to grab a piece of that pie. Armed with high-speed pipes and a gaming-friendly PacketCable Multimedia (PCMM) architecture looming on the horizon, its fair to say that cable definitely has gaming on the agenda.

    TECH TALK: The Network Computer: Ellisons Ideas

    Oracle CEO Larry Ellison first touted the idea of a network computer as early as 1995. Ellison introduced his vision of the network computer, a small, inexpensive device that makes it easy to run applications that access information via the Internet.
    Wally Bock takes up the story:

    The reasons were obvious, at least to [Ellison]. PCs had gotten too complicated, he said. And besides, every fifteen years or so there’s a new revolutionary product in the computing business that replaces what went before.
    Network Computers were supposed to be slimmed down versions of Personal Computers. They might have a screen, a microprocessor, some memory chips, a keyboard and a mouse. The most important component would be the network connection.

    That magic part would connect the Network Computer to the Net. There would only be rudimentary software and memory on the Network Computer. Most software and serious memory would be out there on the Net where it could be easily maintained. The system would run on Java and use Oracle databases. Microsoft software would be nowhere in sight.

    The idea caught on with the industry insiders, journalists, venture capitalists and other trumpeters of the great Internet Bubble. In mid-1996, Business Week devoted a Special Report to Network Computer.

    By then there was a price target, $500 and all kinds of companies were lining up to make products that would capitalize on this powerful Network Computer trend. Bandai, a Japanese company, announced that it would soon be selling a $600 Network Computer in the US. Apple had designed one called the Pippin.

    There were only two figures in the computer business who didn’t share the enthusiasm for the Network Computer concept. Andy Grove of Intel hedged his bets by having teams work on projects that would set Intel up to be a player if Ellison was right. Bill Gates set Microsoft on a path of developing its own solutions.

    Larry Ellison said that Network Computers would be widely available in 1996 but by mid-1997 that hadn’t happened. Gateway had released a Network Computer in May but that was about it. Still, no lesser authority figure than the Economist was saying that “there is broad agreement that NCs are indeed the future.”

    Not much more happened in 1997 but in 1998 the Economist was still predicting that the PC would be “entering its twilight years by the beginning of the millennium.” The idea of the Network Computer as an “information appliance” had caught on and there were predictions that the folks who would really shift the Network Computer revolution into high gear would be companies like Sony.

    Well, as it turned out, network computers didnt really happen. But the idea refuses to die. Ellison resurrected the idea. In November 1999, this is what News.com wrote:

    Though hotly debated in computer industry circles in 1996 and 1997, the network computing concept failed to gain a market foothold, in part because PC prices suddenly fell to historic lows–lessening the need for new, low-cost systems. When Ellison and others first began touting the network computer, traditional “standalone” desktops typically cost well over $1,500. Today prices begin around $400.

    But as prices dropped, people seemed to conclude that controlling sophisticated software applications is less important than using the Internet. Home consumers in particular often rely on so-called Web-based applications such as Hotmail, and Internet-based corporate networks are commonplace.

    PC makers and electronics companies have accordingly turned their sights toward manufacturing easy-to-use “information appliances” that deliver email and Web access. Thus the network computer, one of the first devices to contemplate doing away with personal hard drives and relying on network storage instead, could be positioned for a comeback–even if questions about the server end of the equation remain.

    Ellison, the flamboyant head of the world’s leading database maker, said Wall Street’s reaction to Liberate [which was earlier called Network Computer] endorses his vision of “thin client” devices such as network computers, telephones and palm-size devices that work with applications from central computers.

    “The personal computer is a ridiculous device,” Ellison said, arguing that while information appliances won’t obviate the need for PCs, the latter have hidden costs, create more labor for corporate information technology departments and don’t make sense for many users with scaled-down PC needs.

    So, what exactly went wrong with the Network Computer?

    Tomorrow: What Went Wrong

    Continue reading

    Future of Search Marketing

    Gord Hotchkiss looks ahead to the future of search and search marketing. Among the search trends: localisation, personalisation, integration with the desktop, unwiring, and expansion of search indexes.

    In the midst of writing this article, I was called by a fund manager for a major mutual fund that has some investments in the Industry. She asked me if I believed search advertising revenues would flat line and maybe even start dropping in the near future.

    I said that it’s possible that revenue produced by existing business models could slow from their previous meteoric rise, but I wouldn’t ever see them decreasing. As keyword inventories get tapped and bid prices find their natural ceilings, we have to see revenue growth slow.

    But then I started talking about some of the potential of search that I’ve laid out in this article. We have to understand that this channel will evolve into an integrated and fundamental function of being online. It will be at the base of all we do. And the opportunities to deliver relevant, targeted marketing messages to highly motivated consumers will grow exponentially.

    Will search marketing be the same as it is today? No. Will it be as straight forward? No. Will it cross over into other channels to a greater extent? Yes. Will there be money in it? Yesbillions and billions. Do Search Marketers have a challenge ahead of them? You have no idea how big a challenge!

    Change will be the imperative for the industry. The search marketers who survive and prosper will be the ones who anticipate, pursue and embrace change. The pace of change in search will accelerate in direct relation to the amount of money invested. Microsoft’s entry into search is only the beginning. As search moves to the center of the online experience through the convergence of new search functionality, it will create a white hot tornado of demand. New technology will appear, be assimilated and become the new standard at a dizzying rate. The marketing potential of search will also move at a breakneck pace.

    The largest search technology players will be investing huge amounts in monetizing this potential. They will be joined by a long line of partners waiting to jump on the rapidly moving bandwagon. Finally, we’ll see large portions of traditional marketing budgets being directed to the new online Chimera which has partially evolved from the search we once knew.

    As with any situation that involves accelerated change, uncertainty and discontinuous innovation, there will be a huge demand for visionary practitioners to help navigate through this change. The brightest and best search marketers will accept this role and work to help advertisers plug into the new possibilities. It will take time for seamless solutions to catch up with the innovations and until then, it will be up to search marketing professionals to bridge the gaps. To do this, the search technology providers will finally, for once and for all, bury the hatchet and embrace search marketing vendors as their partners in the industry. They’ll have no choice. They won’t have enough feet on the street to help introduce the channel to all the potential advertisers and explain the intricacies. The potential for search marketing companies is huge, but so is the challenge.

    Esther Dyson invests in Flickr

    Esther Dyson writes:

    As many people are discovering, writing blogs is hard work – sometimes almost as hard as reading them! Photos are much quicker, and almost anyone can do a decent photo – at least of/for friends. and now that it’s easy to post by mail, and everyone has a digital camera or camera phone, these services are becoming great ways to sell value-added storage.

    My own experience is that it’s quite addictive, and Flickr has a rich but natural-feeling set of social protocols. Each time you go to your home page you see a new set of photos (unless your friends are really slacking on the job and haven’t produced any new photos). You can comment on your friends’ photos and they on yours, send messages, set up groups for friends and group albums for events and so on, but there’s less of the me-me-me feel you get on, say, Friendster, and more of the I-see feel you get from blogs (but without so many words, lucid or otherwise).

    Microsoft’s New Set-Top Box

    New s.com provides an overview:

    On the outside, it’s slick, with new video-playback and photo-viewing programs, and a custom version of Internet Explorer 6 designed to make Web browsing on the television a far less painful process. On the inside, it’s a Windows CE-based product with a 733MHz Celeron–slow by PC standards but downright zippy in the world of set-top boxes.

    Microsoft will sell the $199 device in two ways–as a dial-up product for technology newbies with $21.95 monthly service; and as an additional way for broadband homes to view the Web for $9.95 using the existing Internet connection. Newbies, who have historically been the bulk of MSN TV subscribers, are likely to be the majority of initial customers, said MSN TV General Manager Sam Klepper.

    “We think over time, broadband (subscribers) will be half or more,” Klepper said in an interview at Microsoft’s Silicon Valley campus here.

    Many of the new features are aimed at those customers, including the ability to play music or movies stored on a PC in another room. The device can connect via wired or 802.11b wireless networks, though Microsoft plans to add support for faster 802.11g wireless networking in mid-November. Customers will get 2GB of e-mail space for their primary account and 250MB for up to 11 additional accounts.

    The new box, which is being made by Thomson and sold under the RCA brand, will be shipped to stores starting next week. The product has no hard drive, but it has enough flash memory to store some data, including 100 compressed photos that can be used as part of a slide show.

    Web Services and Network Computing

    Headshift’s Lee discusses the “number of interesting observations made lately about the ways in which web services are starting to move us towards a world of accessible distributed computing” and writes:

    Microsoft, arguably the biggest barrier to progress in this direction, is facing big threats on several fronts. On one hand, Apple PCs and music devices are setting new standards in terms of usability and design, and OSX “Tiger” promises much of the functionality that Microsoft has announced and then retracted from the development of its next generation Longhorn system. At the same time, the Apple threat is linked with the more general Linux and Open source threat that Microsoft has faced for some time, because OSX is Unix-based. But while the consumer end of their market is vulnerable, the concept of a Google OS is a more fundamental danger to the Windows cash cow that Microsoft is based upon. A distributed system powered by Google’s computing and search power, but which is run through a browser and Web services, could simply render Windows obsolete.

    Web services are often referred to as the “plumbing” that joins together distributed applications, and without it, we can’t do much of course; but we also need to think long and hard about how we implement these applications.

    Comcast CEO Interview

    [via Rafat Ali] WSJ has an interview with Brian Roberts, the CEO of Comcast:

    Our desire is to reach out to content companies whether we are an owner or a partner or a purchaser. Right now, we are the world’s largest purchaser of programming content, about $4 billion a year. We think a lot of customers will want to store that content on a box in their house — their digital-video recorder. But most customers will want much more content than the 100 to 200 hours you can store on a box. If you could access 10,000 hours, maybe someday you’ll be able to access 30,000 hours or virtually unlimited content. That would be nirvana.

    We’ve been saying for the last five years we wanted to turn Comcast into a new-products company. In the next couple months and years we’re going to have digital-video recorders and voice-over-IP phones. We already have high-definition television and video on-demand. We’re also already planning on the next suite of products: videophones, video chat, interactive television, Internet search capabilities and interactive advertising.

    We’re approaching seven million users on Comcast’s high speed Internet service… We also have video on-demand and a very exciting arrangement with Sony and MGM to get lots of movies and a deal with the NFL. In fact, we think eventually 10,000 hours will be available on demand. And if you then overlay that with access to the Internet, there is virtually unlimited content that consumers will be able to access on a television, a PC and perhaps on a mobile device. There is constantly going to be a need to make it easy for consumers to access what they want when they want it. Call it a search engine. Call it a portal. Call it an on-screen guide or navigation device.

    Cellphones in India

    WSJ writes:

    Companies are jostling to capture a bigger piece of one of the world’s fastest-growing markets — just this month, many of India’s cellphone companies slashed rates by more than 50%. With lower rates and new handsets priced at less than $50, a cellphone is within reach of millions of middle-class Indians.

    Cellphone-service providers in India, including Hutchison Essar Telecom Ltd., Reliance Infocomm Ltd. and Tata Teleservices Ltd., will be adding two million subscribers a month for the foreseeable future, analysts say. That likely will lift the number of cellphones in India past the number of fixed-line phones before year end.

    The growth and increasing competition show that cellphone companies — as well as international handset makers such as Nokia Corp. and equipment makers such as Nortel Networks Corp. — recognize India is at last a market worth fighting for. As more Indians go mobile, total cellphone revenue will double this year to almost $6 billion and double again over the next two years, analysts say.

    With lower rates, the average revenue per subscriber per month has fallen more than 15% this year to less than $10. The secret to survival in this market, analysts say, is offsetting the decline by finding ways to get Indians to spend more money on data services. Indians can use their phones to download pictures or songs from their favorite films or get video clips of cricket matches. In the Punjab, India’s bread basket, wheat farmers can look up local and international prices, while fishermen in southern Kerala can check the different prices for their catch at each port before deciding where to dock. While data services account for about 5% of cellphone revenue, they will make up more than 20% by 2008, Ms. Desai of Gartner says.