Craig Barrett Interview

Excerpts from a SF Gate interview:

You didn’t have the right specifications initially for the product. You didn’t staff the product properly, either numbers or the numbers with right expertise in terms of employees. Or you weren’t carefully managing the project. In 20 years I’ve looked at projects that have not met the schedule at Intel, it usually falls into one of those three categories.

This convergence of computing and communications, wireless technologies are popping up everywhere. The concept of a digital home where you have digital entertainment, digital computing, digital capability and the ability to move and manipulate personal content and professional content around the home, there’s lots of excitement.

There are four things you can do in the United States to be competitive, and none of them is easy. The education system is first and foremost. You need to fix the K-12 education system and have a higher influx of kids into college in the technical areas.

The second one is research and development, because R&D is the seed corn for products and services of the future. How much does the U.S. invest annually in agricultural subsidies, the industry of the 19th century? If you put food stamps in, you can get to a figure of $30 billion or $35 billion. If you keep food stamps out, you get $20 billion to $25 billion. How much does the United States invest annually in basic R&D in physical sciences? About $5 billion.

Depending on how you count it, you spend four to six times more on agricultural subsidies, the industry of the 19th century, than you invest in producing the ideas for the industries of the 21st century. So, R&D spending is critical. It’s also infrastructure. It’s not bridges or roads. It’s communications infrastructure, information technology infrastructure. You know that the United States is a laggard in broadband. We’re kind of a third-world country from a wireless standpoint.

And the last thing that you can worry about is the Hippocratic oath of “Do no harm,” but not applying to doctors, applying to governments. California (is) a wonderful example of where government rules, (and) regulations and policies are not only restrictive, but detrimental, in driving business away. Other countries are aggressively pursuing investment, much more than the United States.

Mobile Web

Richard MacManus writes:

One of the key factors in uptake of the Mobile Internet is data speed. And although subscriber and developer-wise we’re getting closer to Mobile Internet Nirvana, the fact is a lot of us are still on pre-3G mobile networks. Roland Tanglao recently called it “the GPRS version of the mobile internet” and we in New Zealand are in the same boat. NZ has GPRS and CDMA mobile networks, but we’ve been promised 3G for years. Our neighbour Australia is a bit ahead of us in the mobile world, as Hutchison already has a 3G network – using the brandname 3.

Apart from speed, the user-friendliness of the mobile internet and its applications is another hurdle. As of this date, it’s still a pain for people to use a pokey little keypad and screen for mobile internet. The mobile jigsaw (fitting all the pieces together) I wrote about earlier is also an issue.

On the positive side, the handsets available these days are much easier to use and have more functionality than even a couple of years ago – and they will get even better before the year is out. Plus with people like Russell developing new services and apps, there’s a lot of developer enthusiasm around (don’t forget it wasn’t that long ago that WAP in particular was ridiculed by developers). So mobile apps and services are getting increasingly user-friendly.

As Russell expanded upon, it’s a new form of media. Just as eBooks shouldn’t just duplicate paper books, the Mobile Web shouldn’t be about replicating PC Websites and apps onto a mobile platform. And as Sir Tim says, it’s all about extending the Web so the Mobile Web complements and interoperates with the PC Web.

Distributed Directories

Dave Winer points to a GeekCentral post. Dave adds: “The right way to do it is decentralized, using a convenient XML format for representing and editing directories. We happen to have one, it’s quite popular and has a lot more power than is being used. Please see the Googlish way to do directories, which could easily be the MSN-way to do directories, or what Yahoo can do when they’re ready to give up the centralized model. It’s really simple. Teach your search engine to look inside OPML files, and index them using the same page-ranking method you use for HTML. When the searcher wants to go into a directory, display it like Yahoo or DMOZ. Voila, let a thousand directories bloom. If someone tries to ‘own’ a category, route around them. The Web doesn’t have a single home page, why should directories? Competition is the way of the Web.”

Distributed directories are the way to construct the Memex.

Let Users Control Data and Processes

Phil Wainewright writes:

One of the myths about ASPs has always been that they’ll fail because people won’t want to entrust their data to a third party. This has always been an absurd myth — by the same logic, businesses should keep all their cash on-site rather than having banks manage it, which of course would be ridiculous. But the focus on data has always been missing the point anyway. It’s not the data itself, it’s what you do with it that matters. Process is the thing that businesses don’t want to have third parties in control of. And the irony of course is that traditional software is suffering a backlash precisely because it forces companies to yield up control of their process automation to software vendors and their systems integrator collaborators.

What Jon pointed out was that the latest generation of online web services providers are leaving users in control of both data and process. We’re talking about software providers that don’t even need you to give you their data. They simply add process to it by interacting with it, and if users decide to discontinue those processes, they simply withdraw their interaction.

This a great example of how far out of the box people are going to have to think to really take advantage of service-oriented architectures. As Jon points out, even a leading light of the online services revolution like Amazon hasn’t fully got it, because it still tries to own user reviews rather than simply linking to them in some kind of value-added aggregation or syndication model.

Greg Gianforte, the CEO of CRM provider RightNow Technologies, likes to say that we’re just at the beginning of several decades of exploitation of the software services model. Jon Udell’s examples of next-generation infoware are a great illustration of just how far we still have to travel.

Intel’s WiMax View

Intel Technology Journal has an issue dedicated to WiMax. Here is how it sees the likely deployment scenario:

Service providers will operate WiMAX on licensed and unlicensed frequencies. The technology enables long distance wireless connections with speeds up to 75 megabits per second. (However, network planning assumes a WiMAX base station installation will cover the same area as cellular base stations do today.) Wireless WANs based on WiMAX technology cover a much greater distance than Wireless Local Area Networks (WLAN), connecting buildings to one another over a broad geographic area. WiMAX can be used for a number of applications, including “last mile” broadband connections, hotspot and cellular backhaul, and high-speed enterprise connectivity for businesses.

Intel sees WiMAX deploying in three phases: the first phase of WiMAX technology (based on IEEE 802.16-2004) will provide fixed wireless connections via outdoor antennas in the first half of 2005. Outdoor fixed wireless can be used for high-throughput enterprise connections (T1/E1 class services), hotspot and cellular network backhaul, and premium residential services.

In the second half of 2005, WiMAX will be available for indoor installation, with smaller antennas similar to 802.11-based WLAN access points today. In this fixed indoor model, WiMAX will be available for use in wide consumer residential broadband deployments, as these devices become “user installable,” lowering installation costs for carriers.

By 2006, technology based on the IEEE 802.16e standards will be integrated into portable computers to support movement between WiMAX service areas. This allows for portable and mobile applications and services. In the future, WiMAX capabilities will even be integrated into mobile handsets.

Broadband Gaming

[via Rafat Ali] Communications Engineering & Design writes:

When it comes to broadband gaming, it looks like the cable industry will be playing for keeps.

And why not? Its pretty much consensus among industry analysts that the online gaming market will blow up (in a good way) over the coming years.

Accounting for $353 million in subscriptions and sales revenue in 2003, the market will triple to more than $1 billion by 2008, forecasts the Yankee Group, in a recent study. Throw in advertising revenue and the figure could approach $4 billion, says research firm InStat/MDR, a sister company to CED.

Thats hefty growth for a sector thats quickly shedding its label as a niche market, and its no surprise that cable operators are positioning themselves to grab a piece of that pie. Armed with high-speed pipes and a gaming-friendly PacketCable Multimedia (PCMM) architecture looming on the horizon, its fair to say that cable definitely has gaming on the agenda.

TECH TALK: The Network Computer: Ellisons Ideas

Oracle CEO Larry Ellison first touted the idea of a network computer as early as 1995. Ellison introduced his vision of the network computer, a small, inexpensive device that makes it easy to run applications that access information via the Internet.
Wally Bock takes up the story:

The reasons were obvious, at least to [Ellison]. PCs had gotten too complicated, he said. And besides, every fifteen years or so there’s a new revolutionary product in the computing business that replaces what went before.
Network Computers were supposed to be slimmed down versions of Personal Computers. They might have a screen, a microprocessor, some memory chips, a keyboard and a mouse. The most important component would be the network connection.

That magic part would connect the Network Computer to the Net. There would only be rudimentary software and memory on the Network Computer. Most software and serious memory would be out there on the Net where it could be easily maintained. The system would run on Java and use Oracle databases. Microsoft software would be nowhere in sight.

The idea caught on with the industry insiders, journalists, venture capitalists and other trumpeters of the great Internet Bubble. In mid-1996, Business Week devoted a Special Report to Network Computer.

By then there was a price target, $500 and all kinds of companies were lining up to make products that would capitalize on this powerful Network Computer trend. Bandai, a Japanese company, announced that it would soon be selling a $600 Network Computer in the US. Apple had designed one called the Pippin.

There were only two figures in the computer business who didn’t share the enthusiasm for the Network Computer concept. Andy Grove of Intel hedged his bets by having teams work on projects that would set Intel up to be a player if Ellison was right. Bill Gates set Microsoft on a path of developing its own solutions.

Larry Ellison said that Network Computers would be widely available in 1996 but by mid-1997 that hadn’t happened. Gateway had released a Network Computer in May but that was about it. Still, no lesser authority figure than the Economist was saying that “there is broad agreement that NCs are indeed the future.”

Not much more happened in 1997 but in 1998 the Economist was still predicting that the PC would be “entering its twilight years by the beginning of the millennium.” The idea of the Network Computer as an “information appliance” had caught on and there were predictions that the folks who would really shift the Network Computer revolution into high gear would be companies like Sony.

Well, as it turned out, network computers didnt really happen. But the idea refuses to die. Ellison resurrected the idea. In November 1999, this is what News.com wrote:

Though hotly debated in computer industry circles in 1996 and 1997, the network computing concept failed to gain a market foothold, in part because PC prices suddenly fell to historic lows–lessening the need for new, low-cost systems. When Ellison and others first began touting the network computer, traditional “standalone” desktops typically cost well over $1,500. Today prices begin around $400.

But as prices dropped, people seemed to conclude that controlling sophisticated software applications is less important than using the Internet. Home consumers in particular often rely on so-called Web-based applications such as Hotmail, and Internet-based corporate networks are commonplace.

PC makers and electronics companies have accordingly turned their sights toward manufacturing easy-to-use “information appliances” that deliver email and Web access. Thus the network computer, one of the first devices to contemplate doing away with personal hard drives and relying on network storage instead, could be positioned for a comeback–even if questions about the server end of the equation remain.

Ellison, the flamboyant head of the world’s leading database maker, said Wall Street’s reaction to Liberate [which was earlier called Network Computer] endorses his vision of “thin client” devices such as network computers, telephones and palm-size devices that work with applications from central computers.

“The personal computer is a ridiculous device,” Ellison said, arguing that while information appliances won’t obviate the need for PCs, the latter have hidden costs, create more labor for corporate information technology departments and don’t make sense for many users with scaled-down PC needs.

So, what exactly went wrong with the Network Computer?

Tomorrow: What Went Wrong

Continue reading