Business 2.0 writes on hte new approach to creating companies: “Oddpost is part of an emerging breed of here-today, bought-tomorrow startups that are sprouting with minimal funding, flowering briefly, and being gobbled up by far bigger companies. In many instances, these built-to-flip outfits forgo — or sometimes can’t get — money from venture capitalists. They instead create shoestring operations focused on the rapid development of narrow technologies to plug gaps in existing product lines or add useful features to existing products. Then they look to a deep-pocketed patron to scoop them up.”
WSJ writes about Azul, which has been founded by Stephen DeWitt who earlier headed Cobalt, which was acquired by Sun for $2 billion:
Departing from an industry trend toward standard chips, Azul Systems Inc. says it has packed the equivalent of 24 microprocessors on a single piece of silicon. Many companies are offering or developing such “multicore” chips, including International Business Machines Corp. and Intel Corp., but started out by squeezing two to four processors on each chip.
Azul’s chips are tailored for software programs that are written using a new generation of programming technologies, including Java from Sun Microsystems Inc. and Microsoft Corp.’s .NET. Azul plans to offer special-purpose server systems — in sizes ranging from 96 to 384 processors — that it believes will be much more efficient and powerful than existing machines for running such software.
The target audience for Azul’s new gear is corporate managers who are struggling to estimate how many servers to buy. Where each of those systems is typically assigned to run a single program or two, Azul’s machines, by contrast, are designed to handle changing workloads from many programs.
“We wanted to fundamentally eliminate the issues of capacity planning around computing,” Mr. DeWitt said.
Azul was largely inspired by the evolution of data-storage systems, Mr. DeWitt said. Where companies used to buy storage hardware from their computer vendor — which was mainly designed to work with its products — technology standards emerged in the 1990s that allowed storage systems to hold files of any type that come from nearly any kind of computer.
New programming technologies, such as Java and .NET, use a layer of translation software, called a virtual machine, that allows an application program to run on multiple kinds of computers and operating systems. Though programs based on such technologies are a fraction of the software companies use today, Mr. DeWitt cites estimates that 80% of new programs by 2008 will be based on virtual-machine approaches.
Technology Review presents its fourth class of 100 remarkable innovators under 35 who are transforming technologyand the world.
It would be interesting to prepare a list of Indians working in India who are doing the same. I am sure there is a lot of innovation happening in India, but it hasn’t bubbled up yet. Any suggestions?
Technology Review writes: “A startup claims it has created software that lets programs run on any operating systemand any processorwithout slowing down. Is the hype for real this time?”
Software emulatorssoftware that allows another piece of software to run on hardware for which it was not originally intendedhave been an elusive goal for the computing industry for almost 30 years. The ability to port software to multiple hardware configurations is something companies such as IBM, Intel, and Sun Microsystems are constantly working on. Software emulators do exist today, but most are narrowly focused, allowing one particular program to run on one other processor type. Sometimes, performance suffers with the use of an emulator.
It was with a shock, then, that I read the announcement by tiny Transitive Software of a new product, Quick Transit, that it claims allows software applications compiled for one processor and operating system to run on another processor and operating system without any source code or binary changes. My first thoughts went straight to the heart of the Linux/Microsoft battle. Could this software emulator be used to run Microsoft programs on Linux? And wouldnt that be inviting the full wrath of the Microsoft legal team?
I called the Los Gatos, CA-based startup to learn more and ended up talking with CEO Bob Wiederhold, who spoke from Manchester, England, home of the companys engineering offices. Wiederhold immediately dashed my grander ideas. If we tried to run Windows programs on a Linux platform, Microsoft would be upset, Wiederhold said. Thats not what were trying to do. Wiederholds initial goals are less incendiary, but could bring about big changes in the way companies manage their technology assets. Whats more, the technology could eventually drift down to the consumer level, where it could allow older video games to play on newer versions of game platforms (such as Microsofts Xbox, or Sony Playstation). The initial target market for the product, however, is large computer makers.
Wiederhold says Quick Transit has been in development for nine years, and that its the first software emulator that works with a broad array of processors with minimal performance degradation. Typically, software emulatorswhen they do worksuffer performance hits; a cursor arrow struggles to move across the screen, or there’s a two-second delay after clicking on a file menu before the dialogue box opens. Analysts who have seen Quick Transit report that it exhibits no such degradation.
Wired News also wrote about Transitive earlier.
A few companies often market leaders in their industries have moved away from single-transaction interactions with suppliers. These leading corporate buyers have built what we call an advantaged supply network. An advantaged supply network does not have pricing self-interest as the only basis for the buyersupplier relationship; rather, it aims for participants in the network jointly to create competitive advantage from diverse sources for themselves and for others. Buyers strive to work closely with suppliers to attack inefficiencies and waste in the supply chain, to coordinate their business strategies, and to manage resources together for competitive advantage. Efficiency and innovation in manufacturing are gained through such cooperative buyersupplier strategies as collaborative product and process planning, integrated engineering and design, and other forms of cooperation that lower total costs, decrease time to market, and improve the quality of the entire supply bases output.
Whereas the price-driven transactional management model encourages transient relationships between buyers and suppliers, the advantaged supply network creates incentives for buyers to build deeper and longer-lasting relationships with suppliers, so that both sides can more effectively pursue, over time, many opportunities to bolster economic stability and competitive advantage. The network also encourages players to look for and eliminate waste.
WSJ has an article by Laura Landro on EMRs in the US context:
In New York’s Hudson Valley, more than 600,000 patients are blazing a trail with a new regional medical-information network that lets area hospitals, doctors, labs and pharmacies share medical records securely over the Internet.
The Taconic Health Information Network and Community project is one of the most ambitious efforts yet in a growing movement to establish large regional health-information networks around the country. While it may be a decade or more before Americans have a national system of electronic medical records — as promised this year by the Bush administration — more than 100 state and local groups are moving quickly to establish their own networks, securing seed money from federal agencies and nonprofit groups, and lining up local employers and health plans to offer financial incentives, including bonuses for doctors to participate.
The regional networks aim to get local providers to convert patients’ paper medical files to electronic records, and persuade doctors to exchange pertinent information with a patient’s other health-care providers. By using a single network, regional health groups say they can reduce medical mistakes, better track patients with chronic diseases such as diabetes, zip prescriptions electronically to pharmacies, and cut costs by eliminating duplicated lab tests and X-rays.
“The simple vision is that we want to see every American covered by one or more regional health-information organizations,” says David Brailer, who was appointed as the nation’s first health-information-technology coordinator this year. Regional networks are better suited to meet the needs of specific geographic populations, he says, and eventually, the regional networks can all be interconnected to form a national network that will enable officials to track health trends, report disease outbreaks and better identify public-health issues.
Recently, there was speculation that Google was building a network computer along with its own browser. The New York Post wrote:
The broader concept Google is pursuing is similar to the “network computer” envisioned by Oracle chief Larry Ellison during a speech in 1995.
The idea is that companies or consumers could buy a machine that costs only about $200, or less, but that has very little hard drive space and almost no software. Instead, users would access a network through a browser and access all their programs and data there.
The concept floundered, but programmers note that Google could easily pick up the ball. Already, its Gmail free e-mail system gives users 1000 megabytes of storage space on a remote network providing consumers a virtual hard drive.
“I think a similar thing [to the got network computer] is developing in a more organic way now,” said Jason Kottke, a New York-based Web developer who follows Google’s moves. “People are ready for it. Instead of most of your interaction happening with Windows or Mac, you’re spending a lot of time with Google-built interfaces.”
News.com wrote: Google has also been rumored to be working on a thin-client operating system that would compete with Microsoft in areas beyond search. Techies have even discussed the idea of Google becoming a file storage system.
A commentary on ZDNet added:
What Google must do is get itself on the desktop. The obvious Google-shaped hole is local searching, where Microsoft has a history of conspicuous failure. A browser plug-in that amalgamated general file management with knowledge of Outlook, multimedia data types and online searching would be tempting indeed. Add extra features such as integrated email, instant messaging, automated backup to a remote storage facility and so on, and it gets very interesting. That would need considerable browser smarts, but would extend the Google brand right into the heart of the unconquered desktop where it would stick like glue.
By effectively combining local computing and the Web in this way Google would open up multiple revenue models. As well as advertising-supported and subscription services, it could start to offer very effective antivirus and other security functions–your data, safe in their hands–as well as any number of cleverly targeted sales opportunities based on what it knows about your personal file mix.
It would also remove one of the big barriers that stops people moving from Windows to open source. If all your important data has been painlessly stored on Google’s farm and there’s a neat, powerful Java browser-based management tool to retrieve it, you can skip from OS to OS without breaking into a sweat.
Google may not be the only one thinking about networked computers. A recent story in Business Week mentioned that AMD is planning to announce as early as October that it is teaming up with contract manufacturers to create an inexpensive, networked PC for sale in India or China. It’s part of [CEO] Ruiz’s ambitious plan to help connect 50% of the world’s population to the Internet by 2015.
So, is the network computer just a dream or will it become a reality? Given that we already have ever-cheaper computers, cellphones, TVs and gaming consoles, do we really need a fifth device? Will the network computer succeed in its second avatar? Is the network computer idea the harbinger of a deeper shift in computing?
As we seek to answer these questions, we need to first understand what a network computer is.
Tomorrow: What Is It?