Virtual Utility

Red Herring writes:

Today, utility computing is the latest attempt at selling hosted applications. Recent technological advances like server virtualization (which makes unused parts of many computers function as one computer) make it easier for customers to share a data centers computing resources in ever-changing configurations to suit each customer’s needs. In theory, that flexibility helps outsourcers charge for the use of processors and disks much like an energy company charges for electricity by the kilowatt-hour. In a prediction reminiscent of the rosiest forecasts of the ASP era, market research firm IDC says worldwide spending on utility computing will grow from $1 billion last year to $4.6 billion in 2007. But VCs still smarting from the ASP bust arent chasing that anticipated windfall by investing in utility-computing outsourcers. Instead, they are placing bets on companies that make hardware and software for utility-style data centers.

Virtualization software presents an administrator with a view of all the available computing, storage, and networking gear in a data center. By selecting an unused processor on one machine and some disks and memory on another, the administrator can fashion a virtual server to run a new application or supply extra computing power on demand without installing additional hardware.

A close cousin of virtualization is server blade technology, another hot area for utility-computing startups. Blades are removable boards containing processors and other server components. Administrators can swap them in and out of racks to provide computing power as needed.

Bottom-up Semantic Web

Kevin Werbach writes:

The latest and greatest example of the bottom-up semantic Web in action is tags. Tags are user-created labels for objects on the Web, such as pages and photos. Using a tool such as Del.icio.us (for bookmark links) or Flikr for photos, anyone can assign tags. Once objects are tagged, users can search on those tags and retrieve human-categorized results. Technorati recently introduced tag search across blog posts, del.icio.us bookmarks, and Flikr photos, with the ability to tag other types of objects as well.

What’s cool about this is that, in true Web spirit, it simply ignores the biggest problems with a decentralized system. I might think something belongs under a “politics” tag that you categorize differently. Or, different users will tag the same item in inconsistent ways. Not to mention that, to take a trivial example, “blogs,” “weblogs,” and “Web logs” might all refer to the same thing, but be treated as distinct tags. So what. Tags work well enough to be useful, despite not being perfect. Just like the Web vs. SGML, just like Ethernet vs. token ring networking, the lightweight, decentralized solution wins.

And it gets better.

The exciting part of tags is that they fit together with mechanisms to build open programmatic interfaces to Web resources. A tag category, for example, can easily become an RSS syndication feed. And more. Lots of smart people, and many startups, are coming up with intruiging applications of these new capabilities.

The semantic Web is dead. Long live the semantic web.

Mobile Wars

Silicon.com writes:

For several reasons, the mobile phone is set to become the most influential portable electronic device. Technology is one. While the constant improvement of every part of the modern computer seems now to have relatively little impact on the desktop, it is making a huge difference for the phone. You can now fit substantial processing power and a good deal of memory into your pocket, along with decent battery life.

With half-gigabyte memory cards now readily available for well under 50, some pundits have suggested we will soon carry round all our important data. When we find a computer, it will just be a device to manage the data we already have in a phone.

Maybe – but the phone itself will soon be powerful enough to do the job itself with perhaps some optional add-ons. Moreover, carrying the whole of your computer software in your pocket may be technically feasible, but the complexities imposed by the intertwining of hardware is liable to make this solution slow to progress.

Another factor is the desirability of connectivity. Wi-Fi hotspots are proving popular. But if you can remember it at all, the history of the Rabbit phone strongly suggests the ubiquitous network always wins out over the hotspot. 3G will improve bandwidth greatly and is likely to enable the operators to compete strongly against commercial Wi-Fi providers.

Microsoft seems certain to play a substantial role in the stationary systems, although Linux will also be important. Despite recent setbacks, Nokia has an immensely strong position in mobile handsets. Some handset makers are keen to work with Microsoft to create smart phones. Others will be chary, noticing the fate of many of the PC makers, including IBM.

Nokia has so far stuck firmly with software maker Symbian, while implementing links to the Microsoft desktop. Neither party has made much headway with providing tools to manage a large population of powerful computing devices that are constantly on the move. Innovation is needed and looks most likely to come from third parties that grab the opportunity.

If Microsoft wins, it will be the dominant force in a greatly expanded computing and communications environment. Nokia will be marginalised as a handset maker for the consumer who has only weak links with large organisations. If Nokia wins, the whole computing environment will be changed.

BitTorrent, eXeem, Meta-Torrent, Podcasting

Marc Eisenstadt asks: “The index that facilitates the sharing of files on a large scale is also the Achilles heel of peer-to-peer file-sharing, because it is vulnerable to litigation and closure. So what happens if the index is itself distributed? I try to get my head around the latest in peer-to-peer file sharing, and explain a bit about what I’ve learned, including the fact that BitTorrent’s power rests in its ‘swarm’ distribution model, but not necessarily in your end-user download speed. What has this got to do with podcasting?”

consider podcasting as a time-shifted radio distribution model. In fact, podcasting generalises to RSS Media feeds, but let’s just stick with podcasting, because it is simpler to understand. I summarised the ‘so what?’ of podcasting in an earlier Get Real posting, to the effect that it completes the ‘last mile’ of the connections from the user’s point of view: you subscribe to an RSS feed that embeds within it (not unlike an email attachment) an MP3 file of interest to you, e.g. a regularly-scheduled technology review or talk radio interview, audio book, rock concert, etc., and presto-mundo, it appears on your iPod or other portable gadget whereupon you can listen while on the train, jogging, etc. All the pieces have been there for a long time, but podcasting makes it a hands-free seamless end-user experience (once you’ve done the one-time setup, at least), and that is extremely nifty. But there’s still one piece missing.

There has been some concern expressed that RSS feeds (certainly full-text feeds) are themselves bringing the internet to its knees. This is probably something of an over-statement, but ‘enclosures’ could compound the problem. Consider this scenario: you have created a wildly successful weekly talk show, and the zillions of hits and downloads, whether directly or via RSS feeds, are killing your server, or forcing you to invest in mirror sites and similar server-centric distribution models. You are now ‘a victim of your own success’: large scale has proven self-defeating. But wait! The P2P visionaries rebel agains this very thought, remember? As I wrote above, “Big scale is an asset, rather than a liability”. And in the BitTorrent world, massive scale improves throughput rather than thwarting it.

Sure enough, the guys behind podcasting are already way ahead on this one. iPodder, for example, is conducive to podcasters who make their MP3 RSS enclosures available as torrents. Setup is a little fiddly at this stage, but there are articles that provide how-to guides, such as “Battle the Podcast Bandwidth Beast with Bittorrent ” Wahoo!! The loop is closed! There is end-to-end content creation and delivery for the masses, with no ‘victim of its own success’ bottlenecks. The more popular a file is, the more easily it can be distributed. Awesome.

That’s the way the net was meant to be.

Apple vs Google

PC Magazine writes that “Apple and Google will each be trying to act as the spigot and control point of choice of nontechnical humans everywhere for handling the flood of digits coming onto home screens. Google will support its thrust through profits on advertising. Apple will support its thrust through profits on hardware. But they will meet in the middle.”

[Google’s] approach is to get all this stuff onto big honking hard drives and then let you search the drives any way you choose with any key words that come to mind. However, lest you forget, Google also is trying to figure out how to do what iLife does: keep track of important stuff on your personal computer hard drive and let you find it easily.

Google’s results have worked best with text. Google has yet to show its hand on how it will work with more kinds of visual imagery than still photos and illustrations. But you know a new, big thought is coming there.

Conversely, Apple comes at the same problem of harnessing huge amounts of digital stuff by figuring out the end point first: how to best display and present stuff you contribute. Then it backs up to work through how it can help get you there. Next up for iLife will be a way to display the best stuff that comes in from the Web, probably tailored to settings you easily manage. Call it, maybe, the iNet portion of iLife.

In any case, the two companies will be competing to be in control of the next generation of digital media life, when entertainment and information from in-home and remote hard drives, as well as broadcast and cable signals, are blended onto the same screen.

Stay tuned. These are the companies that are the best at reducing the complexity of our digital lives into screen displays that are simple and inviting to use. They are the two companies most devoted to looking at the digital universe from the consumer’s standpoint and delivering products and services that play to that, effectively.

TECH TALK: Microsoft, Bandwidth and Centralised Computing: Comments (Part 4)

wyoung76: the main problem I have with the authors point of view is that of a Modern World perspective. As evidence that this future is still many a generation away from becoming reality, we need only look at the Third World countries and witness the total lack of infrastructure in supporting such a society of high bandwidth and low local maintenance computing. The local computer is a fast, simple, and easy way of getting the required (or needed/desired) computing power to the people in poorer nations without worrying about the HUGE commitment in upgrading or installing the infrastructure that we modern nations are beginning to take for granted.

Doc Ruby: The real trend is mobile devices, DRM, and cheap bandwidth to home servers at local centers of always-on P2P networks. The huge mass market of less sophisticated/tolerant users, and the peripheral attention offered by personal mobile devices mean the devices will be multimedia terminals with wireless networking. The media industry orientation towards DRM means they’ll give away mobiles at a loss to sell their more scalable/profitable media products, while ensuring the terminals can’t copy the media objects. While the whole network will become much more complex under the hood, the market will demand that it all “just works”, like TV (IOW, when it doesn’t work, there’s nothing you can do about it but wait). That’s why Microsoft is evolving into a media company (games, interactive “TV”), enforcing the consumption of their lower quality products by perpetuating the applications that they prefer/require to “play”. So we’re going in the direction predicted by this story, but along the way the changes will be much different.

neurocutie: despite increasing bandwidth out to the Internet as a compelling force, equally powerful trends suggested the continued importance and popularity of the home PC. Most of these trends can be summed up as needing even higher bandwidth locally, as well as needing specific interfacing of other devices, both of which aren’t likely to be reasonably handled by some form of thin client. For example, all the reasons to burn personalized CDs or DVDs. It is not likely that burning CDs or DVDs would happen straight over the Internet without some kind of fast local store (i.e. hard disk). Another is interfacing digital and video cameras and editing those results. Again it doesn’t seem reasonable to build a thin client to interface these device just to ship the many gigs of data (particularly video) out over the Internet to a remote fileserver and, worse, to perform editing against the remote fileserver — these applications, popular on the home front, pretty much dictate a home PC-like architecture with fast, large local file store

Craig Maloney: I think the argument for a more service-based PC has some major issues to get around: First, there needs to be some receiver machine at the home end. A reasonable computer can be had for around $500 nowadays. Unless this subscriber machine can be had for less than $200, there is no incentive to move to this model. Second, nothing is free. This service will be a subscription-based service. I think it would have had some bearing had people not been burned by subscriptions from other companies. Witness the cable companies and TiVo and how they’ve handled their subscriptions. Witness the cellphone subscriptions. Paying outrageous rates for using a computer won’t succeed if there is no conomic reason to do so. People will sooner purchase Macintoshes. Thirdly, there is the issue of control. You’re dealing with people’s data, and their private information. I will never relinquish control of my checkbook, nor my family pictures, nor anything else like that. Some people may be amenable to this, but many will not. The computer is a multimedia device now, and people have scads of personal data on their computers. It’ll take a very convincing argument, and a company with a reputation for integrity to wrestle away that desire for control. The PC as we know it will change, but I see that change moving more to a home entertainment/personal network than a service based machine.

Dutky: The solution to the increasing administrative burden on computer users is not hire someone to do the administration: instead, we need computers that actually reduce amount of administration required or make the task of administration markedly easier. This is what personal computers did 40 years ago, and it can be done again.

In the next two columns, we will summarise the argument for and against centralised computing. The final two columns in this series will then discuss the notion of centralised computing in the context of emerging markets and what Microsoft should do.

Tomorrow: The Arguments For Centralised Computing

Continue reading