Virtual Utility

Red Herring writes:

Today, utility computing is the latest attempt at selling hosted applications. Recent technological advances like server virtualization (which makes unused parts of many computers function as one computer) make it easier for customers to share a data centers computing resources in ever-changing configurations to suit each customer’s needs. In theory, that flexibility helps outsourcers charge for the use of processors and disks much like an energy company charges for electricity by the kilowatt-hour. In a prediction reminiscent of the rosiest forecasts of the ASP era, market research firm IDC says worldwide spending on utility computing will grow from $1 billion last year to $4.6 billion in 2007. But VCs still smarting from the ASP bust arent chasing that anticipated windfall by investing in utility-computing outsourcers. Instead, they are placing bets on companies that make hardware and software for utility-style data centers.

Virtualization software presents an administrator with a view of all the available computing, storage, and networking gear in a data center. By selecting an unused processor on one machine and some disks and memory on another, the administrator can fashion a virtual server to run a new application or supply extra computing power on demand without installing additional hardware.

A close cousin of virtualization is server blade technology, another hot area for utility-computing startups. Blades are removable boards containing processors and other server components. Administrators can swap them in and out of racks to provide computing power as needed.

Bottom-up Semantic Web

Kevin Werbach writes:

The latest and greatest example of the bottom-up semantic Web in action is tags. Tags are user-created labels for objects on the Web, such as pages and photos. Using a tool such as Del.icio.us (for bookmark links) or Flikr for photos, anyone can assign tags. Once objects are tagged, users can search on those tags and retrieve human-categorized results. Technorati recently introduced tag search across blog posts, del.icio.us bookmarks, and Flikr photos, with the ability to tag other types of objects as well.

What’s cool about this is that, in true Web spirit, it simply ignores the biggest problems with a decentralized system. I might think something belongs under a “politics” tag that you categorize differently. Or, different users will tag the same item in inconsistent ways. Not to mention that, to take a trivial example, “blogs,” “weblogs,” and “Web logs” might all refer to the same thing, but be treated as distinct tags. So what. Tags work well enough to be useful, despite not being perfect. Just like the Web vs. SGML, just like Ethernet vs. token ring networking, the lightweight, decentralized solution wins.

And it gets better.

The exciting part of tags is that they fit together with mechanisms to build open programmatic interfaces to Web resources. A tag category, for example, can easily become an RSS syndication feed. And more. Lots of smart people, and many startups, are coming up with intruiging applications of these new capabilities.

The semantic Web is dead. Long live the semantic web.

Mobile Wars

Silicon.com writes:

For several reasons, the mobile phone is set to become the most influential portable electronic device. Technology is one. While the constant improvement of every part of the modern computer seems now to have relatively little impact on the desktop, it is making a huge difference for the phone. You can now fit substantial processing power and a good deal of memory into your pocket, along with decent battery life.

With half-gigabyte memory cards now readily available for well under 50, some pundits have suggested we will soon carry round all our important data. When we find a computer, it will just be a device to manage the data we already have in a phone.

Maybe – but the phone itself will soon be powerful enough to do the job itself with perhaps some optional add-ons. Moreover, carrying the whole of your computer software in your pocket may be technically feasible, but the complexities imposed by the intertwining of hardware is liable to make this solution slow to progress.

Another factor is the desirability of connectivity. Wi-Fi hotspots are proving popular. But if you can remember it at all, the history of the Rabbit phone strongly suggests the ubiquitous network always wins out over the hotspot. 3G will improve bandwidth greatly and is likely to enable the operators to compete strongly against commercial Wi-Fi providers.

Microsoft seems certain to play a substantial role in the stationary systems, although Linux will also be important. Despite recent setbacks, Nokia has an immensely strong position in mobile handsets. Some handset makers are keen to work with Microsoft to create smart phones. Others will be chary, noticing the fate of many of the PC makers, including IBM.

Nokia has so far stuck firmly with software maker Symbian, while implementing links to the Microsoft desktop. Neither party has made much headway with providing tools to manage a large population of powerful computing devices that are constantly on the move. Innovation is needed and looks most likely to come from third parties that grab the opportunity.

If Microsoft wins, it will be the dominant force in a greatly expanded computing and communications environment. Nokia will be marginalised as a handset maker for the consumer who has only weak links with large organisations. If Nokia wins, the whole computing environment will be changed.

BitTorrent, eXeem, Meta-Torrent, Podcasting

Marc Eisenstadt asks: “The index that facilitates the sharing of files on a large scale is also the Achilles heel of peer-to-peer file-sharing, because it is vulnerable to litigation and closure. So what happens if the index is itself distributed? I try to get my head around the latest in peer-to-peer file sharing, and explain a bit about what I’ve learned, including the fact that BitTorrent’s power rests in its ‘swarm’ distribution model, but not necessarily in your end-user download speed. What has this got to do with podcasting?”

consider podcasting as a time-shifted radio distribution model. In fact, podcasting generalises to RSS Media feeds, but let’s just stick with podcasting, because it is simpler to understand. I summarised the ‘so what?’ of podcasting in an earlier Get Real posting, to the effect that it completes the ‘last mile’ of the connections from the user’s point of view: you subscribe to an RSS feed that embeds within it (not unlike an email attachment) an MP3 file of interest to you, e.g. a regularly-scheduled technology review or talk radio interview, audio book, rock concert, etc., and presto-mundo, it appears on your iPod or other portable gadget whereupon you can listen while on the train, jogging, etc. All the pieces have been there for a long time, but podcasting makes it a hands-free seamless end-user experience (once you’ve done the one-time setup, at least), and that is extremely nifty. But there’s still one piece missing.

There has been some concern expressed that RSS feeds (certainly full-text feeds) are themselves bringing the internet to its knees. This is probably something of an over-statement, but ‘enclosures’ could compound the problem. Consider this scenario: you have created a wildly successful weekly talk show, and the zillions of hits and downloads, whether directly or via RSS feeds, are killing your server, or forcing you to invest in mirror sites and similar server-centric distribution models. You are now ‘a victim of your own success’: large scale has proven self-defeating. But wait! The P2P visionaries rebel agains this very thought, remember? As I wrote above, “Big scale is an asset, rather than a liability”. And in the BitTorrent world, massive scale improves throughput rather than thwarting it.

Sure enough, the guys behind podcasting are already way ahead on this one. iPodder, for example, is conducive to podcasters who make their MP3 RSS enclosures available as torrents. Setup is a little fiddly at this stage, but there are articles that provide how-to guides, such as “Battle the Podcast Bandwidth Beast with Bittorrent ” Wahoo!! The loop is closed! There is end-to-end content creation and delivery for the masses, with no ‘victim of its own success’ bottlenecks. The more popular a file is, the more easily it can be distributed. Awesome.

That’s the way the net was meant to be.

Apple vs Google

PC Magazine writes that “Apple and Google will each be trying to act as the spigot and control point of choice of nontechnical humans everywhere for handling the flood of digits coming onto home screens. Google will support its thrust through profits on advertising. Apple will support its thrust through profits on hardware. But they will meet in the middle.”

[Google’s] approach is to get all this stuff onto big honking hard drives and then let you search the drives any way you choose with any key words that come to mind. However, lest you forget, Google also is trying to figure out how to do what iLife does: keep track of important stuff on your personal computer hard drive and let you find it easily.

Google’s results have worked best with text. Google has yet to show its hand on how it will work with more kinds of visual imagery than still photos and illustrations. But you know a new, big thought is coming there.

Conversely, Apple comes at the same problem of harnessing huge amounts of digital stuff by figuring out the end point first: how to best display and present stuff you contribute. Then it backs up to work through how it can help get you there. Next up for iLife will be a way to display the best stuff that comes in from the Web, probably tailored to settings you easily manage. Call it, maybe, the iNet portion of iLife.

In any case, the two companies will be competing to be in control of the next generation of digital media life, when entertainment and information from in-home and remote hard drives, as well as broadcast and cable signals, are blended onto the same screen.

Stay tuned. These are the companies that are the best at reducing the complexity of our digital lives into screen displays that are simple and inviting to use. They are the two companies most devoted to looking at the digital universe from the consumer’s standpoint and delivering products and services that play to that, effectively.

TECH TALK: Microsoft, Bandwidth and Centralised Computing: Comments (Part 4)

wyoung76: the main problem I have with the authors point of view is that of a Modern World perspective. As evidence that this future is still many a generation away from becoming reality, we need only look at the Third World countries and witness the total lack of infrastructure in supporting such a society of high bandwidth and low local maintenance computing. The local computer is a fast, simple, and easy way of getting the required (or needed/desired) computing power to the people in poorer nations without worrying about the HUGE commitment in upgrading or installing the infrastructure that we modern nations are beginning to take for granted.

Doc Ruby: The real trend is mobile devices, DRM, and cheap bandwidth to home servers at local centers of always-on P2P networks. The huge mass market of less sophisticated/tolerant users, and the peripheral attention offered by personal mobile devices mean the devices will be multimedia terminals with wireless networking. The media industry orientation towards DRM means they’ll give away mobiles at a loss to sell their more scalable/profitable media products, while ensuring the terminals can’t copy the media objects. While the whole network will become much more complex under the hood, the market will demand that it all “just works”, like TV (IOW, when it doesn’t work, there’s nothing you can do about it but wait). That’s why Microsoft is evolving into a media company (games, interactive “TV”), enforcing the consumption of their lower quality products by perpetuating the applications that they prefer/require to “play”. So we’re going in the direction predicted by this story, but along the way the changes will be much different.

neurocutie: despite increasing bandwidth out to the Internet as a compelling force, equally powerful trends suggested the continued importance and popularity of the home PC. Most of these trends can be summed up as needing even higher bandwidth locally, as well as needing specific interfacing of other devices, both of which aren’t likely to be reasonably handled by some form of thin client. For example, all the reasons to burn personalized CDs or DVDs. It is not likely that burning CDs or DVDs would happen straight over the Internet without some kind of fast local store (i.e. hard disk). Another is interfacing digital and video cameras and editing those results. Again it doesn’t seem reasonable to build a thin client to interface these device just to ship the many gigs of data (particularly video) out over the Internet to a remote fileserver and, worse, to perform editing against the remote fileserver — these applications, popular on the home front, pretty much dictate a home PC-like architecture with fast, large local file store

Craig Maloney: I think the argument for a more service-based PC has some major issues to get around: First, there needs to be some receiver machine at the home end. A reasonable computer can be had for around $500 nowadays. Unless this subscriber machine can be had for less than $200, there is no incentive to move to this model. Second, nothing is free. This service will be a subscription-based service. I think it would have had some bearing had people not been burned by subscriptions from other companies. Witness the cable companies and TiVo and how they’ve handled their subscriptions. Witness the cellphone subscriptions. Paying outrageous rates for using a computer won’t succeed if there is no conomic reason to do so. People will sooner purchase Macintoshes. Thirdly, there is the issue of control. You’re dealing with people’s data, and their private information. I will never relinquish control of my checkbook, nor my family pictures, nor anything else like that. Some people may be amenable to this, but many will not. The computer is a multimedia device now, and people have scads of personal data on their computers. It’ll take a very convincing argument, and a company with a reputation for integrity to wrestle away that desire for control. The PC as we know it will change, but I see that change moving more to a home entertainment/personal network than a service based machine.

Dutky: The solution to the increasing administrative burden on computer users is not hire someone to do the administration: instead, we need computers that actually reduce amount of administration required or make the task of administration markedly easier. This is what personal computers did 40 years ago, and it can be done again.

In the next two columns, we will summarise the argument for and against centralised computing. The final two columns in this series will then discuss the notion of centralised computing in the context of emerging markets and what Microsoft should do.

Tomorrow: The Arguments For Centralised Computing

Continue reading

Music Industry Future

[via Om Malik] Umair Haque writes:

Connected music players will totally reshape the future of music distribution. Record stores haven’t vanished because, let’s face it, shopping for music is fun – part of our utility in consuming media is sampling different goods.

When you combine connected music players with RFID, you get a whole new ecosystem of possbilities for music distribution. Ponder this for a second. RFID opens up whole new kinds of network possibilities for media goods in retail locations. The most obvious is record stores which can beam tracks directly into your iPod without a massive infrastructure investment, while you walk around different listening stations (or similar scenario).

But my money is on clubs becoming music distributors/retailers – when you go to a club, you can get the DJ set or selected tracks beamed into your player. This is a natural evolution for clubs, the most iconic of which (Tresor, Ministry) have evolved naturally into labels with dedicated shops. There are huge synergies here – we go to clubs to hear the tracks DJ’s have selected – that’s the value they add. But we don’t get to consume them later without incurring significant additional cost (ie, tracking down the right tracks on the right CDs at the right record stores). Eliminating this additional cost creates huge gains for consumers.

Intuitive Design

[via Amy Wohl] Jaerd Spoll asks: “What does it mean, from a design standpoint, when someone desires a design to be intuitive?”

In our research, weve discovered that there are two conditions where users will tell you an interface seems intuitive to them. It only takes meeting one of the two conditions to get the user to tell you the design is intuitive. When neither condition is met, the same user will likely complain that the interface feels unintuitive.

Condition #1:
Both the current knowledge point and the target knowledge point are identical. When the user walks up to the design, they know everything they need to operate it and complete their objective.

Condition #2:
The current knowledge point and the target knowledge point are separate, but the user is completely unaware the design is helping them bridge the gap. The user is being trained, but in a way that seems natural.

The biggest challenge in making a design seem intuitive to users is learning where the current and target knowledge points are. What do users already know and what do they need to know? To build intuitive interfaces, answering these two questions is critical.

For identifying the users current knowledge, we favor field studies. Watching potential users, in their own environments, working with their normal set of tools, and facing their daily challenges, gives us tremendous insight in what knowledge they will have and where the upper bounds are. Teams receive a wealth of valuable information with every site visit.

For identifying necessary target knowledge for important tasks, usability testing is a favorite technique of ours. When we sit users in front of a design, the knowledge gap becomes instantly visible.

Messaging Server Trends

ServerWatch writes:

Along with the massive surge in e-mail usage around the globe, an even more astounding and troublesome trend continues to emerge. A rapidly increasing percentage of e-mail messages are unsolicited and sometimes malicious, from conventional spam to virus-laden carriers to so-called phishing scams. In early 2003, the oft-cited estimate of e-mail considered spam was approximately 40 percent; by early 2004 this figure had jumped to 60 percent. At the start of 2005, some sources estimate spam comprises as much as 70 percent to 80 percent of global e-mail traffic.

Not surprisingly, anti-spam and anti-virus features have been the most common upgrades to mail server packages in the past year. A strong anti-spam feature set will offer several defenses, which work in combination to trap the majority of incoming spam.

Security will probably remain the e-mail server focus for 2005, as servers grow more sophisticated in their ability to minimize the crushing weight of unsolicited messages. Expect to see a wider implementation of tools to enforce e-mail authenticity and challenge messages with questionable origins.

Disposing Old PCs

The Economist writes about eBay’s initiative in this area:

eBay, the world’s leading online auction business, has come up with an innovative way to encourage people to sell, donate or recycle their old machines over the internet. A web-based program reads the redundant computer’s components and gives its specifications (like its memory and processor speed). Owners can then ascertain the value of their old PC, put it up for sale and get a special mailing kit to simplify shipping. The site also makes it easy to donate a PC to charity or get it to a nearby recycler.

Google-Gazing

John Dvorak writes about Google’s possible future plans:

If Google does a modified Firefox browser you can be certain that it will be optimized for Google searching and may incorporate shortcuts to make things easier for the user although you can expect much of its orientation will be aimed at promoting gmail and blogger among other Google properties. And it will probably be designed to be the front-end or client screen for perhaps a more secret project, the development of a so-called Internet OS to replace Windows.

If you follow the Google strategy their incursions are leading directly down a path often discussed during the late 1990’s — a browser-centric Internet OS. Netscape hinted about this possibility and Microsoft (MSFT: news, chart, profile) got freaked about it, since it would marginalize its Windows OS.

These concepts are not lost on Google. Think of the potential advertising revenue you can generate when you own the entire desktop environment.

And what’s to stop them at the operating system level? What about a Googlebox? An actual machine.

Since all the X86 computers are essentially generic machines made in China, why wouldn’t Google leverage its brand name and roll out the Google X1 — the “computer for the X-Generation!” It could probably get an Apple-like premium for such a machine and load it up with proprietary software too.

Young Asian Inventors

WSJ writes about the Asian Wall Street Journal’s Young Inventors Awards, which “aim to recognize and reward the ingenious thinking, effort and experimentation that lead to the discovery and creation of new ideas.”

The Gold Award goes to Wang Qijie of Nanyang Technological University, Singapore, for creating all-fiber optical interleavers and deleavers — a component of optical networks.

The Silver Award goes to Randall Law of the National University of Singapore for inventing an ultrafast laser nanopatterning device.

The Bronze Award goes to Liang Xiaojun, Sun Yi and Zhang Xuming, also of Nanyang Technological University, for developing a chip-based cancer-diagnostic kit.

Newspaper and Google-Yahoo

Poynter Online writes about the threats newspapers face from the leading online media companies:

[Google and Yahoo] have boatloads of capital to invest in new ventures and acquisitions. They have strong existing news aggregation products, increasingly able to become the personalized “Daily Me,” so long a staple of thinking about the-newspaper-of-the-future. Their principal revenue base is advertising, the lifeblood of newspapers’ income.

Chew on this. As the year turns, the two hot technology companies have a market capitalization (shares of stock multiplied by share price) of roughly $100 billion, about $50 billion each. By contrast, the publicly traded newspaper companies, about 65 percent of the industry, are valued at about $80 billion.

TECH TALK: Microsoft, Bandwidth and Centralised Computing: Comments (Part 3)

Honky: The Core should reside in your pocket and speak to a variety of Shell types wirelesslyThe Core would carry the bare essentials for mobility allowing the user to access key data without necessarily having network access. The Core would simultaneously function as a users Access Card for the higher ASP functionsA user would typically walk upto a wall mounted touch screen terminal and have their familiar work environment immediately appear – the Core would tell the Shell who you are and what state your user profile is in.

Drambeg: Think of public transportation run by the likes of Verizon, your cable company etc. etc. The reason for the success of the PC was/is that its like a car: you are in control of where and when you go and how long it takes to get there. Personally I hate public transportation and Im absolutely certain I would hate the unpredictability of an ASP doling out compute bandwidth”, according to its rules & regs.

Raph: people need to develop web based interface and friendly ftp servers, webdav based client php interfaces simplicity + open source now. people need to learn about their responsibility in the online world and what they can do to prevent the web to become shattered (microsoft dream). Im thinking about user-friendly jabber client php photo album mozilla calendar project (some please come up with and open source address book application)Soon our cell phone will have enough processing power to host an ftp server 24/7 isnt that privacy? and what about personal home servers what if for 300$ you can have you own internet gateway at home, your own .com your own voip gateway everything user-friendly with a php interface things like that are already on the market.

RicktheBrick: I believe that some company will give away the hardware so people will sign up for their service just like cell phones today. The people who do will not have to worry about the hardware as it will be like cable box today if it breaks it will be replaced for free. They will have access to billions of dollars of software and video for a monthly fee. The computer will have zero maintenance and zero worry so it will attract a huge amount of people.

ivan256: The fact of the matter is that companies will never trust their business critical processes to an application service provider. That’s why the major ASPs failed in the ’90s even while corporations *did* have the bandwidth to use their services. This means that it’s never going to take of in the consumer market because the business market is where the money is. Consumer software is the drippings of the business computing market with some eye candy added. If the base technology can’t catch on in the corporate world, it will never end up on the home desktop.

metamatic: People don’t want to pay subscription fees for software. If they did, we’d see a ton of software being sold month-by-month, with remote activation via Internet. There’s no technical block to doing so, and there hasn’t been in over a decade. The problem is that whenever someone tries it, nobody outside of the business world is interested. People don’t want to be at the mercy of the cable company or the phone company. We’re talking about the two companies the average person probably hates most, and now you’re offering them a way to make their entire computer system totally dependent on the whims of the corporate behemoths they hate? People don’t want ever-increasing prices. Look at how the cable company jacks up subscription rates several times a year. Who wants that for all the software they run?

Monday: Comments (continued)

Continue reading

Bus. Std: New Markets for Future Technologies

My latest column in Business Standard (ICE World):

As we look ahead to 2005, the rapidly converging areas of computing, communications and consumer electronics are creating an unprecedented set of opportunities and threats. My belief which has got reinforced over the past year is that it will be the emerging markets like India will define future technologies in the. While the top 10% of these markets are just like their counterparts in developed markets (the top of the pyramid), there is a big chasm which separates the top from the middle.

It is this chasm which presents an opportunity for entrepreneurs and established companies. This middle of the pyramid needs homegrown solutions which are not just priced differently but also may need different business models. This market segment is not just about making things faster, better and cheaper (not all of which are necessarily possible simultaneously) but also about focusing on the utility and value that the device or service provides and building specific solutions to address those needs.

Think about the planned Rs 1 lakh car from the Tatas. It is not just about taking the Indica and trying to cut costs dramatically. To build the car, the Tatas will have to fundamentally rethink every aspect of the car and the corresponding value chains. They did a similar exercise when they came up with the Tata IndiOne hotel in Bangalore to offer a room for business travellers at less than a thousand rupees. Disruptive thinking is the need of the hour.

Entrepreneurs in India have a great opportunity. As Indias consumer class burgeons, there is an opportunity to not just provide solutions to them but also propagate these solutions to other emerging markets globally. India serves as a laboratory to try out innovations and a large, first market.

For the bottom of the pyramid thinkers, the middle is what comes first. The way to the bottom is via the middle. Just as the top globally is almost similar, the middle across emerging markets is very similar. And that is the market that needs to be addressed first. India may have 700 million people in rural areas, but it also has 300 million in urban and semi-urban areas. These potential customers comprise a huge target market of families across 45 million households, 40 million employees across 3 million small- and medium-sized enterprises, and 100 million students across schools and colleges. They are the ones on the edge. The right solutions can help provide new windows of opportunities for them. This is the first market for Indian entrepreneurs.

Besides thinking about the markets outside the top 10%, there are three other guiding principles which I apply to my thinking and writing as we seek out opportunities across these markets: services, subscriptions and ecosystems.

For the next markets, it is important to think of the services that the solutions provide. The target customers have limited resources. So they need to be convinced of the value that the solution provides. For example, instead of talking about computer hardware specifications, this market needs to know what they can do with a computer. That old marketing adage of customers needing a quarter-inch hole rather than a quarter-inch drill is perhaps most apt to describe the marketing approach that is needed for this segment.

The middle segment is also more likely to adopt a monthly subscription-based model than one which requires a large upfront investment. Reliance Infocomm recognised this fact when they launched their mobile service and converted the handset capital expenditure into operating expenditure. This is partly about EMI (equated monthly installments) and partly about offering flexibility of upgrades in a technology world that is rapidly evolving.

Finally, the solution provided needs to address the entire backend ecosystem, rather than just the silo that it is operating in. For example, to target computers to this segment, it is necessary to think about the connectivity and services (applications and content) that will be provided because that is the value chain the computing device is a part of. At times, it will become necessary to reinvent all of the elements of the ecosystem to provide a whole solution that is not just cheaper but also more desirable and manageable than the current offering.

We are at a fascinating point of time. Even as new technologies converge (and diverge) providing us with an amazing array of options and opportunities, we are also part of one of the fastest growing economies in the world. We can build not just the India of our dreams but also create the next Intel, Microsoft, Cisco, Nokia or Google for the middle of the pyramid across emerging markets out of India. As Alan Kay said, The best way to predict the future is to invent it. And that is what Future Tech is about.

Future Techs first column was published on December 17, 2003. There have been 27 columns so far. (All columns are available at http://www.emergic.org/futuretech.) The goal of Future Tech during its first year has been to provide insights into future directions in technology, especially from the perspective of emerging markets like India.

An anniversary is always a good time to look back at what has been and introspect about the future. In the next three columns, I have compiled my best ideas over the past columns. After that, it will be back to predicting the future by working towards inventing it!

The Next Platform

Rafe Needleman writes:

For years I have been hearing startups pitch the idea of the “information furnace” – the computing appliance that consumers would install in their home the way other utility appliances are installed – in the basement, out of sight and mind.

Until recently there was really no need for such an appliance in the vast majority of households; desktop and laptop computers supplied all the processing, communication, and storage that most people needed. But with the growth in digital content that consumers are now storing (photos, music, and video files, not to mention e-mail archives), and the growth in broadband-connected, multi-PC homes, the era of the home server “appliance” may finally be dawning. In fact, we may soon start to see people “place-shift” their content using small servers and the Internet, the same way “time-shifting” was enabled by VCRs and videotapes.

Here are my thoughts on what I think of as the next platform:

It will be a multimedia-enabled thin client with server-based computing (via LAN-Grids and Operator-Grids) over Broadband, and available to users as a service (say, $15 per user per month – for device, server platform, broadband connectivity, remote management, and support).

It has a DSP in the thin client to do video and VoIP. The desktop thin client will also be complemented with a mobile thin client (a cellphone). All data is stored on the server, so users don’t have to think of “my computer” because they have ubiquitous access to “my data.”

The view on the client is adjustable depending on the device — big display at home/office, and small display with the mobile phone. In addition, the thin clients will have some local memory and processing power to support “occasioanlly connected computing.”