SMEs and Local Search

Greg Sterling writes:

Data from a recent telephone survey of 500 small-and medium-sized businesses (SMEs), with 100 or fewer employees, revealed that approximately 5 percent had adopted search engine marketing. Incidentally, that was the same figure as those who reported advertising in Internet yellow pages.

At a high level, there were two principal barriers to SME adoption of geotarged search engine marketing: the perceived lack of local search usage and the complexity of campaign set up and management. Both of those issues have now been addressedat least in partand the way is cleared for growth in both consumer and local advertiser adoption.

In one broad sense, the growth associated with local search is simply a reflection of the growth and usage of search engines and, beyond that, the rise of the Internet as a daily utility in people’s livesat least for the broadband set. SMEs are themselves Internet users and are very much aware, if only anecdotally, of the fact that more and more of their customers are online and searching as a way to find products and services-often in the local area.

However there’s a gap between this recognitionand even what might be considered a pent up demand for access to online marketing channelsand SME behavior. As mentioned, they haven’t boosted or shifted their ad dollars commensurate with their desire to be in front of online consumers.

Kelsey Group-ConStat SME advertising data from June of this year reflects a 10 percent jump from the same time last year in the perception of the Internet as an important marketing medium. Simultaneously, however, the percentage of SMEs using the Internet as an alternative to traditional media was flat versus last year.

This brings me to the second barrier to small business adoption of online marketing and search in particular: complexity and confusion. Setting up an effective search campaign takes time. There’s obviously a learning curve. And that doesn’t even get into provisioning a campaign across multiple paid search networks. Even if you’re committed to figuring it out, there’s potential for confusion and frustration.

Thin Clients to Reduce Power

Tom Sullivan of InfoWorld writes:

Through the typical barrage of vendor meetings I had this week one surprising thread emerged: IT is increasingly looking for ways to reduce expenses by using less power.

I know, I know. It sounds rather obvious. Trite, even. All companies, the well-run ones at least, try to spend as little on power as they can possibly get away with. It only makes sense.

A lot of the attention that gets paid to power savings, however, relates to the datacenter and to lower-voltage processors, particularly for notebook computers. Reducing power expenses has certainly been an issue within the datacenter, but when it comes to notebooks the benefits of using less power, more often that not, relate not to electricity bills but instead to size, performance and battery life.

This week I met with secure remote access provider Tarantella and the thin client folks at Sun.

Frank Wilde said that in his discussions with customers since taking the CEO post at Tarantella about a year ago he has heard myriad CIOs say one of their biggest problems was managing power, space, lighting, heating and air-conditioning.

These problems are not typically associated with the bits and bytes of IT, though IT may consume more of those resources than any other business unit, depending on the type of company.

Unknowingly backing up Wilde’s remarks, Mason Uyeda, the product line manager of Sun Ray clients, said that Sun Rays use an average of 13 watts, whereas a typical desktop PC consumes a “barebones minimum of 80 watts.”

Those wattage numbers are estimates, not exact figures, based on several variables, such as which optional components are run with the system.

What got my attention, though, was that Uyeda said Sun has customers who buy Sun Ray machines primarily to reduce spending on power.

Where Value Will Come From

Adam Bosworth writes about services:

The value is coming from the community and the reputation and the content, not the tool used to author the post which is largely irrelevant.

Services will need to provide value to justify themselves, not data lock in. The value may be in the form of additional information, community discussion, ratings, search, data management, publishing features, relieving you of the tedium/cost of operations or monetization, but they will need to deliver value in a way where, if the value isn’t there, you can walk.

Even iTunes which is lauded as a sucess in the comments hasn’t evolved much for me and still, for example, has no community features. Even peer to peer ones. I can’t chat with my son and say listen to what I’m listening to. But as long as they evolve rapidly, great.

But I will still argue that in general, the value comes from the information, the community, the collaboration. The radio hit a certain point and then was good enough. It was a commodity. The content played over the radio, on the other hand, has continued to be of huge value. Sure new radios come out all the time with new bells and whistles. Compared to a new talk station (pleasing or infuriating or a great new song) how much do we really care? Sure iPod currently has made a business by tying itself to a service, but at some point, it will be like the radio, I predict and have gotten good enough.

Importance of Standards

ACM Queue has an article by Gordon Bell:

Over the next decade, we will encounter at least three major opportunities where success will hinge largely on our ability to define appropriate standards. Thats because intelligently crafted standards that surface at just the right time can do much to nurture nascent industries and encourage product development simply by creating a trusted and reliable basis for interoperability. From where I stand, the three specific areas I see as particularly promising are: (1) all telecommunications and computing capabilities that work together to facilitate collaborative work; (2) hybrid computing/home entertainment products providing for the online distribution of audio and/or video content; and (3) wireless sensor and network platforms (the sort that some hope the 802.15.4 and ZigBee Alliance standards will ultimately enable). No doubt there will be others, but for the purposes of this discussion, these should suffice.

The point here is that, in each of these areas, the right standards adopted at the right time can make an important contribution to technical evolution by applying critical design constraints. That is, they can do much to conserve vital design time and effort simply by providing a stable foundation of already defined compute capabilities and processes. Thus, instead of starting each new system from silicon, developers can be liberated to turn their attention to the design of higher-level, value-added functionality.

Google and Internet OS

Nick Bradbury writes that “the more I look at what Google is doing, the more convinced I am that we’re witnessing the birth of the next Microsoft.”

The big problem for Google – and the big advantage for Microsoft – is that the vast majority of computer users have all of their data on their Windows-powered desktop computers. So what does Google do? Try to get people to move their data to the web (through Google, of course). Google has already identified email and digital photos as two of the primary uses of desktop computers, and they’ve responded with Gmail and (to a much lesser degree, so far) Picasa. Then they release the Google Desktop, which further blurs the line between the web and your desktop by enabling you to search your hard drive using the familiar, simple Google interface. What will we see next? GMusic? GDocuments? Others have speculated whether a GBrowser is in the works, although that certainly remains to be seen. Regardless, Windows is being marginalized piece by piece, and Microsoft can’t stop it. The internet is the next OS, and Google is becoming a primary force behind it.

I believe that open standards are far more important than open source, since open standards mean you can share your data regardless of whether you access it through commercial or open source software (and regardless of which OS you’re using). The software doesn’t matter: the data does.

My assumption is that the internet OS of the future will replace your desktop OS, your television, your newspaper, etc., to become (by far) the primary source of information. Today, TV is widely used by the government and industry to provide only the information that will help sell their ideas and products. So what happens when the TV knows who you are? If you look at this 10 years down the road, it doesn’t seem too scary. But 50 years, 100 years, from now, what will we have built? I just want to make sure we know what we’re building.

TECH TALK: CommPuting Grid: Grid Computing

IBM’s Thomas Meyer offers an introduction:

First came the mainframes: huge hulking computational devices that lived in the rarefied atmospheres of big corporate and university labs, attended to by a secluded priesthood of engineers. Later came the desktop machines, mini- and microcomputers that gave computing power to an ever-expanding group of people at work and home.

Then came the client-server and networking technologies and protocols to hook all these machines together and allow them to communicate. Fast on the heels of all that came the Internet, which expanded our ability to communicate and share files and data with any networked machine on the planet.

Now we’re turning the corner on the next big thing: Grid computing, and it has as much potential for changing the way we do business as the Internet did. You’re probably already familiar with technologies such as Web services, XML, and object-oriented programming. Grid computing is a lot like these, if only conceptually.

IBM’s website offered the following definition:

Grid computing enables the virtualization of distributed computing and data resources such as processing, network bandwidth and storage capacity to create a single system image, granting users and applications seamless access to vast IT capabilities. Just as an Internet user views a unified instance of content via the Web, a grid user essentially sees a single, large virtual computer.

At its core, grid computing is based on an open set of standards and protocols e.g., Open Grid Services Architecture (OGSA) that enable communication across heterogeneous, geographically dispersed environments. With grid computing, organizations can optimize computing and data resources, pool them for large capacity workloads, share them across networks and enable collaboration.

In fact, grid can be seen as the latest and most complete evolution of more familiar developments such as distributed computing, the Web, peer-to-peer computing and virtualization technologies.

Like the Web, grid computing keeps complexity hidden: multiple users enjoy a single, unified experience.

Unlike the Web, which mainly enables communication, grid computing enables full collaboration toward common business goals.

Like peer-to-peer, grid computing allows users to share files.

Unlike peer-to-peer, grid computing allows many-to-many sharing not only files but other resources as well.

Like clusters and distributed computing, grids bring computing resources together.

Unlike clusters and distributed computing, which need physical proximity and operating homogeneity, grids can be geographically distributed and heterogeneous.

Like virtualization technologies, grid computing enables the virtualization of IT resources.

Unlike virtualization technologies, which virtualize a single system, grid computing enables the virtualization of vast and disparate IT resources.

In the November 2002 issue of Release 1.0, Kevin Werbach wrote: Grid computing draws on an analogy to the electricity distribution system. Power from any generating station can serve any customer who needs it, across a huge geographic area. Businesses and residential users need not worry about the intricacies of transformers and peak load management; that all happens magically through the power grid. They simply plug into a wall jack and pay a bill.Grids bring some of that fluidity to computing. Kevin offered the following taxonomy:

P2P: networking devices, applications and data together horizontally through direct connections, rather than up and down though central servers. Its an architecture, not a business segment or a development model.
Web services: a set of standard XML protocols for component-based applications. The resulting business models and methodologies emphasize services with many constituent parts, rather than atomic applications, ultimately operating across corporate boundaries.
Clustering: operating a pool of similar computers as a single machine to handle a defined task.
Grid computing: treating heterogeneous collections of computing resources as a single multi-tasking computer. Grid computing is broader than clustering because of its heterogeneity.

Tomorrow: Grid Computing (continued)

Continue reading