China’s High-Tech Standards

Ninad points to a WSJ article which discusses how China is wanting to set its own standards for Mobile-3G, digital TV and DVD technologies.

Ninad’s analysis of the reasons:

(1) It places China at equal footing with some of the western nations (US, UK & Scandinavian) and Japan, in developing new technology rather than copying it by paying a higher licensing fee.
(2) It lowers the cost of indigenously made Chinese electronics for local consumption and increases the rate of mass market adoption since China controls the standard
(3) Potential revenue opportunity through licensing the standard outside China

Adds WSJ: “China’s drive to create new standards in high technology is part of its broader desire to claim equal footing with the world’s top economic powers…By creating homegrown technical standards, China is trying to increase the use of Chinese innovations world-wide. And it is using its own large domestic market to help speed up their adoption. By requiring these standards to be used on technical products in China, international companies that want access to that market are forced to make products that use them.”

CDMA Overview

Russell Beattie spent time learning about CDMA and shares it with us:

First, here’s the deal about CDMA vs. GSM. The way that GSM works is really an extension of the old TDMA analog system. GSM adds digital technology to divide up the frequency allotments into 8 channels, which are then time shared across those channels. CDMA on the other hand uses the same piece of spectrum and separates the calls by encoding each one uniquely, allowing your phone to disregard other transmissions on the same frequency.

Just on the face of it, you can see that CDMA seems more scalable and easier to manage because the system uses just one frequency. However, enabling this requires a Rube Goldbergesque combination of adjusting power levels, GPS based time settings and other complex stuff that I don’t even have the faintest idea of. I can see why the GSM camp snuffed their nose at it in the beginning. It’s amazing it works at all.

Okay, so the basic CDMA that most people uses right now is now called “cdmaOne”. The next generation is the move to “CDMA2000 1x RTT.” This is what Verizon is spending all their money buying. The 1x stands for “single channel” and the RTT (which Qualcomm doesn’t like to use any more, though it was written in the article that way) stands for “radio transmission technology.” Even though the speeds of this new standard are really what has been considered 2.5G, the technology is the base for higher speeds and has been deemed 3G by marketing higher ups, so you’ll see it referred to that way.

The 1x is the important part: CDMA2000 uses from one to three 1.25 MHz carriers. This first rev of CDMA2000 uses just one of those three. cdmaOne already uses this frequency, which is why CDMA2000 is considered “backwards compatible” and I guess what the CDMA2000 standard adds is more efficient use of that spectrum. The next steps in CDMA are then CDMA2000 1xEv which uses a second channel (1xEv phase one uses the second channel only for data only: “1xEv-DO” and phase two uses both channels together “2x”), and 3x which uses all three channels as a single 3.75Mhz carrier. You can see how adding channels and infrastructure will naturally cause data bandwidth to go up, though, it’s important to note that unlike the GSM route, this allocation seems backwards compatible and isn’t just for data, but also for voice calls as well.

The GSM path goes to GPRS next, which can dedicate one or more of the channels in the GSM spectrum to packet data only. It works, but has lots of provisioning problems and bandwidth constraints. I’m not sure about this, but it seems to me that if you’re enabling GPRS, you’re cutting off at least one channel, and this must affect the GSM voice service. After GRPS is EDGE which works in a similar way, but uses a newer “modulation scheme” which allows higher data rates. I have no idea what a “modulation scheme” is, actually, but it’s easy to get the idea: same general functionality, but with faster moving bits.

After this, however, the GSM guys have to scratch all that equipment and move to WCDMA, which is a version of the CDMA technology and divides up calls by uniquely encoding them. Unlike the “multi channel” CDMA2000, “Wideband”-CDMA uses one big-ass 5Mhz frequency which allows for much greater capacity and data speeds (up to 2Mbps). How exactly this compares to CDMA2000 3x’s 3.75Mhz, I don’t know.

So that’s the general idea. The pairs are roughly cdmaOne/GSM, CDMA2000 1x/GPRS, CDMA 1xEv/EDGE, CDMA2000 3x/WCDMA . It’s important to note that where as the GSM guys are adding just data capabilities until WCDMA, the CDMA guys are also adding more voice capacity as well.

In India, Reliance and Tata are offering CDMA services (if I am not mistaken).

Experimentation Matters

Inc writes about Stefan Thomke’s new book. “there is a vast store of potential innovation in new technologies. To help companies unlock that potential, he writes that they must tap the power of experimentation and new technologies while changing their processes, organization and management of innovation. He explains that computer modeling and simulation have made experimentation less expensive than ever before, and research and development (R&D) teams now have tools at their disposal that can be used to create new value for customers.” The book suggests six principles for managing experimentation and explaining how they can be used to drive innovative product development:

1. Anticipate and exploit early information through “front-loaded” innovation processes.

2. Experiment frequently but do not overload your organization.

3. Integrate new and traditional technologies to unlock performance.

4. Organize for rapid experimentation.

5. Fail early and often but avoid “mistakes.”

6. Manage projects as experiments.

Personally, experimentation is the way I have seen one can make progress with ideas. Try out a few things, and some will work, others don’t. It is how many of the things that we are doing today have emerged. More bottom-up than through a top-down approach.

IT and Productivity

The Economist has two articles [1 2] on how American productivity has grown rapidly, and the role of technology.

A puzzle [in the American economy] is why productivity accelerated over the past three years at the same time as IT investment fell. After all, a host of studies have concluded that most of the revival in productivity growth is linked to the production or the use of computers and software.

One explanation is that the productivity gains from IT investment do not materialise on the day that a computer is bought. Work by Paul David, an economist at Oxford University, has shown that productivity growth did not accelerate until years after the introduction of electric power in the late 19th century. It took time for firms to figure out how to reorganise their factories around the use of electricity and to reap the full efficiency gains.

Something similar seems to be happening with IT. Investing in computers does not automatically boost productivity growth; firms need to reorganise their business practices as well. Just as the steam age gradually moved production from households to factories, and electricity eventually made possible the assembly line, so computers and the internet are triggering a sweeping reorganisation of business, from the online buying of inputs to the outsourcing of operations. Yet again, though, the benefits are arriving years after the money has been spent.

IT’s impact is likely to continue for the foreseeable future:

Pundits who reckon that 3-4% productivity growth is sustainable for another 5-10 years are, in effect, making the bold claim that IT will have a far bigger economic impact than any previous technological revolution. During the prime years of the world’s first industrial revolutionthe steam age in the 19th centurylabour productivity growth in Britain averaged barely 1% a year. At the peak of the electricity revolution, during the 1920s, America’s productivity growth averaged 2.3%.

Yet there are still good reasons to believe that IT will have at least as big an economic impact as electricity, with average annual productivity growth of perhaps 2.5% over the coming years. One is that the cost of computers and communications has plummeted far more steeply than that of any previous technology, allowing it to be used more widely throughout the economy. Over the past three decades, the real price of computer-processing power has fallen by 35% a year; during 1890-1920, electricity prices fell by only 6% a year in real terms.

IT is also more pervasive than previous technologies: it can boost efficiency in almost everything that a firm doesfrom design to accountingand in every sector of the economy. The gains from electricity were mainly concentrated in the manufacture and distribution of goods. This is the first technology that could significantly boost productivity in services.

So, IT does matter, but only if companies are willing to change the way they do business. “The most dramatic gains happen when companies use technology to understand better what they do in order to change how they do it, says Navi Radjou, an analyst at Forrester, a technology-research firm. The main issue slowing productivity gains down, he adds, is ‘grandma syndrome’a reluctance to ditch tried and tested processes.”

This is what SMEs need to do – adopt technology and revamp the way they think and do their business. This is the next frontier for tech companies.

Governments and Open-Source

The Economist writes:

Across the globe, governments are turning to open-source software which, unlike proprietary software, allows users to inspect, modify and freely redistribute its underlying programming instructions. Scores of national and state governments have drafted legislation calling for open-source software to be given preferential treatment in procurement. Brazil, for instance, is preparing to recommend that all its government agencies and state enterprises buy open source.

Other countries are funding open-source software initiatives outright. China has been working on a local version of Linux for years, on the grounds of national self-sufficiency, security and to avoid being too dependent on a single foreign supplier. Politicians in India have called on its vast army of programmers to develop open-source products for the same reasons. This month, Japan said it would collaborate with China and South Korea to develop open-source alternatives to Microsoft’s software. Japan has already allocated 1 billion ($9m) to the project.

Policymakers like open source for many reasons. In theory, the software’s transparency increases security because backdoors used by hackers can be exposed and programmers can root out bugs from the code. The software can also be tailored to the user’s specific needs, and upgrades happen at a pace chosen by the user, not the vendor. The open-source model of openness and collaboration has produced some excellent software that is every bit the equal of commercial, closed-source products. And, of course, there is no risk of being locked in to a single vendor.

Economics is a big driver for governments to use and encourage open-source software. Governments cannot pirate software (purchase through tenders), and so their total cost of ownership can be quite high – especially in emerging markets.

In India, most state governments and the Central government have been incredibly slow to recognise the power and potential of open-source. India should have been leading the world in the use of open-source, but we aren’t even following. Yes, the President has made some positive statements, but it hasn’t gone much beyond that.

India can define a new architecture for computing for the rest of the world. This can create a much wider use of computers and also make its people and companies more efficient. A little push from the government can go a long way in shaping a domestic software products industry, which can, over the years, become as big as the services industry.

Linux and ISVs

VARBusiness has an article on why independent software vendors (ISVs) should turn to Linux, offering five reasons:

1. It offers a firm foundation for building Web services and Internet-based applications.

2. More competition means lower costs for customers.

3. It’s easy to leverage J2EE and other open-systems standards.

4. Applications are all Web-based, so apart from a browser, no client software is required, and they can be used across Windows and Unix servers.

5. Open-system toolsets mean freedom of choice for developers, as well as less costly tools.

The short advice is: “Be open. Be cheap. Be nimble.”

TECH TALK: The Next Billion: Distribution

One of the important challenges the industry will face is reaching its users the individuals in the homes, and the SMEs everywhere. They are two different segments, and we will address them separately.

Homes

For the home segment, the connectivity is a critical component of the solution because the desktop (virtual PC) needs the server to be useful, just as the TV needs to cable operators satellite dishes and head-ends to be able to show anything on the screen. There are various possible providers who can become technology operators the telco, the cable operator or the gas company. Each of them has a pipe going into the customers home, and has a billing relationship. Alternately, independent internet service providers can also provide the service.

The connectivity between the home and the operator will be a high-speed network over Ethernet, fibre or wireless. The servers will be at the operator premises, connected over a multi-megabit connection to the various homes in the neighbourhood. The end-device (our virtual PC) is very much like the phone maintenance-free. It either works or doesnt. If it does not work, it needs to be replaced. The user can do nothing which will require a customer visit (which is expensive) to fix problems. A centralised call centre can handle application-related and service-level queries.

SMEs

Much of the distribution chain for reaching small- and medium-enterprises already exists in the form of the assemblers (the white box sellers) who aggregate the various components that go into making a thick desktop today. The assemblers already know the SME customers well, acting as their de facto IT managers and advisors. However, the business of the assemblers has been squeezed in recent times, as technology has become more of a commodity. While prices have not changed much, margins have come down.

The new server-centric computing architecture promises to re-invigorate their business by making IT a critical and affordable part of the enterprise DNA. By advocating the concept of a computer on every desktop and elaborating on the advantages of using computers, the assemblers can now address todays non-consumers the 80-90% of the SME that does not use computers.

The assembler can ally with local training institutions to not only provide the end-user training on how to maximise the business benefits from computing and the Internet, but also create demonstration centres (showcases) where prospective buyers can see, touch and feel the technology prior to purchase. Training institutions are present in every neighbourhood across emerging markets their business has been affected by the slowdown in technology. This new approach creates additional business opportunities by leveraging their existing infrastructure to service the new markets that are being created.

Summary

We have seen this week how an affordable computing solutions can create a price point, which is a third of todays price point. This will help open up technologys invisible market the billion buyers across SMEs and homes in the worlds emerging markets. Next week, we will delve deeper into the innovations that are required as part of creating the next-generation computing and information architecture.

Next Week: The Next Billion (continued)

Continue reading