Agriculture Lessons from Ethiopia

WSJ writes about how good intentions can result in bad results:

In the 1990s, Ethiopia went through a decade of global initiatives that sought to boost agricultural production but at the same time withdrew state support for the farming sector. The government, under pressure from international lenders and aid donors, was pulling out of the grain markets in favor of an underfunded and inexperienced private sector. However, little provision was made to support this fledgling free market with storage facilities, transport and financing. When a bumper harvest came in 2001, the markets were overwhelmed. Prices collapsed.

The problem was that the government, at the same time it was pushing to boost production, was also dismantling its system of state aid to farmers and intervention in the agriculture sector. In its place came a private-sector system that was inexperienced and woefully underfunded. It couldn’t absorb or distribute the bountiful harvests that came. Storage facilities were inadequate. Traders still relied on donkeys for transport. Export markets were nonexistent. There was no money to support prices or help farmers get through losses.

The warning on prices, though, triggered no great alarm. “The market side hadn’t been thought about at all. The government was saying, ‘That’s a second-generation problem,’ ” says Eleni Gabre-Madhin, an Ethiopian who explores market dynamics at the International Food Policy Research Institute in Washington and regularly meets with Ethiopian officials in the capital, Addis Ababa. “The emphasis was on, ‘Let’s just produce.’ ”

Could there be some lessons from India?

Continue reading

Ellison on Databases

Line56 reports on a talk given by Ellison:

“We think the database business and the apps business are very closely related,” Ellison said, adding that the ultimate value is more in the information delivery capability of the database and transactional systems in general. “Modern systems will deliver information so people can log on and see how well they are doing.”

Creating such dashboards is something that can be done even if a company is working with several different vendors, with business intelligence (BI), data warehousing, and other software helping to pull in transactional information from systems of record.

This seems quite aligned with our philosophy – events routed through an information refinery, delivered to subscribers via info aggregators/email clients or digital dashboards, which can then be re-published through RSS/blogs.

Software for Mid-Market

The Register looks at what constitutes the mid-market for software:

When evaluating IT products to buy, customers should look very carefully at how a vendor is defining the mid-market in order to ascertain whether or not the products are suitable for the needs of their particular size of organisation. For firms at the higher range of the market, vendors are producing scaled-down versions of their products – essentially lighter versions of the products designed for less complex requirements. Customers should also look at the delivery options since hosted solutions have fewer implementation requirements and hence less up-front cost. Many of the former ERP vendors, as well as some pure-play technology vendors, are taking this route.

At the lower end of the market, many small companies have found recently that they are being forced to hook into whatever technology system their business partners are using – and this can multiply fast if those business partners are using a range of divergent technologies.

Smaller companies should be looking for technology solutions that provide them with a unified platform that handles seamlessly a range of communication protocols on their behalf. This is the route that companies like Microsoft are taking with the development of a platform for smaller companies that allows them to perform simpler business requirements, such as order communication and processing, without needing to train users to handle the variety of business technologies that its partners are using.

The New Platforms

VentureBlog (Naval) writes about the emerging platforms, a world where Linux is likely to play a key role: “Most killer apps will emerge first via web-based GUIs (client side) unless they involve 3D graphics or heavy filesharing, in which case they’re Win32 apps. Server-side killer apps will more easily emerge on Linux than on Windows. Some of the more interesting consumer-facing server apps are emerging just as quickly on Linux as on Windows (PVRs, online photo albums, music jukeboxes).”

Software for SMEs

WSJ writes about the efforts of the big software companies to target small and medium-sized businesses (SMBs):

France is a good example of the potential for these software giants – there are around 2.9 million companies with less than 500 people in the country – but also the pitfalls.

One indication of the potential is a segment like the accounting profession, where more than half of French SMBs have software providers whose share of the market is less than 1%, typically small companies themselves lacking the resources and domestic, let alone international, reach to give them a competitive edge.

But there’s plenty of competition too. Take a company like the U.K.’s Sage, which is hoping to replicate its success in tapping the middlemarket at home and in the U.S. through local acquisitions, and now stands as France’s leading supplier of software to small accounting firms.

These second-string software companies, focused on the middlemarket in France, like Sage, privately held CCMX, or Cegid (F.CEG) won’t necessarily be quaking in their feet at the looming competition from their bigger brethren.

The reason: the critical importance of local knowledge, expertise in particular sectors and proximity to the customer that allows the many second-tier companies to provide tailor-made products to SMBs.

Continue reading


BoingBoing writes about an idea which could be an interesting one for rural areas: “A group of underground music enthusiasts have come up with an ingenious hack to the FCC’s rules on low-power FM radio. It’s legal to broadcast very low-power FM radio signals that can be received in a 200-foot radius. By encouraging Internet users spaced at 200′ intervals around your hometown to download the same Internet radio station and rebroadcast it over cheap-lass low-power FM emitters, you can create a micropower city-wide radio-station.”

TECH TALK: An Affordable Alternative Technology Architecture for Indias BFSI Industry: Part 3

Thin clients need thick servers to do the processing and storage. The thick server that we refer to here can be of two types: it can be a single, new desktop computer with enhanced memory and two hard disks with real-time mirroring of data (software RAID), or a collection of clustered desktop machines. Think of these as inexpensive blade servers with a network-attached storage. This second solution circumvents the single point of failure problem inherent in the first option, thus offering greater scalability and reliability. The investment on the server would be about Rs 1,500-3,000 per client attached to the system.

The third element of the solution is the software. The base for the client and the server is Linux and other open-source applications. The basic set of applications on the desktop include an email client (Ximians Evolution), a desktop productivity suite (OpenOffice, which can read and write files in DOC, XLS and PPT file formats), a web browser (Mozilla or its lightweight variants), an instant messaging client (GAIM) which provides interoperability with existing IM clients (AOL, ICQ, MSN and Yahoo), and a PDF reader (Adobes Acrobat). All these applications are available for free on Linux.

Applications run on the server and are displayed on the 5KPC using either a terminal-server application like LTSP (Linux Terminal Server Project, which runs an X server on the client) or vnc (virtual network computer). vnc, created by AT&T Labs, is a remote display system which allows you to view a computing ‘desktop’ environment not only on the machine where it is running, but from anywhere on the Internet and from a wide variety of machine architectures.

The idea of doing processing on the server and sending the keystrokes and mouse clicks from the user and getting the updated screen from the server is not a new idea: running applications on the server over low-speed connections is already being done Citrix has a solution which works in the Windows world. What is new here is using a Linux desktop to cut costs of not just desktop and server hardware but also software.

The big opportunity for the ATIC is at the branch-level. Each branch can have a 5KPC for every employee, connected to a thick server. The users now get the performance of a new thick desktop, the look-and-feel of a Windows-like interface, the full complement of applications (email, browser, IM, Office suite) without the attendant problems of having to upgrade every few years. In addition, support is simplified dramatically because the client computers dont need any support and the thick servers at the branches can be managed centrally.

What ATIC does is bring down the single biggest impediment to computerisation: the high cost of hardware and proprietary software (MS-Windows and MS-Office).

By using the ATIC architecture, estimated cost savings per user will be Rs 40,000. Multiply these savings by a few thousand users, and add to it the other benefits of lower administrative costs, lesser virus worries and simpler application upgrades, and the benefits of a ATIC architecture become apparent.

Tomorrow: Part 4

Continue reading