Google + Blog = Personal Knowledge Management System

David Reed wrote on SATN.org:

It happened again. I told a friend about a new program. He wants a URL. I say “Did you try Google?” and he says “oh … yeah.” He doesn’t need a URL.

Maybe it’s just that we’re used to having difficulty finding information about things. So few people have absorbed that Google creates a shared context that is bigger than all of our brains, so we humans don’t need specific pointers most of the time anymore. We’re slow learners.

But now when I sit in a meeting where I have an Internet connection, or conferencing on the phone in my office, I’m Googling all the time. The context it creates is immense and useful. Somebody might make an allusion to some literary idea – and I’m no longer in the dark. Somebody might mention a product or service – and I can order it immediately, or bookmark it.

When someone can’t remember a fact or a name, I can usually get it quickly enough to be useful.

Google is my other memory. If it isn’t yours, it probably will be eventually.

As I thought about the comments, it struck me that we have three memories. The first memory is our brain, which unfortunately, ages with time. We forget, or some things just slide into the background. The second memory now is Google, which never forgets and almost unfailingly gets us to the right document wherever it on the Web. The weblog we write becomes our third memory.

I have realised that I now use the blog as an extension of my own memory: articles I like, ideas that interest me, excerpts, comments are all being posted on my blog. With Categories and Search, it now strengthens my memory. I can look up things much faster, review recent ideas or thoughts in much more detail. I used to make notes in my notebook when I read articles, but now I find myself doing so on the blog. The attractive features are the ability to excerpt the part from the article that interested me, the ability to comment and then later search.

What I would like is a private blog, which becomes a superset of the public blog and a part which I only keep to myself. This way, I can post all my notes, meeting summaries, etc. on this blog, knowing fully well that I can find them again (and get the context). Searching paper notes can be quite hard — they become like a black hole, difficult to get anything out of them. So, now, I am using my notebook (the paper one) for doodling and thinking. When I am somewhat ready, I post on my blog (like I am doing now).

I think there’s a much bigger idea here…that of using the blog and the other Digital Dashboard components (RSS Aggregator, outlines, directories and filters) as a “personal knowledgement management system”. This does away with the weakness of the primary memory (our brain) — aging. The blog becomes our tertiary memory, since Google won’t be able to index our private blog. So now, I can note down all kinds of things on my private blog, knowing it will never age or forget. (Well, sometimes, it may be better to forget some things, so then we go hit the delete button!)

What this does is build a stronger foundation for new ideas and innovation. I have seen that ideas take time to mature, and many ideas come but the timing may not be right. So, what it means is that we need to store our thinking, and periodically, go back to our old ideas and perhaps, build upon them. This is because we get new experiences, new insights, new knowledge every day.

I guess the right word is “Assimilation”. As we assimilate new inputs, some of our older ideas may strike a chord. I have seen this happen many times, especially on working on futuristic, innovative concepts like we are doing now. In fact, many of the ideas for Emergic and what we are doing now have evolved over time and I still learn a lot reading my notes (and Tech Talks) from the past.

This may be hard to imagine because we have not been exposed to anything like this before — everything we have known of ages, or fades away. For the first time, between Google and our Blogs (public and private), we can build a unified system for managing our own knowledge. What is needed is to add in Directories, Outliners, Search, Categories, Filters and many of the other features we are seeing with now with blogs. Out of such personal systems (excluding theprivate blog, ofcourse) can be built the enterprise knowledge management system. All that would be expected of each of us is to write what we think. More than anything else, it would be an enriching personal system — a “Forget-Me-Not” kind-of Emergent Knowledge Base.

Karun Philip on Software Components

Karun had written in recently commenting on some points in my recent series on “Rethinking Enterprise Software“. His comments pertain to Visual Biz-ic and Software Components (1 2 3):

In developing a “Visual Biz-ic”, it is useful to think of it as developing the DTD for an XML language. In my business, we started with a DTD for structured finance and then developed the interpreter for it. The DTD is now accepted by W3C and is available at xml.org.

Any business language has two dimensions — horizontal components common to all industries, and vertical components specific to particular industries. For instance a balance sheet is common to all industries. A construct like Average Mileage may only be relevant to the tire industry. Since my industry is banking, I have built the basics of balance sheet constructs in DoubleHelix — our XML language. An asset is simply a schedule table of date, principal due, interest due (and some related fields). A liability is simply a schedule table of date, pricipal to be paid, interest to be paid. All businesses boil down to this. On top of the substructure, we have very specific things such as calculate risk-based present value, etc. etc. which are only really needed by banks. But the basic structure of asset table and liability table is a useful generalization.

What Karun says is absolutely right. We need to think of the components as Horizontal Components and Vertical Components — the former are general-purpose (working across multiple or all industries) while the latter are much more specific to certain industries. Interestingly, there are XML standards being created for multiple industries which could serve as the base for building these components.

Karun once again:

You wrote: “I could visually define the business processes, rules and information flows in my enterprise. Still better, the system could come with a library of business processes from which I could use. While most SMEs do follow some processes, they may not necessarily be the most efficient. Being able to identify other companies in similar segments whose processes and flows could be used can help me define the processes for use in my company. The software would then automatically generate the necessary business objects and logic based on my choices. If required, I could then use English to define special business rules for my company.”

In my company we implemented TQM, but the people we learnt it from made sure that it was people centric rather than process centric. There exist process definition documents, but they are ancillary to the results that each employee is required to do. In every task, the aspects that we measure are some or all of PQDCM — productivity, quality, delivery (per schedule), cost, and morale. We set numeric targets and measure the actual result, and insist on continuous improvements. If we realize something is not being achieved despite the metrics showing improvement, we need to see whether the metrics need to be re-designed / enhanced.

Your tool could begin with the business process design, but each step should have the person and the metric(s) by which it is measured including two dimensions: targets and actuals. That would then feed into each person’s areas (metrics) of accountability. In my company we have a software system where anyone can log in and they will see their metrics and the metrics of those who report to them. ou can click on a particular item to see the support documents (like business process, quality measuring procedures, etc.). This system of being people focused has worked great. Once you only specify the result (in a metric) that must be achieved by the person, it is up to him or her to use creativity and talent to achieve it. If he or she does, the data will record it, and they will eventually be recognized and rewarded. I have also found that poor performers leave of their own accord without any sense of indignence, because it is clear what the performance was. TQM is something every small business should learn — it works wonders.

Maybe you can incorporate some of this in your product if you think it make sense.

Excellent thoughts, and something we should definitely keep in mind when we get to the execution of the enterprise software ideas.

RFIDs to supercede Bar Codes?

Writes Business Week:

Although radio-frequency identification, or RFID, has been used for years to track high-value goods such as car subassemblies in a factory or electronic components in a warehouse, only recently has the technology become cheap enough to deploy in high-volume retail environments. As costs come down, “RFID technology is opening up to new applications like [mobile] commerce, baggage tracking, and in-store uses,” says Deepak Shetty, an analyst at Frost Sullivan.

What makes RFID so hot in an otherwise tepid technology market? Compared with bar codes–the 30-year-old industry standard for product identification–RFID systems require less manipulation by humans even as they hold more useful data. Whether powered by a battery or an external signal, all RFID devices contain a computer chip able to store hundreds of characters of data (vs. just 12 or 15 characters for a bar code) plus a tiny antenna that communicates with a receiver.

The oft-quoted application example for RFIDs is in supermarkets where shampoo bottles or cereal boxes will have built-in RFID chips (costing no more than a few pennies) which would broadcast to the “smart shelf” whenever they are picked up, thus providing a real-time update to inventory management systems.

Three applications I’d like to see:
– in libraries, to prevent mis-filing of books
– embed the chips in luggage so one knows exactly when the bags is coming after arriving at airports (or even if they have gotten on the flight!)
– finding the TV remote and the cellphone

Separate the Screen

Interesting comments by David Coursey in “What do you think future TV will look like?” on ZdNet’s AnchorDesk:

The component model, with tomorrow’s TV set reduced to the role of a monitor (just like a PC’s), is the concept that wins.

We’d be much better off, I think, separating the screen, which should have a fairly long service life, from the electronics, which may be subject to regular upgrades and replacement, like a PC.

For the most part, I expect the digital hub to more closely follow the model of a computer and peripherals (or component stereo system) than that of a single box that does everything and includes the screen, too.

How many components will be connected to create this system? I don’t know, though it’s clear customers want more features, with fewer cables and remote controls to connect and operate them.

And, yes, all this stuff will be clustered around the big screen in your den. So if you want to say the TV will be the center of the hub, that’s OK with me–so long as we remember that the TV screen is merely an output device and not the center of the digital universe.

Reading this, I reflected on the Thin Client-Thick Server philosophy inthe enterprise context. It is nearly the same! The monitor on the desktop is merely a “remoted screen”, with the server being the digital hub (processing and storage). The software on the server can be thought of as the components (or perhaps “blades”).

Ghana – TechReview

Ghana’s Digital Dilemma:

In the West African country of Ghana, one of the world’s poorest places, the busy signal is a reminder of the unfulfilled promise of the Information Age. Making a telephone call here requires persistence. Roughly half don’t go through because of system failures, but that’s only the start of Ghana’s telephone woes. The country has a mere 240,000 phone lines – for a population of 20 million spread across an area the size of Britain. Moreover, telephone bills are inaccurate, overcharges common, and the installation of a new line can cost a business more than $1,000, the rough equivalent of the annual office rent. Lines are frequently stolen, sometimes with the connivance of employees of Ghana Telecom, the national carrier. Phones go dead, and remain unrepaired, for months. Some businesses hire staff for the chief purpose of dialing numbers until calls go through.

How can we make a difference to countries like Ghana via technology?

UDDI’s Slow Progress

News.com writes on UDDI in Directory assistance for Web services?:

Born during the business-to-business e-commerce craze, the directory project was touted as a “Yellow Pages” for e-commerce applications and services. The basis of the directory is a Web services specification called Universal Description, Discovery and Integration (UDDI), which identifies and catalogs Web services so they can be easily found online.
The specification is finding a home, albeit slowly, in big companies as a way to build directories for internal Web services projects–allowing the companies to better catalog and communicate services across departments. Yet the bigger dream to build the UDDI-based public Web services directory, known as the UDDI Business Registry, is just that for now–a dream, experts say.

“The idea of a public registry was ahead of its time because you have to have other parts, like service-quality guarantees and trust,” Gartner analyst Daryl Plummer said. “People are using it now, but it’s just experimentation. We are a good four to five years away.”

Also came across a story in InternetWeek on whether UDDI and LDAP would be a perfect fit: ” Couple UDDI’s strengths — widely accepted and standards-based — with LDAP’s advantages — broad enterprise deployment and proven security and scalability features — and Web services may have the repository architecture required to make a major impact on the enterprise.”

TECH TALK: Server-based Computing: A Brief History of Computing (Part 2)

The commoditisation of the PC through the attack of the clones in the 1980s and 1990s ensured that prices kept falling and adoption kept rising. During the 1990s, PCs running Windows took over the desktops, and Servers running Windows NT and Unix much of the server-side. Unix workstations made their presence felt where there was a need for additional horsepower on the desktops. Suns mantra for much of the 1990s was: the network is the computer.

Desktops continued to became more and more powerful following Moores Law, and software got more and more bloated. Into this world of the mid-1990s, seemingly out of nowhere, emerged the Internet. Built on open standards (TCP/IP, HTTP and HTML), the Web Browser started becoming the de facto front-end with web servers as the back-end for serving data and applications. This was client-server with a twist: the client only had to have a web browser, and didnt need to do much processing.

There were even efforts to create network computers which were stripped down computers. These efforts failed because they forgot one important fact: Microsoft had by then taken control of the desktops, and more critically, the file formats for document exchange (DOC, XLS and PPT). It was no longer possible to be part of the world community and not run Windows on the desktop.

At the same time, the world was getting awash in bandwidth. Telecom companies laid fibre all over the place and multi-megabit connectivity between locations became commonplace across locations in organisations. The great wall between speeds available on the LAN and the WAN was falling. It didnt make a difference where information or applications resided. As long it was out there on the network, it was possible to access it. The network had truly become the computer.

So, then, heres the anomaly. Even as computers got more and more powerful and networks become more and more pervasive, the processing power available on the desktop far exceeded what most users needed. Most applications came with a Web-based front-end. The mix of what users needed on the desktop boiled down to Email, Instant Messaging, Web Browser and the desktop productivity applications. The first three were platform-independent, but Microsoft had the lock on desktop through its MS-Office suite.

This has meant that computers (thick desktops) now come with the processing power of mini-computers of the past and cost about USD 700, and need Microsoft Windows and Office which cost USD 500. At no stage does the CPU utilisation on the desktop cross a small fraction of whats available. These economics have meant that while the developing countries of the world have embraced computing whole-heartedly to near-saturation levels, the emerging markets of the world have been hampered because of the dollar-denominated pricing. The result: low adoption rates and high software piracy in countries like India and China.

Until recently, there didnt look like there was any way to beat the system. Fortunately, a number of recent developments offer hope for a return to a variant of the earlier era of server-based computing. Therein lies the route to create a computing mass market for the next 500 million users.

Tomorrow: Recent Developments

MailStation for Emails

mailstation.gif Walter Mossberg writes about the MailStation in the WSJ: “The MailStation works on a simple principle. It uses its own built-in Internet service to send and fetch e-mail automatically twice a day, at times the user selects. It will also send and receive e-mail whenever you command it to do so. It is purely an e-mail device — it can’t surf the Web, or hook up to AOL, or to any Internet service other than its own. And, except for the high-end model, it is limited to text e-mail. It cannot deliver or display attachments. However, attachments sent to you can be fetched from a special Web site available from any PC.”

The device, from Earthlink in the US, comes in two models, with the cheaper one at USD 100 + USD 10/month for service.

This got me thinking: what will it take for the Thin Client to become a MailStation and more? The TC by itself is useless — it needs a Thick Server. But suppose we could put a basicversion of Linux on the device (like in an embedded device) such that it could connect to the Internet, then it could talk to the TS and get other apps. Would be slow over the phone line, though. But if we focus on just a few apps (eg. Mail and Browser), it may be possible to build in enough of the stuff on the TC itself. The question is: which markets would thisbe of interest in? Perhaps, for the home segment (first-time users).