AlwaysOn Network has a column by Dr. Deborah McGuinness, associate director and senior research scientist of the Knowledge Systems Laboratory at Stanford University:
We’re moving to a web where I know what was meant instead of I know what was input, where the web can understand the meaning of the terms on the page instead of just the text size and color that it should present those words in. It knows a little bit about the user background. We’re facilitating interoperability so you don’t have to have all these stand-alone applications that don’t understand each other. The web is moving to be programmable by normal people instead of just geeks like me.
This is becoming important for outward-facing applications. If you’re going to trust the answers from something, you’ve got to be able to understand why you should trust them. The web is also moving to being explainable, more capable of filtering, and more capable of executing services.
While the RDF schema and RDF and XML just got to recommendation status from W3C, the basics were already there; ontology and forward is where we’re focusing these days. Potentially the most important is a level of sophistication, from the very simple notion of a catalog, a unique index for a term, to things like having a glossary where those terms have natural-language descriptions. It might be understandable by humans but not very operational by agents unless they’re capable of full natural-language understanding.
We move to a more structured realm where we see a thesaurus and the notion of a narrower or more general term. There you start to get some various structured relationships between terms and things that agents can start to use. I captured this notion of what I call an informal isa [“is a”].