Somewhere around 2002, while I was working on the concept of a Context Value Broker, a colleague asked for my opinion about the future of ICT-systems and services. Not being an acknowledged futurist, that is a hard question to answer. Many who did try were proven wrong, as this prediction of the future of mail delivery in the year 2000 may illustrate.
My answer was that I could imagine a trend in the next decades in which the IT landscape of mainly one-dimensional monolithic solutions would evolve into multi-dimensional ecosystems of pluggable knowledge and function kernels, running on agile platforms. These ecosystems and platforms would provide a collaborative base for new services and solutions that made sense of complex situations and serve users on a 1:1 basis for their individual needs. The actors in this services would be both humans and machines; even although the interpretative capabilities of machines would grow in time.
The concept of knowledge and function kernels seemed to be especially viable in the case of knowledge intensive situations, in which reasoning and assigning meaning is required for decision making. In that original concept, knowledge kernels existed of sets of triples connected into knowledge graphs. These graphs could be linked into whole knowledge systems that could be inferred. The content on which the graphs would be constructed preferably had to be atomized and well structured. Regulatory content would offer an excellent base to start with, because of its structure and reuse potential. Functional kernels would offer a wide variety of functions like calculating, composing decision documents, retrieving and presenting information. Platforms would allow to assemble these building blocks in multiple way to create a 'solution' for 'any' emergent need, inspiration or aspiration.
The beauty of the concept was that a whole new movement of collaborating contributors could arise: topic based knowledge creators, reviewers and auditors, solution architects, function creators etcetera. All with their own special core capabilities to offer harmonized services that were attuned to the user.
Source: Kunsthistorisches Museum Wien
The result would be an ecosystem of services that could evolve with the needs and context of its environment. Maintenance of the kernels could be done in a distributed way by commercial, not for profit and/or individual (groups of) contributors. In my view, there were similarities with trends like the open source movement, evidence based collaboration initiatives and innovation concepts in the content industry. Such an IT-concept and ecosystem could be regarded as being fundamental disruptive.
Later, in 2006, this paper about the networked knowledge economy claimed that 'Knowledge ‘architecture’ was the most forgotten discipline within enterprise architecture. The emphasis should be placed on “being able to use knowledge” instead of “having knowledge”. After all, how sophisticated the heuristics may become in helping to find information, the question remains how actionable the found results are in the specific context of a user and in how far they can adapt instantly to a new situation. In order to achieve this goal one needed to apply actionable ontologies. Meanwhile the company Be Informed had developed the technique and tools to turn this vision into reality. They built a modern platform that uses a declarative ontology model, that is goal driven and instantly actionable, that can be found here.
I didn't expect the above mentioned development to happen for all services, bearing in mind the expanding volume and decreasing halftime of information, as well as the longevity of dominant IT-mastodons. However, up till now collaborative sense making by humans and machines is still in its infancy.
So did reality prove me wrong, like that prediction of mail delivery above? Perhaps not entirely, since the concept of contextual intelligence is gradually winning ground (see e.g. this Accenture blog). Furthermore, the big data and artificial intelligence movement seems to focus mainly on pattern recognition. This offers plenty of room for human sense and decision making, in which solutions can be used that are based on a type of semantic and functional knowledge kernels. They can for instance be used as a glue layer to connect artificial intelligence results and put them in the right perspective.
In recent times we have moved from a manufacturing era, in which transformations prevailed, into an era in which interactions prevail. A society that is based upon interactions is service oriented and deals predominantly with transactions and interpretations. The rules of the game are simple: we have to keep on moving; climbing still higher up on the interpretation and sense making ladder. Evolving from applying simple straightforward logic, towards interpretation and decision making in complex, sensitive and ambiguous situations. In the end it is all about meaning. Especially, but not solely, in services for knowledge intensive situations meaning makes always the real difference.
The concept of disruptive innovation has both been applauded and criticized in the recent years. Without reopening that discussion, it remains to be seen whether such a type of service and supporting infrastructure will become the real disruptor in this century. Yet, one thing is for sure: the plethora of different, unrelated apps that we are using now is not the answer, nor is the massive computer power to discover similarities and hidden connections in big data. Ultimately, it is all about sense making and creating services on a human scale that support 'the process of me'.
Sooner or later, we have to become entrepreneurs of meaning to keep up with the challenges of the interaction economy and the megatrends that we will face in this century. Therefore, I still expect that semantic knowledge constructs will be - one way or another - part of the equation.
This article is republished on LinkedIn, here.