Marc Fawzi blogged recently that Wikipedia 3.0 meant the end of Google. Perhaps, but I wouldn’t be holding my breath. While I’ve always said that the Internet giveth and taketh away – both with blinding speed…it’s gonna need a lot of ‘taketh’ to topple the Googlezilla.
In any event, Marc did make an interesting comment that:
“The emergence of a Wikipedia 3.0 (as in Web 3.0, aka Semantic Web) that is built on the Semantic Web model will herald the end of Google as the Ultimate Answer Machine. It will be replaced with “WikiMind” which will not be a mere search engine like Google is but a true Global Brain: a powerful pan-domain inference engine, with a vast set of ontologies (a la Wikipedia 3.0) covering all domains of human knowledge, that can reason and deduce answers and not just throw some information at you using the rudimentary concept of the ‘search engine’.
Having spent some quality time with Doug Engelbart recently, I hear within Marc’s comment rumblings of Doug’s Collective IQ vision. This also ties in with the influence upon Doug, and many others, of Vannevar Bush’s seminal paper on the Memex.
I love how John Battelle ties this all together in The Search:
“But what may well become possible in the world of perfect search is the ability to take the clickstream of that journey and turn it into an object – a narrative thread of sorts, something I can hold and keep and refer to, a prop to aid in the telling and retelling of how I came to my answer. Tracks in the dust, so to speak, that others can follow, or question to discover how I came to my conclusions. And these tracks are not just potential narratives for others to read; they can also be objects that can be spidered by a search engine, providing them with an entirely new order of intelligence about how people learn. In the aggregate, these clickstreams can provide a level of intelligence about how people use the Web that will be an order of magnitude more nuanced than mere links, which formed the basis for Google’s PageRank revolution.
“As We May Think”, Vannevar Bush’s famous 1945 essay in The Atlantic, posited the memex, a computational machine that created the equivalent of clickstreams in the field of scholarly research. In the essay, Bush outlined a looming problem for humankind – that knowledge and learning have become so complicated, so layered, so inefficient, that it is nearly impossible for anyone to be a generalist, in the sense that Aristotle was in his day. In short, there is simply too much knowledge – we can’t depend on any one person to be a philosopher to the kings.
As Bush outlined it, the memex gained its potency by capturing the traces of a researcher’s discovery through a corpus of knowledge, then storing those traces as intelligence so the next researcher can learn from and build upon them.
Clickstreams are the seeds that will grow into our culture’s own memex – a new ecology of potential knowledge – and search will be the spade that turns the Internet’s soil. Engines that leverage clickstreams will make link analysis-based search (nearly all of the commercial search today) look like something out of the Precambrian era. The first fish with feet are all around us – nearly every search engine now supports search history, and dozens of interesting tools have recently come to market that attempt to make sense of the patterns we searchers are leaving upon the Internet’s corpus. We have yet to aggregate the critical mass of clickstreams upon which a next generation engine might be built, but we are already pouring its foundations.”
Perhaps the most interesting point made by Marc, is this:
“Wikipedia makes ‘us’ count. Google doesn’t. Wikipedia derives its power from ‘us’. Google derives its power from its technology…Who would you count on to change the world?”
This reminds me of the comparison John Battelle makes between Google and Yahoo:
“…key distinction between Google and Yahoo. Yahoo is far more willing to have overt editorial and commercial agendas, and to let humans intervene in search results so as to create media that supports those agendas. Google, on the other hand, is repelled by the idea of becoming a content – or editorially driven company. While both companies can ostensibly lay claim to the mission of ‘organising the world’s information and making it accessible’, they approach the task with vastly different stances.
Google sees the problem as one that can be solved mainly through technology – clever algorithms and sheer computational horsepower will prevail. Humans enter the search picture only when algorithms fail – and then only grudgingly.
But Yahoo has always viewed the problem as one where human beings, with all their biases and brilliance, are integral to the solution. It’s humans, backed by technology, who drive the ‘also try’ results at the top of the page (the process has been automated, but its classic architecture of participation stuff: “Here’s what other humans find useful related to your search”). It’s humans who push Yahoo’s internal content and commerce sites to the fore in the shortcut results. DNA has much to do with it – Yahoo started as an entirely subjective collection of links – humans first, technology second.”
It’s this ‘us’ people factor that defines much of the Web 2.0 movement and it also sets the scene for a 3.0 inflection point in the near future.