It's all in the details

Reading materials regarding new companies and startups, the same question is asked implicitly again and again, "what's the next big idea?" Individuals in the technology startup arena are always looking for the next big technology breakthrough. I don't know if it's the media portrayal of computing startups, but the notion of a "breakthrough", a technology that "suddenly appears" and significantly alters our world is popular. Linked to this is the idea of the sole "inventor" who's thinking is so different that it precipitates this revolution.

Working in the computer industry for a while, there is a universal truth about computers that most designers eventually come to realize. Almost nothing in the field of computers is revolutionary. Most computing "breakthroughs" are old ideas + timing + details. Take most of the big transformative computing technologies and you can usually trace the core ideas back at least 10-20 years. That's not to say these inventors are just plagiarizing old ideas, it's just the really good ideas are usually ones that someone has come up with already. Ideas in this case are like convergent evolution, how dolphins and sharks are roughly the same shape while unrelated. Sometimes it's important to understand ideas as a foundation to properly "tweak" them to make the details right.

What about the World Wide Web? That was an amazing revolution, wasn't it? Well let's discount the fact that the Internet, on which the web was based, was originally Arpanet brought online in in a primordial form in 1969. The idea of hypertext, links, and a way of interconnecting knowledge was new, right? A quick investigation led me to this entry from the book World Wide Web - Beyond the Basics:

The history of hypertext begins in July of 1945. President Roosevelt's science advisor during World War II, Dr. Vannevar Bush, proposes Memex in an article titled As We May Think published in The Atlantic Monthly. In the article, Bush outlines the ideas for a machine that would have the capacity to store textual and graphical information in such a way that any piece of information could be arbitrarily linked to any other piece.
Read more of Dr. Bush's insights at this web version of the book, pretty fascinating stuff. Obviously he was limited to the technology of his day, imagining microfilm data and a desk with a "translucent screen", a keyboard, and mechanical "levers" for data retrieval, but the basic idea is there. Reading the first section of that book, you can see how many times this basic idea was rehashed. Xanadu (1981) and Apples HyperCard (1987) are great examples.

I imagine true product and technology revolutions are when all the details click into place. When all of the details are right, the support structure is right, and people are ready for it, all the tumblers click into place and we have a "revolution". The core ideas in these cases are rarely new. It's unfortunately not quite as sexy to say "what's the next convergence of factors that are going to make the good ideas 'ready'".

Truly great 'revolutionary' computing startups are the ones that understand the ideas but more importantly the 'bigger picture'. It's the big picture thinking, understanding how the ideas fit into the context of the world as it exists in the 'inventors' time. I think there's too much emphasis on big ideas alone. Ideas are important, but not as important as implementation. Innovation in the implementation of the big ideas , I would argue, is the most critical factor. It's all in the details.