🔬 The Distribution of Scientific Discoveries →
I really enjoyed this post from Jerry Neumann exploring the structure of how technological and scientific progress happens.
Referencing the well known work of Karl Popper and Thomas Kuhn, he demonstrates how technological change falls into a power-law distribution in its frequency-to-impact ratio. Kuhn’s argument was that progress happens in either small, incremental improvements, or massive, revolutionary leaps:
Kuhn looked at the history of scientific progress and saw that Popper’s heroic scientific machinery was rarely how science happened in the real world. Kuhn’s theory was descriptive: it explained why science seems to have two different processes at work, one of the gradual accumulation of knowledge through normal science and the other of jarring change through revolution. These two processes are not versions of one another, they are truly different, in Kuhn’s view. He says the proponents of normal science fiercely resist revolutionary science and so revolutions can only occur when normal science hits an almost existential dead end.
Kuhn was a proponent of the idea that large, tectonic movements in scientific progress were the results of theories overwhelming the inertia of the status quo. When the old guard would age, shrink in influence, and eventually die out. “Science advances one funeral at a time.”
But Neumann here peels apart what a “technology tree” looks like in reality, and how changes to the modular components result in technological output at the “leaf” level. Using microprocessors as an example, they’re the result of combined sets of discoveries connected in a trunk-and-branch type configuration:
In this model, making incremental improvement to a fundamental technology (like transistor technology or lithography) has a cascading impact up the tree.
An interesting insight here is how he uses this explanation to refute not only what Kuhn’s theory describes, but also Clayton Christensen’s theory of sustaining versus disruptive innovation, which is widely accepted as truth in the tech community.
If innovation outcomes are power-law distributed then there aren’t really two processes at all, it just seems that way. Kuhn, not to mention Clay Christensen, might have been seriously misreading the situation. It may seem like change faces resistance until it is big enough that the resistance can be swept away, but the truth may be that every change faces resistance and every change must sweep it aside, no matter if the change is tiny, medium-sized, or large. We just tend to see the high frequency of small changes and the large impact of the unusual big changes.
When framed this way, it makes a lot of intuitive sense. Impact and frequency are often the two qualities we index on when reacting to discoveries. For those in the middle of the distribution — discoveries that are less frequent than the small incremental ones, and less impactful that the big sea-changes — the response tends to be: “meh.” Perhaps the distribution doesn’t follow the strict bi-modal pattern like we thought; maybe it’s just our attention being focused on the far left and right of the curve.
To me, this is is an important insight considering that our acceptance of the Kuhn / Christensen theories causes us to design our organizations and processes around this model. Organizations create innovation labs and bring in McKinsey consultants to help them “do innovation.” Then these groups are incentivized to discount or ignore ideas that aren’t massive in scope — a selection bias against patently good ideas with potential because they’re seeking the next world-shifting discovery.
There’s a strong case to be made to orient research and development focuses in a more linear fashion; don’t overindex on the Big Ideas, but also don’t fall into the trap of incremental, small steps over exploratory free-form research.