Nick Schulz points me to an essay from a few years ago by Paul Allen and Mark Greaves.
As we go deeper and deeper in our understanding of natural systems, we typically find that we require more and more specialized knowledge to characterize them, and we are forced to continuously expand our scientific theories in more and more complex ways. Understanding the detailed mechanisms of human cognition is a task that is subject to this complexity brake. Just think about what is required to thoroughly understand the human brain at a micro level. The complexity of the brain is simply awesome. Every structure has been precisely shaped by millions of years of evolution to do a particular thing, whatever it might be. It is not like a computer, with billions of identical transistors in regular memory arrays that are controlled by a CPU with a few different elements. In the brain every individual structure and neural circuit has been individually refined by evolution and environmental factors. The closer we look at the brain, the greater the degree of neural variation we find. Understanding the neural structure of the human brain is getting harder as we learn more.
The same appears to be true with cancer, and indeed with all diseases that combine genetic and environmental factors. I would argue that the same is true for macroeconomics. With some problems, you make a lot of progress until the complexity brake kicks in.
While this is a long term trend, there are advancements that reverse this occasionally. The world was hopelessly complex before Newton simplified it. Simplicity is often how we refer to the inexplicable.
Almost the opposite is true. What Newton did revealed the complexity that no one else even knew was there, and later physicists demonstrated that Newton only scratched the surface of it all. That is what Allen and Greaves are writing about- new knowledge may well be good, but what you know you don’t know seems dauntingly greater as a result. Science all around seems to be the same- every question answered seems to raise even more unanswered ones.
Ignorance is simple in explanation, complex in effect and prediction.
And the complexity brake applies to both computer hardware and computer software – both of which depend on various mechanisms to manage complexity.
Nature, of course, is not subject to such management mechanisms.
Another example, from Wikipedia on the previously named “noncoding” dna:
“For example, it was originally suggested that over 98% of the human genome is noncoding,[2] while 20% of a typical prokaryote genome is noncoding.[3] Work by ENCODE (a part of the Human Genome Project) has shown this is not the case. Where only a small percentage of the genome is responsible for coding proteins, the percentage of the genome performing regulatory functions is growing.”
I would characterize it more as a “coordination” brake. Making any one thing more efficient or effective is fine, but most of what we want to tackle now has a long string of steps or dependencies, and we have to solve the whole string before we get the benefit