This post was co-authored by Pete Membrey, chief engineer – VPN technologies and Brendan Horan, senior principal system engineer at ExpressVPN
It was announced yesterday by Intel and the Gordon and Betty Moore Foundation that Gordon Moore passed away peacefully among friends and family at the age of 94 in Hawaii.
A pioneer in the field of microprocessor design and production, Moore was highly regarded and widely respected, having been presented a number of awards from some of the most prestigious professional societies in the world. He also received both the National Medal of Technology and Innovation and the Presidential Medal of Freedom in 1990 and 2002, respectively.
However, despite such acclaim and despite having co-founded Intel (yes, that Intel), it might be surprising to learn that much of his recognition didn’t actually stem from any of those accomplishments.
Rather it came from an idea he posited in 1965—an idea that captivated the field for literally decades, an idea so pervasive that it became less of a prediction and more like a fact, an idea that ultimately became known as a law.
Moore’s Law was responsible, at least in part, for setting the trend that ultimately gave rise to the vast amounts of processing power that we enjoy and drives so much of our modern civilization.
What is Moore’s Law?
It was a simple idea really, and at the time he shared it, it wasn’t considered especially groundbreaking. In fact it was just an observation based on his experience: Every two years or so, the number of transistors on a microchip would double.
It might not sound like much, but wrapped up in that one short sentence are a lot of implications.
First, to pull off the feat described in Moore’s Law, it would be necessary to make transistors smaller. Effectively, they would need to halve their size—and repeat that leap every two years. That by itself sounds like an incredible achievement, but really the implications are much greater than shrinking things down. You can’t just make transistors smaller. Smaller transistors require changes to be made to other supporting technologies as well. The desire to push the limits on numerous technologies every two years led to a cascade of innovations.
It’s not just a case of simply evolving existing technologies either. Sooner or later you start hitting fundamental laws of physics. When that happens, entirely new fields of science must be imagined and studied, new materials created, and new processing technologies engineered to support these new advancements.
In short, the implications of Moore’s Law are staggering: It didn’t just map the course of microprocessor technology but much of human scientific development in the many decades that followed.
Although Moore tweaked his predictions in 1975, Moore’s Law continued to hold true until well into the new millennium. Even though in terms of pure transistor count Moore’s Law isn’t the king it once was, its legacy is still felt in the industry today.
Pushing the limits to make a prediction come true
Moore’s Law remains interesting not so much because of what it predicted but because of the precedent it set. While initially at least it was an observation of how technology was evolving, it ended up being the benchmark that people (and companies alike) aspired to surpass. In effect, Moore’s law stopped being about predicting the future and more about laying out specific targets for business to hit.
It wasn’t just business though. It also spurred human creativity in finding unique ways to keep this law alive even as reducing transistor size became harder and harder. Humans found ways to double performance every two years regardless. This very human desire to achieve ever greater things is part of what makes Moore’s law such an integral part of our lives.
And so then, Moore’s legacy is not just about shrinking transistors, but about the human desire to do and be better.