Friday, April 17, 2015

It Keeps Going And Going And...

This week was the fiftieth anniversary of Moore's Law. If you're not familiar with it, Gordon Moore is an engineer who was involved in the creation of the first microchips. Early on in the process, he noted that their progress in making newer and improved chips was going at a steady rate of improvement: the amount of circuitry they could fit on a chip doubled every 18 months. The concept has become notable for a couple of reasons:
  • It's continued at a similar rate since then. That's odd, given that microchip design and manufacturing has changed so much. It's gone from a small number of researchers working on a little-known technology, to a huge business that the world depends on, pushed by a duopoly of multinational tech companies.
  • People have been predicting it demise for a variety of reasons - both technical and economic - for years. And yet, like a microchip-powered Timex, it just keeps ticking. Even Moore himself predicted that it would eventually be no longer economically feasible to keep upgrading manufacturing equipment to keep making chips smaller and faster. That was almost twenty years ago.
But the weird thing about Moore's law is that for all the false predictions of doom, we know it has to end eventually. If we just keep making the circuitry on chips smaller and smaller, eventually, we'll get to the point where the wires are just a line of individual atoms, and we can shrink it no more. And thanks to the weirdness of quantum mechanics, it would stop behaving like any normal wire long before we can get it that small.

So claims of the end of more Moore's law are like the predictions of a bad psychic: if we make enough of them, we'll eventually be right.

And that's led to the strangeness of people celebrating the anniversary of Moore's Law with discussion of what happens when it ends.  It may not seem like a big change, but for many of us, constant and fast growth of computing power is all we've ever known.  Our devices have always been getting faster and more powerful.  But what if our kids just used the same apps we do now?  One article posited that we'll have heirloom computers handed down from one generation to another.  I don't see that happening; if new computers are no better than the old ones, we'll just start emphasizing their design and style.

But lack of computer improvement could be positive: without constant power increases to cover up bad programming and design, there'd be more pressure to get software right.  And we the users would probably start to refine the way we use computers into something more effective and intelligent, instead of just bouncing from one start-up service to another.  Gaming would surely improve too, since the developers couldn't just rely on greater realism to convince audiences to replace old games.

But seriously, computer engineers, keep the new chips coming.

No comments:

Post a Comment