- It's continued at a similar rate since then. That's odd, given that microchip design and manufacturing has changed so much. It's gone from a small number of researchers working on a little-known technology, to a huge business that the world depends on, pushed by a duopoly of multinational tech companies.
- People have been predicting it demise for a variety of reasons - both technical and economic - for years. And yet, like a microchip-powered Timex, it just keeps ticking. Even Moore himself predicted that it would eventually be no longer economically feasible to keep upgrading manufacturing equipment to keep making chips smaller and faster. That was almost twenty years ago.
So claims of the end of more Moore's law are like the predictions of a bad psychic: if we make enough of them, we'll eventually be right.
And that's led to the strangeness of people celebrating the anniversary of Moore's Law with discussion of what happens when it ends. It may not seem like a big change, but for many of us, constant and fast growth of computing power is all we've ever known. Our devices have always been getting faster and more powerful. But what if our kids just used the same apps we do now? One article posited that we'll have heirloom computers handed down from one generation to another. I don't see that happening; if new computers are no better than the old ones, we'll just start emphasizing their design and style.
But lack of computer improvement could be positive: without constant power increases to cover up bad programming and design, there'd be more pressure to get software right. And we the users would probably start to refine the way we use computers into something more effective and intelligent, instead of just bouncing from one start-up service to another. Gaming would surely improve too, since the developers couldn't just rely on greater realism to convince audiences to replace old games.
But seriously, computer engineers, keep the new chips coming.
No comments:
Post a Comment