The end of Moore's Law?
Ever smarter, faster and cheaper silicon chips have driven the computing revolution but many believe this rapid pace of technological change is about to grind to a halt.
We take it for granted that mobile phones today do as much, or more, than the cumbersome personal computers we bought just a decade before. But many industry insiders believe silicon chip manufacturers are about to hit the buffers. And without the hardware to support them, computers may not continue to evolve at the astonishing rate to which we have grown accustomed.
Arranging transistors a thousand times smaller than a human hair on a silicon chip isn't easy but the ability to manipulate such miniscule entities is just one of the challenges chip manufacturers are facing and it's probably not the most serious.
When transistors are smaller than this, silicon starts to lose the very properties that make it so useful for building logic circuits. "It's like having light switches that are made from soggy pieces of pasta. They just don't work," says Rich Howard.
In 1965, Intel employee Gordon Moore predicted that the number of transistors that manufacturers could fit on a silicon chip would double every two years. At the time a naive and rather glib remark that has since become known as Moore's Law and has driven the rapid pace of technological change that we've witnessed over the last four decades. But now even Moore himself is clear: this dramatic rate of progress must soon come to an end.
Roland Pease asks if this is really the end for Moore's Law, or is there something new around the corner to fuel the next technological step to smaller and faster devices.