Moore’s Law Could Be No More

Moore’s law is already slowing down and will collapse within a decade according to Michio Kaku, a theoretical physicist. He believes physical limitations are already beginning to come into play.

Exactly what “Moore’s law” means is a somewhat vague term, and the definition has varied over the years since Gordon E Moore wrote in a 1965 issue of Electronics Magazine ” The complexity for minimum component costs has increased at a rate of roughly a factor of two per year… Certainly over the short term this rate can be expected to continue, if not to increase.”

In other versions, including later tweaking by Moore himself, the general theme is that computing power will double roughly every two years, based on the idea of being able to fit twice as many transistors on a same-sized circuit. Intel has tweaked that further to an 18-month period based on the transistors getting further.

Whatever the details, it’s been more of a principle than a law, though certainly one that seems to have held up over the past 50 years or so. (See above, via Wikipedia)

Now Kaku says that as things stand, Moore’s law will “flatten out completely” over the next 10 years. That’s based on the idea that we are beginning to hit the physical limits of silicon. Specifically, if transistors get any smaller, the inherent problems of overheating and leakage will eventually outweigh any gains.

Kaku believes we’ll be able to stave off the decline with workaround for a while yet, including parallel processing and even three-dimensional chips. Ultimately he believes we’ll have to switch to completely different concepts of computer processing, such as molecular computing (effectively using molecules to create a valve to replicate silicon transistor switches at a smaller scale) and even quantum computing. However, the latter option likely won’t be viable until near the end of the century according to Kaku.


7 Responses to Moore’s Law Could Be No More

  1. moore's law may only work on transistor count, but it doesn't take into account differences in software and computing power of those physical chips. The memristor will change everything. Computers will begin to have permanent memories which they can recall like humans instead of doing the calculation from start to finish. Silicon might reach density and speed limitation, but the way we calculate things & and prohibitive cost of manufacturing change might extend the life of this old friend for much longer than we anticipate or want.

  2. If you read his book, Physics of the Future: the Year 2100, he dicusses this in detail including looking at quantum computing and the current ones in operation. I think the one he discusses was at MIT. I forget, I read it a while ago. But he also discusses energy, nanotechnology, space travel(including sci-fi’s fave of space elevators), and transportation. I do definately recommend that you get it. It was a great book.

  3. A workaround will be developed. There was a similar issue a few years back where manufacturers ran into a problem with creating the die with which to stamp the procs. The tech they had at the time was reaching a point where it couldn't carve anything finer than a certain point. The solution ended up being creating a new type of laser with which to carve the die that could cut on a much finer scale, and thus progress continued. I understand that this is referring to the physical limitation of silicon itself, but they'll find a way around it. If nothing else, we can just start building multi-proc systems (essentially like servers already use) until a new more efficient proc is finally devised.

  4. Eventually, there MUST be an end to Moore's Law… That's just common sense. But to say that's it – we're there because of what I saw in the last 5 years – that's myopic. In it's generic form, Moore's Law applies to price-performance of computation technology, and we're not even close to exhausting computational limits of materials we have available to us at rock bottom prices. Parallel computing where each core costs less doesn't necessarily change the tech, but it does change price-performance. And buys us enough time to figure things out. Just because he doesn't know what's next, doesn't mean it's not there. Usually the front runner solutions end up being the wrong ones. Same as saying "vacuum tubes are the cutting edge, but eventually you'll run of of room for the tubes in the box and the heat will burn everything out". Cloud computing that makes arbitrary amount of computing power available to individuals on demand for fraction of costs of today's cost of ownership of a computer is another solution. We may reel in an asteroid that will be so rich in precious metals and quartz that cost of manufacturing electronics will plummet overnight. We've just recently tapped the quantum physics field and we're already talking about quantum computing. Why not string computing? Why not fabric of space computing? It may end up being gibberish or it may end up being the de-facto computing method in 50 years' time. Point is, while Moore's Law must eventually end, our understanding of physics at the level of ultra small computation is still incomplete to rule out potential of HUGE breakthroughs in price-performance.

  5. I liked all of the written posts because I had a bit too much to drink(silence in space for half an hour…)
    but seriously I think everyone saw this coming, that working with technology that is relevant today will not work for the needs of tomorrow or next year. When you think "oh its impossible to fly to the moon in one-point-two-eight-two seconds" you are thinking in today's technology, what can be possible by today standards. You will laugh, and scoff off the mere concept of it being probable…..but not knowing of tomorrow's discoveries, that is human.

  6. All this talk about molecular and quantum computers and most people are still using magnetic disks for their mass storage (myself included).

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.