by Mike Masnick

Filed Under:
moore's law

Moore's Law: Dead Again, According To The Press

from the haven't-we-heard-this-before dept

Every couple years or so, the press goes nuts, after some renowned chiphead says something along the lines that "Moore's Law" is dying. But then you look at the details and it's all rather meaningless. So... with reports coming out, yet again, that someone is claiming that Moore's Law is reaching its end, there's still little to worry about. First of all, Moore's Law was always more a rule of thumb (and has been defined multiple different ways by Moore himself). The definition that many people attribute to Moore has never really been accurate, anyway. But even the guy quoted in this article notes that we'll just move on to somewhat different technologies to continue the inevitable march forward to more and more powerful computer chips. So, once again, it seems that the death of Moore's Law isn't nearly as big a deal as it's been made out to be.

Reader Comments (rss)

(Flattened / Threaded)

  1. identicon
    Lawrence D'Oliveiro, Apr 13th, 2009 @ 10:45pm

    But ...

    ... don’t forget Rock’s Law.

    reply to this | link to this | view in thread ]

  2. identicon
    Barry, Apr 13th, 2009 @ 11:05pm

    then again...

    ...newspapers might be dead long before Moore's Law.

    reply to this | link to this | view in thread ]

  3. identicon
    Dan, Apr 13th, 2009 @ 11:05pm

    Keen insight?

    reply to this | link to this | view in thread ]

  4. identicon
    Dan, Apr 13th, 2009 @ 11:13pm

    Keen insight?

    This wit comes from an industry that can't figure how to sell their product in an evolving technology environment. Moore's law was an offhanded comment that, in retrospect, proved amazingly accurate. The question is, was it a lucky guess or a truly insightful prediction.

    reply to this | link to this | view in thread ]

  5. identicon
    Moore's Law, Apr 13th, 2009 @ 11:30pm


    Rumours of my death have been greatly exaggerated.

    reply to this | link to this | view in thread ]

  6. identicon
    Relonar, Apr 14th, 2009 @ 2:33am

    yeah, but so much effort is being pooled into research to shrink sizes (down even to the level of manipulating individual atoms and building gates out of them, some absolutely amazing processes). Optimizing tech will come out as the chips continue to shrink. The fab processes that are used on the "leading-edge" chips will fall towards manufactures of the designers of the "everyday applications" as they become cheap enough for those uses.

    As far as Moore's Law goes...it really doesn't matter. The engineers, physicists, and material scientists will continue to push the envelope. It's just the nature of things.

    reply to this | link to this | view in thread ]

  7. identicon
    Andrew, Apr 14th, 2009 @ 3:06am

    Moore's Law will keep on going ..

    One reason why I'm fairly confident we haven't yet reached a limit for Moore's Law is that we already have a massively-parallel-processing computer capable of ~100 Teraflops, using 20 watts of power and taking up only about 1600 cm2. It fits nicely into a human skull ..

    Maybe the pace of advances in computing power will slow - but I don't think we've reached any kind of fundamental limit ..

    Lausanne, Switzerland

    reply to this | link to this | view in thread ]

  8. identicon
    Smeg, Apr 14th, 2009 @ 4:43am

    Re: Moore's Law will keep on going ..

    Multiprocessors are the proof of the limit. Try running enterprise TCP on a multiprocessor. You'll soon be pinning a process to each cpu.

    reply to this | link to this | view in thread ]

  9. identicon
    SteveD, Apr 14th, 2009 @ 4:45am

    A friend of mine is convinced the very existence of Moore's law is conclusive proof that the Singularity is coming.

    I for one welcome our new silicon masters...

    reply to this | link to this | view in thread ]

  10. identicon
    Jonathan Strickland, Apr 14th, 2009 @ 4:46am

    Factors in Moore's Law

    Moore's law isn't just about innovation -- it's about economics. As the number of transistors on a semiconductor chip (or the power of a microprocessor) has doubled, the cost of designing new chips has increased. At this point, a research and development facility can cost billions of dollars. In the middle of a recession, it's hard to justify that kind of expense. Add to that the fact that consumers are starting to move away from buying the latest and greatest computers to things like netbooks and you've removed a lot of the incentive companies had to push innovation. It remains to be seen if the average customer will be satisfied with what we'd traditionally refer to as underpowered computers that connect to the cloud. If they do, I'd imagine Moore's Law would effectively come to an end. We'd still make advancements, but not at the exponential rate we've seen over the last several decades. But then, I'm one of those wacky journalists referred to in the original post.

    reply to this | link to this | view in thread ]

  11. identicon
    Jerry Leichter, Apr 14th, 2009 @ 6:35am

    Moore's Law has *already* come to an end

    At least in the sense it was usually taken. Moore spoke about the number of transistors on a chip. At the time he originally wrote, the number of transistors you could fit on a chip limited basic functionality - you needed multiple chips to create a full-function CPU, and indirectly that (among other things) limited the speed of the chips. It also directly limited memory sizes.

    Memory sizes are still limited by the number of transistors on a chip, and always will be. We're getting close to fundamental limits here - with all available technologies you need at least one electron per bit! - though perhaps new approaches (spintronics?) will help.

    But for CPU's, things have changed repeatedly. We long, long ago passed the point where we could put any CPU functionality we wanted onto a chip. We then spent a couple of decades using additional transistors to make that CPU faster - with pipelining, super-pipelining, caching, and so on. And we used the corresponding decrease in feature sizes to increase clock speed. Both of those approaches have been pretty much played out for a number of years, mainly because we can't cool the damn things. So we've instead used the extra transistors to support multiple threads per CPU, then multiple CPU's per chip. That can keep going for a long time from a hardware point of view, but to make good use of it, we need advances in software - and those have been proceeding at nothing like Moore's Law rates. We know how to use tens of CPU's for graphics - but graphics engines feed human eyes, and those have limits. For some specialized algorithms, we can use hundreds, sometimes thousands, of CPU's for non-graphics computation - but most multi-thread/multi-core uses are for running many instances of HTTP responders and such things. The "Moore's law" advances we've talked about - in which the processor you could buy next year would noticeably improve what you were running today, pretty much whatever it was you were running today - is a thing of the past.

    Has improvement halted? Hardly - but it's moved to other places. "Computes per watt" is the new measure we look at. Intel is building new chips with more transistors - but it's also building huge numbers of chips - the Atom - that use many few transistors, but much less power (and provide many fewer "computes"). There's tons of room for advance here - Atom is too slow and *still* uses too much power, relative to demands we can identify even today.

    Now, you can batch all semiconductor improvements under the name "Moore's Law" and then safely say "it's not dead yet." But that's not really helpful in understanding what will be produced and how it will be used. The direction of evolution of chips has fundamentally changed, because the previous direction has lead to, not quite a dead end, but certainly an area where advances are much more expensive and slow in coming. New directions have appeared, and they are enabling entirely new classes of products.

    Moore's Law is dead - and that's a good thing! Just look at your smartphone to see why.....

    reply to this | link to this | view in thread ]

  12. identicon
    Anonymous Coward, Apr 14th, 2009 @ 7:59am

    Re: then again...

    Moors Law is only dead when used to describe the newspaper industry

    reply to this | link to this | view in thread ]

  13. identicon
    Sean, Apr 14th, 2009 @ 8:25am

    Re: Factors in Moore's Law

    If Internet2 ever comes to town then that might end Moors Law but only since it will allow a user to borrow CPU cycles from idle computers.

    I feel the other thing that would slow the "Law" is manufactures moving from increasing speed and adding cores (once the gain becomes minimal) and moving to efficiency. Or focusing on being able to reprogram registers on the fly so data will not have to make as many passes to complete the task at hand.

    reply to this | link to this | view in thread ]

  14. identicon
    swag, Apr 14th, 2009 @ 12:27pm

    Roger Moore was never a good Bond anyway.

    reply to this | link to this | view in thread ]

Add Your Comment

Have a Techdirt Account? Sign in now. Want one? Register here
Get Techdirt’s Daily Email
Use markdown for basic formatting. HTML is no longer supported.
  Save me a cookie
Follow Techdirt
Insider Shop - Show Your Support!

Report this ad  |  Hide Techdirt ads
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Chat
Report this ad  |  Hide Techdirt ads
Recent Stories
Report this ad  |  Hide Techdirt ads


Email This

This feature is only available to registered users. Register or sign in to use it.