Commentary

No More Moore? Don't Be So Sure

The chips are down for Moore’s Law,” read the headline in last month’s Nature.  Finally, the semiconductor industry was going to admit “what has become increasingly obvious to everyone involved,” and the party will be over. After 50 years of riding an exponential wave, we’re maxing out on our ability to double the number of transistors on a chip.

From the Nature article: “The doubling has already started to falter, thanks to the heat that is unavoidably generated when more and more silicon circuitry is jammed into the same small area… Top-of-the-line microprocessors currently have circuit features that are around 14 nanometres across, smaller than most viruses. But by the early 2020s… ‘we'll get to the 2–3-nanometre limit, where features are just 10 atoms across…’ [A]t that scale, electron behaviour will be governed by quantum uncertainties that will make transistors hopelessly unreliable.”

advertisement

advertisement

So that’s it. No more buying a new computer every other year, twice as good for the same price. No more going from a megabyte to a gigabyte to a terabyte in what seems like weeks. No more transformative shifts in technology at warp speed.

Except, of course, that that’s not what’s happening, and that’s not it, not by a long shot. I mean, that may be it for Moore, but not for the rest of us. All we need is a simple shift in perspective.

Instead of looking specifically at transistors on a chip, as Moore did, we should be looking at how much computing power we can generate for how much money. Ray Kurzweil called it the “price-performance of computing,” and it refers to the number of instructions per second you can buy for $1,000.

Turns out this price-performance malarkey was following a doubling curve long before Gordon Moore was even a glint in his mother’s eye. Sure, it may have taken different forms -- in the early 1900s it was electromechanical punch cards, in the 1950s vacuum tubes -- but every time we hit the limits of a particular technology, a new one comes along, and the result is the same: You buy a new computer every other year, twice as good for the same price.

“But Kaila,” I hear you protesting, “We’ve never reached this kind of limit before! We’ve never had to come up with solutions beyond the scale of the nanometer!”

No, we haven’t. But remember, we’re not talking about scale anymore. We’re talking about what we can get and how much it will cost. Which likely means we’re talking about an entirely different computing paradigm.

What will that paradigm be? Nobody knows yet.

In November, a team of researchers from the University of Sheffield and the University of Leeds published a study showing that certain kinds of sound waves can move large amounts of data with very little power.

Last month, Motherboard reported on a startup called Koniku, “integrating lab-grown neurons onto computer chips in an effort to make them much more powerful than their standard silicon forebears.”

And earlier this week, nanotechnologists at Lund University in Sweden announced a way to use cellular organelles for parallel computing, which could “shrink a supercomputer to the size of a laptop.

Koniku founder Oshiorenoya Agabi summed up the phenomenon perfectly. “The fact that we’re constantly getting increases in computational power, that law of computing holds for the last 2,000 to 5,000 years. Moore’s Law is a little patch, it’s only a little piece of that law… One of these two laws is going to have to give in, and I suspect that Moore’s Law is the one that’s going to have to give in. But our power to calculate faster and faster -- that law is here to stay.”

2 comments about "No More Moore? Don't Be So Sure ".
Check to receive email when comments are posted.
  1. Nicholas Fiekowsky from (personal opinion), March 4, 2016 at 1:05 p.m.

    Cloud computing is delivering lots more compute capacity at lower price. Two factors at work:

    - Cost-effectiveness of low-friction shared resource. Price out a home generator to provide power during an outage. It's clear that utility generation, where you flip a switch to get the power you need when you need it, has lower unit costs and capital costs. Same thing when you pay Amazon Web Services for the computation and storage you need, when you need it. Even NetFlix finds it a compelling bargain. BTW - this radically changes economics for web start-ups. No need to dump millions into building, filling and running a data center. Then growing it ahead of demand.

    - Efficiency gains from cloud-scale data centers. Even small efficiency and performance gains mean a LOT when you run millions of servers in dozens of data center. Amazon, Google and Microsoft are partnering, or replacing, technology vendors to deliver more service for lower cost.

    We may not be buying new devices as often, but we're getting a lot more computing done at very low cost.

  2. Paula Lynn from Who Else Unlimited, March 4, 2016 at 8:21 p.m.

    The faster and more efficient they are, the more dangerous they become and the faster they can destroy what has been built and grown. Cool without caveats. 

Next story loading loading..