Moore’s Law Can’t Last Forever—But Two Small Changes Might Mean Your Phone Battery Will

Georgejmclittle/Shutterstock.com

It’s true the pace of accelerating computer power is slowing. It’s also true this slowdown is a problem.

In 1965, when computer technology was in its infancy, a pioneering computer engineer named Gordon Moore wrote a paper that shocked technologists at the time. Moore’s theory was that the power of computers would double every 12 months while the cost of that technology would fall by 50 percent over the same time. And so, for 40 years, what became known as Moore’s Law remained pretty rock-solid.

But these are hard days for Moore. Last year Intel, the computer-chip maker Moore cofounded, said the rate at which it was doubling processing power had slowed to 30 months. In May 2017, the MIT Technology Review ran the headline “Moore’s Law Is Dead.”

It’s true the pace of accelerating computer power is slowing. It’s also true this slowdown is a problem: Many of the next-generation products we’ve been promised depend on faster, more powerful, less expensive chip hardware, and their progress has been modeled off the assumption that Moore’s Law will hold true. Advances in virtual reality, artificial intelligence, self-driving cars, medical and genetic engineering, and even the newest smartphones will be delayed significantly if the exponential incline continues to erode—or stops altogether.

But to misquote Mark Twain, the report of its death may be greatly exaggerated.

How much smaller can a microchip get?

Moore’s Law isn’t dead. But it isn’t looking good, either. And if it’s going to be resuscitated, engineers and product designers have to adjust where they look for new breakthroughs.

“Have to” isn’t a suggestion—it’s physics. Computer engineers have squeezed better performances from chips for years by shrinking their size, but this strategy has run its course. In chip design, we’re smacking our heads against the walls of physics and geometry: It is incredibly difficult, as a practical matter, to get smaller.

Contemporary chip design has cut the space between the component parts of a chip down to a dozen or so nanometers. For nonengineers, you could cut the thinness of a single sheet of paper, which is about 0.1 millimeters thick, into 100,000 nanometers: The spaces inside chips are now roughly the size of 1/8,000 of a sheet of paper.

And while it’s possible to shrink those sizes further, down to about seven nanometers, the industry estimates that just developing a prototype of a 7nm chip would cost $100 million, and there are only three companies on the planet capable of even attempting it: Taiwan Semiconductor Manufacturing Company, Samsung, and Moore’s Intel. The latter just announced it was putting $9 billion into the 7nm processer, which will take at least four years to develop.

At 7nm, we’re done. There just isn’t more juice to squeeze from smaller spaces. So after that, getting better performance from our computing technology will come down to how well we can innovate in two other areas: heat management and power density.

Heat and power issues are design and device killers. They are also fatal to innovation. Locked in by size limits and handcuffed by heat and power issues, we’re at a virtual standstill.

Step 1: Not getting a hot head

To have any chance of regaining the pace of advancing computing power, we must push the boundaries of heat management. Think of it this way: To get faster cars, we need more powerful engines and better tires. But right now, nearly everything we do to make the engine better blows out the tires.

Heat issues have already stalled some computer-engineering advances such as stacking, a design solution by which parts of a computer system such as the processer, the memory and the power source are stacked atop one another. This shortens the distance commands and power must travel within a machine, saving energy and increasing processing speeds.

But while stacked components are faster, they generate more heat together than they do apart. Their proximity severely limits engineers’ ability to maintain workable, safe temperatures. As a result, chip makers Qualcomm and Intel have already ditched the stacking idea. Babak Sabi, director of assembly and test development technology at Intel, told the trade journal EETimes, “No one has true stacking of memory on logic and unless someone comes up with a thermal solution … I don’t think anyone’s going to use it.”

Old heat dissipation technologies relied on copper and aluminum tubes and plates to conduct and spread out heat. But those tubes and plates are heavy, which makes them inefficient in products such as laptops, cellphones and cars. They are also rigid and inflexible, which makes them design nightmares—you try designing a sleek, sexy smartphone around a copper plate.

The good news is that because heat technology is blocking progress on overall computer performance, it‘s evolving quickly. The heat solutions of tomorrow will likely include gels, pastes and newly designed flexible fibers instead of heavy, rigid materials. For example, NASA is currently testing a new, light, flexible heat-dissipation material that looks and feels like velvet.

Step 2: Getting more bang from your power buck

If heat issues have hobbled Moore’s Law, power density issues have downright crippled it.

Power density is the amount of power that can be drawn from a set amount of space. Greater power density provides more power for longer periods of time from a battery of the same size. To return to the race car analogy, if computer processing is the engine and heat management is the tires, power density is the fuel.

Our computers and other electronics have been getting faster and stronger, requiring more and more power in less and less space—but our battery technology is only inching along. As the Samsung Galaxy 7S Note can tell you, even the slightest mistakes balancing greater power demands and tighter design specifications can be catastrophic.

The energy density issue has been a giant stop sign for any next-generation computer products that move, such as robotics, drones, space exploration devices and electric vehicles. For those realms, power density is everything. For more casual consumers, the lack of power-density improvement is why it feels like your cellphone battery drains so quickly—because it does.

To complicate matters even further, energy density and heat management are related problems. Storing energy, charging batteries and drawing power all generate heat. So every time engineers push one boundary, something on the other side gets a tad more complicated.

The future of chip technology

It’s not all doom and gloom, though. I’m highly optimistic that scientists and engineers will make Moore’s-Law-type strides in heat management and power density very soon. One of the reasons I have faith is because the incentives to overcome these technical, engineering, and design challenges are consumer driven. Customers want batteries that last longer and laptops that don’t get too hot; they prioritize thinner, lighter products over processing power. So when it comes to risk/reward business decisions, there are considerable economic rewards to be had in getting heat and power right.

Another cause for optimism is that the innovation slowdown has created slack in the capacity chain, which means that for every step forward we take in thermal or energy technology, we may unlock a corresponding advancement elsewhere.

When that happens, the rush of new products and technologies will be fast and furious—reinstating and destroying Moore’s Law at the same time. Technological progress may not be linear, as Moore predicted, but it just might end up being even more exciting.