Low-power chip will last decades on a battery

New microprocessors, being tested now, will aid the Internet of Things IoT through frugal power consumption. Soon, we may never change batteries again.

Water bottle lids blurred 178385003
Credit: Thinkstock

For years we've been obsessed with increasing chip processing power. Intel's i386, launched in 1985, followed by the i486 in 1989, introduced economical multitasking and number crunching to the enterprise.

In the following years, the chips got more powerful still, culminating with today's hundred-dollar smartphone threatening the PC.

It could be argued that we've reached an acceptable level of multitasking and personal computing power for cost. We've found it in small-form-factor smartphones, and it may be all we really need now.

Just as well then, because this processing power as the chip's holy grail is about to be completely replaced by another requirement: battery life.

Internet of Things

The Internet of Things (IoT) is about to reverse a lot of what we've wanted in a chip.

Soon, we won't need vast amounts of calculations per second—just how many instructions does it take for your fridge to send an order to your supermarket? Not that many when you compare it to something complicated that chip design has been working towards, like a Computer Aided Design drawing in 3D, for example.

Battery life

Size is important. However, the real big issue, when it comes to a ubiquitous IoT where everything is connected, will be battery life.

The reason is that we are not going to want to change the batteries within the base of a dozen bottles of water that we may have sitting around just to discover whether we've drank their contents or not. Even if your fridge orders fresh stock, it wouldn't be worth it.

Same with a dozen or so planter pots sitting in the yard. Great, they talk to the sprinkler system. Very eco-friendly, and maybe they score a 10 out of 10 on the carbon footprint elimination scale, but it's not so great if you've got to change dozens of batteries, even annually.

That battery has to last the life of the connected object in the IoT. And that could be 10 years away, possibly longer.

Decade-lasting chips

Chip-maker Atmel reckons it has a solution. It says its new 32-bit ARM-based chips will last decades. Note the plural.

Atmel says its new chips combine battery-saving low power with flash and SRAM that is big enough to run both the application and the IoT-needed wireless stacks.

In its marketing, Atmel proffers IoT use scenarios such as "fire alarms, healthcare, medical, wearable, and devices placed in rural, agriculture, offshore and other remote areas."

Wearable

Along with IoT, wearable is a key word here. Atmel says that its chips, called SAM L21, are so low-power that they can be run off energy captured from the body.

Sean Gallagher, writing about the SAM L21 chip for Ars Technica, says that the manufacturer demonstrated that human energy-sourcing with this chip at CES in January.

In the article, Gallagher also says that an Atmel marketing person told him that the SAM L21 chips were 50% more efficient than other low-power microcontrollers when comparing microamps per MHz.

Consumption

In fact, power consumption in these chips is 35µA (microamperes) per MHz in active mode and 200nA (nanoamperes) in deep sleep mode. There are a million microamperes in an amp, and a billion (1,000,000,000) nanoamperes in an amp, to give you some perspective.

A traditional laptop charger ordinarily uses a little over three amps, for comparison.

Sleep modes

How is Atmel doing it? It's mainly creating efficiencies in sleep modes, where power has leaked when it doesn't need to be used. That's the main way that this chip saves power, and thus provides the battery savings.

This article is published as part of the IDG Contributor Network. Want to Join?

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Must read: Hidden Cause of Slow Internet and how to fix it
Notice to our Readers
We're now using social media to take your comments and feedback. Learn more about this here.