Skip to main content

Run long and prosper: New research forecasts more bang from batteries

glowing green battery charging
Image used with permission by copyright holder

These days, almost every smartphone user has felt the pain of their battery giving up the ghost — often at the worst possible moment. Sure, maybe you didn’t have to play Angry Birds or Words with Friends when the battery was at an 89 percent charge, but now that it’s at 1 percent (with no USB or wall power in sight), you’re desperate to receive that critical text or email message.

We’ve all been there.

New research from researchers at the University of Michigan and Stanford University may help stave off these battery woes — at least, eventually. University of Michigan researchers have developed a new “subconscious mode” for smartphones and other devices that could enable continuous monitoring Wi-Fi networks while consuming only a tiny sip of power.

Plus, if history is any indicator, power ought to take future devices much farther than it does now. For the first time, researchers have established that not only does processing power of computers double roughly every 18 months (Moore’s Law), but the energy efficiency of computers doubles at the same pace. In other words, in a year and a half, devices will be able to do the same work they’re doing today with only half the battery power.

Both these development could have tremendous near-term and long-term implications for the future of mobile devices.

Subconscious mode for smartphones

One battery-draining bugaboo of smartphones and other battery-powered Wi-Fi devices (like media players, cameras, and gaming devices) is that even in power-saving idle and sleep modes, the devices are still monitoring nearby Wi-Fi traffic. This can be a surprisingly active process: in addition to examining essentially every packet to see if they need to act on it, Wi-Fi devices are often renegotiating their Wi-Fi connections and looking for clear channels as the local data and noise environment changes—and that can be ner-constant on networks with lots of traffic or that experience lots of interference.

University of Michigan computer science and engineering professor Kang Shin and doctoral student Xinyu Zhang decided to take a look at devices’ Wi-Fi behavior on real-world networks, and discovered devices can spend 60 to 80 percent of their time in power-saving modes performing “idle-listening” functions. Moreover, their previous work had established that phones in low-power idle modes consume roughly the same amount of power they do when they’re fully awake. Bottom line: Keeping an eye on local Wi-Fi traffic substantially impacts battery life.

So, the researchers looked at ways devices could monitor nearby Wi-Fi traffic in a more energy efficient manner, and came up with a new technology called Energy-Minimizing Idle Listening, or E-MiLi. E-MiLi takes a two-step approach to reducing power needs when monitoring Wi-Fi data. First, it slows the internal clock on devices’ internal Wi-Fi processors to just 1/16th of its normal speed, which means the Wi-Fi processors consume far less power — the reverse of over-clocking a processor in a gaming rig to get more performance. When the system notices data intended for the device, E-MiLi returns the Wi-Fi processor to full speed, enabling it to capture and process Wi-Fi data intended for the device.

The trick turned out to be noticing that inbound traffic when downclocked. “We came up with a clever idea,” said Shin a statement. “Usually, messages come with a header, and we thought the phone could be enabled to detect this, as you can recognize that someone is calling your name even if you’re 90 percent asleep.”

dell-streak-5
Image used with permission by copyright holder

In addition to slowing down the Wi-Fi processor, E-MiLi uses a new header (or, in Wi-Fi terms, preamble) that enables the Wi-Fi chip to detect data likely addressed to it by exploiting self-correlation in packets. In basic terms, the technique saves the Wi-Fi receiver the work of looking inside every packet for an address. An analogy might be that the receiver can tell with just a glance whether it needs to process a packet; if it sees one, the system cycles up to full speed to deal with it.

In testing, Shin and Zhang found that E-MiLi can detect packets with near 100 percent accuracy, even when running Wi-Fi receivers at 1/16th their normal speed. They say the technology is compatible with 92 percent of mobile devices on real-world mobile networks, and results in an average energy saving of about 44 percent. The same technology can be applied to other wireless protocols like ZigBee that have similar idle listening needs. At a glance, it doesn’t appear E-MiLi has any significant security implications: Its packet preambles don’t appear to be disclosing any information that can’t be easily decoded by a Wi-Fi sniffer anyway.

Unfortunately, there is a downside: since E-MiLi puts a new preamble on 802.11 packets, Wi-Fi devices would need new firmware to support the technology; E-MiLi can’t be rolled out with an operating system update. However, devices with E-MiLi would be able to co-exist on networks with devices that don’t support it.

Shin and Zhang will be presenting E-MiLi next week at the ACM International Conference on Mobile Computing and Networking; their conference paper is available online (warning: contains math!). Of course, the University of Michigan is already seeking patent protection for E-MiLi, but says it is looking for commercial partners to bring the technology to market.

Koomey’s Law

One long-standing trend in the history of computing is Moore’s Law, which holds that the number of transistors in an integrated circuits will double roughly every two years. The “law” is named for Intel co-founder Gordon Moore, but was probably first articulated way back in 1959 by Douglas Englebart — who, incidently, invented the computer mouse. Intel’s David House later restated the Moore’s Law as a doubling in computing performance — not just the number of transistors — every 18 months, owing to advances other than the raw number of transistors. Moore’s Law has been in effect since the 1950s, and technologists expect it to hold for perhaps another decade.

jonathan_koomeyMoore’s Law might be well and good for general computing, but it ignores a key limiting factor in mobile computing: power. At a basic level, more transistors and higher clock speeds mean processors consume more power, although in practice advances in manufacturing techniques (and downscaled integrated circuit designs) have prevented power consumption from doubling every 18 months along with computing performance.

Now, a new historical analysis by Stanford professor Jonathan Koomey establishes for the first time that the amount of power required to handle a particular computing load will be cut in half every 18 months. Dubbed “Koomey’s Law,” the finding is a near-perfect counterpart to Moore’s Law, and it’s backed by research into the power consumption of more than six decades of computing hardware — going all the way back to the U.S. Army’s vacuum-tube monstrosity ENIAC in 1946.

Interestingly, ENIAC pre-dates transistors — which means Koomey’s Law doesn’t just apply to microprocessors. “This is a fundamental characteristic of information technology that uses electrons for switching,” Koomey told the MIT Technology Review. “It’s not just a function of the components on a chip.”

The new research was conducted in collaboration with Intel and Microsoft, and is published in the IEEE Annals of the History of Computing. Koomey is the lead author, and co-authors are Microsoft’s Stephen Berard, Carnegie Mellon University’s Marla Sanchez, and Intel’s Henry Wong.

Koomey’s Law has led to some interesting speculation and comparisons. If it holds true, it could mean that a device like an iPhone that can run for (say) 24 hours today performing some task would be able to run for nearly three weeks in a decade’s time. Similarly, as Alex Madrigal points out in The Atlantic, a MacBook Air that gets 7 hours of battery life on a computing task today would have run for a mere 2.5 seconds on the same 50 watt-hour battery back in 1991.

Of course, these and all other battery-powered devices do more than plain-and-simple computing: They also expend power accessing memory and storage, driving their displays, and (as University of Michigan researchers have pointed out) keeping up with Wi-Fi networks. But Koomey’s Law does imply that, in not-too-many-years, charging a phone, tablet, or even notebook computer might change from once-a-day or once-every-other-day drudgery to a once-a-month reminder.

Top photo credit: Norebbo / Shutterstock

Topics
Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more