Over the last few years, we saw numerous new battery technologies being proposed by different research institutes. There was news of flexible Li-ion batteries and even batteries made from graphene. However, no one is really sure how long it would take to get these new technologies into commercial form and therefore there is no point in waiting for them at this point of time.
If you cannot increase the amount of charge available for your device, the next best thing is to increase the amount of work that it can do with the same amount of charge. That is why we decided to check with the top engineers in the industry how they have been enhancing battery life in their devices.
How to save power
There are umpteen areas that you can focus on to save power in a system. The major ones are discussed below.
Select the right component. While working on a project that focuses on operating with very low power consumption, it is of paramount importance to choose components that improve the power system performance within the constraints of the project’s current budget. One way to do this is to utilise power saving techniques on top of your low-power components.
Brian Chu, senior application engineer, Microchip Technology, says, “In addition to selecting low-supply-current devices, it is recommended that designers take advantage of various power-saving techniques such as load-based multi-mode power-conversion design, input-to-output bypass design, standby and the like while maintaining the performance level.”
Fortunately, all the latest components available play a part in helping engineers design more efficient circuits. The latest power switches offer very low resistance, standby and shutdown current, and also protect against current-overload and short-circuit conditions in a circuit. On the other hand, the latest low-dropout regulators (LDOs) make less noise and offer high ripple rejection and low standby current.
“Moreover, current microprocessor supervisory circuits offer voltage-monitoring, power-up reset and watchdog functions. A combination of these solutions will enable engineers to build more efficient circuits that would consume much less power. For embedded designs, engineers can utilise microprocessor supervisory circuits, power switches and LDOs,” explains Kay Annamalai, director of marketing, Pericom Semiconductor.
Another important aspect is the quiescent current of the components being used in the system. Quiescent current is the current flowing through the system or component when no load is being driven.
Abhishek Kumar, business development manager-power management, Texas Instruments (India), shares, “Quiescent current is important because battery life is determined by the total current drain composed of quiescent current (mainly in standby conditions) and load current. Quiescent current consumption should be as low as possible in order to prolong the battery’s life.”
Improve your power system performance. Faster, lower-loss switching topologies such as zero current switching, zero voltage switching and sine-amplitude conversion are allowing significant improvements in power system performance.
“Reductions in the overall size of power components and increases in conversion efficiency are opening up opportunities for integrating power management solutions adjacent to their demanding loads, microcontrollers and FPGAs, right on the main PCB,” informs Andy Gales, vice-president, international sales, Vicor Corporation.
Tuning the code
Mobile devices typically have their operating systems tuned for better battery life by making several special features available to designers. Android utilises a system called ‘wakelocks,’ which is a set of patches to the Linux kernel that allow a caller to prevent the system from going to low-power state.
“If a novice programmer created a buggy code where it would acquire wakelock and then get into some loop somewhere, the device would hold wakelock and never go into standby mode,” explains Darshak Vasavada, CEO, Stamp Computers.
It is a myth that only hardware dissipates power, and software does not play any role. In all the embedded systems it is the software that controls the hardware and thus plays an important part in optimising the power of the system as a whole. Nitin Gupta, lead engineer, µEnergy Applications, CSR India explains where and how to optimise. “Optimise the memory accesses in the software, as more frequent are the access, more time is used up in switching high capacity bus system and also the memory system. You can also use parallel processing of the memory system to access more data in one access. Another tip is that all the IPs should be maintained in no-clock/lower-clock, if they are not required in the application so that the power dissipation is least, but it’s a trade-off with performance.”
Power system designers are demanding even higher input voltage ranges from their component power suppliers. Vicor recently introduced a wide-input-range zero voltage switching buck regulator that accepts inputs up to 36V, yet provides a 1.8V, 15A output at efficiencies typically exceeding 90 per cent—all in a 14×10×2.6mm3 SIP. The technology to achieve this performance was developed in-house and utilises a clamp-switch to minimise losses in switching the main FET.
Today, battery-system designs typically have at least one of the following integrated circuits (ICs), in addition to the system’s main chipset (if applicable):
1. Power management unit
2. Microcontroller unit
3. Battery management unit
The power management unit offers regulated voltage or current in a system.
“Some voltage regulations are also integrated into the main chipset. However, a monolithic converter is still required due to the complexity of the layout, EMI concerns and lack of performance, including an insufficient amount of power-output channels or load capabilities, etc. Monolithic voltage converters are available with single output and multiple outputs,” adds Brian.
To know how to select low-power MCUs for your project, you may please refer to “How to Select A Microcontroller” article published in Dec. 2012 issue of EFY.
Computing elements and scaling clock rates. Power consumption by processor cores can be controlled in a big way by utilising the various sleep modes available. Advanced Micro Devices’ Larne reference platform is projected to measure APU power at 1.2W during idle mode, 1.4W during Web browsing, 2.35W during video playback and 0.02W during a system’s sleep state.
Abhishek advises design engineers to keep in mind many factors in their system architecture and design during development.
“More specifically, engineers must take well-informed system architecture decisions, choose optimal components and provide microcontrollers that optimise functionality, resulting in a longer battery life. This is definitely much more complex than simply estimating how much current the CPU consumes when active. Depending on the application being developed, standby current, peripheral current or data logging current may have a more significant impact on battery life than CPU power,” he shares.
The second aspect that increases the battery life is dynamic voltage scaling. “It is a framework to change the frequency and/or operating voltage of a processor based on system performance requirements at a given point of time. Power devices with dynamic voltage scaling as an inbuilt feature will help lower power consumption of the CPU in mobile devices,” adds Abhishek.
“An increase in clock speed or bus-width increases power consumption in direct proportion. Therefore design the system to run as slow as practical in all the operating modes that you have decided,” advises T. Anand, MD, Knewron.
Next, check whether the processor is static or dynamic. “Dynamic core processors are not usually made for low-power applications as they don’t respond well to slow or stopped clocks. If the datasheet says ƒosc = 0, you can slow down as much as you want,” Anand explains.
These idle modes ensure that the system merely sips power when not in use while at the same ensuring sufficient power for intensive number-crunching sessions. The silicon itself is built with a technology that ensures ultra-low power consumption.
“Engineers have now implemented an effective clock circuit strategy to ensure that unused processes can be shut down and consumption kept to a minimum. Moreover, in our pursuit of better performance, we have moved away from proprietary cores towards ARM cores. This has enabled us to have different products, each targeted towards high performance, standard low power and even ultra-low power,” explains Affa Au, director-marketing, South Asia, Fujitsu Semiconductor Asia (Pte) Ltd.
“While slowing down strategy is going to save power, it sure will affect computing capability negatively. Having clock speed control on-the-go can help a lot in such cases, where device remains slow or static in power saving/down mode and speeds up only when necessary,” adds Anand.
Energy efficiency requires more of an evolutionary than revolutionary approach. “In a portable environment such as a handheld industrial test gadget, it is highly desirable for the device to last a complete shift on a single charge and so enabling low-power ‘drowse’ modes when not in use is sensible. If it takes the device 0.5 second to come back up to operational speed, that is likely acceptable. Other industrial devices that have real-time control functions like a robot are more safety-oriented. The controller part probably consumes a small percentage of the energy use compared to the motor drives and so controller power saving is less desirable,” explains Brian Brown, vice president of marketing, embedded computing, Emerson Network Power.
At times it also becomes necessary to remove certain popular features or at least replace them with good alternatives.
“As people tried to make devices cheaper and low-power especially on the consumer side, all that the traditional Adobe Flash provided was a heavier stack to implement in embedded processors. It runs about 2.5 million lines of code on the Flash player, and that does take quite a bit of computation power. This, along with the resulting impact on battery life, is also a concern. It was one of the main reasons for Adobe to discontinue support for Flash on the embedded side and support HTML 5 that is gaining a lot of acceptance today,” explains Mrinmoy Purkayastha, associate vice president-marketing, and Somenath Nag, director-business development and marketing, ALTEN Calsoft Labs.
Since the Internet of Things is already here, designers also need to consider the penalties of having an always-connected device.
Pratul Schroff, MD of eInfoChips, advises, “In these cases, one thing that is being done is to scale down the processing speed or the clock speed, so that they only need low power. Typically, the operating frequency in a modern mobile device is 1 GHz but we scale it down to 20 MHz at other instances where processing power is enough to handle the data but the power consumption remains controlled. Thus we run at 1/50th of its standard speed, and by doing so reap the benefits of lower power consumption.”
Often, the connected peripherals are also to blame for inadequate battery life. Milind Gandhe, associative vice president, semiconductor business line, explains how Sasken Communication Technologies ensures low power consumption in its designs: “Power consumption of the device is directly proportional to the leakage current. So we make sure that a lot of our designs are basically centered at minimising leakage current. A key challenge to reduce power consumption in these systems is to ensure that the power drawn by the peripherals is low.”
You can influence your system’s uptime
It’s time designers stopped pointing to battery technology as the reason for awfully low uptime on their devices. Like Stephen R. Covey once said, “Proactive people focus their efforts on their circle of influence.” Designing an efficient circuit is under your influence. So go make it count.
The author is a tech correspondent at EFY Bengaluru