Learn about old and new methods and how innovations may change old habits.
Many batteries are equipped with a state-of-charge (SoC) gauge that shows the remaining charge. While this is helpful, the readout is incomplete without also tracking the capacity as the battery fades. The user may be accustomed to a battery that delivers full capacity, but this condition is temporary and cannot be maintained. Capacity is the primary indicator of battery state-of-health (SoH) and should be part of the battery management system (BMS). Knowing SoC and SoH provides state-of-function (SoF), the ultimate confidence of readiness, but technology to provide this information in an effective way is being improved.
Building a better BMS is a challenge when considering that we still lack a dependable method to read state-of-charge, the most basic measure of a battery. (See BU-903: How to Measure State-of-charge) Reading the remaining energy in a battery is more complex than dispensing liquid fuel. While a fuel tank has a fixed dimension and delivers fuel which can be measured with great accuracy, an electrochemical storage system reduces its size and the in- and out-flowing coulombs cannot be assessed with great accuracy as the battery ages.
The BMS also provides protection when charging and discharging; it disconnects the battery if set limits are exceeded or if a failure occurs. Established BMS standards are the SMBus (System Management Bus) used for mostly portable applications, as well as the CAN Bus (Controller Area Network) and the simpler LIN Bus (Local Interconnect Network) for automotive use.
Stationary batteries were among the first to include supervisory systems and the most basic is voltage monitoring of individual cells. Some systems also include cell temperature and current measurement. Recording a slight difference in cell temperature hints to a problem, and measuring the voltage drop of each cell at a given load reveals cell resistance. Dry-out, corrosion, plate separation, and other malfunctions can thus be identified.
Although the BMS is effective in detecting anomalies; capacity fade, the most predictable health indicator, is difficult to estimate because voltage and internal resistance are commonly not affected. The ability to read capacity fade from 100 to 70 percent would be valuable, but most BMS cannot do this effectively and the battery might be given a clean bill of health even if the capacity has dropped to 50 percent. Most BMS only respond to anomalies that lie outside capacity estimation, such as voltage differences among cells caused by cell imbalances and a change in internal resistance.
Some industrial and medical device manufacturers use a date stamp to determine the end of battery life, others observe the cycle count. While counting cycles may be simplistic, no convention exists that defines a cycle and some systems simply call it a cycle when the battery is charged. (See BU-501: Basics About Discharging.) Date-stamping has similar shortcomings in that it promotes premature replacement of batteries that are seldom used, while the heavy hitters may stay in service too long. (See BU-803: Can Batteries be Restored?) To reduce risk of failure, authorities mandate early replacement, and a two-year service life is common. Prolonged storage will give the batteries a very short working life.
Biomedical engineers are aware that most batteries are replaced too soon. iPhone owners have complained that their smartphones show 100 percent charge when the battery is only 90 percent charged. Even military leaders say that their battery arsenal for combat is so poor that many soldiers carry rocks instead of batteries. Effective battery management is either missing or is inadequate. Over-expectations with BMS are common and the user is stunned when stranded without battery power.
Let’s look at how a BMS works, note the shortcomings and examine up-and-coming technologies that could change the way batteries are monitored.
A BMS takes the imprint of the “chemical battery” during charging and discharge and establishes the “digital battery” that communicates with the user. Figure 1 illustrates the battery components consisting of stored energy, the empty portion that can be refilled and the inactive part that is permanently lost. Rated capacity refers to the manufacturer’s specified capacity in Ah (ampere-hours) that is only valid when the battery is new; available capacity designates the true energy storage capability derived by deducting the inactive part. State-of-charge (SoC) refers to the stored energy, which also includes the inactive part.
Figure 1: Three parts of a battery
A battery consists of stored energy, the empty portion that can be recharged and the inactive portion that is permanently lost due to aging.
A BMS is programmed to a rated capacity and it measures the in-and-outflowing coulombs that relate to the available capacity. As the capacity drops, the coulomb count decreases and this discrepancy enables capacity estimation. The most accurate readings are possible when counting the coulombs from a fully discharged battery during a complete charge or discharging a fully charged battery to the cut-off point. Such clean starts are seldom possible and real-life capacity estimations get muddled over time.
A BMS sets flags when receiving a full discharge and charge. During a rest period, an advanced BMS may also calculate SoC on hand of the stable open circuit voltage and begin counting the coulombs during charge and discharge from that vantage point. Some BMS also look at voltage recovery after removing a load to estimate SoC and/or SoH.
The old Volkswagen Beetle had minimal battery problems. Its battery management system applied charge to the battery and burned the over-charge energy on a resistor while cruising through a relay-operated regulator. The car had no parasitic loads when parked.
Since then, modern vehicles have been inundated with onboard electronics to enhance safety, convenience, comfort and pleasure; features no one knew were needed. For the accessories to function reliably, the state-of-charge of the battery must be known at all times. This is especially critical with start-stop technology that is being adopted worldwide.
When the engine of a start-stop car is off at a red light, the battery draws 25–50 amperes to feed the lights, ventilators, windshield wipers and other accessories. The battery must have enough charge to crank the engine, which requires an additional 350A for a brief moment. When the engine runs again and the car accelerates to the posted speed limit, the battery only begins charging after a 10-second delay, a deferral allows channeling all energy to vehicle acceleration. When back in charge mode, the lead acid battery is notoriously slow in charging.
To provide vital battery information, luxury cars are fitted with a battery sensor that measure voltage, current and temperature. Figure 2 illustrates the electronic battery monitor (EBM) packaged in a small housing forming part of the positive battery clamp.
Figure 2: Battery sensor for starter battery
The sensor reads voltage, current and temperature to estimate state-of-charge and detect anomalies; capacity assessment is not possible.
The EBM works well when the battery is new but most sensors do not adjust correctly to aging. The SoC accuracy of a new battery is about +/–10 percent. With aging, the EBM begins to drift and the accuracy can drop to 20 percent and higher. This is in part connected to capacity fade, a value most BMS cannot estimate effectively. It is not an oversight by engineers; they fully understand the complexities and shortcomings involved.
A typical start-stop vehicle goes through about 2,000 micro cycles per year. Such a strain would reduce the capacity of a standard starter battery to about 60 percent and carmakers use different battery systems that include AGM and the Advanced Lead-carbon. (Also see BU-806a: How Heat and Loading affect Battery Life)
Automakers want to ensure that no driver gets stranded in traffic with a dead battery. To conserve energy, modern cars turn off unnecessary accessories when the battery is low on charge and the motor stays on at a stoplight. Even with this measure, the state-of-charge can remain low if commuting in gridlock traffic because an idling motor does not provide much charge to the battery. With lights, windshield wipers and electric heating elements engaged there could be a net discharge.
Battery monitoring is also important on hybrid vehicles to optimize charge levels. Intelligent charge management prevents overcharge and avoids deep discharge. When the charge level is low, the internal combustion engine (ICE) engages earlier than normal and is left running longer for additional charge. On a fully charged battery the ICE turns off and the car moves on electric energy in slow traffic.
An EV driver expects similar accuracies in energy reserve as is possible with a fuel-powered vehicle but current technology does not allow this. To compensate, the EV battery is overrated and the fuel gauge is adjusted to preserve extra energy when the charge drops low to cover for inaccuracies. The EV driver is advised not to let the charge go too low but to charge more often. A mid-charger range is best for the battery.
The EV driver also anticipates the same driving range as the car ages. This is not possible and the drivable distance gets shorter with each passing year, but the BMS makes allowances. A new battery may only charge to about 80 percent and discharge to 30 percent. As the capacity fades, the bandwidth gradually increases, providing similar driving ranges as a new battery would. The distances traveled will be noticeably shorter when driving in cold temperatures because of reduced battery performance and once the battery has aged beyond the energy compensation band of the BMS. (See BU-1003: Electric Vehicle.)
The EBM has limitations in that it cannot estimate capacity effectively. This can be overcome by adding capacity estimations. (See BU-904: How to Measure Capacity) Figure 3 shows a BMS with common sensing points to which the ability to measure capacity has been added. Spectro™ stands for electrochemical impedance spectroscopy (EIS) with complex modeling. This converts a simple battery sensor to the state-of-function (SoF) level.
|Figure 3: Spectro-BMS™ adds capacity as key element to estimate battery state-of-health.|
Knowing SoF improves battery validation, but some device manufacturers refuse to reveal capacity readings to a consumer that are less than 100 percent, especially during the warranty period. To conceal unwanted information, the data can be made code-accessible for service personnel use only. (See also BU-602: How does a Battery Fuel Gauge Work?)
Consumer concerns put aside, SoF signifies a momentous improvement to BMS in terms of battery reliability as it tracks capacity fade and calculates the true runtime on the available energy. Capacity-based BMS will also predict eventual replacement, an issue that cannot be fully satisfied with current BMS technologies. Future BMS will combine the information of the “digital battery” with that of the “chemical battery” to provide reliable SoF data through advanced learn algorithms.
Last updated 2016-01-29
Comments are intended for "commenting," an open discussion amongst site visitors. Battery University monitors the comments and understands the importance of expressing perspectives and opinions in a shared forum. However, all communication must be done with the use of appropriate language and the avoidance of spam and discrimination.
If you have a question, require further information, have a suggestion or would like to report an error, use the "contact us" form or email us at: email@example.com. While we make all efforts to answer your questions accurately, we cannot guarantee results. Neither can we take responsibility for any damages or injuries that may result as a consequence of the information provided. Please accept our advice as a free public support rather than an engineering or professional service.