Understanding test challenges for emerging battery management systems

Article By : Brent Rousseau, Teradyne

This article focuses on the current and future trends around BMS devices for automotive, and some key test challenges the industry is working to address.

Battery cell stacks and battery management systems (BMS) are already in use all around us, from power tools, robotic vacuum cleaners and drones to micro-mobility applications like eScooters and eBikes. Less noticeable items like uninterruptible power supplies (UPS) and renewable energy storage require a large number of battery cells. Every battery stack needs to be monitored to ensure that it is being safely charged and discharged, as well as being able to measure the overall health of the battery. There are challenges associated with rechargeable batteries that require testing to very accurate voltage levels. Additionally, batteries are tested in a stack, requiring precision measurements at high common-mode voltages. The trend going forward is to add more cells to battery stacks, as a way to drive towards higher voltage systems.

How battery management systems work

A BMS device typically consists of a number of cell measurement pins (12 to 24 or more channels) that feed into a front-end analog-to-digital converter (ADC). This ADC measures the voltage of the cell, allowing precise measurement of the individual battery cells. There is an additional pair of cell-balancing pins per cell, which also have ADC inputs. The purpose of these pins are to help balance the voltages between the cells in the stack. The remaining pins are power supply pins, and analog and digital control lines. Each stack of cells in a battery will require a BMS so in some cases there could be six to twelve or more BMS devices in a single electric vehicle, not counting devices used for redundancy. These devices are typically powered from the lower and upper rails in each of the cell modules so each BMS device is floating on the BMS or cell module below it. That means that all of these devices require a digitally isolated, daisy-chained communication to one another, and out to a master controller, typically a microcontroller unit (MCU).

Electric vehicles will drive BMS advances

The largest market for battery systems going forward, and the main focus of this article is automotive applications. These include (see Figure 1) full battery electric vehicles (BEVs, 400 Volt and up), as well as  internal combustion engines (ICE) with start/stop technology (typically 48 Volt systems, and both mild hybrid (48V battery driven) and hybrid EVs. In 2022, less than 5% of new vehicle sales were electric vehicles, however many automotive manufacturers are expecting electric vehicles to ramp to 50% of sales by 2030. Given this, electric vehicle technology is one of the fastest growing markets for many semiconductor manufacturers.

Figure 1: Application of BMS on different types of vehicles
Figure 1: Application of BMS on different types of vehicles

There are some key deliverables required to drive electric vehicle adoption, the first of which is the need for higher quality BMS, which affects driving range. With a more accurate battery management system the consumer can get more range from the same battery pack. For example, if the BMS can sense with 1% or greater accuracy, the battery can be charged closer to its max storage level. Think of it like guard-banding – with a 5% margin for error, the battery should only be used between 15-85% of its capacity. If the BMS is more accurate, then there’s not as much need to guard-band the usable charge in the battery. So going from 5% error to 1% error allows for the use of 8% more of the stored charge, translating to more miles per charge. See Figure 2 for details.

Figure 2: Battery Charge Limits
Figure 2: Battery Charge Limits

Secondly, in terms of safety and reliability, precise state of charge (SOC) enables greater battery utilization while maintaining battery safety (avoiding catastrophic failure) and increasing range projection accuracy. Greater battery utilization and efficiency also allows for smaller, lower weight battery packs, driving down vehicle cost.

There are a few trends driving changes in electric vehicle batteries. The first, as already mentioned above, is better accuracy, which translates directly to more mileage between recharging and more moderate aging of cells. Better “fuel gauge” accuracy also improves driver safety and confidence.

The second big trend is battery stacks are moving to higher voltages, requiring more cells in a stack, which drives a need for more channels of front-end ADC and cell balancing pins per BMS device. Mainstream battery cells today run at approximately 400 Volts, with some performance EVs already using 800 Volt systems. Looking forward, these levels are expected to trend up to 1000+ Volts in just a couple years, resulting in faster charging that help EVs achieve recharging times that are closer to the time to refuel an internal combustion engine. These capabilities create a competitive advantage for semiconductor suppliers who can now add value to a battery pack.

However, higher overall voltages mean more batteries need to be packed into the EV. Currently manufacturers assemble batteries into modules, then assemble these modules into battery packs. This modularization creates a need for more interconnections, which drive up the cost and weight of the battery packs. Cell-to-pack architecture is a novel approach that puts the battery cells directly in the battery pack, avoiding the module carrier model. Moving to a cell-to-pack architecture means that manufacturers can either fit the same number of cells into a smaller lighter-weight assembly or add more cells into the existing area. This may lead to more cells in a stack, which translates to more front-end ADC channels on the BMS device.

ATE testing challenges

These trends in battery management systems create new challenges for automated test equipment (ATE) companies. The first challenge is around the improved accuracy of the BMS. When measuring the discharge curve of a battery, the majority of the usable area happens to fall along a very tight curve. The full Li-ion state of charge (SOC) 100-0% range is from ~4.3V (fully charged) down to 2.2V (discharged). Looking at Li-ion full range, it appears to be an easy task to measure the change (~2.1V voltage range or 21mV / 1% SOC change).

A typical Li-ion discharge usage ranges from 80-20% or 90-10% of the full battery range. In the 80-20% region the SOC voltage is quite flat at 3.75-3.65V (~100mV total or 1.7mV / 1% SOC Change). This explains why BMS suppliers are looking into 100uV or 50uV of measurement accuracy in the 5V range. Figure 3 below illustrates a typical Li-ion discharge voltage curve.

Figure 3: Typical Li-ion discharge voltage curve
Figure 3: Typical Li-ion discharge voltage curve

This sub 100uV level of accuracy directly translates to the forcing accuracy of the ATE stimulus channel driving into the ADC of each cell of a BMS device. While this should be easily achievable on a low voltage supply, it must hold this accuracy across a 5-6V range in some test cases. It is also important to provide this as a very low noise and very low drift stimulus so that there is a fixed known voltage biasing the ADC input.

The second challenge is that the BMS device can consist of 16 to 24 or more cells that must be biased with common mode voltages to mimic a stack of battery cells. In some cases this can be over 120 Volts for some of the upper cell pins. This drives a requirement for dense high-precision, low-noise floating voltage/current sources (VIs) from the ATE. Due to the testing nature of these devices, there is typically a need to multiplex a high-precision measurement system to all cell pins, and the corresponding discharge pins of the device. This can require a large number of relays on the device interface board (DIB) if the ATE system does not have sufficient internal matrices, resulting in the need for very large application space on the DIB. In most cases semiconductor manufacturers want to test as many devices as possible in parallel to drive down the cost of test. For BMS devices this is also important due the extremely high volumes that BEVs are expected to require in the near future. The ideal ATE system should include all of the matrices and multiplexing already built into the system to allow for maximum site count.

One other challenge is the daisy-chained communication per BMS device out to the MCU. All of the BMS devices will be sending data onto the bus. Since this bus is inside the battery cell modules or cell pack, it can be a very noisy environment. This is also occurring at stacked voltages, and that requires an isolated interface. This isolated communication typically runs asynchronously and requires enough digital instrument capability to handle the unique pattern generation. Some key features needed on the ATE in this case are format and period switching on the fly, and the ability of the digital instrument to read asynchronous data.

Batteries and BMS devices are evolving to fit the needs of the automotive market. These improvements will drive the need for new test methodologies. From 1000+ Voltage systems to new battery chemistries, there will be many challenges in just a short time. The explosive growth in EVs will necessitate new tester capacity with short production ramps, requiring ATE providers to outpace the BMS requirements, and provide cost-effective solutions at high site counts.


This article was originally published on Power Electronics News.


Leave a comment