As we seek to reduce power consumption, we must comply with standards. The key is to make voltage and current measurements, calculate power, and minimize harmonics.
In the 1990s, experts calculated that without efforts to increase efficiencies and reduce "no-load" power consumption, external power supplies—the ones used to power everything from cell-phones to game consoles and laptops—would account for a shocking 30 percent of total energy consumption in less than 20 years. Power draw has improved, but now requires testing to prove that power supplies can meet regulations.
At the time, the efficiency of power supplies could be as low as 50 percent and, even worse, they would still draw considerable power when the devices they were powering were turned off. Starting in 1992, the US Environmental Protection Agency started voluntary programs to promote energy efficiency, including the Energy Star program. Not until 2004, when the California Energy Commission (CEC) got into the act, did governments begin to set mandatory standards for efficiency and no-load power draw.
Since then, the global regulatory environment surrounding external power supply efficiency has rapidly evolved, confronting the electronics industry with many challenges and uncertainties as it tries to keep up with a dynamic regulatory environment. Still, there’s little question that the effort has been worthwhile. The EPA, for instance, estimates that external power supply efficiency regulations implemented over the past decade have reduced energy consumption by 32 billion kilowatts, saving $2.5 billion annually and reducing CO2 emissions by more than 24 million tons per year.
In the U.S., power efficiency standards have remained relatively stable since the Department of Energy (DOE) published Level VI standards that went into effect in February 2016. Looking globally, however, the situation is murkier. The next round of legislation is expected to come from Europe once the current voluntary code of conduct (CoC) becomes mandatory. The EU’s more stringent CoC Version 5 Tier 2 voluntary standard was supposed to go into effect in January 2018, but final implementation has been delayed for a year or two.
Regardless of whether the standards are voluntary or mandatory, we shouldn't assume that authorities are slowing down the pace of regulatory advancement. Power supply designers face difficult challenges to continually improve efficiency while still maintaining specified performance over a range of input and load conditions. Beyond efficiency, designers also must cope with international regulations for current harmonics, safety, and EMI/EMC.
Modifying a power supply to meet stricter standards such as a decrease in no-load power consumption is typically not a simple component change. In such a case, for example, the control IC may need to be changed to one that draws less power and energy saving techniques like pulse skipping may need to be implemented. In turn, changes to primary side circuitry may require that products be resubmitted to have various safety certifications updated. Both the redesign and the re-certification process are expensive, emphasizing the need for pre-compliance testing for regulatory compliance tests early in design cycles to help reduce re-runs at third party compliance labs.
Test method overview
Although stricter standards are on the horizon, the principle and underlying method used for DOE Level VI can be for any AC-DC power supply and have broad applicability. The official test methods are published under Title 10 of the U.S. Code of Federal Regulations, Appendix Z to Subpart B of Part 430 and is available from the U.S. Government Printing Office.
Level VI testing requires that an external power supply be tested under load and no-load conditions. After the 30 minute warm-up period, AC input power must be monitored for 5 minutes to check for device stability. The power level drift should be within 1% from the maximum value observed and the 5-minute stability check must be performed at each loading condition. Under Level VI, the unit under test must be tested at 100%, 75%, 50%, 25%, and 0% of de-rated load (±2%) in that specific order. For power supplies that you expect to be on the market for some time, you will also want to refer to CoC Version 5 Tier 2, which adds a requirement for testing at 10% load -- a low-load condition where many types of power supplies have been notably inefficient.
Per the standards, power efficiency is calculated in real-time as a ratio, expressed as percentage, of the output active power to AC input active power at each loading condition and recorded separately. The power supply’s power that is consumed but not converted to output power, on the other hand, is calculated as the difference between the active output power and the active AC input power at each loading condition and again reported separately.
Note that no-load power consumption testing must also conform to the requirements specified in “Test Method for Calculating the Energy Efficiency of Single-Voltage External AC-DC and AC-AC Power Supplies” published by the California Energy Commission.
Test equipment requirements
The primary workhorse for power-supply efficiency testing is the power analyzer, which measures input and output power on AC-DC power supplies while also making efficiency measurements. The requirements for power analyzers are defined by the IEC 62301 standard. In general, the standard requires the equipment to have an uncertainty rating of 1 or 2 percent at the 95 percent confidence level depending on the power level.
The standards also define requirements for an active or passive load, which is required to set output current of the DUT while testing efficiency. Electronic loads, though relatively expensive, can provide significant benefits over passive loads due to their versatility and programmability and can also help automate efficiency testing, reducing testing time and minimizing user error. Both passive and electronic loads need to maintain the load current at all required output voltages with an accuracy of ±0.5%.
In addition to the load, AC sources are required in most cases for efficiency testing to regulatory standards. The input voltage from an AC wall socket is unreliable and can have significant variations based on various factors like time of day, wiring impedance, and load variations. Electronic AC sources solve these problems by providing a traceable and reliable voltage output that has a calibrated tolerance within the required standards. Most electronic AC sources also allow programming, which makes it easy to automate efficiency testing and reduce time and user errors. AC sources used for efficiency testing per standards are required to have the input voltage and frequency within ±1 percent of the specified values and required to have the total harmonic content (THC) of the input voltage within ≤2 percent, up to and including the 13th harmonic.
[Continue reading on EDN US: Running the tests]Josh Brown is a Senior Applications Engineer in the Keithley division of Tektronix.
- Tryout: Joulescope Precision DC energy analyzer
- Inside a high-power test fixture
- RMS Measurement
- How to Measure Electrical Power
- Low-power measurement techniques
- Make Power Measurements With Your Oscilloscope
- Improve power supply reliability
- Friday Quiz: Power Supply Measurements