Use Copper to Temperature-Compensate High Current Measurements

Article By : Jerry Steele

Measuring high currents via cable voltage drop is an economical solution, but how can the large copper tempco be compensated?

Large currents flowing in a length cable can be measured using the voltage drop along the cable. It eliminates the need for a bulky shunt or expensive magnetic measurement method. Accuracy is limited however because of the +0.39%/°C tempco (temperature coefficient) of copper.

Temperature sensors can facilitate compensation, but are point-measurement devices, the relevance of which is questionable over the length of a cable. Consider that a mere 2.5° error or difference from the cable temperature introduces 1% of error.

If at least 10mV is dropped at maximum current, you can readily measure it with modern zero drift amplifiers (auto-zero, chopper, etc.). They offer ultra-low offset performance that enables accurate sensing of the low full-scale voltage drops.

What remains is what to do about the tempco. The solution proposed in this Design Idea takes advantage of the fact that high current cables are made up of many fine strands. The example here will be based on AWG 4 cable with 1,050 strands of AWG 34 wire.

In Figure 1, the operational amplifier non-inverting input senses the cable drop on the load end of the cable. The MOSFET is in the output/feedback path, which continues through the temperature-sensing strand – what would normally be a resistor for setting gain – ending at the power supply. The circuit forces a drop across this gain-setting element which exactly equals the main cable drop. In this case of course, the gain-setting element is the single insulated strand (lacquered such as magnet wire) of 34 gauge wire embedded within a custom insulated cable assembly including the high current cable.

Figure 1  Temperature-compensated high current measurement using ratio’d cable.

AWG 34 = 265.8Ω/1,000 ft.
AWG 4 = 0.248Ω/1,000 ft. (source: http://www.brimelectronics.com/AWGchart.HTM)
e.g., 0.474 ft. of #4 = 117.6 µΩ; 10 mV drop@ Iin = 85A; Iout = 80mA

 

Since the cable is composed of 1,050 strands, this arrangement will cause a current to flow in the MOSFET and gain element which is proportional to the total current divided by 1,050. Because the gain element as well as the

cable are both composed of copper, and are in intimate thermal contact, output variation over temperature is cancelled.

The feedback current flows out the MOSFET drain through RLoad to ground, providing a ground-referred output voltage.

Using the wire strand solves two primary problems with other temperature sensors:

  1. The wire is a "distributed" sensor that runs the length of the cable and better senses the overall temperature effects.
  2. Since the wire is copper like the main cable, temperature compensation is perfect.

 

Actual tests

Our setup used four feet of JSC 1666 AWG 4 cable. The insulation was slit along its length, and 34 gauge magnet wire inserted under the insulation. An NCS333 op amp was used in the circuit. Since the op amp’s common-mode voltage is equal to its supply rail, it must have rail-to-rail input ability (or use a higher supply). Furthermore, it should be a zero-drift (chopper) amplifier, as standard rail-to-rail op amps have generally poor performance near the positive rail.

 


Figure 2  Test setup: Since the sense-wire length affects absolute accuracy, the two grey wires connecting it to the board are heavier gauge.

Measurements

RLoad = 50Ω 1%

At no load, Vout reads 94 µV.

At a 10A load, Vout = 454.6 mV (5.85% error).

At a 58A load, Vout = 2.604 V (5.7% error).

 

This setup was then placed into a temperature chamber and tested from room up to 100˚C. It demonstrated less than 0.1% of additional error. Several factors can contribute to that error, such as op amp offset drift, as well as resistance and thermocouple effects in the terminations of the cable.

 

Wire-tolerance contribution to errors

In an effort to see what might be achievable with practical cables, I found the following wire data, showing a 2% tolerance for 34 gauge wire. One would expect similar overall tolerance for the 4 gauge. This suggests that commercial wire built to standard tolerances will yield an accuracy limit of 4% due to cable alone. The electronics would add some to that, but of course could be trimmed by the user, or matched to a supplied cable.

Figure 3  Wire data (source: weicowire.com)

 

As a final note, it seems cumbersome to build the cable that accomplishes this function. This concept was triggered by and is targeted to OEMs who could specify a custom

cable that includes one enameled strand to act as the gain resistor. There are many high current cable runs in electric and hybrid vehicles that an OEM could take advantage of, eliminating large shunts. This method can provide accuracy and temperature performance that will prove competitive to magnetic sensing, but at lower cost, particularly in OEM volumes.

At low volumes, it’s feasible to wrap or otherwise bundle the sensing strand on the outside of the cable; it would still have the advantage of distributed temperature sensing. Because of the cable insulation, sensing would be more responsive to ambient temperature, with a weaker coupling and longer time constant to actual cable copper temperature.

  

Also see:

 

Jerry Steele is Applications Manager at ON Semiconductor.

 

 

 

 

Leave a comment