Extreme-Precision Numbers and Skepticism

Article By : Bill Schweber

It’s normal to be skeptical of data reported with many significant figures or extreme precision, but a NIST thermometer with 0.001°C precision legitimately supports that impressive...

One of the many irritations I have with so much of the “reporting” I read is the poor use (intentional or not) of statistics and data. I have seen stories talking about how a change in some parameter from 4% to 8% is a rise of 4 percent, when it is really a rise of four percentage points or a doubling of the initial value. I have seen misuse of significant figures, such as a “one-kilometer” distance also being reported as 0.6213 miles, which is arithmetically correct but implies precision far beyond the original casual metric measure.

Sad to say, I have even seen that mistake in scientific-journal papers; either the author is naïve, to be polite, or the presumed reviewer didn’t do the job. And try explaining the difference between accuracy and precision—often a waste of time. Note that the subject of numerical illiteracy is not new, and was forcefully discussed in the 1988 book Innumeracy: Mathematical Illiteracy and its Consequences (review)” by math professor John Allen Paulos (who apparently coined the very descriptive term “innumeracy” as well). Claims of extreme precision and accuracy are especially critical in analog design, as it is very tough to develop a circuit and system with consistent performance to 0.01%, while in the digital world you can claim as much precision as you like just by adding more bits to the numerical representation.

That’s why I was somewhat shocked when I saw a headline indicating that a new thermometer read 0.001°C. No way, I figured: that’s either a simple error in writing or some excessive zeal in unit conversion. The story was taken directly from NIST (National Institute of Standards and Technology) announcement, and the NIST people are pretty careful with their significant figures, precision, accuracy, and related numeracy issues. After all, look at the care they and complementary institutes around the world used when redefining the fundamental physical constants for the artifact-free SI system, Figure 1 and References 1 through 4.

SI units of measurement

Figure 1. The latest and significant re-definition of the fundamental SI units, promulgated by NIST and similar world bodies after extensive research and validation, relies on reproducible definitions and eliminates the need for physical artifacts as primary standards. (Image source: NIST)

So I read further, and found that the 0.001°C number was correct: a NIST team had developed a thermal-infrared radiation thermometer (TIRT) for the -50°C (-58°F) to 150°C (302°F) range (corresponding to infrared wavelengths between 8 to 14 micrometers) which can measure temperatures with precision of a few thousandths of a degree Celsius. Even better, it does not require cryogenic cooling, as many other high-performance infrared temperature sensors do.

First question, of course, is who needs that? They say it is needed for medical and scientific studies, such as satellite-based measurement of the surface temperature of oceans. The second question is: so, how did they achieve this? It’s a combination of a new approach plus the usual tactic of understanding and analyzing every error source and then seeking to minimize or eliminate it, plus adding a feedback mechanism to stabilize performance around known points.

NIST’s Ambient-Radiation Thermometer (ART), Figure 2, uses an set of internal thermometers to constantly gauge temperatures at different points in the instrument; those readings are then used in feedback loop which keeps the 30-cm (12-inch) core cylinder containing the detector assembly at a constant temperature of 23°C (72°F).

NIST precision thermometer

Figure 2. Optical design arrangement of the ART showing the ZnSe lenses and the temperature-stabilized compartment holding the tilted-field stop, aperture stop, lenses, and detectors. The entire assembly to the right of the chopper wheel is temperature stabilized at 23°C; the distance from the objective lens to the pyroelectric detector is about 55 cm. Note that the outer case of the ART is not temperature stabilized. (Image source: NIST)

The unit also includes additional focusing features to reduce errors due to IR radiation getting into the instrument from outside the targeted field of view (called as size-of-source effect), Figure 3. The design is described in full detail in the paper with the very modest title “Improvements in the design of thermal infrared radiation thermometers and sensors,” published in Optics Express from the Optical Society of America.

Figure 3. In the NIST Ambient Radiation Thermometer, (1) infrared light from a fixed-temperature calibrated source (at right, not shown) enters the thermometer enclosure through this lens, which focuses the radiation onto a “field stop,” analogous to the f-stop aperture in photography; (2) a circular metal chopper slices the IR beam into a sequence of pulses; (3) the first lens inside the central cylinder converts the light from the field stop to a parallel beam; (4) the light passes through this insulated cylinder (about 30 cm long) which is temperature-controlled by a feedback system; stray radiation is blocked by another stop; (5) a second lens focuses the light onto a pyroelectric detector; and (6) the detector output is routed to an amplifier that boosts the signal levels. (Image source: NIST)

My third question was: show to you calibrate such an instrument, to be able to claim such performance? Again, they have certainly done their homework, as you would expect. The paper describes two techniques using their advanced infrared radiometry and imaging facility with the standard platinum resistance thermometers (RTDs). This was done with both water-bath and ammonia heat-pipe blackbodies along with extensive corrections due to the real-world physics of blackbody radiation and RTDs. No doubt about it: this is impressive theoretical and applied science and engineering, and one of the few cases where I can accept claiming measurement to 0.001°C as meaningful.

Have you ever had a design or function which mislead you due to unjustified claims of precision in the initial analysis or actual specifications?

References (all from NIST)

  1. For All Times, For All Peoples: How Replacing the Kilogram Empowers Industry
  2. A Turning Point for Humanity: Redefining the World’s Measurement System
  3. Toward the SI System Based on Fundamental Constants: Weighing the Electron
  4. Universe’s Constants Now Known with Sufficient Certainty to Completely Redefine the International System of Units
  5. Precise Temperature Measurements with Invisible Light

Virtual Event - PowerUP Asia 2024 is coming (May 21-23, 2024)

Power Semiconductor Innovations Toward Green Goals, Decarbonization and Sustainability

Day 1: GaN and SiC Semiconductors

Day 2: Power Semiconductors in Low- and High-Power Applications

Day 3: Power Semiconductor Packaging Technologies and Renewable Energy

Register to watch 30+ conference speeches and visit booths, download technical whitepapers.

Leave a comment