Although basic thermal analysis for forced air cooling of electronics and for wind-chill factor rely on the same principles, the latter has unavoidable ambiguity.
Forced-air convection cooling via fans is a widely used approach for keeping the temperature of electronics below some threshold value. It’s more aggressive than “natural” convection cooling and ensures that air flows across the electronics and heat is removed (ensuring that the heat has a place called “away” to go to, that’s another story and challenge). While engineers might prefer natural, unforced convection cooling to avoid the perceived negatives of fans (cost, noise, reliability, space, to cite a few), the reality is that the available airflow mass and rate in many unforced cases is either inadequate or unreliable, so fans must be used.
The obvious question is, “How much airflow do I need?,” followed quickly by “What fan(s) will deliver it?” Using basic data, such as the thermal path from heat source to surface, the maximum allowed temperature, and the worst-case ambient temperature, gives designers an approximate sense of the situation. Next, detailed thermal-modeling software can provide a more comprehensive assessment of the situation, including airflow patterns, maximum overall temperature, and localized “hot spots” (Figure 1). Finally, fan vendors have good application notes which show how to decide on the specific fan models, use of a single fan or multiple smaller fans, and even multiple-fan serial or parallel airflow.
Figure 1 Design engineers rely on these images of PC boards created by software using thermal models of components and the board itself. (Source: Mentor/Siemens)
Forced-air cooling analysis does not apply only to electronics. Living creatures also experience it via the wind-chill effect (although the wind is, of course, a natural phenomenon). You would think that given the accuracy and details with which we can model the effect of moving air on electronics, the same would be true for calling out the wind-chill factor, which simply means how cold it feels to a person given the ambient temperature and wind speed.
It turns out that assessing wind-chill factor for humans is very difficult. It’s also one of those cases where the standard, accepted, and widely-cited numbers were based on some very tenuous data, until recently. A little background here: For many years, the wind-chill factor was based on research done ”on site” in 1945 by Antarctic explorers Paul Siple and Charles Passel, who measured the rate of freezing of water in a suspended container over the span of temperature and wind from -9°C to -56°C (16°F to -69°F) and from calm to 12 m/sec (27 mph). This was not a well-controlled test situation, and their “model” container does not truly represent the human target.
There are lots of sources of uncertainty in this wind-chill analysis, and the American Meteorological Society (AMS) has long recognized this. A short 1993 paper, “Wind Chill Errors,” crisply and clearly summarized the problem and decried the widespread use of the wind chill factor number as if it was “golden.” The paper put it bluntly: “Neither the heat index nor the wind chill index serve any purpose other than to provide the news media with a tool with which to excite the general public.”
It’s not enough to complain about the inadequacies and misleading nature of this metric. Fortunately, the AMS went further, and in 2005 published a paper that provided a much more analytical framework for what they called the Wind Chill Equivalent Temperatures (WCT) concept. This very readable paper, “The New Wind Chill Equivalent Temperature Chart,” constructs some basic thermal models (Figure 2), which look very much like the ones engineers use when modeling heat flow and thermal paths from a device to its package and to the ambient environment.
Figure 2 In this model, R1 is the thermal resistance from the core temperature of 38°C to the skin, and there are two parallel resistances from the skin to the outside air: convection (R2) and radiation (R3). Note that heat flow is from right to left in this model. (Source: American Meteorological Society)
The authors of the paper do acknowledge that any model must make assumptions. They used a cylinder approximation of the cheek as the exposed area. They first had to determine the thermal resistance between the body core and the human cheek, which they did with data from another experimenter’s results (Figure 3). The AMS authors do note that there are still many variables that affect their conclusions including amount of body fat, skin thinness, metabolism, and others.
Figure 3 This graph shows the steady-state thermal resistances between the human-body core and the skin of the cheek. (Source: American Meteorological Society)
Finally, they developed a wind-chill equation, to four significant figures:
WCT = 35.74 + 0.6215T – 35.75V0.16 + 0.4275TV0.16 (temperature T in degrees F and wind speed V in mph).
A personal comment: to me, this process of wind-chill equation “fitting” is somewhat analogous to the curve-fitting technique and equation derived by physicist Wilhelm Wien in 1896, to closely fit inexplicable data from black-body radiation experiments:
“Physics,” Halliday & Resnick
where Rλ is the spectral radiancy at a given wavelength λ, and c1 and c2 are empirical constants.
Subsequently, Max Planck modified the equation slightly to yield a more-precise fit, but it was still an empirical equation and did not constitute a theory. Planck did not stop with curve-fitting; he developed the then-radical theory of quantum-based radiation which fully explained the data and is the basis for our modern quantum physics. But I digress.
No doubt about it, while the basic principles of thermal modeling required to determine wind chill are the same as the ones for assessing effect of forced air cooling in electronics, electronic and mechanical engineers have a much, much easier challenge than meteorologists.
Have you ever had design situations where the analysis should have been straightforward in principle, but the many uncertainties prevented you from developing a good model? Or have you ever wondered where some widely used, accepted data actually is from, and maybe even questioned it?