Many noisy processes are described by Gaussian probability distributions. Let's take a look at the mathematics of that.
All sorts of physical processes in this analog world exhibit some degree of randomness. Think of noise, for example. Many noisy processes are described by Gaussian probability distributions. We should take a look at the mathematics of that.
Consider the equation of the “bell curve” for a Gaussian probability distribution by starting with a very simple equation:
Without having to draw a picture, we know that the curve described by Equation 1 has a y-value of one when the x-value is zero and we further know that the y-value goes toward zero as the x-value goes toward either minus infinity or plus infinity.
Next, we add “zero” and the number “1” as in the equation above, which doesn’t really change anything, but will lead us to the next step. Equation 3, below, introduces the mean and the standard deviation.
Our “mean” is a new center value around which changes in the x-value will affect the y-value. Now the y-value goes to a value of one when the x-value equals the mean, where the mean could be zero or it could be any non-zero value. The “standard deviation” will affect how quickly or how slowly the y-value declines toward zero as the x-value departs from the mean.
A large value of the standard deviation would require the x-value to depart quite far from the mean to appreciably lower the y-value. On the other hand, when the standard deviation is a small number, a small departure of the x-value from the mean will bring the y-value toward zero more quickly.
Now we draw a couple of pictures to graphically see how the mean and the standard deviation do their respective things (Figure 1).
Figure 1 This graph shows the effects of mean and standard deviation.
Next, we scale the y-value by the reciprocal of the standard deviation. Please don’t worry just now about why we do this, just follow along.
If we look again at the graphics, we see an additional effect of standard deviation (Figure 2).
Figure 2 This graph shows an additional effect of standard deviation.
Next, we put in another scaling factor, the reciprocal of the square root of 2*π. Here again, don’t worry why, just keep following along.
We now introduce a new term called the “variance.” The variance is simply the square of the standard deviation. We can rewrite our equation as:
At this point, we’ve done it. As illustrated by Figure 2, if we integrate either of these last two expressions for the y-values over the range of x-values going from minus infinity to plus infinity, the area under the curve comes to unity or a value of one. With the seemingly arbitrary scalings that we made, either of the last two equations will yield an integration result of one.
Whatever values of the mean or the standard deviation or variance you choose, the integral comes out always to be one and this is the Gaussian probability distribution. Feel free to take your pick.
Now, I want to vent a little about an issue that I endure as a mathematical novice trying to understand the works of a great master such as Carl Friedrich Gauss. When Gauss did his work, it was pure intellect in action.
He deduced among other things that the square root of 2*π in the above equations was required, but HOW did Gauss arrive at his results? I have never seen any explanation of that nor any explanation of how l’Hospital derived his rule for finding limits nor of Pappus’ theorem about finding the volume of a toroid nor of other works. I have only been taught the end results of these genius’ efforts; how to apply their results, but nothing more.
Somehow, I feel cheated by the lack of deeper explanations.
John Dunn is an electronics consultant, and a graduate of The Polytechnic Institute of Brooklyn (BSEE) and of New York University (MSEE).