Yesterday, I had a trip into town for a medical visit. While in the waiting room, there was a television newscast
and the talk was about the warm weather lately. If you've watched any of these, they'll often compare current conditions
against a "normal" day. But what is "normal"?
Normal in these contexts is just an average. Like all average calculations, you sum up the observed temperatures over
time, and divide by the number of observations to get the average. The news people just call that a normal temperature.
However, it is an oversimplified way of looking at what is "normal" and what isn't.
So, over the course of years, the temperature in this region has been noted, and a history of temperatures on this date
has been compiled. Let's say there's a hundred observations going back to 1923. The temperatures would vary over a
range of observations that one might expect for this time of the year. In this region, even in late winter, you can
observe highs in the nineties. Sometimes, even in triple digits. Lows can also range down to below freezing. There
is a natural variation in the observations.
The variations can be statistically calculated too. Square the differences from the average, so that you'll always
get a positive number. Then take the square root of the sum. This will yield the average variation in the temperature.
This number is called a standard deviation from the mean. The mean is another way of saying average.
If it is a Normal distribution, the mean will have the often discussed "bell curve" when plotted. In order to really
zero in on what's normal or not, you look at the standard deviations from the mean. In a normal distribution, two
thirds of all observations will fall within one standard deviation from the mean. If the mean high for this part of
the
world in this time of year is 75 degrees Farenheit, the standard deviation from that mean will tell you how far
from the "normal" temperature it really is. Since this time of year has a lot of variation to it ( it can get hot,
and it can also get cold), then the standard deviation would be large, would it not?
Also in a normal distribution, three standard deviations from the mean would encompassing virtually all observations.
For this area, a standard deviation times three would yield all the observations during that time period observed. It
stands to reason that temperature range could vary as much as 50 degrees, I would estimate. The standard deviation would
be larger if that number is larger, and smaller if that number was smaller. Let's say you'd take the square root of
50 and get a little over seven degrees as a standard deviation. This would yield a range of between 54 to 96 degrees
for virtually all observations, if the mean high is 75. If the low is similarly calculated from a mean of 55, then
you would obtain a range from 34 to 76. Those numbers may be a bit off, but it is in the ballpark. A rough estimate.
To get accurate numbers, you'd have to have all the numbers.
Ninety percent of all observations will fall within two standard deviations, so that range would be 61 to 89 degrees.
A temperature above 89 would be considered most unusual. A temperature above 96 would be considered unheard of.
This has happened before, but not very often. I've seen unusual things in this area in the last 6 1/2 years.
It would seem that to claim that ten degrees above "normal" is really not that much to comment about. This all may
seem rather pedantic, but the use of words means something. These people are always pushing the idea that the world
is getting hotter, but the data hasn't shown it. Even if it did, how does anyone really know what "normal" really
means when temperatures haven't been measured for all that long?
TV weatherman wouldn't want to include discussion like this, because it is rather boring. It is easier to say it is
above "normal", but the whole point of this is what is that??? I could go on with this bit of manipulation of public
opinion, but maybe you get the idea. There is a lot of wiggle room there when one uses a word like "normal" too loosely.