When you want to know the temperature outside, you look at the phone, the news, or the internet. When you want to know the temperature of your oven, you check a thermometer. Have you ever wondered about temperature measurement systems prior to the modern age?
From the earliest times to Galileo
As far as historians know, there were no scales for measuring heat and cold prior to the age of Galileo. It is possible that earlier civilizations had technology to assess temperatures, but there are no historical records of this. Instead, temperatures were defined by fixed points, like the different colors metals would take on depending on how much heat was applied to them, or how much heat would boil water or melt different substances like wax. Galileo Galilei invented the world’s first known thermometer in approximately 1592. It took the form of a long tube attached to a glass bulb. While the tube was dipped into cold liquid, the bulb was warmed, causing the air inside it to expand. As this process continued, some of the air would escape. Then the heat was removed, and the air left in the tube contracted. This contraction caused the liquid within the tube to rise, thus indicating a temperature change.
Many different temperature scales had been created by the early 1700s. Daniel Gabriel Fahrenheit devised two types of thermometers in 1714. One type used mercury, and, as it employed a capillary tube which was sealed off, was not impacted by atmospheric pressure like Galileo’s air thermometers. Fahrenheit also invented thermometers that employed alcohol. Since mercury freezes at -39 degrees Celsius, it is not possible to measure any temperature below that with a mercury thermometer. Alcohol freezes at -113 degrees Celsius, so alcohol thermometers can measure at much colder temperatures.
Celsius’ hundred degree scale
In 1742, Anders Celsius, a Swedish astronomer, devised a scale that separated temperature into 100 degrees, rather than the 96 degrees Fahrenheit used. He decided to call the point at which water freezes 100 degrees, and to designate zero degrees as the point at which it boils. This system was reversed in 1744 by Swedish botanist Carolus Linnaeus, and from then on, zero degrees will be forever considered the freezing point. The unit was called “centigrade” until 1948.
Lord Kelvin’s “infinite cold”
William Thomson, who became Baron Kelvin, was British engineer and mathematical physicist who, in 1848, determined that the value of absolute zero is around -273.15 degrees Celsius. Absolute zero means the coldest possible temperature. Thomson wrote about the need for a scale in which absolute zero or “infinite cold” was the lowest point. Today, we call the scale he made possible the “Kelvin thermodynamic temperature scale.”
Our temperature measurement systems have evolved from very primitive descriptors which did not even employ mathematics or numbers, to highly sophisticated and extremely accurate classifications. From the kinetic theory of gases to the Zeroth law of thermodynamics, the study of temperature continues to both break new ground in scientific thought and enable other parts of science and technology to continue to develop.