The kelvin (symbol: K) is the base unit for temperature in the International
System of Units (SI). The Kelvin scale is an absolute temperature scale that
starts at the lowest possible temperature (absolute zero), taken to be 0 K.[1][2]
[3][4]
By definition, the Celsius scale (symbol °C) and the Kelvin scale have the
exact same magnitude; that is, a rise of 1 K is equal to a rise of 1 °C and vice
versa, and any temperature in degrees Celsius can be converted to kelvin by
adding 273.15.[1][5]
The 19th century British scientist Lord Kelvin first developed and proposed
the scale.[5] It was often called the "absolute Celsius" scale in the early 20th
century.[6] The kelvin was formally added to the International System of Units
in 1954, defining 273.16 K to be the triple point of water. The
Celsius, Fahrenheit, and Rankine scales were redefined in terms of the Kelvin
scale using this definition.[2][7][8] The 2019 revision of the SI now defines the
kelvin in terms of energy by setting the Boltzmann
constant to exactly 1.380649×10−23 joules per kelvin;[2] every 1 K change
of thermodynamic temperature corresponds to a thermal energy change of
exactly 1.380649×10−23 J.
History
[edit]
See also: Thermodynamic temperature § History
Precursors
[edit]
An ice water bath offered a
practical calibration point for thermometers (shown here in Celsius) before
the physical nature of heat was well understood.
During the 18th century, multiple temperature scales were developed,
[9]
notably Fahrenheit and centigrade (later Celsius). These scales predated
much of the modern science of thermodynamics, including atomic
theory and the kinetic theory of gases which underpin the concept of
absolute zero. Instead, they chose defining points within the range of human
experience that could be reproduced easily and with reasonable accuracy,
but lacked any deep significance in thermal physics. In the case of the
Celsius scale (and the long since defunct Newton scale and Réaumur scale)
the melting point of ice served as such a starting point, with Celsius being
defined (from the 1740s to the 1940s) by calibrating a thermometer such
that:
Water's freezing point is 0 °C.
Water's boiling point is 100 °C.
This definition assumes pure water at a specific pressure chosen to
approximate the natural air pressure at sea level. Thus, an increment of 1 °C
equals 1/100 of the temperature difference between the melting and boiling
points. The same temperature interval was later used for the Kelvin scale.
Charles's law
[edit]
From 1787 to 1802, it was determined by Jacques
Charles (unpublished), John Dalton,[10][11] and Joseph Louis Gay-Lussac[12] that,
at constant pressure, ideal gases expanded or contracted their volume
linearly (Charles's law) by about 1/273 parts per degree Celsius of
temperature's change up or down, between 0 °C and 100 °C. Extrapolation
of this law suggested that a gas cooled to about −273 °C would occupy zero
volume.