I have borrowed from several authors the convention of
expressing the ENTROPY
in explicitly
dimensionless form
[the logarithm of a pure number is another pure number].
By the same token, the simple definition of TEMPERATURE
given by Eq. (10) automatically gives
dimensions of energy, just like U.
Thus
can be measured in joules or ergs or other
more esoteric units like electron-volts; but we are
accustomed to measuring TEMPERATURE in other, less
``physical'' units called degrees. What gives?
The story of how temperature units got invented is fascinating
and sometimes hilarious; suffice it (for now) to say that
these units were invented before anyone knew what
temperature really was!15.18
There are two types of ``degrees'' in common use:
Fahrenheit degrees15.19
and Celsius
degrees (written C) which are moderately
sensible in that the interval between the freezing point
of water (0
C) and the boiling point of water (100
C)
is divided up into 100 equal ``degrees''
[hence the alternate name ``Centigrade''].
However, in Physics there are only one kind of ``degrees''
in which we measure temperature: degrees absolute
or ``Kelvin''15.20
which are written just ``K'' without any
symbol. One K is the same size as one
C,
but the zero of the Kelvin scale is at
absolute zero, the coldest temperature possible,
which is itself an interesting concept.
The freezing temperature of water is at 273.15 K,
so to convert
C into K you just add 273.15 degrees.
Temperature measured in K is always written T.
What relationship does
bear to T?
The latter had been invented long before the development of
Statistical Mechanics and the explanation of what
temperature really was; but these clumsy units
never go away once people have gotten used to them.
The two types of units must, of course, differ by some
constant conversion factor. The factor in this case is
,
BOLTZMANN'S CONSTANT:
By the same token, the ``conventional entropy'' S
defined by the relationship
![]() |
(15.13) |
![]() |
(15.14) |