Boltzmann's Law states that at equilibrium the probability of finding an energy H in some molecular "degree of freedom", such as motion along the x axis or vibration of a particular CH bond, is proportional to e^{H/kT}. This is the same as saying that the equilibrium ratio between two states of different energy (ΔH) is given by e^{ΔH/kT}.
This law is a result of a very simple statistical idea, but deriving it analytically requires mathematical techniques that are not comfortable for the person on the street. To get the idea we can use the same approach we used in elementary school for getting the idea of multiplication by counting up 3 sets of 4 toothpicks to find that 3 x 4 = 12:
Suppose there are four identical bits of energy to share among four atoms (each with only one degree of freedom).Suppose that all possible ways of distributing the energy among the four atoms are equally likely.
What is the likelihood that an atom should have 4 bits of energy, or 3, or 2, or 1, or none?
This problem is simple enough that we can just write down all the possibilities and count to see how often the different amounts occur.
Consider the number of permutations that involve putting 4 (or 3, 2, 1, 0) bits of energy in the first atom:With all 4 in the first atom there must be 0 in each of the others. So there is only one way to do this.With 3 in the first atom, the remaining bit of energy can be in any of the three other atoms. Hence 3 ways.
With 2 in the first atom, there can be 2 in any of the others (3 ways), or one in each of two others and one vacant atom (any of 3). Thus 6 permutations in all.
With 1 in the first atom, there are 3 bits to permute among the other atoms, 3 in any one (3 ways), 2 in one and 1 in either of the two others (6 ways), or 1 in each of the three atoms (1 way). Thus 10 permutations in all.
With none in the first atom, there are 4 bits to permute among 3 atoms, 4 in any one (3 ways), 3 in one and 1 in one of the others (6 ways), 2 in each of 2 and one of the others vacant (3 ways), 2 in one and 1 in each of the others (3 ways). Thus 15 permutations in all .
This yields the following table of the number of permutations of bits of energy among the other atoms for a certain number of bits of energy in the first atom:
Number in first atom 0 1 2 3 4 Permutations 15 10 6 3 1 [Incidentally, it is crucial in counting these permutations correctly that one consider the bits of energy indistinguishable, so that you don't care which energy bits are in which atom, just how many are in each atom. If they were distinguishable, one would derive a different distribution  the Poisson distribution  which gives a much higher weight for intermediate occupancies. Boltzmann was certainly not disturbed by this indistinguishability, because he didn't really think that energy came in little bits anyway. He was just using the model as a computational device and would ultimately take the limit of lots of infinitesimal bits to model continuous energy.]
If we then plot the number of instances of having a certain amount of energy against the amount of energy as blue diamonds, we find a distribution that is reminiscent of a decaying exponential (e^{x}), the dashed red curve.
As the number of energy units and containers increases, the distribution approaches an exponential curve more closely, as shown by the large blue diamonds and red curve in the following plot, which were calculated for 16 bits of energy in 16 atoms. In this case we represent the number of permutations as a percentage of all possible permutations rather than as a raw number.
Note that 16 energy
units in 16 atoms is the same energy density
(temperature) as 4 energy units in 4 atoms. The other points/curves
in this plot are for 16 energy units in 8 or 4 atoms, that is twice
and four times the energy per molecule of the previous plot. Again
the points are well described by exponential curves, but they decay
less rapidly, and therefore the percentage of the sample represented
by the E=0 population is smaller (30% and 16% vs. 48%). Still E=0
is the most likely single value for the energy, for the
obvious
reason that it allows the maximum number of arrangements of
energy
bits among the other atoms. After looking at these
points and curves, it will not surprise you to learn that in 1877 when
Boltzmann, who was good at math, found the limiting case
for many atoms and infinitesimal energy packets,
the distribution turned out to be exactly his decaying exponential (e^{H/kT}).
[You'll learn how to do this for yourself
in PChem. Click here if you can't wait.] You can see from the plot above
why temperature occurs in the denominator of the exponent. Higher
temperature slows the exponential decay. One way of having a different amount of energy in one molecule
as compared to another is for them to have isomeric structures. Thus
the Boltzmann formula gives the equilibrium constant between isomers
from their energy difference. Since all chemical
transformations can be considered isomerizations (rearrangement of a
fixed set of atoms), this approach can predict any
equilibrium constant.
Ludwig Boltzmann
(18441906)
Entropy
Remember that statistics enters into determining an equilibrium constant in another way as well, one that has in a sense to do with nomenclature. If you're talking about the equilibrium between gauche and antibutane, you need to consider not only the Boltzmann energy factor, you also need to consider the fact that there are two gauche butane structures and only one antibutane. This means there should be a factor of about 2 in favor of gauche that attenuates the roomtemperature energy factor (of 10^{3/4 0.9} = 5) in favor of anti. Overall you would expect anti to be favored by a factor of about 2.5 at room temperature.
Here we have profitted from the useful fact that at room temperature e^{}^{ΔH/RT }is the same as 10^{3/4 }^{ΔH}, if ΔH is in kcal/mol
In the case of cyclohexane, the chair form is favored by about 6 kcal/mole (a factor of about 30,000), but the flexible twistboat form includes many more structures (about 7 times as many), so the equilibrium constant falls to about 4,000 in favor of the chair.
This kind of statistical factor is very closely related to what entropy is about, though entropy is usually couched in terms that make it seem mystical to most people. To me entropy seems needlessly obscure.
Entropy (S) is typically multiplied by T and combined with H to make up Gibbs free energy G = H  TS. This free energy (G) is then substituted for H in the Boltzmann factor to give e^{Δ}^{G/RT}.
Now a very common value of ΔS is 1.377 cal/mole/K. This value turns out to be R ln 2, the gas constant (Boltzmann constant times Avogadro's Number) times the natural log of 2. Consider what this means to an equilibrium constant:
That is, after all this rigamarole 1.377 cal/mole/K just came out to mean a factor of 2 in the equilibrium constant.
Are you surprised that the entropy difference between gauche and anti butane is about 1.377 cal/mole/K?
Might it be more user friendly to say that gauchebutane has a statistical factor of two favoring it, rather than to say that it has 1.377 cal/mole/K more entropy that the antiisomer?
You bet!
While I'm on the subject, here's another pet peeve:
Because ΔG = ΔH  TΔS, it
is often said that
"ΔS
becomes more
important as temperature increases". This is nonsense,
because
when you go to use free energy in calculating an equilibrium or rate
constant, you divide by T in the Boltzmann expression. A given
ΔS
contributes a factor to
the equilibrium (or rate) constant that is independent of
temperature. What happens with increasing
temperature is
that ΔH
becomes less
important. This is sensible because at higher temperature
there
is more energy to go around, so it does not "cost as much
probability" to put a little more energy into one place (the
exponential curve in the graph above becomes flatter).
This discussion was designed to give you a fundamental idea of what entropy is about without being intimidating. Statistical thermodynamics does involve some pretty sophisticated ideas, and lots of math. Those who created it were not being intentionally obscurantist, they were just trying to be careful and rigorous. How you should do the counting to decide how many "structures" there are for a certain molecule can be a delicate question. For example the more you warm a molecule, the floppier it becomes, and the more structures you should consider for it. Furthermore, the questions of indistinguishability and symmetry restrictions can get really hairy.
