Read the fourth chapter of Instant Physics (Rothman) and
write a one-page reaction to or summary of.
Due: Nov. 10
Monday:
Not posted yet.
Wednesday:
We tried to address some confusion about the differences between the information (Shannon)
entropy and the thermodynamic/statistical-mechanics entropy. In both situations we can think
about having macrostates and microstates. For example in a gas the macrostate is characterized
by pressure, volume, temperature, etc.; while the microstates is characterized by the positions
and velocities of molecules. In the dice we simulated in lab, the macrostate would be the sum
of the dice and the microstates would the face values of the individual dice. The important
distinction to keep in mind is that in the information theory context we are going to reveal
the microstate, and then the entropy measures the information conveyed by that revelation (we
considered how many true-false questions it would take to pinpoint the microstate). Does this mesh
with our idea of entropy as a measure of disorder? Well, consider for instance how much more
difficult it would be to locate a particular book if the library stacks were not ordered in some
way.
In thermodynamics/statistical-mechanics, on the other hand, we will not reveal the microstate. In
this case then entropy will be a measure of what we do not know. When there are more microstates
for a given macrostates, then there is more that we do not know. If we cannot resolve a pattern in
some situation, we tend to think of it as "disordered".
Thermodynamics got its start in considering the efficiency of engines. An engine typically starts
off with two distinct regions -- often of differing temperature. Consider using the steam coming
from a boiling kettle to turn a turbine. If we just in a steam room and uniform steaminess, we could
not get the flow from one region to another that we extract useful work from. Engine efficiency then
is typically tied to the degree of distinction between the two regions. The bigger the temperature
difference the more efficient. (We are fortunate here on Earth to have a built in temperature
differentiation -- the sun is much hotter than the Earth.
To understand what entropy has to do with temperature differentiation (and therefore efficiency) we
will consider a concept known as Maxwell's demon. (James Clerk Maxwell is a giant of 19th Century
physics contributing to the two main thrusts of 19th Century physics -- thermodynamic/statistical-mechanics
and electromagnetism.) In Maxwell's demon we consider a volume of gas with a distribution of molecules
with different speeds buzzing around. We introduce a divider between two halves and a demon who will
allow faster molecules to move to the left side and slower molecules to move to the right side but not
the other way around. Since temperature is related to average kinetic energy of molecules, Maxwell's
demon would be introducing a temperature difference. But one of the statement of the Second Law of
Thermodynamics is that one cannot have have such a separation spontaneously. It would decrease entropy.
But Maxwell's Demon would have to have information about the molecules to affect the separation. So
once again we see a connection between entropy, disorder and information. The demon uses information
to sort out the molecules, i.e. to order them.
While we might naively associate uniformity with order, this is not really the case. Consider soldier
marching in unison versus a mob. As you observe soldiers marching by, you see a lot of soldiers lined
up then none then another line etc. But with the mob, you see a steady stream of humanity -- disordered
but in a way more uniform. Increasing entropy is increasing disorder is a drive toward uniformity -- and
without distinction we cannot extract useful work from our engines. Again we on Earth are not in a closed
relationship with the sun. We are in a situation of homeostasis rather than equilibrium with the sun.
The universe on the other hand is presumably a closed system with increasing entropy and heading toward
a condition known as "heat death" -- in which we have uniformity and no useful energy.
A new demon was suggested by John Wheeler -- Wheeler's demon. In this case the demon bundles the high
entropy part of the system and throws it into a black hole -- so that it is hidden away from the rest of
the universe. Thus for the "observable universe" we would be violated the Second Law of Thermodynamics.
But it turns out that black holes have entropy, temperature, etc. and are not isolated from the
rest of the universe thermodynamically.
Some of Stephen Hawking's work is in the radiation by black holes. The idea is that near the edge of
a black hole a particle-anti-particle pair can form spontaneously. (What seems to us as a violation
of the conservation of matter/energy is allowed by the Uncertainty Principle of Quantum Mechanics.)
Then one of the pair goes into the black hole and the other escapes. This is effectively equivalent
to the black hole radiating and bringing back into the system all of the high entropy we thought we had
hidden away using Wheeler's Demon.
Speaking of Heisenberg's Uncertainty Principle, it too is related to our theme of physics and information.
While statistical mechanics is about what you don't know, the Uncertainty Principle is about what
you can't know.
We ended this very abstract topic by mentioning Lord Kelvin and his famous speech
"Nineteenth-Century Clouds over the Dynamical Theory of Heat and Light" where he discussed the
success of 19th Century physics but raised two clouds. One cloud involves an issue with light (really
an electromagnetic wave) and will lead to Relativity Theory. The other cloud involves the
distribution of radiation energy (some thermodynamics) and will lead to Quantum Mechanics.