tomclegg.net |
EntropyPosted December 12, 2000 Mostly Mozart is sponsored by Comfort and Joy, a unique clothing store. Today, for reasons which should be obvious, I'm going to talk about heat. The latest theory of heat is called thermal physics, or some people prefer the term statistical mechanics. It stems from quantum physics, and it explains why the laws of thermodynamics exist. You probably already know the laws of thermodynamics, even if you don't recognize the name. They're numbered from zero to three; the zeroth law is that if you have three objects, A B and C, and A and B are at the same temperature, and B and C are also at the same temperature, then A and C are the same temperature. If you think you missed something because that sounded too obvious, then you understand perfectly. A=B... B=C... A=C. Here's Michael Flanders and Donald Swann, with the "First and Second Laws." --- 1:00 --- flanders & swann That was Flanders and Swann with a song called "First and Second Law." They were right about the first law. Heat is work and work is heat. Heat is a way to transfer energy from one place to another, and so is work. But there is a difference between heat and work, and the difference is entropy.
Entropy is disorder. Randomness. Number of possible configurations.
Now we're getting into thermal physics a little. In thermal physics,
entropy is the number of possible ways to rearrange the energy.
According to quantum physics, energy comes in packets of finite size.
So there is a finite number of ways to rearrange the energy in a given
system. The laws of thermodynamics were invented before quantum
physics, so nobody knew exactly what entropy was. It was just a
special number that they could calculate and use to predict things.
One thing they And this is exactly what the second law says. Entropy never decreases. This has several implications, and one of them is that you can't move heat from the cooler to the hotter. Another is that you can't convert heat into work, or work into heat, without losing some along the way. The Kelvin-Planck formulation puts it this way: "It is impossible for any cyclic process to occur whose sole effect is the extraction of heat from a reservoir and the performance of an equivalent amount of work." This would be called perpetual motion of the second kind, if it were possible.
The third law states that entropy approaches a constant value as
temperature reaches absolute zero, which is -273.15 degrees Celsius.
The entropy difference between different energy arrangements
disappears at extremely low temperatures. This seems to imply that
being very close to absolute zero is the same as being Of course, the laws of thermodynamics are obsolete, now that we have thermal physics. So don't worry too much about the third law. It wasn't in the song anyway. The lights in the station are flickering, to indicate that the intermission is almost over, so I'll come back to thermal physics after the next set. Mostly Mozart is sponsored by Comfort and Joy, a unique children's store. Here's the rest of that horn concerto that you heard at the beginning of the show. James Sommerville with the CBC Vancouver orchestra. --- 2:30 --- James Sommerville 6 You're listening to CJLY Nelson 93.5 fm. Mostly Mozart is sponsored by Comfort and Joy, a unique children's store. --- --- James Sommerville 7, 8 That was James Sommerville with the CBC Vancouver orchestra. You're listening to Mostly Mozart. I said earlier that according to quantum physics, particles can only absorb discrete amounts of energy. You can't add or take away half a quantum of energy; you can only add zero, or one, or two, and so on. This means that there is a finite number of ways to rearrange the energy in a system of particles. The fundamental assumption of thermal physics is that each of these arrangements is equally likely. The rest is just math. Well, it's a little worse than that. It's not just math, it's statistics. And this would explain the term, "statistical mechanics," as well as being an effective way to discourage people from studying it. It turns out that if you have a huge number of possibilities, each of which is equally likely, you can still find some distinct patterns. For example, if you toss 10,000 pennies and count how many come up heads, you can safely bet the whole $100 that you'll toss at least 4000 heads and at least 4000 tails. The trick is that while there is an insanely large number of possible outcomes, almost all of them involve between 4000 and 6000 pennies coming up heads. There are millions of different ways to toss 5000 heads, but there is only one way to toss 0 heads. So tossing 5000 heads is a million times more likely than tossing 0 heads. In thermal physics, you can do the same thing: if you assume that each arrangement of energy is equally likely, then you will still find that the total outcome, as viewed from outside, is very predictable. When I mentioned entropy earlier, I said that 19th century physicists used the word without knowing what it meant. And what does it mean? Entropy is the logarithm of the number of possible arrangements. What's a logarithm? Ah... good question. A logarithm is how many times you have to multiply by 10 before you arrive at a number. Or you might be multiplying by 2, or some other base, but in any case the nice thing about using logarithms is that they can be added together, in cases where they would otherwise have to be multiplied. And why would you want to multiply the number of possible arrangements of energy? Well, say you have two very small objects at different temperatures. The hotter one has 100 ways to rearrange its energy, and the colder one has only 10 possibilities. For each of the 100 arrangements of the hot object, there is a choice of 10 arrangements for the cold object. So between the two objects, there are 100 x 10 which is 1000 possibilities. In other words, 10^2 * 10^1 = 10^3. Speaking in logarithms, 2 + 1 = 3.
Now, if you put the two objects So, entropy increases, and this causes thermal equilibrium. What's the difference between hot and cold things, according to this theory? Consider an ice cube in a bowl of soup. We know that the soup is going to give energy to the ice cube, because Flanders and Swann told us so, and we also know that the total entropy is going to increase. So, we'll say the soup gives up one unit of energy. Obviously there will now be fewer ways to rearrange the energy in the soup, since there is less energy available. So we'll say that the soup loses one unit of entropy as a result. Now that one unit of energy is absorbed by the ice cube, and in order for the total entropy to increase, the ice cube's entropy has to increase by more than one unit. And that's the difference between hot and cold: the same amount of energy change causes a bigger change in entropy in a cold thing than it does in a hot thing. So if U is energy, small sigma is entropy, and tau is fundamental temperature, and you know what partial derivative is, then you won't be at all surprised to hear that tau = dsigma / dU. |