Thursday, July 10, 2014

Can I get LaTeX to Work In Blogger

Linky.

Einstein's most famous equation is \( E=mc^2 \).

Edit: Rats! Round 2:
Linky.

Is $$\LaTeX$$ up yet? It is!
A random variable is some list \(p_i\), that is always non-negative and sums to 1. Each number i represents a possible state. The \( p_i \) is then the probability of being in that state. Statistical Entropy is defined from the following equation:
 \[ S = -k_B \sum_i p_i \log(p_i) \]
Where we assume \( 0*log(0) = 0 \). This is a measure of how random, how uniform a random variable - which is nothing but an indexed list of the probability of every possible outcome - is. It is a special measure, it is the only one that has certain desirable traits. A bachelor pad consists of a variety of locations for dishes. At first all the dishes are clean and put away. In fact, if all the dishes are in one location, then since \( p_{that location} = 1 \) and all other locations \(p_i = 0 \), by our assumption and the fact that \(1log(1) = 0\), S=0. All the dishes can be found trivially, they are all right there. As time passes and our bachelor fails to clean, dishes start traveling around the apartment. The probability of finding a dish in a strange location starts to rise. The entropy increases. I stress that this is not a metaphor, that in this model of cleanliness there really is an entropy and it really does increase. We might expect as time goes on for the dishes to be moved around the whole of the apartment (and entropy to be therefore maximized), but this part of the example is metaphor. Unless the man is, even by bachelor standards, lazy and dirty entropy could halt before being maximized.
This is the plain meaning of statistical entropy. Unless special care is taken, a random variable (in the above example, the man himself) will tend to spread things out. That spreading is the amount of entropy in that variable. This would be important enough even if it was a purely statistical object. One could measure, for instance, inequality in a way that has desirable properties, for instance, it can be decomposed into contributions from each region, so that one could search for where inequality really is in an unequal country (in the farms? in the cities? etc.). There is no connection between this index and thermodynamics. Statistical entropy is its own beast.

Statistical entropy is named after a physical property, the entropy of a system.

In the splashes of a wave, one can see a lot of movement that doesn't really go anywhere. Physcial entropy is a measure of how much energy goes into this. The concept of physical entropy was discovered in the context of studying steam engines. It was realized by an engineer/physicist named Sadi Carnot that in real machines there is always lost work. He developed a model of a machine that did not have heat loss and this is the most efficient possible machine (simple "proof": if it was not, power the more efficient machine with a Carnot machine and you can move heat in a cycle but extract work, making perpetual motion, which is impossible).

The remarkable thing about statistical and physical entropy is that they turn out to be the same thing, or more correctly, physical entropy can be well modeled as an application of statistical entropy. There is confusion here, since this is by far the most important application of statistical entropy many believe that these concepts are the same, but again statistical entropy can and is used in contexts without any thermodynamic meaning whatsoever. In physical thermodynamics, we have every reason to expect the entropy of a system to increase over time - but in the non-physical applications of statistical entropy this must be proven all over again and only if true. If the above income inequality index was subject to maximum entropy, then total income equality would never be more than a breath away. If the man's apartment was subject, he would be sleeping on knives and forks. These concepts are different and must be differentiated. That they are related makes this all the more important.

Relatedly, Paul Samuelson wrote a remarkable article on thermodynamics that I currently do not have time to post on. Maybe next time!

No comments:

Post a Comment