Legendre duality and statistical mechanics

 I’d like to make another attempt at a topic I handled badly before: How Legendre duality shows up in statistical mechanics (or, at least, toy models thereof).

We are going to be considering systems with {n} parts, and asking how many states they can be in. The answers will be exponential in {n}, and all that we care about is the base of that exponent. For example, the number of ways to partition an {n} element set into two sets of size {n/2} (if {n} is even) is {\binom{n}{n/2}} which Stirling’s formula shows to be {\approx 2^n/\sqrt{2 \pi n} = \exp(n (\log 2 + o(1) ))}. All we will care about is that {\log 2}.

How many ways can we partition an {n} element set into a set of size roughly {n/3} and a set of size roughly {2n/3}? More mucking around with Stirling’s formula will get us the answer

\displaystyle \exp\left (n \left(- \frac{1}{3} \log \frac{1}{3} - \frac{2}{3} \log \frac{2}{3} + o(1) \right) \right).

 

I’ll show you a non-rigorous way to get that answer without getting into the details of Stirlings formula.

Let’s suppose there is a function {h(p)} so that the number of ways to partition an {n} element set into a piece of size {pn} and a piece of size {(1-p)n} is roughly {\exp(n h(p))}. Let’s look at the generating function

\displaystyle Z_n(x) = \sum_{k=0}^n \binom{n}{k} e^{xk}.

Then we expect

\displaystyle Z_n(x) \approx \sum_{k=0}^{n} \exp\left(n h\!\left(\frac{k}{n} \right) + x n \frac{k}{n} \right).

 

For fixed {x}, there is presumably some {p_0} which maximizes {h(p) + p x}. The terms coming from {k/n} near that {p_0} will overwhelm the others, so we should have

\displaystyle Z_n(x) \approx \exp( n \max_{0 \leq p \leq 1} (h(p) + p x))

 or

\displaystyle \frac{1}{n} \log Z_n(x) \approx \max_{0 \leq p \leq 1} h(p) + p x.

Set {z(x) = \lim_{n \rightarrow \infty} \frac{1}{n} \log Z_n(x)}. So we should have

\displaystyle z(x) = \max_{0 \leq p \leq 1} h(p) + p x.

In other words, we expect {z} and {h} to be Legendre dual. In particular, we expect to have all of the following formulas:

\displaystyle z(x) = \max_{0 \leq p \leq 1} h(p) + p x \quad h(p) = \min_{x} z(x) - p x.

 

\displaystyle \frac{- \partial h}{\partial p} \circ \frac{\partial z}{\partial x} = \mathrm{Id} \quad \frac{\partial z}{\partial x} \circ \frac{- \partial h}{\partial p} = \mathrm{Id}.

 

To spell the last one out in words, {- \frac{\partial h}{\partial p}} is a function of {p} and {\frac{\partial z}{\partial x}} is a function of {x}; they should be inverse functions. (Keeping the signs straight is one of the real nuisances in this subject.)

In this case, we can compute {z(x)} explicitly. We have {Z_n(x) = (1+e^x)^n}, so {z(x) = \lim_{n \rightarrow \infty} \log(1+e^x) = \log (1+e^x)} so {\frac{\partial z}{\partial x} = \frac{e^x}{1+e^x}}. We use the relation that the derivatives of Legendre dual functions are inverse. Inverting {p = \frac{e^x}{e^x+1}} gives {x = \log p - \log (1-p)} so

\displaystyle \frac{\partial h}{\partial p} = - \log p + \log (1-p)

 and we can compute the integral to get

\displaystyle h(p) = - p \log p - (1-p) \log (1-p).

I’m trying to learn to write shorter posts, so I’ll stop here for now. Next time, we compute the shape of a random partition of {n}.

3 thoughts on “Legendre duality and statistical mechanics

  1. Consider a fair 2^n-sided die. If I roll it and tell you what I got, you get n = -log_2(1/2^n) bits of information. If I paint half of it black and then roll it, then when it comes up black you only get 1 bit, but a number still gives n bits.

    Consider an unfair coin that comes up heads p of the time, tails 1-p. How much information do you expect to get when I tell you what I flipped? Answer: this h(p), assuming the logs are log_2.

Comments are closed.