I’d like to make another attempt at a topic I handled badly before: How Legendre duality shows up in statistical mechanics (or, at least, toy models thereof).

We are going to be considering systems with parts, and asking how many states they can be in. The answers will be exponential in , and all that we care about is the base of that exponent. For example, the number of ways to partition an element set into two sets of size (if is even) is which Stirling’s formula shows to be . All we will care about is that .

How many ways can we partition an element set into a set of size roughly and a set of size roughly ? More mucking around with Stirling’s formula will get us the answer

I’ll show you a non-rigorous way to get that answer without getting into the details of Stirlings formula.

Let’s suppose there is a function so that the number of ways to partition an element set into a piece of size and a piece of size is roughly . Let’s look at the generating function

Then we expect

For fixed , there is presumably some which maximizes . The terms coming from near that will overwhelm the others, so we should have

or

Set . So we should have

In other words, we expect and to be Legendre dual. In particular, we expect to have all of the following formulas:

To spell the last one out in words, is a function of and is a function of ; they should be inverse functions. (Keeping the signs straight is one of the real nuisances in this subject.)

In this case, we can compute explicitly. We have , so so . We use the relation that the derivatives of Legendre dual functions are inverse. Inverting gives so

and we can compute the integral to get

I’m trying to learn to write shorter posts, so I’ll stop here for now. Next time, we compute the shape of a random partition of .

### Like this:

Like Loading...

*Related*

Consider a fair 2^n-sided die. If I roll it and tell you what I got, you get n = -log_2(1/2^n) bits of information. If I paint half of it black and then roll it, then when it comes up black you only get 1 bit, but a number still gives n bits.

Consider an unfair coin that comes up heads p of the time, tails 1-p. How much information do you expect to get when I tell you what I flipped? Answer: this h(p), assuming the logs are log_2.