If any of you were wondering where we had all gone this past week, there’s a simple answer: Maui! More than half of the contributors to this blog were far far away from the internet at “Subfactors in Maui.” This was a relentlessly laidback conference organized by Vaughan Jones (which means it was actually organized by our very own Scott M), entirely dominated by Berkeleyites; a little internet research shows that every mathematician there either received their Ph.D., did a postdoc, or is a faculty member at Berkeley.

I’m afraid we didn’t really keep up our previous standards of talk blogging, but I’ll plead lack of internet and a somewhat exotic collection of material. I can’t hope to do justice to the talks on subfactors (given by Vaughan Jones, Emily Peters, Dietmar Bisch, and Pinhas Grossman), though they were very interesting to me as someone who’s generally been faking it when it comes to subfactors (though I do REALLY want to know what arithmetically equivalent subfactors are) or Laurent Bartholdi’s talk on automatic groups and subfactors (though maybe I should, considering that I took a class on automatic groups with Laurent 4 years ago when he was at Berkeley). That leaves my talk, which was basically on material I’ve already covered, Scott M’s talk, which I’ll let him cover in his own sweet time, and the Station Q denizens Mike Freedman and Kevin Walker.

Mike’s talk on nonabelian statistics was probably the most interesting to the audience around here (and quite interesting for me, though I’ll admit, as times it strained my understanding of mathematical physics).

Of course, before there’s an discussion of non-abelian statistics, we should probably know what abelian statistics are. The way a bunch of identitical particles of any given type like to arrange themselves in energy levels is known as their “statistics” and it’s been known for a long time that these statistics depend on what happens to the wavefunction of said mass of particles if you swap two of these particles. Of course, you’ll get out a wavefunction with the same norm (since the initial and final configurations are the same), but if you do something like pick up a sign, this will affect how these particles entangle with each other.

The simplest thing you might expect is no change. Particles that do this are called bosons (because the statistics they satisfy is called Bose-Einstein) For example, photons (and, in fact, all gauge bosons) fall into this class.

The next simplest thing that could happen is that you pick up a negative sign for every switch. These particles are called fermions (they have Fermi-Dirac statistics), and include particles like electrons and quarks (if you’re wondering where protons are, remember that protons are not elementary particles, but rather agglomerations of quarks).

In the standard model, these are all that show up, and the spin-statistics theorem says that the bosons and fermions are the particles of integer and half integer spin, respectively (you’ll often see this as a definition. I honestly have no idea which is better, but that is a point far beyond the scope of this post).

One of the theories of how people should create a quantum computer is find some “particles” (you can call them “excitations” if you’re uncomfortable labeling things that don’t fit in the standard model as “particles”) that don’t follow the spin-statistics theorem, which give you better chance of actually computing with them without too much noise seeping in (this is called “topological error-correcting” and is one of the things under study at station Q).

Now, I’ll just note that we’re not talking about contradicting the standard model here, but rather about some kind of funny system where out of the rules of the standard model, where we get a wavefunction for some mass of entangled states that obeys non-abelian statistics.

What’s remarkable is that these physical systems seem to have been discovered, something called the “fractional quantum Hall effect.” Basically, one has a crystal of gallium arsenide and one interfaces this with a similar crystal with some of the arsenic replaced by aluminum. For whatever reason, these crystals are very close in structure, so they meet with almost no distortion, but the difference between compositions allows us to create a chemical potential that traps a bunch of electrons at the interface.

So we have a bunch of electrons trapped in this 2 dimensional world, and while the experimental confirmation is still being worked on, it sure appears that some non-abelian statistics show up, when one fills up certain fractions of the energy levels of this whole interface (which means billions of electrons). The most popular fractions seem to be 1/3, 5/2, 8/5 and 13/6.

But this is not just any non-abelian statistics. It’s actually widely believed to be Chern-Simons theory for SU(2) at different levels. This rather exciting (or should be) for knot-theorists, since expectations in this model are Jones polynomials are roots of unity, which for big enough roots of unity (a 5th root is big enough, I think) is enough information to do quantum computing.

For me (who is not THAT excited about quantum computers) this just serves as an incredible demonstration of Wigner called “the unreasonable effectiveness of mathematics in natural science.” Mathematicians have studied Jones polynomials and quantum groups because they seemed interesting (admittedly, with some inspiration from physics at times), and now, shockingly, they turn out to have been describing bizarre physical systems the whole time. Who knew?

Not to mention that it looks good on grant applications :)

Some minor corrections:

In the upper layer of semiconductor, the aluminum replaces the gallium rather than the arsenic – they are both type III elements, while As is type V. You can find an explanation of this in, e.g., the Britney Spears guide to semiconductor physics (really quite good – search for it).

People seem to like the 12/5 Landau level, because it is conjectured to yield some kind of universality with respect to quantum computation.