Let denote the finite field with elements. Let denote the set of complete flags in . Such a flag is a chain of subspaces of , with and . For example, if , an element of is a quadruple , where is a plane through the origin and is a line through the origin, with contained in . Of course, including and is redundant, as is always zero and is always , but we will see that it is convenient to have them around. Now consider the following operation: forget and replace it by one of the other possible -planes lying between and . There are such choices, and we pick uniformly at random. Note that we are forbidden to just put back where it was. We will denote this operation by .
Let’s do an example: starting with a given flag, in the case , what is the effect of ? Note that we draw our pictures projectively, so we visualize as a pair (point, line through the point) in the projective plane. Let’s call this pair . The first step moves the point along the line to a new point . The second step pivots the line about the point to a new line . And the final step slides the point along the pivoted line to its final resting place at .
(Click the image for a larger version.) Now, if we are given and , what is the probability that takes us from to ?
Exercise: This probability is zero if , or if any two of the points , and coincide. If none of these unfortunate events occur, then the probability is . (Hint: note that we must take to have any chance of success.
Now, here is the interesting thing. Run through the same analysis for . The intermediate stages of the process will look completely different, but the end effect is precisely the same: probability zero in the degenerate cases mentioned above, and probability the rest of the time.
So we can write the equation and, more generally, . If you are uncomfortable talking about an equality of “processes”, you may interpret as a linear operator on the space of probability distributions on . There are also two other relations between the . The first, which is completely obvious, is that when . The second, which takes only a moment’s more thought, says that . In other words, if we do twice, there is a chance that we will get back the -plane we stared with and a chance that we won’t. Noah Snyder and John Baez can talk for hours about how this shows up in the tensor category of representations of a quantum group, so I won’t. Instead, I’ll explain some of the consequences of this for combinatorialists. At the end, I’ll bring up a very natural question that Jim Propp asked me, to which I don’t have a good answer yet. If no one writes in with a good solution, then I’ll pull together a sequel explaining my failed attempts.
All right, first of all, I have the word Hecke Algebra in the title and I haven’t defined it yet. Let’s fix that right now: The Hecke algebra is the algebra of operators on probability distributions on generated by . That factor of is just there to get rid of the denominators. (Actually, is also more natural than from the general Hecke perspective that John Baez and Jim Dolan are presenting over at the -Category cafe. But is both a little easier to think about, and a lot more natural for Jim’s question.) We can also describe the Hecke algebra as the algebra with generators , , …, and the relations: , when and .
A priori, the Hecke algebra could be huge! One could imagine that it has dimension . In fact, it only has dimension . (Also huge, but not as huge.) To see why this might be the case, let’s imagine plugging in . This doesn’t make sense in if we use the definition in terms of operators on , but the definition in terms of generators and relations works just fine. Our relations now read , when and . Let be the element of the permutation group which swaps and , leaving , , …, , , …, where they are. Then the satisfy the same relations. It is a fundamental result in Coxeter theory that the group generated by , , …, subject to these relations is . So, when , the Hecke algebra is isomorphic to the group algebra of . In other words, if the relation holds in , then the corresponding relation holds in the Hecke algebra (at .)
When is not 1, life is trickier. However, we have the following amazing result: Suppose that the words and are both reduced. This means that and there is no shorter way to write as a product of ‘s. Then the relation is still true, for arbitrary . This gives us a way, for any element of , to define a corresponding element in the Hecke algebra: simply take the product where is a reduced word for . Furthermore, it turns out that the are a basis for the Hecke algebra.
Expressing nonreduced products of the ‘s in the basis of ‘s is an area of active combinatorial research. In the probabilistic model at the start of this section, we can describe the problem this way: Suppose we do a Markov walk on , performing one after another. Can we give an explicit description of how this walk mixes? (Of course there are many variants of this problem, because I haven’t told you how to select the sequence of which we use.)
This brings us to Jim’s question. In the Markov description above, there is no interpretation of sending to . Jim asked whether there was an alternate Markov process we could come up with, which modeled the Hecke algebra, but in which was a continuous parameter. (Or, rather, this was my interpretation of a somewhat less specific question.) Presumably, as went to , this process would become the Cayley graph of . I can do this in a stupid way, by defining a certain orientation on said Cayley graph and defining a walk which is random except that, when the walk would take an edge which is directed against it, it stays put instead with probability . But what I think would be interesting would be to define a Markov process on which, as goes to , becomes the Cayley walk on the permutation flags.