The class was surprisingly packed, with many familiar faces. Persi went right into the course outline. He said he will cover Stanley Enumerative Combinatorics volume II Ch 7 (maybe up to ch 7).

The central object is of course the space of symmetric functions where the superscript denotes invariant subspace. He introduced briefly several bases of this vector space: power sum , elementary , monomial , homogeneous , and Schur functions , all of which are indexed by partitions of some positive integer. The change of basis coefficients between one family and another will encode much combinatorial information. For instance if , then the is the character of indexed by evaluated at the conjugacy class .

Something I wasn’t aware of is that the Schur polynomials themselves are characters are the unitary group , given as following:

where ‘s are the eigen-angles of . This allows one to compute the following type of integral

because the trace can be seen as a symmetric polynomial of the eigenvalues, namely the simple power sum polynomials , which can be expressed in terms of the Schur polynomials by a change of basis. And since Schur polynomials are orthonormal with respect to Haar measure, one gets the integral above equals . This is written in Persi’s paper ” Patterns on Eigenvalues” on Bulletin of AMS. Other topics he will cover are

- Hall inner product, which I am not if is the same as the defining inner product for Jack polynomials (maybe only for Hall-Littlewood polynomials).
- Schur functions and RSK algorithm, which he says associates a permutation with two partitions (or Young Tableaux), which do have the same cardinality.
- Classical definition of Schur functions by Jacobi
- Character theory of and Murnagon Nakayama rule
- Random Matrix theory (yay) and quasi-symmetric functions, which he says is popular nowadays
- Polya theory- which seems highly combinatorial in nature, and deals with counting orbits of finite groups.
- Combinatorial Hopf algebras
- Macdonald polynomials, something I really need to learn
- Shuffling cards : here he gave a probabilistic interpretation in terms of card shuffling model of the RSK algorithm.

The next highlight of the lecture is three partial orderings on the space of partitions of integers. First of all partitions are written with weakly decreasing component sizes. Then we have the following three orderings:

- component-wise AND ordering (component-wise OR doesn’t make sense)
- dominance ordering, which is that if .
- A common refinement of both of the above, which is a linear (i.e., total) ordering: in simple terms, it’s just lexicographical ordering.

Of course one can describe dominance ordering alternatively in terms of dot up-moving of Ferrer’s diagram. It’s a serious problem to show iff . From the fact that monomial symmetric polynomials span one instantly gets that it’s dimension is which is the number of partitions of . Next recall elementary symmetric functions are defined by And for general , . The final punchline is the following proposition:

Theorem 1The transition coefficients of to gives the number of 0-1 matrices with row sum equal to and column sum equal to .

To prove this, first observe that gives the generating function of all configurations of balls in infinitely many boxes satisfying the Fermi-Dirac allocation constraint: i.e., no two balls in the same box. This is true because each such allocation gets counted exactly once with the label indicating the positions of the balls. On the other hand counts the power of , , etc, so a monomial term in corresponds to choosing distinct ‘s in the first row, distinct ‘s in the second row, etc, and it will be a term in if there are ‘s chosen in total, ‘s chosen in total, etc. Now since and are invariant under , this gives the desired counting result.