## Algebraic Combinatorics Lecture 2

2. Algebraic Combinatorics lecture 2

Note the coefficient ${M_{\lambda,\mu}}$ is symmetric by the interpretation in terms of 0/1 matrices. As a recap of the last part of last lecture, we have the following two generating function identities:

$\displaystyle 1+ e_1 t + e_2 t^2 + \ldots = \prod_{i=1}^\infty (1 + x_i t)\\ \prod_{ij} (1 + x_i y_j) = \sum_{\lambda, \mu} M_{\lambda \mu} m_\lambda(x) m_\mu(y) = \sum_\mu m_\mu(x) e_\mu(y). \ \ \ \ \ (1)$

The first one can be seen by expansion of right hand side. The second one uses the interpretation ${M_{\lambda,\mu}}$ as the number of 0/1 matrices with prescribed row sum ${\mu}$ and column sum ${\lambda}$ (check it on one monomial term from ${m_\lambda(x)}$ and one from ${m_\mu(y)}$, and then symmetrize).

Theorem 1 ${\{e_\lambda\}}$ form a basis for ${\Lambda}$ and hence ${\{e_i\}}$ are algebraically independent over ${\mathbb{Q}}$.

Proof: The second statement follows because every algebraic relation of ${e_i}$‘s can be written as a multivariate polynomial, which each term a multiple of a basis vector ${e_\lambda}$. The proof of the first statement starts with the axiom that ${m_\lambda}$‘s form a linear basis (by ocnstruction of ${\Lambda}$). We need the following Fact due to Gale and Ryseath (check spelling): ${M_{\lambda,\mu} \ge 1}$ iff ${\lambda \le \mu'}$ and ${=1}$ if ${\mu = \lambda'}$. Here the partial order is the dominance order. In fact one could also take the lexicographical order since it refines dominance and if the cardinalities of ${\mu}$ and ${\lambda}$ are different, ${M_{\lambda,\mu}}$ has to be 0 (exercise). Persi only proved the only if direction of the first part and the ${=1}$ case using a push-to-the-left argument. Suppose ${M_{\lambda,\mu} \ge 1}$, then left ${A}$ be a 0/1 matrix with the prescribed row and col sums. Push all the 1’s in A to the left. This operation increases the rank of the partition determined by the columns of ${A}$ under dominance, in fact strictly increases if the operation is nontrivial. This last part also gives the ${M_{\lambda,\mu}=1}$ case. Thus ${M_{\lambda',\mu}}$ is unipotent upper triangular under a total ordering ${\le_T}$ of partitions of ${n}$, that refines dominance, and such that the transpose also refines transposed dominance, i.e., ${\lambda' \le_T \mu'}$ if ${\lambda' \le \mu'}$. Since a unipotent upper triangular matrix transform one ${\mathbb{Q}}$ basis to another, ${e_\lambda}$ also form a basis. $\Box$

Next we introduce the homogeneous symmetric functions ${h_\lambda = \prod h_{\lambda_i}}$ where ${h_i = \sum_{j_1 \le \ldots \le j_i} \prod_i x_{j_i}}$. Note the only difference with ${e_i}$ is the nonstrictness of the inequalities among ${j_k}$‘s. By geometric expansion, we get

$\displaystyle \prod_{i=1}^\infty \frac{1}{1-x_i t} = \sum_i h_i t^i.$

Define ${N_{\lambda, \mu}}$ as the transition coefficients from ${m_\mu}$ to ${h_\lambda}$, then it has the interpretation as the number of matrices with entries in ${\mathbb{N}}$(nonnegative integers) with row sum ${\lambda}$ ,and col sum ${\mu}$. This makes sense since the inequalities in the sum above is not strict (which gives the Bose-Einstein allocation rather than Fermi-Dirac). ${N_{\lambda,\mu}}$ cannot be put into upper triangular form. We have another handy generating function relation:

$\displaystyle \prod_{i,j} \frac{1}{1-x_i y_j} = \sum_{\mu,\lambda} N_{\lambda,\mu} h_\lambda(x) h_\mu(y) = \sum_\lambda h_\lambda(x) m_\lambda(y)$

which makes sense since the monomials in ${h_\lambda}$ have repeated factors, which is provided by the geometric series on the left hand side. The following algebra homomorphism is an involution ${\omega(h_\lambda) = e_\lambda}$, hence in particular an isomorphism. I wonder if this has to do with the canonical isomorphism between Fermion and Boson representations of the Heisenberg algebra (See Wang Dong’s paper on KP tau function of the partition function of GUE with external source). Proof: If we look at the generating functions of ${e_i}$ and ${h_i}$, given by ${E(t) = \prod (1+ x_i t)}$ and ${H(t) = \prod (1- x_i t)^{-1}}$, we see their product is ${1}$, which means coefficients of ${t^k}$, ${k \ge 1}$, are all ${0}$, i.e.,

$\displaystyle \sum_{i \in [1,n]} e_i h_{n-i} = 0$

, for all ${n}$. This system of equations define ${h_i}$‘s, as seen by Grahm-Schmidt. Applying ${\omega}$ to both sides of all these equations, we easily get ${\omega(e_\lambda) = h_\lambda}$. I would like to know a more intrinsic definition of ${\omega}$. $\Box$

The last basis we study is power sum polynomials, defined in the first lecture. Their generating function is

$\displaystyle \prod_{ij} \frac{1}{1-x_i y_j} = \sum_\lambda \frac{1}{z_\lambda} p_\lambda(x) p_\lambda(y) = e^{\sum_n \frac{1}{n} p_n(x) p_n(y)}$

where ${z_\lambda}$ is the size of the conjugacy class of ${\lambda}$. Thus we get a better looking formula taking log of the first and last expressions. A similar formula is

$\displaystyle \prod_{ij} (1 + x_i y_j) = \sum_\lambda \frac{\epsilon_\lambda}{z_\lambda} p_\lambda(x) p_\lambda(y) = e^{\sum_n \frac{(-1)^{n-1}}{n} p_n(x) p_n(y)} \ \ \ \ \ (2)$

where ${\epsilon_\lambda = n-l(\lambda)}$ is the signature of ${\lambda}$. Note one can get the first equality of the second formula from the first by substituting ${y_i \rightarrow -y_i}$. We will only prove the first formula: Using Polya’s cycle index formula

$\displaystyle \sum_{n \ge 0} \sum_{\lambda \vdash n} \frac{1}{z_\lambda} \prod_i x_i^{\lambda_i} = \exp(\sum_{i \ge 1} \frac{1}{i} x_i t^i), \ \ \ \ \ (3)$

with ${t=1}$ and ${x_i = p_i(x) p_i(y)}$, we immediate obtain the second equality. The first equality is another expansion and counting result. Notice the ${x_i}$ and ${y_i}$ power must match for each monomial term because the power sums in each term have the same partition index ${\lambda}$. It would be nice to get a direct proof of the equality skipping the expression ${\sum_\lambda \frac{1}{z_\lambda} p_\lambda(x) p_\lambda(y)}$. One last observation:

$\displaystyle \omega(p_\lambda) = \epsilon_\lambda p_\lambda$

. This is obtained by applying ${\omega}$ to ${\prod_{ij}\frac{1}{1-x_i y_j} = \sum_\nu m_\nu(x) h_\nu(y)}$ proved before, and using the two formulae proved immediately above (exercise).