Polya theory continued

The setting is as previous lecture. Many of the expositions are adapted from Stanley’s Volume 2 appendix by Sergey Fomin. {w(f) = y_1^{f^{-1}(r_1)} y_2^{f^{-1}(r_2)} \ldots y_i^{f^{-1}(r_i)}}, where {f(y_i) = r_i} and {f^{-1}(r)} denotes the size of the inverse image {|f^{-1}(r)|} by an abuse of notation. Last time we saw that

Theorem 1 (Polya) {\sum_{\text{pattern } f} w(f) = Z_G(p_1,p_2,\ldots, p_d)}, where {Z_G(x_1,\ldots,x_d) = \frac{1}{|G|} \sum_{g \in G} \prod x_i^{a_i(g)}}, and {g} has {a_i} i-cycles, and {p_i(y_1,\ldots,y_r) = \sum_{j=1}^r y_j^i}.

Note that both sides above are in {\Lambda}. This leads to connections in character theory.
Let {G \subseteq S_d}. Then {S_d} acts on {S_d / G}. This gives a permutation representation of {S_d} with dimension {|S_d| / |G|}. Let {\chi^G} be a character.

Theorem 2

\displaystyle  Z_G(p_1,\ldots,p_d) = \text{char}(\chi^G), \ \ \ \ \ (1)

where recall if {f} is a class function on {S_d}, {\text{char}(f) = \frac{1}{d!} \sum_{\omega \in S_d} f(\omega) \rho_{\lambda(\omega)}}, where {\lambda(\omega) = } the partition structure of {\omega}.

Corollary 3 If {\chi^G = \sum_{\lambda \vdash d} m_\lambda \chi_\lambda}, then {Z_G = \sum m_\lambda s_\lambda}. Since {m_\lambda} are all nonnegative, {Z_G} is Schur positive.

Proof: (of theorem) {\chi^G = \text{IND}_G^{S_d}(1)}. If {G \subseteq \bar{G}}, and {\chi} is a character of {G}, then

\displaystyle  \text{IND}_G^{\bar{G}}(\bar{g}) = \frac{1}{|G|} \sum_{x \in \bar{G}} \chi^0(x \bar{g} x^{-1}), \ \ \ \ \ (2)

where

\displaystyle  \chi^0(\bar{g}) = \left\{ \begin{array}{cc} \chi(\bar{g}) & \text{if } \bar{g} \in G \\ 0 & \text{otherwise}\end{array}\right. \ \ \ \ \ (3)

If {G \cap \text{CL}_{\bar{G}}(\bar{g}) = \cup_{i=1}^k \text{CL}_{G_k}(g_i)}, then

\displaystyle  \text{IND}_G^{\bar{G}}(\chi)(\bar{g}) = c_{\bar{G}}(\bar{g}) \sum_{i=1}^k \frac{\chi(g_i)}{c_G(g_i)}, \ \ \ \ \ (4)

where {c_{\bar{G}}(\bar{g}) = |\{x \in \bar{G}: x \bar{g} x^{-1} = \bar{g}\}|} is the size of the centralizer of {\bar{g}} in {\bar{G}}. The proof uses {\text{CL}_G(g_i) = \frac{|G|}{c_G(g_i)}}. Going back to {G \subseteq S_d}, we have the following claim

\displaystyle  \chi^G = \text{IND}_G^{S_d}(1)(\omega) = \frac{|S_d|}{S^\lambda} \frac{|G \cap S^\lambda|}{|G|}, \ \ \ \ \ (5)

where {\omega \in S_d} has cycle type {\lambda}. Finally,

\displaystyle  \text{char}(\chi^G) = \frac{1}{|S_d|} \sum_\omega \chi^G(\omega) p_{\lambda(\omega)} \\ =\frac{1}{|S_d|} \sum_\omega \frac{|S_d|}{|S^\lambda|} \frac{| G \cap S^{\lambda(\omega)}|}{|G|} p_{\lambda(\omega)}\\ = \frac{1}{|G|} \sum_\lambda |G \cap S^\lambda| p_\lambda\\ = \frac{1}{|G|} \sum_{g \in G} p_{\lambda(g)}\\ = Z_G(p_1,\ldots, p_d) \ \ \ \ \ (6)

\Box

Interpretations of {\sum_{\text{pattern } f} w(f) = Z_G(p_1,\ldots, p_d) = \text{char}(\chi^G)}:

  • consider {G= S_d}. {f \in \mathcal{F}(D,R)} is a placement of {d} balls into {r} boxes. Patterns consist of configurations of unlabelled balls (Bose-Einstein statistics).
    Check: set all {y_i =1}. On the left we have the number of BE configurations of {d} balls in {r} boxes. On the middle and right hand side, we have

    \displaystyle  \frac{1}{d!} \sum_{\omega \in S_d} r^{\alpha_1(\omega) + \ldots + \alpha_d(\omega)} = \frac{1}{d!} \sum_\omega r^{l(\omega)} \\ = \frac{1}{d!} r(r+1)\ldots (r+ d -1) = \binom{r+d-1}{d}. \ \ \ \ \ (7)

    So we have

    \displaystyle  \frac{1}{\binom{d+r-1}{d}} \sum_{\text{pattern }f} w(f) = \mathbb{E}(y_1^{n_1} \ldots y_r^{n_r}). \ \ \ \ \ (8)

    Let’s not divide, instead multiply by {t^d} and sum {d} from 0 to {\infty}. Then we get

    \displaystyle  \sum_{d=0}^\infty t^d \sum_f w(f) = \sum_{d=0}^\infty t^d Z_{S_d} \\ = \exp( \sum_{i=1}^r \frac{t^i}{i} p_i) = \prod_{i=1}^r \frac{1}{1-y_i t}  \ \ \ \ \ (9)

    We can use this: set {y_1 =y}, {y_i = 1} for {i \ge 2}, and differentiate {y} once at {y=1} to get

    \displaystyle  \sum_{d=0}^\infty \binom{d+i-1}{d} \mathbb{E}_d(n_1) = \frac{t}{(1-t)^{r+1}}. \ \ \ \ \ (10)

    Do this many times to get the joint moments of {n_1,\ldots, n_k}. (9) is the joint moment generating function of {x_i}‘s which are independent geometric with {\mathbb{P}(x_i= j) = (1-t)t^j}, {j=0,1,\ldots,r}, i.e., {\mathbb{E} \prod_{i=1}^r y_i^{x_i}}. Then the conditional distribution {\mathcal{L}(x_1,\ldots, x_r|\sum_{i=1}^r x_i = d)} is Bose-Einstein! This is analogous to the picture of Poissonization of the measure on partitions induced from the uniform measure on {S_d}.
    If in (9), we multiply both sides by {(1-t)^r}, then

    \displaystyle  \text{RHS} = \prod_{i=1}^r \frac{1-t}{1-y_i t} = \prod \sum_{j=0}^\infty y^j \mathbb{P}(x_i=j). \ \ \ \ \ (11)

    On the {\text{LHS}}, we have {\sum_d (1-t)^r t^d \binom{d+r-1}{d} \mathbb{E}_d(y_1^{n_1} \ldots y_r^{n_r})}.

  • Let now {G = \{\text{id}\}}, then {\sum_f w(f) = (y_1 + \ldots + y_r)^d}. This gives the joint moment generating function of the Poissonized multinomial distribution (whose components are iid) when multiplied by {t^d / d!} (Fisher’s Identity):

    \displaystyle  \sum_d \frac{t^d}{d!} \sum_f w(f) = e^{t(y_1 + \ldots + y_r)}. \ \ \ \ \ (12)

  • Next try {C_d}, the cyclic group acting on {\{1, \ldots, d\}}. Then

    \displaystyle  Z_{C_d}(x_1, \ldots, x_d) = \frac{1}{d}\sum_{m |d} \phi(m) \chi_m^{d/m}. \ \ \ \ \ (13)

    specializing to {x_i \equiv x}, we get Witts cyclotomic identity:

    \displaystyle  \prod_{d=1}^\infty (1-t^d)^{Z_{C_d}(x)} = \frac{1}{1-xt}. \ \ \ \ \ (14)

Back to Stanley: (1) Curtis Greene theorem on RSK rows and columns (2) Knuth equivalence. Recall RSK: {\omega \rightarrow (P,Q)}, and define the shape of {\omega} to be {\text{Sh}(\omega) := \text{Sh}(P) = \text{Sh}(Q)}. What does {\text{Sh}(\omega)} mean? Or, we do {\omega} and {\omega'} have the same shape? Let {I_k(\omega) = } the maximum number of elements in the union of {k} increasing subsequences of {\omega}.

Theorem 4 {\omega \mapsto (P(\omega),Q(\omega))} has shape {\lambda_1, \ldots, \lambda_l}, where {I_k = \lambda_1 + \ldots + \lambda_k}, and {D_k = \lambda^t_1 + \ldots + \lambda^t_k}, where {D_k} is the maximum number of elements in the union of {k} decreasing subsequences.

Knuth equivalence: when do {\omega} and {\omega'} have the same {P} tableau?
Answer: a “Knuth move” needs three adjacent letters say {a,b,c}, with {a < b< c} with {a} and {c} adjacent, switch them: for example

\displaystyle  \begin{array}{cccc} acb & bac & cab & bca \\ \downarrow & \downarrow & \downarrow & \downarrow \\ cab & bca & acb & bac \end{array} \ \ \ \ \ (15)

More concretely, the following arrow charts are admissible Knuth move paths:

\displaystyle  \begin{array}{ccccccc} && 51243 &\rightarrow &15243& \rightarrow& 12543 \\ && \downarrow & & \downarrow && \\ 54123 &\leftarrow & 51423 &\leftarrow& 15423&& \end{array} \ \ \ \ \ (16)

Theorem 5 {\omega} and {\omega'} have the same {P} tableau if and only if they are Knuth equivalent in the above sense. Furthermore they have the same {Q} tableau if and only if {\omega^{-1}} is Knuth equivalkent to {(\omega')^{-1}}.

Advertisements

About aquazorcarson

math PhD at Stanford, studying probability
This entry was posted in algebraic combinatorics, Uncategorized. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s