Algebraic combinatorics lecture 5: RSK algorithm

First we describe the Schensted algorithm, which associates to each permutation {\omega \in S_n} two Young tableaux of the same shape but possibly different types; it consists of bumping numbers into a greedy patience sorting procedure.

  1. In the first step we write {\omega(1)} down.
  2. After the first k steps, we have a Young tableau {P} of {k} boxes, filled in with {\omega(1), \ldots, \omega(k)}. Now we consider {\omega(k+1)}, if it’s bigger than all the numbers in the first row, we simple append {\omega(k+1)} at the end of the first row. Otherwise we replace the smallest number bigger than {\omega(k+1)} in the first row with {\omega(k+1)}; we say that {\omega(k+1)} bumps into {P}, denoted {P \leftarrow \omega(k+1)}. this results in a new tableau {P'} of size {k} and an extra number that’s the one being displaced, which we denote by {x_1}. Now let {P''} be {P'} with the first row deleted and we repeat the procedure on {x_1} and {P''} as we did with {\omega(k+1)} and {P}. The end product is a Young tableau of certain shape {\lambda} and will be denoted {P(\omega)}.
  3. We now define another Young tableau {Q} also of shape {\lambda}, which assigns the number {k} to box {(i,j)} if position {(i,j)} is the newest box added in the {k}th step of the above bumping process for {P(\omega)}.

We will show that the map {\omega \mapsto (P(\omega),Q(\omega))} is a bijection. Now recall a permutation can be represented as a two-row list:

\displaystyle  \omega = \begin{array}{cccc} 1& 2 & \ldots & n\\ j_1 & j_2 & \ldots & j_n \end{array} \ \ \ \ \ (8)

Knuth extended it to allow for repeated values in both rows above, i.e.,

\displaystyle  \begin{array}{cccc} i_1 & i_2 & \ldots & i_n \\ j_1 & j_2 & \ldots & j_n \end{array} \ \ \ \ \ (9)

where {i_1 \le i_2 \le \ldots \le i_n}, and within the block of equal {i}‘s, the underneath {j}‘s are weakly increasing as well. These two-line arrays always arise in the following way: take a finite {a \times b} matrix with natural number entries. Now we scan the entries from left to right first and then up to down. If we see the number {k} in the {i,j}th position, then we write down {i} over {j} in the two-line array {k} times. Clearly this gives a two-line array with the requisite monotonicity properties. Furthermore it’s not hard to see that every such two-line array can arise in this fashion in a unique way. We can similarly define {P} and {Q} SSYTs (as opposed to SYT) with the same shape {\lambda}, associated with {\omega} (or equivalently the matrix {A}). Notice that {P} takes values from {j_1, \ldots, j_n} and {Q} from {i_1, \ldots, i_n}. Furthermore we have

Proposition 6

  1. The number of boxes in the first row of {\lambda} equals the length of the longest weakly increasing subsequence of {\omega},
  2. The number of boxes in the first column equals the length of the longest strictly decreasing subsequence of {\omega}.

As a reference, C. Greene in his “An extension of Schensted’s Theorem” interprets the length of the other rows and columns as well.
Some open-ended questions:

  1. for {\lambda, \mu \vdash n}, one can always find {{\mathbb N}} valued matrix {A} with those two as row and column sums, by working off the diagonal from upper left corner down, taking the maximally allowed value at each entry. This procedure can be shown not to get stuck if {N(\lambda,\mu) \neq 0}. But is there any combinatorially canonical choice for A?
  2. Given {\omega}, what does the shape of {\lambda} tell us about {\omega}? And is there a natural metric on the space of all matrices {A} with prescribed row and column sums?

Proposition 7 The {P} tableau given by a two line array defined above is Semi-standard.

Proof: It suffices to show that inserting a number {k} into the top row of a SSYT {P} yields another SSYT. The only bad thing that could happen to the shape is if at some stage the displaced number is appended to the end of the next row. But this can’t happen due to the strictly increasing property down the columns of SSYT.
Next observe that the weak increasingness down columns are still preserved, in particular the column below {k}. Rows are weakly increasing by inspection. Strict monotonicity down the columns is true because of the choice of the box we bump at each stage: it cannot be inserted into the box right below its original position. \Box

Proposition 8 {Q} is also SSYT.

Proof: We only need to worry about the monotonicity property defining SSYT. Weak monotonicity is easy to see, just because the boxes are added in the natural way of building a Ferrers diagram. We only need to show strict monotonicity down the column. So suppose {i_k =i_{k+1}} are two equal elements in the top row of the 2-line array {\omega}. We need to show they are in different columns of {Q}. But by definition of 2-line array, {j_k \le j_{k+1}}, hence one could show that the insertion path {I(P_k \leftarrow j_k)} of {j_k} (i.e., the sequence of boxes inserted at each row until the bottom row is reached or until some box is appended), lies strictly to the left of that of {j_{k+1}}, (see Stanley p. 314), and could never go below that of {j_{k+1}}, hence {i_k} and {i_{k+1}} are not in the same column.
From {P} and {Q} we can reconstruct {\omega} by following the monotonicity requirements of the latter. It’s not too hard to show this reconstruction is a bijection. (Details are in Stanley vol II of course) \Box

This bijectivity allows us to establish

Theorem 9 Schur functions satisfy

\displaystyle   \prod_{i,j} (1-x_i y_j)^{-1} = \sum_\lambda s_\lambda(x) s_\lambda(y), \ \ \ \ \ (10)

hence form a basis of {\Lambda_n}.

Proof: Expanding geometrically we have

\displaystyle  \prod_{i,j} (1-x_i y_j)^{-1} = \prod_{i,j} \sum_{a_{ij}=0}^\infty (x_i y_j)^{a_{ij}} \ \ \ \ \ (11)

For weak compositions {\alpha} and {\beta},

the coefficient of {x^\alpha y^\beta} in the above is given by {N(\alpha,\beta)}

by inspection (the {a_{ij}} are suggestive of the matrix {A} with the requisite row and column sum {\alpha}, {\beta}.
On the other hand, in the right hand side of (10), we also have 3, because essentially we are summing over all {\{\lambda, \alpha, \beta:\lambda \vdash |\alpha| = |\beta|\}} which by Schensted-Knuth bijection is equal to the number of all 2-line arrays with first row given by {\alpha} and second row given by {\beta}, which in turn is equal to all matrices with row sum {\alpha} and column sum {\beta}.
Thus {\{s_\lambda\}} forms a basis by the duality result established in the last lecture. \Box

Advertisements

About aquazorcarson

math PhD at Stanford, studying probability
This entry was posted in algebraic combinatorics, Uncategorized and tagged , , , . Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s