23

Consider the Grassmannian $Gr(k,N)\simeq U(N)/(U(k)\times U(N-k))$ which parametrizes $k$-dimensional subspaces of $\mathbb{C}^N$. Let us put on it the $U(N)$-invariant probability measure. Let $\Delta_{I}$ be one of the Plücker coordinates. My question is:

Are there explicit formulas for the moments $m_p=\mathbb{E}\ |\Delta_I|^{2p}$, for $p$ a nonnegative integer, of the random variable $|\Delta_I|^2$?

Equivalently, if $U$ is a Haar distributed unitary matrix in $U(N)$, what are the moments of $|\Delta|^2$ where $\Delta$ is your favorite $k\times k$ minor?

For $k=1$, which corresponds to projective space, we have $$ m_p=\frac{p!}{N(N+1)\cdots(N+p-1)}\ . $$


Edit: I did a rather brutal calculation for $k=2$. The result is $$ m_p=\frac{(p+1)}{\binom{N+p-1}{p}\binom{N+p-2}{p}}\ . $$

I think I know how to do the general case now, but I would be surprised if this computation has not been done before, so any reference would be appreciated.


Aug 2023 Update:

Please see my answer below for a generalization of the problem and more context. There is still a few hours for the second bounty, but I need references on the following points, in particular.

  1. Who first proved the formula $M_{\lambda}(N)=\frac{1}{{\rm dim}_{\mathbb{C}}(\mathbb{S}^{\lambda}(\mathbb{C}^N))}$? Are there standard textbooks which explain the proof?
  2. Is there a name for functions like $|\Delta_1|^{2(\lambda_1-\lambda_2)}\cdots|\Delta_{\ell-1}|^{2(\lambda_{\ell-1}-\lambda_{\ell})}|\Delta_{\ell}|^{2\lambda_{\ell}}$ ? References, generalizations to general Lie groups?
  3. Who first proved Theorem 5.3.1 in the book by Faraut about Haar measures in terms of decompositions $G=PQ$? Good references about this?
  • 2
    Does it make sense to talk about the 'value' of a projective coordinate? – Sam Hopkins May 20 '20 at 23:48
  • 6
    I am presenting the variety as $U(N)/(U(k)\times U(N-k))$ instead of using $GL$ over a parabolic subgroup. So once you put the projective coordinate inside complex modulus square, it does have an honest value. – Abdelmalek Abdesselam May 21 '20 at 00:00
  • I assume that the orthogonal and unitary cases are similar. In the orthogonal case it seems to me that we are essentially interested in the angles of two random subspaces; more precisely, in the singular values of $X^T Y$ if $X$ and $Y$ are random $n \times k$ matrices with orthogonal columns. (If we fix $Y_{i,j} = \delta_{i,j}$, then we get back the original setting.) I've found this paper: https://math.mit.edu/~plamen/files/AEK.pdf, and then the book Aspects of Multivariate Statistical Theory, by Muirhead. The Wishart distribution, and Theorems 3.3.1 and 3.3.3 from the book seem relevant. – user42355 Jul 20 '23 at 19:28
  • I added the automorphic form tag, because I am sure these Haar integrals are well known to folks in the area. – Abdelmalek Abdesselam Aug 02 '23 at 16:28

2 Answers2

7

Notation: $\mathcal{CN}(0,1)$ denotes the complex standard normal distribution: if $Z \sim \mathcal{CN}(0,1)$, then $\sqrt{2} \operatorname{Re}(Z), \sqrt{2} \operatorname{Im}(Z)$ are independent real standard normal random variables.

Claim: Let $X = (X_{i,j})_{1 \le i,j \le N}$ be a random matrix with independent $X_{i,j} \sim \mathcal{CN}(0,1)$ (this is called the Ginibre ensemble). Then $X$ is almost surely invertible, so its QR decomposition $X = QR$, with $Q \in U(N)$ and $R \in \mathbb{C}^{N \times N}$ upper triangular with positive diagonal entries, is uniquely defined almost surely. Then $Q$ and $R$ are independent, $Q$ has the distribution given by Haar probability measure on $U(N)$, and $(R_{i,j})_{i \le j}$ are independent, with $R_{i,j} \sim \mathcal{CN}(0,1)$ for $i < j$, and $2 R_{i,i}^2 \sim \chi^2_{2(N-i+1)}$ for $i \in \{1, \dotsc, N\}$.

Using this Claim, the calculation of the moments becomes easy. Take $k \le N$, and let $X'$, $Q'$, $R'$ denote the top-left $k \times k$ submatrices of $X$, $Q$, $R$. Since $R$ is upper triangular, we get $X' = Q' R'$. So $\det(X') = \det(Q') \prod_{i=1}^k R_{i,i}$, and here all factors are independent. So $$ E[|\det(X')|^{2p}] = E[|\det(Q')|^{2p}] \prod_{i=1}^k E[R_{i,i}^{2p}]. $$ Recall that if $Z \sim \chi^2_n$, then $E[Z^p] = n (n+2) \dotsm (n+2p-2)$. So $$ E[R_{i,i}^{2p}] = \prod_{j=1}^p (N-i+j) = p! {N-i+p \choose p}. $$ Let $X' = \tilde{Q} \tilde{R}$ be the QR decomposition of $X'$, then using the Claim again, we get $|\det(X')| = \det(\tilde{R}) = \prod_{i=1}^k \tilde{R}_{i,i}$, where the $\tilde{R}_{i,i}$ are independent, and $2 \tilde{R}_{i,i}^2 \sim \chi^2_{2(k-i+1)}$. So $E[|\det(X')|^{2p}] = \prod_{i=1}^k E[\tilde{R}_{i,i}^{2p}] = \prod_{i=1}^k (p! {k-i+p \choose p})$. So $$ E[|\Delta|^{2p}] = E[|\det(Q')|^{2p}] = \prod_{i=1}^k \frac{{k-i+p \choose p}}{{N-i+p \choose p}}. $$

The only reference I have found for the above Claim is in the notes https://www.dam.brown.edu/people/menon/publications/notes/intro-rmt.pdf by Menon and Trogdon, see the section titled Other random matrix ensembles (warning: there $\sqrt{2} X$ is called the Ginibre ensemble, not $X$). The orthogonal analogue of the Claim can be found e.g., in Theorem 2.3.18 of the book Matrix Variate Distributions, by Gupta, Nagar.

user42355
  • 1,531
  • Very elegant proof! I was not clear, but my preferred answer would have been a quotable reference, although it is possible (unlikely?) it might not exist. So +1, and bounty bonus, but no accept, for now. I will start a new bounty. – Abdelmalek Abdesselam Jul 26 '23 at 16:12
5

Since the bounty is about to expire, I am posting my findings so far.

Using the method mentioned by user42355, one can prove the following more general result. Let $\lambda=(\lambda_1,\ldots,\lambda_{\ell})$ be an integer partition with at most $\ell$ parts, i.e., we assume the $\lambda_i$'s are integers such that $\lambda_1\ge\cdots\lambda_{\ell}\ge 0$. Define, for $N\ge \ell$,

$$ M_{\lambda}(N):=\mathbb{E}_{U(N)}\left[ |\Delta_1(U)|^{2(\lambda_1-\lambda_2)} |\Delta_2(U)|^{2(\lambda_2-\lambda_3)}\cdots |\Delta_{\ell-1}(U)|^{2(\lambda_{\ell-1}-\lambda_{\ell})} |\Delta_{\ell}(U)|^{2\lambda_{\ell}} \right]\ . $$ Here $\Delta_j(U)$ denotes the $j\times j$ principal minor determinant in the top left corner of the matrix $U$, and the latter is random and Haar distributed in the unitary group $U(N)$.

Let $G_{\lambda}(N)$ denote the expectation of the same integrand but where the matrix $X$ is now sampled according to the Ginibre ensemble, and let $T_{\lambda}(N)$ be the the similar expectation with now a random matrix $T$ which is upper triangular with positive diagonal entries, sampled as follows. The entries are independent. For $i<j$, $T_{ij}$ is such that $\sqrt{2}{\rm Re}(T_{ij})$ and $\sqrt{2}{\rm Im}(T_{ij})$ are independent standard $N(0,1)$ real Gaussian random variables. For $1\le i\le N$, $2T_{ii}^2$ is a chi-squared random variable with $2(N-i+1)$ degrees of freedom.

As in the answer by user42355, the key fact one needs is the Gram (or QR for numerical analysts) decomposition $X=UT$, as an equality in distribution, with $U$ and $T$ independent and distributed as above. We then immediately have $$ G_{\lambda}(N)=M_{\lambda}(N) T_{\lambda}(N)\ . $$ Now since all the entries of the Ginibre matrix $X$ are independent, we see that $G_{\lambda}(N)=:G_{\lambda}$ is independent of $N$. As a result, $$ M_{\lambda}(N)=\frac{T_{\lambda}(\ell)}{T_{\lambda}(N)}M_{\lambda}(\ell) $$ Since $|\Delta_{\ell}(U)|=1$ for $U$ unitary of size $\ell$, we get the recursion relation $$ M_{(\lambda_1,\ldots,\lambda_{\ell})}(\ell)= M_{(\lambda_1-\lambda_{\ell},\lambda_2-\lambda_{\ell}, \ldots,\lambda_{\ell-1}-\lambda_{\ell})}(\ell) $$ which reduces the length of the partition seen as a sequence of nonnegative integers.

Using the formula for the moments of the chi-squared or Gamma distributions, we easily get $$ T_{\lambda}(N)=(N)_{\lambda_1}(N-1)_{\lambda_2}\cdots(N-\ell+1)_{\lambda_{\ell}} $$ where I used the (rising) Pochhammer symbol $(x)_n=x(x+1)\cdots(x+n-1)$.

Putting everything together, and solving the recursion on $\ell$, we arrive at $$ M_{\lambda}(N)=\frac{(\ell)_{\lambda_1}(\ell-1)_{\lambda_2}\cdots(1)_{\lambda_{\ell}}}{(N)_{\lambda_1}(N-1)_{\lambda_2}\cdots(N-\ell+1)_{\lambda_{\ell}}}\times C_{\lambda} $$ where $$ C_{\lambda}=\prod_{j=2}^{\ell}\left( \prod_{i=1}^{j-1} \frac{(j-i)_{\lambda_i-\lambda_j}}{(j-i+1)_{\lambda_i-\lambda_j}} \right) $$ $$ =\prod_{j=2}^{\ell}\left( \prod_{i=1}^{j-1} \frac{(j-i)}{(j-i+\lambda_i-\lambda_j)} \right)\ . $$ In other words, we end up with the beautiful and most certainly known formula, $$ M_{\lambda}(N)=\frac{h_{\lambda}}{\prod_{\Box\in\lambda}(N+c(\Box))}\ . $$ Here $\Box$ denotes a box in the Ferrers diagram of $\lambda$, drawn the English (rather than French) way. The content of $\Box$ is $c(\Box)=j-i$ if the box lies in the $i$-th row (from top to bottom) and the $j$-th column (from left to right). Finally, $h_{\lambda}$ is the product of hook lengths for the partition $\lambda$, which is also given by $$ h_{\lambda}=\frac{(\lambda_1+\ell-1)!\ (\lambda_2+\ell-2)!\cdots \lambda_{\ell}!}{\prod_{1\le i<j\le\ell}\ (j-i+\lambda_i-\lambda_j)}\ . $$ It is easy to see that the moment in the question corresponds to the rectangular partition $\lambda=(p,\ldots,p)$, where $p$ is repeated $k$ times. In a previous version of the post for my question, I suspected that the moment should be one over the dimension of a representation of $U(N)$, but then I deleted that because I dismissed it as too good to be true. Yet, it is true! Namely, $$ M_{\lambda}(N)=\frac{1}{{\rm dim}_{\mathbb{C}}(\mathbb{S}^{\lambda}(\mathbb{C}^N))} $$ where $\mathbb{S}^{\lambda}$ is the Schur functor associated to the partition $\lambda$. This follows from the hook content formula for ${\rm dim}_{\mathbb{C}}(\mathbb{S}^{\lambda}(\mathbb{C}^N))$ which is the number of semistandard Young tableaux of shape $\lambda$ filled with numbers from $1$ to $N$.

As far as references, for the key probabilistic Gram or QR decomposition, I looked at the notes by Menon and Trogdon mentioned by user42355, as well as other random matrix references from the areas of statistics (e.g., the book by Eaton), probability, mathematical physics (e.g., related to quantum transport), but I did not like the presentation therein (just my personal taste). This is because one has to go through rather tedious independence or conditional independence statements about partitioned Wishart matrices. The most illuminating presentation I found was the book "Analysis on Lie Groups, An Introduction" by Jacques Faraut. The needed Gram decomposition is done in Exercise 5 of Chapter 5, as a corollary of Theorem 5.3.1 which shows how to build a left Haar measure on a locally compact group $G$ homeomorphically decomposed as a product $G=PQ$ of two closed subgroups, using a left Haar measure on $P$, a right Haar measure on $Q$, and the module function of $G$.

After finishing the computation of $M_{\lambda}(N)$, and contemplating the beautiful final formula, I realized one can prove this in a completely different way using the Peter-Weyl type formula for orthogonality of matrix elements of irreducible unitary representations of compact groups. Indeed the integrand of $M_{\lambda}$ is essentially of the form $$ |\langle v,\pi_{\lambda}(U)v\rangle|^2 $$ for a suitable highest weight vector $v$.

Finally, note that there is related formula in my previous post:

Integration of a function over 7-sphere

The integrand is essentially the same, but the difference is about taking the expectation of a Ginibre versus a complex Wishart random matrix.