7

Let $L=\{x\in \mathfrak{gl}_n(\mathbb{C}):xA+Ax^{T}=0\}$ with $A=\begin{pmatrix} 13 & 7 & 1 & 2 & -1 & 1 & 5\\ 9 & 6 & 3 & 4 & 1 & 1 & 4 \\ 5 & 3 & 2 & 3 & 2 & 10 & 1 \\ 4 & 2 & 1 & 2 & 1 & 4 & 1 \\ 1 & 1 & 0 & 1 & 1 & 5 & 0 \\ 3 & 1 & −10 & −4 & −5 & 1 & 2 \\ 5 & 2 & 1 & 1 & 0 & 0 & 2 \\ \end{pmatrix}$.

  1. Show that $L$ is semispimple
  2. Find how its decomposes into simple algebras.

It reminds me of the way we define the classical Lie algebras so maybe $A$ has some property I fail to see?

Edit: If $L_1=\{x\in\mathfrak{gl}_n(\mathbb{C})\}: x(A+A^T)+(A+A^T)x^T=0\}$ then $L\subseteq L_1$ where $A+A^T$ is symmetric and thus we can use the signature argument of @KentaS. But in general this is just a subalgebra so I am not sure if I get anything of essence. Maybe if I have $A$ diagonizable I can then find the signature and proceed that way?

Some motivation: Notice that the definition of the classical Lie algebras is similar but for signature matrices so this question is essentially a special case of "how do you study Lie algebras defined in a similar manner to the classical ones for arbitary matrices?"

idocomb
  • 339
  • 1
    you can freely replace $A$ with $BAB^T$ for any invertible matrix $B$ (look at $x\mapsto BxB^{-1}$). – Kenta S Jul 26 '22 at 18:34
  • @KentaS so I need to diagonalize $A$ then? – idocomb Jul 26 '22 at 18:35
  • It means only the signature of $A$ matters, and then $L$ would be of the form $\mathfrak{so}(p,q)$. – Kenta S Jul 26 '22 at 18:45
  • @KentaS how do you know $A$ has signature $I_{p,q}$ though? – idocomb Jul 26 '22 at 18:56
  • I'm a bit surprised that this $A$ is not symmetric though. What is its signature then, @KentaS? If it were symmetric, we'd use https://math.stackexchange.com/q/4336931/96384. – Torsten Schoeneberg Jul 27 '22 at 03:15
  • @TorstenSchoeneberg exactly I do not get if there is any property of $A$ that would help us here. – idocomb Jul 27 '22 at 13:52
  • For general non-symmetric $A$ I don't even see why this space should be a Lie algebra to begin with. Where is this problem from? – Torsten Schoeneberg Jul 27 '22 at 14:21
  • @TorstenSchoeneberg previous exercise from a course I will be TA for – idocomb Jul 27 '22 at 14:25
  • @TorstenSchoeneberg it is a lie algebra for any $A$, it satisfies $[x,y]A+A[x,y]^T=0$. – idocomb Jul 27 '22 at 16:19
  • Alright, I believe that now. I noticed that the symmetrization of this matrix has integer entries. So maybe that's the way to go. https://en.wikipedia.org/wiki/Symmetrization – Torsten Schoeneberg Jul 27 '22 at 19:17
  • 1
    @TorstenSchoeneberg that is a nice observation. $xA+Ax^T=0\implies xA^T+A^Tx^T=0$ adding which we get $x(A+A^T)+(A+A^T)x^T=0$ so $L\subseteq L_1$ where $L_1={x\in\mathfrak{gl}_n:x(A+A^T)+(A+A^T)x^T=0}$. We thus get a symmetric matrix $A+A^T$ and can use the linked question to conclude $L_1$ (and thus $L$) is semisimple. However for part 2) this gives no structural property I can use. We can use the signature argument to reduce it to $\mathfrak{so}(p,k)$ now I think but still we are in a larger algebra. – idocomb Jul 28 '22 at 09:50
  • "$L_1$ (and thus $L$) is semisimple" -- subalgebras of semisimple LAs are not necessarily semisimple. So I'd hope we actually have equality $L=L_1$. If we do have that, on first sight $A+A^T$ looks as if it might actually be positive definite via Sylvester's criterion, although I have not checked the larger principal minors. Unfortunately it is not quite diagonally dominant, otherwise one could save work here. – Torsten Schoeneberg Jul 28 '22 at 18:52
  • 1
    @TorstenSchoeneberg you are right for some reason I assumed $L$ would be an ideal. I am not sure if we can get equality unfortunately. – idocomb Jul 28 '22 at 23:08
  • I think I was able to prove that $L\cong {\begin{bmatrix}g & \ 0 & h \end{bmatrix} \mid g\in \mathfrak{o}_3, h\in \mathfrak{sp}_4 }$ as a Lie algebra. I don't think this is a simple Lie algebra (since the term with $$ should form an ideal), do you know anything about these types of algebras @TorstenSchoeneberg ? – Levent Aug 17 '22 at 13:26
  • @Levent: I know not more than what you say: If $L$ is indeed isomorphic to that, it's not semisimple for the reason you give. So a proof of that isomorphism, if you would like to share it, would (kind of) answer the question, by refuting what it claims. – Torsten Schoeneberg Aug 17 '22 at 18:57
  • 1
    @TorstenSchoeneberg Apparently I made some mistakes in my calculations and the $^*$ part should not have been there. The Lie algebra is actually isomorphic to $\mathfrak{so}_3\oplus\mathfrak{sp}_4$, hence it is semisimple. I posted the proof if you are interested. – Levent Aug 19 '22 at 12:31

1 Answers1

3

TL;DR. The Lie algebra in question is isomorphic to the Lie algebra $\mathfrak{so}_3\oplus\mathfrak{sp}_4$, and, thus; it is semisimple. The property of $A$ which makes it possible is the fact that $A$ is invertible and $rk B+rk C=rk A$ where $B,C$ are the symmetric and skew-symmetric parts, respectively. For any matrix with these properties, the corresponding Lie algebra will be semisimple.

First, I want to present another perspective on the way the Lie algebra in question is defined.

The group $G=GL_7(\mathbb{C})$ acts on the vector space $V$ of $7\times 7$-matrices via $g\cdot M = gMg^T$. This is not the classical conjugation action that we are familiar with, however, it is still a Lie group representation. This representation also induces a Lie algebra action of $\mathfrak{gl}_7(\mathbb{C})$ on $V$ via $M\mapsto xM+Mx^T$. I omit the details. I'd also like to point out that this representation is not irreducible, in fact, it splits as $$ V = S^2(n)\oplus \Lambda^2(n) $$ where $S^2(n), \Lambda^2(n)$ are the spaces of symmetric and skew-symmetric $n\times n$-matrices, respectively.

The Lie algebra $L=\{x\in\mathfrak{gl}_7\mid xA+Ax^T=0\}$ that you have defined is then the Lie algebra of the stabilizer $C_G(A)=\{g\in G\mid gAg^T=A\}$ of $A$. This immediately explains why $L$ is always a Lie algebra. This observation also implies that we can replace $A$ with any $gAg^T$ since the stabilizers of two vectors in the same orbit are isomorphic. This was pointed out in the comments. Moreover, if $A=B+C$ where $B\in S^2(n), C\in \Lambda^2(n)$, then we have $$ C_G(A) = C_G(B)\cap C_G(C). $$

I claim that $A$ is in the same orbit with the matrix $$ M = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 1 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & 1\\ 0 & 0 & 0 & -1 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & -1 & 0 & 0 \end{bmatrix} $$ The intuition behind this normal form is the following. The identity matrix $I_n$ is the standard normal form of a positive definite symmetric matrix under the transpose action. Similarly, the matrix $$ J_n = \begin{bmatrix} 0 & I_n\\ -I_n & 0 \end{bmatrix} $$ is the standard normal form of a skew-symmetric matrix under the transpose action. The above matrix consists of the normal forms of its symmetric and skew-symmetric parts. Assuming this claim, the rest of the proof goes easily. For a matrix $x\in\mathfrak{gl}_7$, write $$ x = \begin{bmatrix} x_1 & x_2 \\ x_3 & x_4 \end{bmatrix} $$ where $x_1$ is $3\times 3$ and $x_4$ is $4\times 4$. Then, we have $$ x M+Mx^T = 0\quad\iff\quad x_1+x_1^T = 0,\; x_4J_2+J_2x_4^T=0,\; x_2 = 0,\; x_3=0. $$ Note that $\{x_1\mid x_1+x_1^T=0\}$ is $\mathfrak{so}_3$ and $\{x_4\mid x_4 J_2+J_2 x_4^T=0\}$ is $\mathfrak{sp}_4$. Thus, the Lie algebra of the stabilizer of $M$ is $\mathfrak{so}_3\oplus\mathfrak{sp}_4$ and since $M$ and $A$ are in the same orbit, the Lie algebra $L$ of the stabilizer of $A$ is isomorphic to $\mathfrak{so}_3\oplus\mathfrak{sp}_4$.

It is not true in general that any matrix has such a normal form, however, the matrix in question has some interesting properties. Using a computer algebra, you may verify the following claims:

  1. The symmetric and skew-symmetric parts $B,C$ of $A$ satisfy $$ rk(B) = 3,\quad rk(C) = 4, \quad rk(A)=rk(B+C)=7. $$

  2. $B$ is positive semi-definite, i.e., its $3$ nonzero eigenvalues are all strictly positive.

Note that for any two matrices $rk(B+C)\leq rk(B)+rk(C)$ is satisfied. The equality holds if and only if the column spaces of $B,C$ have no intersection, i.e., the image of $B+C$ is the direct sum of images of $B,C$. If we choose a basis of $\mathbb{C}^7$ consisting of eigenvectors of $B$ and $C$, then we deduce that there exists $g\in GL_7$ such that $$ gBg^{-1} = \begin{bmatrix} B' & 0\\ 0 & 0 \end{bmatrix},\quad gCg^{-1} = \begin{bmatrix} 0 & 0\\ 0 & C' \end{bmatrix} $$ where $B',C'$ are diagonal matrices with diagonal entries consisting of eigenvalues of $B,C$, respectively. For our purposes, it is not important what the eigenvalues exactly are, but their signs matter. For $B'$, we know by the fact that $B$ is PSD, the diagonal entries of $B'$ are strictly positive, say $\lambda_1,\lambda_2,\lambda_3$. For $C'$, the eigenvalues of skew-symmetric matrices come in pairs and they have to be purely imaginary, so we may assume that $C'$ is of the form $$ \begin{bmatrix} \lambda_4 i & 0 & 0 & 0\\ 0 & \lambda_5 i & 0 & 0\\ 0 & 0 & -\lambda_4 i & 0\\ 0 & 0 & 0 & -\lambda_5 i \end{bmatrix}$$ where $\lambda_4,\lambda_5$ are strictly positive real numbers.

Consider the diagonal matrix $h$ with diagonal entries $(1/\sqrt{\lambda_1},1/\sqrt{\lambda_2},1/\sqrt{\lambda_3},1/\sqrt{\lambda_4},1/\sqrt{\lambda_5},1/\sqrt{\lambda_4},1/\sqrt{\lambda_5})$. Then, we have $$ h (g A g^{-1}) h^T = \begin{bmatrix} 1 & 0 & 0 & 0 & 0 & 0 & 0\\ 0 & 1 & 0 & 0 & 0 & 0 & 0\\ 0 & 0 & 1 & 0 & 0 & 0 & 0\\ 0 & 0 & 0 & i & 0 & 0 & 0\\ 0 & 0 & 0 & 0 & i & 0 & 0\\ 0 & 0 & 0 & 0 & 0 & -i & 0\\ 0 & 0 & 0 & 0 & 0 & 0 & -i \end{bmatrix} $$ The lower $4\times 4$ block of the matrix is actually similar to $$ \begin{bmatrix} 0 & 0 & 1 & 0\\ 0 & 0 & 0 & 1\\ -1 & 0 & 0 & 0\\ 0 & -1 & 0 & 0 \end{bmatrix} $$ by comparing the eigenvalues. Hence, there exists yet another $k\in GL_7$ such that $$ k( h (g A g^{-1}) h^T )k^{-1} = M $$ where $M$ is the normal form above. We are almost done, but not yet. Note that I claimed $A,M$ are in the same orbit with respect to the transpose action, i.e., there exists $g$ such that $gAg^T = M$. However, the transformations I applied also include conjugations. To overcome that, we need the following theorem from Ganthmacher's Theory of Matrices, Volume 2, Page 41, Theorem 6.

Theorem: Let $A,M$ be two $n\times n$ matrices. Let $A=B+C, M=N+L$ be the decomposition into symmetric and skew-symmetric parts respectively. Then, there exists $g\in GL_n$ with $gAg^T=M$ if and only if there exist $h,k\in GL_n$ with $h B k = N, h C k = L$.

Note that the transformations that I applied send the symmetric part $B$ of $A$ to the symmetric part of $M$ and similarly send the skew-symmetric part $C$ of $A$ to the skew-symmetric part of $M$. Hence, the second condition of the theorem is satisfied. Using the theorem, there exists $g\in GL_7$ with $gAg^{T}=M$.

Levent
  • 4,804
  • 1
    This is great. But since we are over $\mathbb C$, not $\mathbb R$, isn't positivity of eigenvalues kind of redundant, and all we need would be the rank of $B$ and $C$? I mean your extra work there also solved the question over the reals (and gives an obvious idea for other fields), which makes this even better. – Torsten Schoeneberg Aug 19 '22 at 14:20
  • @TorstenSchoeneberg Thanks for the comment. You are certainly right, over $\mathbb{C}$ only the rank matters. – Levent Aug 19 '22 at 14:27
  • 1
    Now this makes me wonder if there's an easier criterion than computer check of whether the rank of a matrix equals the sum of the ranks of its symmetric and skew-symmetric parts. (It's trivially the case if the matrix is symmetric or skew-symmetric itself; but more examples come up as soon as $n \ge 3$.) – Torsten Schoeneberg Aug 19 '22 at 16:51
  • @TorstenSchoeneberg Using the same logic I think one should be able to deduce that if the condition is satisfied then 1) the matrix is diagonalisable 2) the eigenvalues are all either real or purely imaginary and 3) the number of purely imaginary eigenvalues is even. The converse should also be true. – Levent Aug 21 '22 at 23:41
  • Do you happen to have a reference about why this representation splits as a sum of symmetric and skew symmetric parts? – idocomb Aug 22 '22 at 22:38
  • @Levent: I think I can believe that, but only if the ground field is $\mathbb R$ again. Somewhat annoyingly, e.g. over $\mathbb C$ there are symmetric matrices which cannot be diagonalized. – Torsten Schoeneberg Aug 23 '22 at 19:41
  • @TorstenSchoeneberg over $\mathbb{C}$ why do you need just the fact it's a direct sum? You still have to use Ganthmacher's Theorem no? – idocomb Aug 31 '22 at 09:03
  • Yes sure you still need that, but not positivity or imaginarity of any eigenvalues. – Torsten Schoeneberg Aug 31 '22 at 20:47