13

Problem 11401 of AMM (the American Mathematical Monthly) states:

Let $A$ be a nonsingular square matrix with integer entries. Suppose that for every positive integer $k$, there is a matrix $X$ with integer entries such that $X^k = A$. Show that $A$ must be the identity matrix.

What if we assume $\det A=0$? For example, any idempotent matrix $A=A^2$ is a $k$ th power for all $k$, but are there any other matrices?

Qiaochu Yuan
  • 419,620
tomm
  • 293
  • 1
  • 7
  • 1
    You cannot assume that the determinant of $A$ is zero as $A$ is nonsingular. – KBS Sep 19 '22 at 14:36
  • 3
    I think OP is asking what happens if we remove this requirement on $A$. – user1090793 Sep 19 '22 at 14:41
  • how do you solve the original problem? – Exodd Sep 19 '22 at 14:42
  • Then you have a contradiction with the "non singular" condition. Or did you mean to remove it, but just didn't tell us? – JonathanZ Sep 19 '22 at 14:46
  • You can prove that projections are the only solutions. Look at the generalized eigenspace of the eigenvalue $0$. The existence of vectors in it that are not in the kernel would give you that the generalized eigenspace for $0$ for an $X$ that is solution of $X^k=A$ with $k$ larger than the dimension of $A$ (and $X$) will have dimension larger than that of $X$. Take a basis formed by elements of the kernel and elements of the kernel of $p(A)$, where $p(x)$ is the characteristic polynomial of $A$ removing the factors $x$. – plop Sep 19 '22 at 14:57
  • The restriction of $A$ to the subspace generated by the vectors in that basis that are not in the kernel, is a matrix with the same property as the original problem. – plop Sep 19 '22 at 14:58
  • @user85667 I understand why the generalized eigenspace of $0$ is the kernel of $A$ in this scenario, but don't understand the rest of your argument - could you please write it as an answer? – tomm Sep 19 '22 at 17:28
  • Maybe you can think the last part this way. The construction of the Frobenius normal form gives a basis in which the matrix is block diagonal with the companion matrices of the invariant factors in the diagonal. Because we know that only kernel vectors generate cyclic invariant subspaces intersecting non-trivially the kernel, then we can split the basis into kernel vector and non-kernel vectors and these will form a block diagonal of a zero matrix and a non-singular matrix. – plop Sep 19 '22 at 17:59
  • Note that the Frobenius normal form doesn't require field extensions, so we are still in the rationals, and by scaling the vectors of the basis we can clear their denominators, if they have some. Now, restrict to the subspace generated by the non-kernel vectors of the basis that gives the Frobenius normal form. The matrix there is non-singular and satisfies the same conditions as the original problem. – plop Sep 19 '22 at 18:06
  • 1
    @Rodrigo: it's the American Mathematical Monthly: https://www.mat.uniroma2.it/~tauraso/AMM/amm.html – Qiaochu Yuan Sep 19 '22 at 21:17

1 Answers1

14

The original question was asked about previously on math.SE and there is a beautiful solution by @MooS which generalizes cleanly to this case. Suppose these are $n \times n$ matrices. Working $\bmod p$ for $p$ a prime, the eigenvalues of any matrix in $M_n(\mathbb{F}_p)$ live in a fixed finite field $\mathbb{F}_q$ (e.g. $q = p^{n!}$). By hypothesis, the eigenvalues of $A \bmod p$ must be $k^{th}$ roots for all $k$. But taking $k = q - 1$ this gives that the eigenvalues of $A \bmod p$ are $0, 1$. Since the minimal polynomial of $A \bmod p$ has degree $\le n$ it follows that

$$A^n (A - 1)^n \equiv 0 \bmod p.$$

From here, plugging in $X$ such that $X^{p^m} = A$ gives

$$X^{p^m n} (X^{p^m} - 1)^{n} \equiv X^{p^m n} (X - 1)^{p^m n} \equiv 0 \bmod p$$

(this is actually the key step of the proof!), so as above the only eigenvalues of $X \bmod p$ are $0, 1$. Also as above, since the minimal polynomial of $X \bmod p$ has degree $\le n$ we have $X^n (X - 1)^n \equiv 0 \bmod p$. This gives

$$X^{p^m \lceil \frac{n}{p^m} \rceil} (X - 1)^{p^m \lceil \frac{n}{p^m} \rceil} \equiv 0 \bmod p$$

which gives

$$A^{\lceil \frac{n}{p^m} \rceil} (A - 1)^{\lceil \frac{n}{p^m} \rceil} \equiv 0 \bmod p.$$

Taking $p^m \ge n$ gives $A(A - 1) \equiv 0 \bmod p$, and since this holds for all primes $p$ we conclude that $A^2 = A$, so $A$ is idempotent.


Edit: Here's an even shorter version of the argument which uses only matrix multiplication. First observe that it suffices to prove the result $\bmod p$ for all primes $p$, since then we can reduce $A \bmod p$ to conclude as above. Since every finite monoid embeds into $M_n(\mathbb{F}_p)$ for some $n$, this is equivalent to proving:

Problem: Show that if an element $a$ in a finite monoid $M$ has a $k^{th}$ root for all $k$, then $a$ is idempotent.

Proof. Recall that every element $m$ in a finite monoid satisfies $m^i = m^j$ for some $0 \le i < j \le |M|$, by pigeonhole applied to the set $\{ 1, m, m^2, \dots m^{|M|} \}$. It follows that $m^{|M|!}$ is idempotent, hence that every element with an $|M|!$th root is idempotent. $\Box$

Qiaochu Yuan
  • 419,620
  • This argument is so slick that it's a little hard to tell what's really going on. Basically once we've shown that the only eigenvalues of $A$ are $0, 1$ the only thing left to do is rule out nontrivial Jordan blocks. The argument for $0$ just says that Jordan blocks with eigenvalue $0$ have bounded nilpotence degree, which we apply to $X$ satisfying $X^k = A$; this bit doesn't require working $\bmod p$. The argument for $1$ says that Jordan blocks with eigenvalue $1$ have bounded order $\bmod p$ which is a power of $p$, which we apply to $X$ satisfying $X^{p^m} = A$. In terms of eigenvalues... – Qiaochu Yuan Sep 19 '22 at 19:41
  • ...this use of $k = p^m$ reflects the fact that the only $p^m$-th root of $1 \bmod p$ is $1$, which allows us to show that $X$ satisfying $X^{p^m} = A$ also only has eigenvalues $0, 1 \bmod p$. Without this trick we'd have to deal with the possibility that $X$ has eigenvalues some other roots of unity which complicates the situation although maybe the argument can still go through in this case. – Qiaochu Yuan Sep 19 '22 at 19:42
  • Amazing solution - thank you. – tomm Sep 19 '22 at 21:37
  • Btw, you could also use the fact that over a field, if $M$ is a nilpotent $n\times n$ matrix (in our case $M=X(X-1)$) then $M^n=0$. – tomm Sep 19 '22 at 21:44
  • @tomm: I've edited in a second version of the argument which no longer uses matrix addition. – Qiaochu Yuan Sep 19 '22 at 22:35