1

I’m stuck solving this problem. Given $T^n=0$, I know that the column vectors of T will be linearly dependent. However, it’s unclear that subtracting a linearly dependent matrix from the identity gives another linearly independent (invertible) matrix. How do I begin to show this?

The problem continues: show that $(I-T)^{-1} =I+T+...+T^{n-1}$ and guess how one would give the formula. Perhaps that leap of intuition will occur once I understand the first half better, but a hint here would be appreciated as well.

  • It seems so. Should I delete the question then? – cryptograthor Dec 28 '18 at 18:01
  • $T^n= 0$ implies $\sigma(T)= {0}$ so $\sigma(I-T)= 1-\sigma(T)= {1}$. This doesn't contain $0$ so $I-T$ is invertible. – mechanodroid Dec 28 '18 at 18:08
  • @mechanodroid I’m unfamiliar with the sigma you’re referring to. Where can I read about it? – cryptograthor Dec 28 '18 at 18:09
  • 1
    @ThorKamphefner $\sigma(T)$ is the spectrum of the matrix $T$, i.e. the set of all eigenvalues of $T$. A matrix is invertible if and only if $0$ is not an eigenvalue. – mechanodroid Dec 28 '18 at 18:11
  • @ThorKamphefner: You can find an answer here: https://math.stackexchange.com/questions/119904/units-and-nilpotents – Robert Lewis Dec 29 '18 at 00:22
  • If $I-T$ were singular (not invertible) then $1$ would be an eigenvalue of $T$, but since $1$ is not a root of the polynomial $X^n$ that by assumption annihilates $T$, this cannot be the case. Or restated without referring to eigenvalues: if $I-T$ were singular there is a nonzero vector $v$ in its kernel, which therefore satisfies $T(v)=v$. But then $v=T(v)=T^2(v)=\cdots=T^n(v)=0$ contradicts that $v$ was nonzero. – Marc van Leeuwen Dec 29 '18 at 13:39

0 Answers0