0

Suppose $G$ is a complex $n\times n$ matrix Could anyone help me to prove the following where $\sigma$'s are singular values of $G$?

  1. $\det G\ne 0 \Leftrightarrow \sigma_{\min}[G]>0$.

  2. $\sigma_{\max}[G^{-1}]=\frac{1}{\sigma_{\min}[G]}$ if $\sigma_{\min}[G]>0$

  3. $\sigma_{\min}[I+G]\ge 1-\sigma_{\max}[G]$

  4. $\sigma_{\max}[G_1G_2]\le \sigma_{\max}[G_1]\sigma_{\max}[G_2]$ for complex matrices $G_1,G_2$

Myshkin
  • 35,974
  • 27
  • 154
  • 332

1 Answers1

2
  1. $\sigma_{min} \gt 0 \Rightarrow det\ G \neq 0$ is proved using contrapositive. Note that all singular values are real and non-negative, so $\sigma_{min} \neq 0 \Leftrightarrow \sigma_{min} \gt 0$. $$ \begin{align} det\ G = 0 &\Rightarrow 0\text{ is an eigenvalue of }G\\ &\Rightarrow \exists v \neq 0 \text{ s.t. } Gv=0\\ &\Rightarrow \exists v \neq 0 \text{ s.t. } G^*Gv = 0 = 0.v\\ &\Rightarrow \sigma_{min} = 0 \end{align} $$

$det\ G \neq 0 \Rightarrow \sigma_{min} \gt 0$ using contrapositive.

$$ \begin{align} \sigma_{min} \not\gt 0 &\Rightarrow \sigma_{min}=0\\ &\Rightarrow \exists v \neq 0 \text{ s.t. } G^*Gv = 0\\ &\Rightarrow \exists v \neq 0 \text{ s.t. } Gv=0 \text{ or } G^*w = 0 \text{ where }w=Gv \neq 0\\ &\Rightarrow G \text{ is not injective and hence singular}\\ &\Rightarrow det\ G=0 \end{align} $$

  1. If $\sigma_{min} \not\gt 0$, then $G$ is not invertible as proved in 1. So assume all singular values of $G$, $\sigma_{1},...,\sigma_m$ are non-zero. Writing the SVD of $G$ as $G=U\Sigma V^*$, we get $G^{-1}=V\Sigma^{-1}U^*$. Clearly $\Sigma^{-1}=diag(\frac{1}{\sigma_1},...,\frac{1}{\sigma_m})$ contains the singular values of $G^{-1}$ along the diagonal. Hence $\sigma_{max}[G]=max\{\frac{1}{\sigma_1},...,\frac{1}{\sigma_m}\}=\frac{1}{\sigma_{min}}$

The other 2 inequalities involve more complex proofs. I will leave here pointers to find those:

  1. This follows from a more general property: $\sigma_i(A+B) \geqslant \sigma_i(A) - \sigma_1(B)$, where $\sigma_i$'s are in decreasing order. From this it directly follows $\sigma_{min}[I+G] \geqslant \sigma_{min}[I] - \sigma_1[G] = 1 - \sigma_{max}[G] \quad [\text{all singular values of I are 1}]$

  2. This also follows from a more general property (sometimes refered to as Horn's lemma) which states that: $\prod_{i=1}^{k}\sigma_i[G_1G_2] \leqslant \prod_{i=1}^{k}\sigma_i[G_1]\sigma_i[G_2]$. If both $G_1$ and $G_2$ are square matrices of size $n$, the equality holds for $k=n$.
    In your case, just taking $i=1$, proves the required result.

The proofs of both these inequalities can be found in: R.A. Horn, C.R. Johnson, Topics in Matrix Analysis, Cambridge University Press, 1991