0

This is the statement

Consider the inner product space $(\mathbb{R}^n, \left \langle \cdot ,\cdot \right \rangle) $ over $\mathbb{R}$. Let $A \in M_{n\times n}(\mathbb{R } ) $. Let's define $$\begin{matrix} a_A:&\mathbb{R}^n \times \mathbb{R}^n & \rightarrow & \mathbb{R} \\ &(x,y) & \mapsto & x^{T}Ax \end{matrix}$$

The goal is to show that if for all $x \in \mathbb{R}^n$ we have $x^T A x > 0$, then $a_A$ is bilinear and symmetric, and furthermore there is an $\alpha>0$ such that $\forall x\in \mathbb{R}^n$ we have $$\alpha a_{I}(x,x)\equiv \alpha\left \langle x,x \right \rangle \equiv \alpha\left \| x\right \|^2\leq a_A(x,x)$$

Proof: not to write so much, I have already demonstrated that it is bilinear and symmetrical. for the other party:this is what I have, this is more or less the idea:

let's define $$\alpha :=min\left \{ a_{kk} /k=1,...,n\right \};$$ note that

$$\alpha \left \langle x,x \right \rangle \equiv \alpha a_{I}(x,x)=\alpha (xIx)=\alpha \sum_{k=1}^{n}x_{k}^{2}= \sum_{k=1}^{n}\alpha x_{k}^{2}\leq \sum_{k=1}^{n}a_{kk} x_{k}^{2}+ \text{(REMAINING AMOUNTS) }= \sum_{i=1}^{n}\sum_{j=1}^{n}x_{i}a_{ij}x_{j}=x^T Ax=a_A(x,x)$$

My question is as follows:

I have not yet been able to prove that alpha is positive.

I have tried but I don't know how to proceed, I would be very grateful if you could tell me how to proceed....

User203940
  • 2,473
F.R.
  • 178
  • 1
    The proof is wrong. Consider $$\begin{pmatrix} 2 &1\ 1& 2\end{pmatrix}$$ Then $\alpha=2$ and for $x=(1,-1)$ we have $\langle Ax,x\rangle =2$ while $\alpha \langle x,x\rangle =4.$ The correct $\alpha$ should be equal the smallest eigenvalue of $A$ if $A$ is symmetric or the smallest eigenvalue of $(A+A^T)/2$ in general. – Ryszard Szwarc Jun 16 '22 at 22:28

2 Answers2

1

First, you don't appear to assume that $A$ is symmetric, but you need this to prove that $\alpha_A$ is symmetric, so I will assume that you forgot to write it.

The condition $\forall x\in\mathbb{R}^n\setminus\{0\} : x^tAx > 0$ is known as the matrix $A$ being positive definite. Positive definite (real symmetric) matrices have a Cholesky Decomposition $A = LL^t$, where $L$ is a (real) lower-triangular matrix with positive diagonal entries. You can then write that $$\alpha_A(x,x) = x^tAx = x^t(LL^t)x = \lVert L^tx\rVert_2^2 = \alpha_I(L^tx, L^tx).$$

Let $y := L^tx$, so $x = L^{-t}y$ (note that $L^t$ is invertible, as its diagonal is positive). We then have that $$\alpha_I(y, y) = \alpha_A(L^{-t}y, L^{-t}y) = \lVert L^t(L^{-t}x)\rVert_2^2 = \lVert L^{-t}(L^tx)\rVert_2^2 \leq \lVert L^{-t}\rVert_{op}^2\alpha_A(y,y).$$

Here, we use that

  1. $L^t$ is invertible (as it has a positive diagonal, and therefore positive eigenvalues)
  2. matrices commute with their inverses,
  3. the operator norm $\lVert \cdot\rVert_{op}$ is sub-multiplicative, and

Therefore, we get the desired inequality, for the constant $\lVert L^{-t}\rVert_{op}^{-2}$.

0

Just as a note, you also need that $A^T = A$.

What you've done so far is show that $a_A$ is an inner product. We can define a norm induced by the inner product by $\|x\|_A^2 = a_A(x,x).$ Notice that $\|\cdot\|_A^2 \geq 0$ with equality only at $0$. Let $$\Sigma := \{x \in \mathbb{R}^n \ | \ \|x\|_A = 1\}.$$

Notice that $0 \notin \Sigma$, so we have that

$$\|\cdot\| : \Sigma \rightarrow (0,\infty),$$

that is, it is positive. Next, observe that $\Sigma$ is compact and $\|\cdot\|$ is continuous. A continuous function on a compact set is bounded, so we have $c, C > 0$ so that $c \leq \|x\| \leq C$ for all $x \in \Sigma$. Finally, let $x \in \mathbb{R}^n$ be non-zero. We have that $x/\|x\|_A \in \Sigma$, so

$$c \leq \left\| \frac{x}{\|x\|_A}\right\| \leq C \implies c \|x\|_A \leq \|x\| \leq C \|x\|_A.$$

This gives your result.

User203940
  • 2,473
  • This is just the classic proof that all norms on $\mathbb{R}^n$ are equivalent: see https://math.stackexchange.com/questions/57686/understanding-of-the-theorem-that-all-norms-are-equivalent-in-finite-dimensional – User203940 Jun 16 '22 at 22:26
  • You don't need symmetry as the form is defined on $\mathbb{R}^n\times\mathbb{R}^n$. For example $$A=\begin{pmatrix} 2&1\ 0&2\end{pmatrix}$$ is positive definite. – Ryszard Szwarc Jun 16 '22 at 22:35
  • I meant you need $A^T = A$ to get symmetry (as the other answer points out), which is needed for it to be an inner product. – User203940 Jun 16 '22 at 22:37
  • In the real case you do not need symmetry The matrix innym first comment is associated with inner product $2x_1^2+x_1x_2+2x_2^2.$ – Ryszard Szwarc Jun 17 '22 at 02:04