9

Consider an $n \times n$ matrix $A$ with the property that the row sums all equal the same number $s$. Show that $s$ is an eigenvalue of $A$. [Hint: Find an eigenvector]

My attempt:

By definition: $Ax = sx$ which implies that $(A - sI)x = 0$

$s$ is an eigenvalue for $A$ iff $\det(A - sI) = 0$

When you do $A - sI$ the sum of each row is now $0$. I think that's important but I don't know what it means. So this is where I'm stuck

2 Answers2

10

Try the vector of all 1's. That'll do it.


If this is not clear, then think about it this way. Let $$e_1 = \left( \begin{array}{c}1\\0\\ \vdots \\ 0\end{array} \right), e_2 = \left( \begin{array}{c}0\\1\\ \vdots \\ 0\end{array} \right), ..., e_n = \left( \begin{array}{c}0\\0\\ \vdots \\ 1\end{array}\right)$$

The operation $Ae_1$ is the first column of $A$. $Ae_2$ is the second column. Thus $A(e_1 + e_2)$ is the vector that results from the addition of the first and second columns. The vector of all 1's is given by $e_1 + e_2 + \cdots + e_n$. Applying $A$ to this vector yields $$Ae_1 + Ae_2 + \cdots + Ae_n$$ which is the vector that results from summing all of the columns.


This means $A ( 1, 1, ..., 1)^T = (s, s, ..., s)^T$ and we are done.

Joel
  • 16,256
4

What can you deduce about the linear dependence/independence of the columns of $A-sI$?

Erick Wong
  • 25,198
  • 3
  • 37
  • 91
  • Well I can't say they're linearly dependent because I don't know all the entries. I only know that all the entries in each row equal 0. hmmm A - sI is linearly independent if (A - sI)x = 0 has only the trivial solution,which only occurs if A - sI has a pivot in every column...I'm not seeing it, all i know is that adding all the columns of A - sI gives me the 0 vector – user3672888 Jul 23 '14 at 21:27
  • @user3672888, this means that the columns add to the zero vector and they are not linearly independent. This means that $A-sI$ is not invertible, since it has rank less than $n$. This means that $det(A-sI) = 0$ and $s$ must be an eigenvector. – Joel Jul 23 '14 at 22:14