2

Is there an analytical way or a good approximation or any other mathematical method to diagonalise a sparse (symmetric) matrix with elements only onsome diagonals?

For example $$ \begin{bmatrix} B & 0 & 0 & A & 0\\ 0 & B & 0 & 0 & A\\ 0 & 0 & B & 0 & 0\\ A & 0 & 0 & B & 0\\ 0 & A & 0 & 0 & B \\ \end{bmatrix} $$

or similar...

(is there an index notation way of writing the above matrix? Like $A_{m,n} = \cdots$?)

  • I know there are fast solvers for this especially if the matrix is diagonally dominant. – lightxbulb Jan 18 '19 at 00:02
  • The diagonal is always roughly a factor of $2$ larger than any off diagonal. Have you got any names for these fast solvers? – SuperCiocia Jan 18 '19 at 00:03
  • Is $A$ a submatrix or a number? – Ben Grossmann Jan 18 '19 at 00:12
  • Assuming that $A$ and $B$ are square submatrices of (identical) size $n$, we can write your matrix as $$ M = I_5 \otimes B + \pmatrix{0&0&0&1&0\0&0&0&0&1\0&0&0&0&0\1&0&0&0&0\0&1&0&0&0} \otimes A $$ where $\otimes$ denotes the Kronecker product – Ben Grossmann Jan 18 '19 at 00:15
  • If $A$ and $B$ are numbers, it's fairly easy to diagonalize this analytically. – Ben Grossmann Jan 18 '19 at 00:17
  • They're numbers. Does this decomposition make the diagonalisation easier? – SuperCiocia Jan 18 '19 at 00:17
  • "Solving Sparse, Symmetric, Diagonally-Dominant Linear systems". – lightxbulb Jan 18 '19 at 00:17
  • @Omnomnomnom even if i increase the size of the matrix, and the number of non-zero diagonals with it? – SuperCiocia Jan 18 '19 at 00:18
  • I tried using $Av = \lambda v$ to try and find $\lambda$, and it equals $B$ due to the third row. However for it to work $A$ needs to be $0$. The other option is for $v_1, v_2, v_4, v_5$ to be $0$. So one eigenvector is $(0,0,1,0,0)$ with eigenvalue $B$. – lightxbulb Jan 18 '19 at 00:34
  • If we set $v=(0,\frac{1}{\sqrt{2}}, 0, 0, \frac{1}{\sqrt{2}})$, then from the 2nd and 5th equation we get $\lambda = B+-A$. And similarly for $v=(\frac{1}{\sqrt{2}}, 0, 0, \frac{1}{\sqrt{2}}, 0)$. – lightxbulb Jan 18 '19 at 00:48
  • I am interested in a scalabilty procedure though. I can make the matrix bigger, but it will keep the same structure (symmetric, only some diagonals non-zero and always with the same value). Are all these methods specific to the size of the matrix I gave above? – SuperCiocia Jan 18 '19 at 00:51
  • The last derivations are specific to the matrix above, but it seems to have a nice structure, so if the extension is similar to this one you would again be able to find the eigenvalues and vectors easily. – lightxbulb Jan 18 '19 at 00:52
  • @SuperCiocia changing the number of non-zero diagonals throws things off, unfortunately. Depending on the diagonals that are occupied, you might be able to use facts about circulant matrices; for your particular example these give us a nice trick. – Ben Grossmann Jan 18 '19 at 02:50
  • Wait but mine is not a circulant matrix right? The middle column does not have $A$? – SuperCiocia Jan 18 '19 at 10:42
  • @SuperCiocia Please mark the question as answered or clarify if you need something more. – lightxbulb Jan 18 '19 at 10:56

1 Answers1

1

By using $Xv_i= \lambda_i v_i$ you can derive the eigenvectors: $(\frac{1}{\sqrt{2}},0,0,\pm\frac{1}{\sqrt{2}},0)$, $(0,0,1,0,0)$, $(0, \frac{1}{\sqrt{2}},0,0,\pm\frac{1}{\sqrt{2}})$, with corresponding eigenvalues $\lambda =(B\pm A, B, B\pm A)$. Let $Q$'s columns be made up of the eigenvectors (in the given order), then: $X= Qdiag(\lambda)Q^T$, where $X$ is your initial matrix. Note that this can be trivially extended to higher dimensions for a matrix with the same structure.

lightxbulb
  • 2,047
  • I will confirm this as answered, just out of curisioty did you just brute force try for eigenstates or did you use any tricks/known methods? – SuperCiocia Jan 18 '19 at 11:01
  • 1
    @SuperCiocia What do you mean bruteforce? I just wrote the equations $Xv_i = \lambda_i v_i$ and noticed the structure (3rd component doesn't affect the others, 1 and 4 affect each other, and 2 and 5 also). – lightxbulb Jan 18 '19 at 11:08