6

I need to solve a real generalized eigenvalue problem

$Ax= \lambda Bx(*)$

A and B are calculated from equations below:

$$A=\sum_{i,j=1}^{N}W_{ij}(K_{i}-K_{j})\beta\beta^{T}(K_{i}-K_{j})^{T}$$

$$B=\sum_{i=1}^{N}D_{ii}K_{i}\beta\beta^{T}K_{i}^{T}$$.

where $W$ is a real symmetric $N*N$ matrix with diagonal entries being $0$ and off-diagonal entries between $(0,1)$.

$D$ is an $N*N$ diagonal matrix with $D_{ii}=\sum_{j=1}^NW_{ij}$.

$K_i$ is an $N*M$ matrix with all entries positive.

$\beta>0$ is an $M$ dimensional column vector.

From above equations, A and B should be symmetric semi-definite and B should be positive definite(I did some proof myself).

Maybe because some numerical losses( I are not sure :( ), $B$ appears to have small negative eigenvalues( I do the eigenvalue decomposition using LAPACK routine dsyev() ) and $(*)$ gives complex eigenvalues.

I want to select P smallest eigenvalues of this generalized eigenvalue problem, so complex values here are really a problem. Is there any way to avoid complex eigenvalues in such a case?

By the way I used armadillo as linear algebra library and solve $(*)$ directly using LAPACK routine dggev().

Any suggestions will be appreciated.

ZeyuHu
  • 317
  • 2
  • 7
  • 1
    I assume there's a mistake in how $A$ and $B$ are defined: Right now, they both appear to be rank-1-matrices, $A=\alpha_1 \beta \beta^T$, $B=\alpha_2 \beta \beta^T$. – Nico Schlömer May 14 '13 at 09:12
  • Hmm I guess the rank of the sum of $N$ rank-1 matrices is not always rank-1? $A$and$B$ are $N$ sums of what you said rank-1 matrices. – ZeyuHu May 14 '13 at 09:23
  • Ah, I see now that the $K_i$s are matrices. – Nico Schlömer May 14 '13 at 09:43

1 Answers1

5

If, as you say, you are sure that you have a symmetric-definite pencil (that is, $\mathbf A$ is symmetric, and $\mathbf B$ is symmetric positive-definite), then LAPACK already has something for directly handling your problem: dsygv(). What it does is to perform a Cholesky decomposition of $\mathbf B$ (if in fact your $\mathbf B$ is not symmetric positive-definite, then you should see a warning), after which the Cholesky triangle thus produced is used to convert your generalized eigenproblem into a regular symmetric eigenproblem that can be solved with all the usual methods. Since this method requires the inversion of the Cholesky triangle of $\mathbf B$, you'll probably want to check if $\mathbf B$ is well-conditioned; you can use dsycon() for the purpose.


There is an alternative method based on the eigendecomposition of $\mathbf B$ if the Cholesky route fails, also discussed in Golub and Van Loan. Briefly, the procedure proceeds like so: give the eigendecomposition $\mathbf B=\mathbf V\mathbf D\mathbf V^\top$, form the matrix $\mathbf W=\mathbf V\sqrt{\mathbf D}$, where $\sqrt{\mathbf D}$ is done by taking the square roots of the diagonal elements. Having formed $\mathbf W$, form $\mathbf C=\mathbf W^{-1}\mathbf A\mathbf W^{-\top}$, which has the same eigenvalues as the pencil $(\mathbf A,\mathbf B)$. (I'll leave the procedure of how to form the eigenvectors as an exercise.)

J. M.
  • 3,155
  • 28
  • 37
  • I mean $B$ should be positive definite, because when given an arbitrary $N$ dimensional column vector $x\neq 0$ and let $v_{i}=K_{i}\beta$ which is also an $N$ dimensional column vector and $v_{i}>0$, we have $$x^TBx=\sum_{i=1}^{N}x^Tv_{i}v_{i}^Tx=\sum_{i=1}^{N}(x^Tv_{i})^2>0$$.But numerically it isn't for I get negative eigenvalues of $B$ after dsyev(). I want to find a way to make this $B$ "numerically positive definite" before going any further using what your quoted dsygv(). Could this be possible or when $B$ is bad-conditioned this problem could not be efficiently solved? – ZeyuHu May 14 '13 at 12:10
  • That's why I told you to check with dsycon(); what is it returning for your right matrix? – J. M. May 14 '13 at 12:21
  • Ah, I get an error using dsycon(), but I do get reciprocal of the condition number using $RCOND = 1 / (norm() * norm(inv()))$.Here $RCOND(B)=1.2296e-20$, Hmm It's really ill-conditioned right? – ZeyuHu May 15 '13 at 02:47
  • Then yes, your $\mathbf B$ is certainly ill-conditioned. I'll edit my post later to include an alternative method. Unfortunately, that method isn't explicitly implemented in LAPACK, so you'll have to implement it yourself. – J. M. May 15 '13 at 02:49
  • This method is quite new to me, and again thank you for your help, I'll post the results later after I finish the implementation. $;)$ – ZeyuHu May 15 '13 at 10:15
  • 1
    I think I know how to form the eigenvectors:$Ax=\lambda Bx\Leftrightarrow B^{-1}Ax=\lambda x$,after we do the eigendecompositon $B=VDV^{T}$ and form $B=WW^{T}$, we get an invertible $W$, so $B^{-1}=W^{-T}W^{-1}$, then we have$W^{-T}W^{-1}Ax=\lambda x$, multiply $W^T$ on the left we get $W^{-1}Ax=\lambda W^{T}x$, by adding $W^T$ we form $$W^{-1}AW^{-T}(W^{T}x)=\lambda (W^{T}x)$$, let $C=W^{-1}AW^{-T}$, so we see $C$ has eigenvectors $W^{T}x$, is that correct? But another question came up when doing the eigendecompositon of $B$, I get some small negative doubles, should I simply abandon them? – ZeyuHu May 15 '13 at 13:00
  • You got it. :) Good show! – J. M. May 15 '13 at 13:02
  • Maybe you didn't see what I wrote later(I edit the reply too slow...), so should I just make tiny small negative double eigenvalues(using dsyev()) of $B$ zero or something else? – ZeyuHu May 15 '13 at 13:18
  • Okay, try this: how about computing the singular value decomposition of $\mathbf B$? – J. M. May 15 '13 at 13:21
  • Right this time!!! But I wonder if it will be dangerous to inverse $W$ and $W^{T}$ because they might be ill-conditioned as well. – ZeyuHu May 15 '13 at 13:34
  • That's why you look at your matrices with a condition estimator first, before inverting them. – J. M. May 15 '13 at 13:58
  • Ah I see... condition estimating comes first. – ZeyuHu May 15 '13 at 14:07
  • The results show that a few eigenvalues of $C$ are negative, mainly because $W$ is also ill-conditioned for $RCOND(W)=1.82103e-14$, don't know what to do now.Will pencil $(A,B)$ generate negative eigenvalues if B is positive definite? – ZeyuHu May 16 '13 at 02:52
  • Certainly, it is possible for a symmetric-definite pencil to have negative eigenvalues. The only thing guaranteed about a symmetric-definite pencil is that all of its eigenvalues are real. – J. M. May 16 '13 at 03:01
  • what if A is symmetric semi-definite? – ZeyuHu May 16 '13 at 03:09
  • Again, let me remind you what it means to have a symmetric-definite pencil $(\mathbf A,\mathbf B)$: the condition is that $\mathbf A$ should be symmetric (definiteness does not come into play) and that $\mathbf B$ is symmetric positive definite. The usual eigenvalue problem can be treated as a symmetric-definite pencil, as $\mathbf I$ is after all SPD. – J. M. May 16 '13 at 03:18
  • I do know that it is unnecessary to have a semi-definite A to form a symmetric-definite pencil, but here A in my project is sure to be semi-definite because the equation that it came from is similar to that of B $A=\sum_{i,j=1}^{N}W_{ij}(K_{i}-K_{j})\beta \beta^{T}(K_{i}-K_{j})^{T}$. Since semi-definite matrix has only non-negative eigenvalues, I guess what I get from the result is not right.Maybe I should check my project to keep ill-conditioned matrices from coming out... – ZeyuHu May 16 '13 at 04:17
  • Hmm, you're right, if $\mathbf A$ is semidefinite, then your pencil's eigenvalues ought to be nonnegative. What happens if you take the SVD of $\mathbf W$ (and while you're at it, compare with the results of eigendecomposing $\mathbf W$)? – J. M. May 16 '13 at 04:28
  • I just compare the eigenvalues and singular values of $W$ and they are totally different. – ZeyuHu May 16 '13 at 05:07
  • You understand that they were supposed to be the same, no? It seems you might need to reformulate your problem; I'm no longer seeing any way to salvage this, unless you can do arbitrary precision. – J. M. May 16 '13 at 05:35
  • Hmm they should not be the same here and I don't get what to compare them for. – ZeyuHu May 16 '13 at 06:27
  • Thank you so much for your answer! I'll go check previous part of my work hoping to find something useful. – ZeyuHu May 16 '13 at 06:37
  • "Hmm they should not be the same here" - they should, if your supposition of positive semi-definiteness is correct. Recall the forms of the SVD and the eigendecomposition, and compare. – J. M. May 16 '13 at 10:36
  • When I do the svd of $B = UDV^{T}$, $U$ and $V$ are not the same as goes in $B=VDV^{T}$, I think that's why the results are no good. – ZeyuHu May 16 '13 at 12:58
  • That's a diagnostic, isn't it? It means your $\mathbf B$ is too ill-conditioned, and you will need to look for a better formulation of your problem. – J. M. May 16 '13 at 13:11
  • I have a feeling you'll want to read this. – J. M. May 16 '13 at 22:05