Suppose we have incomplete observations of the square matrix $X$. Most matrix completion algorithms assume the matrix is low-rank. What if instead we assume the matrix of eigenvectors is a tensor product: $V = A\otimes B$, $X=VDV^{-1}$? This still dramatically reduces the space of possible matrices, so it may be possible to exactly recover a matrix of this form. How would we solve this algorithmically?
Asked
Active
Viewed 66 times
3
-
1If $V$ is the tensor product of two vectors $A,B$ (I suppose this is what you mean), then it has rank 2 and $V^{-1}$ in your formula does not exist unless you are only considering $2\times 2$ matrices. – Wolfgang Bangerth Feb 08 '14 at 13:05
-
No, $A$ and $B$ can be full matrices. For instance, the 2D Fourier basis is the tensor product of two 1D Fourier bases, both of which are invertible matrices. – David P Feb 09 '14 at 02:38
-
Do you mean V is the kroneker product of matrices A and B ? – sebas Feb 09 '14 at 09:26