I have a time-series observation dataset that has been distorted. I want to recover the best approximation of the original signal as possible. Disclaimer:: I know only the basics of linear algebra, so please bear with me.
I have a good model of the distortion, represented by a square matrix. In theory, if I can find an inverse of the transformation, I can recover the original signal. However, the model of the distortion as a matrix is ill-conditioned (i.e. almost singular).
Is it possible to take this matrix model, and generate an invertible matrix that is an approximation of the original matrix?