Ridge regression can be posed as minimizing the following objective function (over $x$):
$$\frac{1}{2} \lVert Ax - b \lVert_2^2 ~+ \frac{\lambda}{2} \lVert x \lVert_2^2 $$
Which has a closed form solution:
$$x = (A^TA + \lambda I)^{-1} A^T b $$
Let's say we want to solve this problem for several different values of $b$. Then its a good idea to cache the LU decomposition of $A^TA + \lambda I$. In MATLAB I this translates to:
[L,U] = factor(A'*A + lambda*I)
x1 = L \ U \ A'*b1
x2 = L \ U \ A'*b2
% etc...
Here's my question: if $\lambda$ is updated after several solves, is there a simple/efficient way of updating $L$ and $U$? Or do I have to completely redo the LU factorization?