2.2 Tikhonov Reqularization
Tikhonov regularization, named for Russian mathematician Andrey Tikhonov, attempts to fix the issue that arises when the least squares method is used with an ill-posed inverse problem by adding an additional constraint on the minimization: $$\min_x\{\Vert Ax-b \Vert ^2_2 + \lambda^2\Vert x\Vert^2_2\}$$ In the new minimization, the additional constraint is the squared norm of $x.$ $\lambda$ is a non-negative constant decided in advance, acting as a weight on the strength of the additional constraint. Some alternate ways to write Tikhonov regularization include $$\bbox[lightgreen]{(A^T A + \lambda^2 I)x=A^{T}b}$$ and $$\min \left\Vert\genfrac{[}{]}{0pt}{}{A}{\lambda I}x - \genfrac{[}{]}{0pt}{}{b}{0}\right\Vert$$
Can someone explain the alternate formulation in green. I am not able to convert this formulation to standard one. I get one Atransposedinverse extra. The second formulation can be easily converted to original formulation, though. Thanks.