Let's say we want to solve $Ax=b$ ($A$ symmetric positive /semi/definite) with the conjugate residual/gradient method. $A$ comes from FEM where the mesh is being refined. The exact solution is $x_*$ and the approximate solution after $m$ iterations is $x_m$. Say that the initial guess is $x_0=0$.
We have that $$\varepsilon=\left(\frac{(x_*-x_m, A(x_*-x_m))}{(x_*-x_m, A(x_*-x_m))}\right)^{1/2}=\left(\frac{(A^{1/2}(x_*-x_m), A^{1/2}(x_*-x_m))}{(A^{1/2}(x_*-x_m), A^{1/2}(x_*-x_m))}\right)^{1/2}\le2\left(\frac{\sqrt{\kappa}-1}{\sqrt{\kappa}+1}\right)^m$$ with $\kappa$ being the condition number of $A$. But how can I calculate or estimate this in practice? I don't have $x_*$, all I have is $x_m$ and thus $r_m=b-Ax_m$. Sure, I can calculate \begin{align} \tilde{\varepsilon}=\left(\frac{(r_m,r_m)}{(r_0,r_0)}\right)^{1/2}&=\left(\frac{(b-Ax_m,b-Ax_m)}{(b-Ax_0,b-Ax_0)}\right)^{1/2}=\left(\frac{(b-Ax_m,b-Ax_m)}{(b-Ax_0,b-Ax_0)}\right)^{1/2} \\ &=\left(\frac{(A(x_*-x_m),A(x_*-x_m))}{(A(x_*-x_0),A(x_*-x_0))}\right)^{1/2}=\left(\frac{(A(x_*-x_m),A(x_*-x_m))}{(b,b)}\right)^{1/2} \end{align} but then what? For a fix $A$ I think I'll still have the same asymptotic decrease in the $A$-norm when using $\tilde{\varepsilon}$ as stopping criterion, that is $\left(\frac{\sqrt{\kappa}-1}{\sqrt{\kappa}+1}\right)^m$, so if I continue until $\tilde{\varepsilon}<\tilde{\varepsilon}_{\mathrm{max}}$ then the number of iterations needed are $m=c\sqrt{\kappa}$. But even so, $c$ depends on $A$! That means my iteration count could behave oddly if I'm using the algorithm for different matrices $A$ (corresponding to a mesh that has been refined a different number of times), does it not? Using $\tilde{\varepsilon}$ might not give the wanted $\sqrt{\kappa}$ growth of the iteration count. (Now, I'm not sure why $\varepsilon$ is a better stopping criterion than $\tilde{\varepsilon}$, but using one for theory and the other for implementation without understanding how they're connected feels very immoral.)