Formulate the data fitting problem as a least squares problem
$\frac {1}{2} \Vert Ax-b \Vert_2^2 $
I thought I was supposed to wrote it like this:
$ \frac {1}{2} x^THx + g^T+ \gamma$ but actually that's an unconstrained quadratic program; any help?
Formulate the data fitting problem as a least squares problem
$\frac {1}{2} \Vert Ax-b \Vert_2^2 $
I thought I was supposed to wrote it like this:
$ \frac {1}{2} x^THx + g^T+ \gamma$ but actually that's an unconstrained quadratic program; any help?
In data fitting, we are interested to solve:
$$\boldsymbol \theta = \underset{{\boldsymbol{\theta}} \in \mathbb{R} ^{M+1}}{\text{minimize}}J(\boldsymbol{\theta})$$
The error function $J\colon \mathbb{R^{M+1}}\to\mathbb{R}$ is given by $$J(\boldsymbol{\theta}) = \frac{1}{2N} \sum\limits_{n=1}^N \{h_{\boldsymbol{\theta}}(\boldsymbol{\phi}^{(n)})-t^{(n)}\}^2$$
where $\boldsymbol{\phi}:\mathbf{R}^D \rightarrow \mathbf{R}^{\mathcal{H}}$ a map/transformation function. The hypothesis $h_{\boldsymbol{\theta}}(\mathbf{\phi}^{(n)})$ we want to fit is given by:
$$h_{\boldsymbol{\theta}}(\boldsymbol{\phi}^{(n)}) = h(\boldsymbol{\phi}^{(n)},\boldsymbol{\theta}) = \theta_0 + \theta_1 \phi^{(n)}_1 + \theta_2 \phi^{(n)}_2 + \dots + \theta_D \phi^{(n)}_D = \sum\limits_{d=0}^D \theta_d \phi^{(n)}_d, \quad \phi^{(n)}_0 = 1$$
If we define the parameters vector $\boldsymbol{\theta} = [\theta_0, \theta_2, \dots, \theta_D]^T \in \mathbb{R^{D+1}}$ the vectorized form of the hypothesis and the error functions respectively is $$h_{\boldsymbol{\theta}}(\boldsymbol{\phi}^{(n)}) = \boldsymbol{\theta}^T \boldsymbol{\phi}^{(n)}$$ and $$J(\boldsymbol{\theta}) = \frac{1}{2N} (\boldsymbol{\Phi}\boldsymbol{\theta}- \mathbf{t})^T(\boldsymbol{\Phi}\boldsymbol{\theta}- \mathbf{t}) = \frac{1}{2N}||\boldsymbol{\Phi}\boldsymbol{\theta}- \mathbf{t}||^2$$ with
$$\boldsymbol{\Phi} = \begin{bmatrix} (\boldsymbol{\phi}^{(1)})^T \\[0.3em] (\boldsymbol{\phi}^{(2)})^T \\[0.3em] \vdots \\[0.3em] (\boldsymbol{\phi}^{(n)})^T \end{bmatrix}= \begin{bmatrix} \phi^{(1)}_1 & \phi^{(1)}_2 & \dots & \phi^{(1)}_D \\[0.3em] \phi^{(2)}_1 & \phi^{(2)}_2 & \dots & \phi^{(2)}_D \\[0.3em] \vdots \\[0.3em] \phi^{(N)}_1 & \phi^{(N)}_2 & \dots & \phi^{(N)}_D \\[0.3em] \end{bmatrix}$$
Finally, the quadratic form of the error function is:
$$J(\boldsymbol{\theta}) = \frac{1}{2N} \Bigg\{ \boldsymbol{\theta}^T \boldsymbol{\Phi}^T\boldsymbol{\Phi}\boldsymbol{\theta} -2 \boldsymbol{t}^T \boldsymbol{\Phi} \boldsymbol{\theta} + \boldsymbol{t}^T\boldsymbol{t} \Bigg\}$$
PS: This methodology is also called multivariable linear regression.
Formulate the data fitting problem:
$\hat y(\theta)=\hat y(\theta,t)={\theta}_0+{\theta}_1e^{-t}+{\theta}_2t^2 $ $\theta=[{\theta}_0,{\theta}_1,{\theta}_2]$
$y_i=\hat y(\theta,t_i)+e_i$ $i=1,2,..,m$
as a least squares problem
$\frac {1}{2} \Vert Ax-b \Vert_2^2 $
I did this way:
A=$\begin{bmatrix}1& e^{-t}& t^2\\1& e^{-t}& t^2 \\ ..&..&..\\1& e^{-t}& t^2\end{bmatrix} $ $\in \mathbb{R}^{m \times3}$
x=$\begin{bmatrix} {\theta}_0 \\ {\theta}_1 \\ {\theta}_2 \end{bmatrix} $
$b=y_i \in \mathbb{R}^m$
$n=3$
Is it right?