1

So I'm trying to derive an analytical solution for a MLE that should estimate a static value polluted by multiplicative Gaussian noise.

The vector of measurements $\tilde{\boldsymbol{d}}$ is given as $a[n]x_u$ where $a[n]$ is the $n$-th realization of the random variable $A \propto \mathcal{N}(\mu_A, \sigma_A)$ and $x_u$ is an unknown constant that should be estimated from $\tilde{\boldsymbol{d}}$.

I got as far as the log-likelihood function, which I now need to maximize with regard to $x$. To do this, I need to take the derivative of the log-likelihood , set it to zero and solve for $x$. However, the summation term at the end is giving me a headache as I can't figure out its derivative.

$\ell(x, \tilde{\boldsymbol{d}}) = -N \cdot \log x - N\cdot \log\sigma_A - \frac{N}{2} \log 2\pi - \frac{1}{2\sigma^2_A}\sum\limits_{n=1}^N\bigg(\frac{\tilde{d}[n]}{x}-\mu_A\bigg)^2$

sobek
  • 151
  • 8

2 Answers2

4

OK, let's have a look at one of the problematic terms: $$ \frac{\delta}{\delta x} \bigg[ \bigg(\frac{\tilde{d}[n]}{x}-\mu_A\bigg)^2 \bigg ] = - \frac{2 \tilde{d}[n] \bigg (\tilde{d}[n] - \mu_A x\bigg) }{x^3} $$ which can be verified by Wolfram Alpha.

The full derivative of the summation term is then just this summed over $n$.

sobek
  • 151
  • 8
Peter K.
  • 25,714
  • 9
  • 46
  • 91
1

To be clear:

I'm assuming $\mathbf{\tilde{d}}$ is given as $a[n]x$, with a deterministic variable $x$, so that with the i.i.d. on $\mathbf{\tilde{d}}$ you then have $$ \text{if}\quad A \sim \mathcal (\mu_A, \sigma^2_A) \Longrightarrow \mathbf{\tilde{d}} \sim \mathcal (\mu_A x, \sigma^2_A x^2) $$ And with this, only the first and last term of your log-likelihood depend on $x$; I'm assuming your $\log(\cdot)$ is the natural logarithm $\ln (\cdot)$, otherwise you'll have to adjust the quadratic term accordingly. And this then gives you \begin{align} \frac{\textrm{d}\left[\ell(x, \tilde{\boldsymbol{d}})\right]}{\textrm{d}x}&=\frac{-N}{x}-\frac{1}{2\sigma^2_A}\sum_{n=1}^N\left[ -2\frac{\tilde{d}[n]}{x^2}\left(\frac{\tilde{d}[n]}{x}-\mu_A\right)\right]\\ &=\frac{-N}{x}+\frac{1}{\sigma^2_A{x^3}}\sum_{n=1}^N\tilde{d}[n]\left(\tilde{d}[n]-\mu_A x\right) \end{align} You can proceed from here.

Gilles
  • 3,386
  • 3
  • 21
  • 28
  • Why do you say that all x in the log-likelihood would become $x_u$? This does not make sense to me.

    $x_u$ is only contained in $\tilde{\boldsymbol{d}}$. Also $x$ is not a function of $n$. If our estimator is unbiased, then at the maximum of $\ell$ with regard to $x$, $x$ is the estimate of $x_u$.

    – sobek Jul 09 '16 at 17:20
  • @sobek you're right, $x$ is not a function of $n$. And wrong formulation. However, if $x_u$ is to be estimated then your PDF is a function $x$ (and not $x_u$), $\mu_A$, and $\sigma^2_A$. – Gilles Jul 09 '16 at 19:59
  • The PDF of the estimator, yes, but the PDF of $\tilde{\boldsymbol{d}}$, this i don't really understand. Thanks for the hint about the logarithm, you are right, since it is only used to make it easier to deal with the exponential of the Gaussian PDF, the natural logarithm must be used. – sobek Jul 09 '16 at 20:35