2

Let $X_1,...,X_n$ be independent copies of a real-valued random variable $X$ where $X$ has Lebesgue density

\begin{align*} p_\theta(x) = \begin{cases} \exp(\theta-x),\quad x>\theta \\ 0, \quad\quad\quad\quad\;\ x\leq \theta, \end{cases} \end{align*} where $\theta\in \mathbb{R}$ is an unknown parameter. Let $S:=\min(X_1,...,X_n)$.

Find the Uniform Minimum Variance Unbiased (UMVU) estimator of $\theta$.

I already know that $S$ is sufficient for $\theta$ and that $T:=S-1/n$ is an unbiased estimator of $\theta.$ My idea is to apply the Lehmann-Scheffé thm. since then the UMVU is given by

\begin{align*} \mathbb{E}[T|S]=\mathbb{E}[S-1/n|S]=S-1/n. \end{align*}

Is this the correct approach? If yes, for applying Lehmann-Scheffé, I would also need that S is a complete statistic. How do I show this properly?

Edit: I tried to show completeness by definition, i.e. I setup the equation $\mathbb{E}_\theta[g(S)]=0 \;\forall \theta$ for some function $g$ and now want to show that $g(S)=0 \; \mathbb{P}_\theta$-a.s. for all $\theta$. Since the $X_i$ are iid it is easy to see that the cdf is $F_S(x)=1-(1-P_\theta(x))^n$, where $P_\theta(x)$ is the cdf of $X_i$. By taking the derivative we get the pdf for $S$: $f_S(x)=n\cdot p_\theta(x)(1-P_\theta (x))^{n-1}$. $P_\theta (x)$ can be easily calculated and we get \begin{align*} f_S(x)=n\cdot e^{n(\theta-x)}. \end{align*}

Hence, $\mathbb{E}_\theta[g(S)]=\int_\theta^\infty g(x)ne^{n(\theta-x)}dx$ has to be $0$.

Is it now enough to say that $g(S)=0 \; \mathbb{P}_\theta$-a.s. for all $\theta$, since the exponential function is always positive? Or is there a more rigorous way to show it?

KCd
  • 46,062
  • This is the correct approach. To show completeness, did you setup the equation $E_{\theta}(g(S))=0$ for all $\theta$ and for some function $g$? Can you show, as per definition, that $g(S)=0$ with probability 1? – StubbornAtom Aug 06 '19 at 17:26
  • Ok, so $\mathbb{E}\theta (g(S))=\int\theta^{\infty} g(x)e^{\theta-x}dx = e^{\theta}\int_\theta^{\infty}g(x)e^{-x}dx$ which is $0$ iff $\int_\theta^{\infty}g(x)e^{-x}dx=0$ for all $\theta$. Since the exponential function is always positive, this is only possible if $g(S)=0$ $\mathbb{P}_\theta$-a.s. for all $\theta$ and therefore, by definition, $S$ is complete. Is this the correct justification or can one show it more rigorous? – CauchySchwarz Aug 06 '19 at 18:29
  • 1
    You have to use pdf of $S$ in the integral. – StubbornAtom Aug 06 '19 at 18:53
  • Oh, of course! So the pdf of $S$ is $f_{\min(X_1,...,X_n)}(x)=p_{\theta}(x)(1-P_\theta(x))^{n-1}$ and therefore $\mathbb{E}\theta(g(S))=\int\theta^{\infty} e^{\theta-x}g(x)(1-(1-e^{\theta-x}))^{n-1}dx=\int_{\theta}^{\infty}g(x)e^{\theta-x}(e^{(\theta-x)(n-1)})dx=\int_{\theta}^{\infty}g(x)e^{n(\theta-x)}dx$ and the argumentation from above would then still hold? – CauchySchwarz Aug 06 '19 at 19:06
  • Please add your work in the post, not in comments. How did you derive the pdf of $S$? – StubbornAtom Aug 06 '19 at 19:52
  • https://math.stackexchange.com/q/3250333/321264 – StubbornAtom May 29 '20 at 18:35

1 Answers1

4

For some measurable function $g$, suppose

$$\mathbb E_{\theta}\left[g(S)\right]=\int_{\theta}^\infty g(x)ne^{-n(x-\theta)}\,dx=0\quad\,\forall\,\theta\in\mathbb R$$

That is, $$\int_{\theta}^\infty g(x)e^{-nx}\,dx=0\quad\forall\,\theta$$

Now for some $a\in(\theta,\infty)$, we can rewrite the last equation as

$$\int_{\theta}^a g(x)e^{-nx}\,dx+\int_a^\infty g(x)e^{-nx}\,dx=0\quad\forall\,\theta$$

Differentiating both sides of the last equation with respect to $\theta$, we get

$$g(\theta)e^{-n\theta}=0\quad\forall\,\theta$$

Now that $e^{-n\theta}>0$ for each $\theta$, you can conclude that $g$ is exactly zero almost everywhere.

This perhaps is a more convincing argument.

StubbornAtom
  • 17,052
  • How about using Laplace transformation on the second equation? And what is the purpose of finding $a\in(\theta,\infty)$. What would be wrong if taking the derivative w.r.t $\theta$ directly for the second equation? – Tan Jan 10 '21 at 01:49
  • 1
    @Tan I'm not sure how that works out here given my limited knowledge on Laplace transforms. The $a$ was introduced for working under the familiar setup of Leibniz integral rule where integral limits are finite. I don't think anything is wrong in taking the derivative directly but it probably requires some additional justification. – StubbornAtom Jan 10 '21 at 20:37
  • "Now for some $a\in(\theta,\infty)$". I think "some" should be "all/any", right? – Jackie Feb 06 '23 at 21:17
  • After differentiating, I think you miss a negative sign. It should be $$-g(\theta)e^{-n\theta}=0\quad\forall,\theta$$ – Jackie Feb 12 '23 at 00:17