I need some help with this exercise:
The density function of the random variable $X_i$ is:
\begin{equation*} f_\theta(x)=\exp(\theta-x)1_{[\theta,\infty)}(x) \end{equation*}
The maximum likelihood function is:
\begin{equation*} L(x,\theta)=\prod_{i=1}^n\exp(\theta-x_i)1_{[\theta,\infty)}(x_i)=exp\bigg(n\theta-\sum_{i=1}^n x_i\bigg)1_{[0,\min\{X_1,...,X_n\}]}(\theta) \end{equation*}
And its (unique) maximum is attended for $\hat\theta_{ML}=\min\{X_1,...,X_n\}$.
I find that the statistic $Y=\min\{X_1,...,X_n\}$ is sufficient using the factorization theorem and I now have to prove that it is also complete.
First of all I've calculated the density of $Y$ which is:
\begin{equation*} g_\theta(x)= n\exp(n\theta-nx) \end{equation*}
Then for the definition of complete statistic I have to prove that :
\begin{equation*} E_\theta(h(Y))=0 \Rightarrow h=0 \end{equation*}
and this implies that:
\begin{equation*} E_\theta(h(Y))=\int_\theta^\infty h(y)n\exp(n\theta-ny)dy=0 \end{equation*}
How can I conclude from the last expression that $h(y)$ has to be zero $\forall y$