Let $(X_{1},X_{2},\ldots X_{n})$ be a random sample from the distribution with density $$f(x)=\begin{cases} e^{-(x-\delta)},&\text{if x > $\delta$}\\ 0, &\text{otherwise} \end{cases} $$ Find the MVUE(Minimum variance unbiased estimator) for $\delta$.
I tried to use the Lehmann-Scheffé theorem which says that: Let $X_{1},X_{2},\ldots,X_{n}$ $n$ a fixed positive integer denote a random sample from the distribution that has pdf $f(x,\theta)$ where $\theta \in \Omega$. Let $Y1=u1(X_{1},X_{2},\ldots X_{n})$ be sufficient statistics for $\theta$ and let the family $\{f_{Y_{1}}(y1;\theta)\}$ be complete. If there is a function of $Y_{1}$ that is unbiased estimator of $\theta$, then this function of $Y_{1}$ is the unique MVUE.
Regarding the above argument I found $\sum_{i=1}^{n}X_{i}$ as sufficient statistics if I show the pdf family is complete then I will be done because $\frac{\sum_{i=1}^{n}X_{i}}{n}-1$ is unbiased estimator which is function of $\sum_{i=1}^{n}X_{i}$ in this way I will apply the above theorem and say $\frac{\sum_{i=1}^{n}X_{i}}{n}-1$ is MVUE. The issue is showing the completeness because it is difficult to know the pdf of $\sum_{i=1}^{n}X_{i}$ even by convolution can you see it?