Let $X_1,...,X_n$ be independent copies of a real-valued random variable $X$ where $X$ has Lebesgue density
\begin{align*} p_\theta(x) = \begin{cases} \exp(\theta-x),\quad x>\theta \\ 0, \quad\quad\quad\quad\;\ x\leq \theta, \end{cases} \end{align*} where $\theta\in \mathbb{R}$ is an unknown parameter. Let $S:=\min(X_1,...,X_n)$.
Find the Uniform Minimum Variance Unbiased (UMVU) estimator of $\theta$.
I already know that $S$ is sufficient for $\theta$ and that $T:=S-1/n$ is an unbiased estimator of $\theta.$ My idea is to apply the Lehmann-Scheffé thm. since then the UMVU is given by
\begin{align*} \mathbb{E}[T|S]=\mathbb{E}[S-1/n|S]=S-1/n. \end{align*}
Is this the correct approach? If yes, for applying Lehmann-Scheffé, I would also need that S is a complete statistic. How do I show this properly?
Edit: I tried to show completeness by definition, i.e. I setup the equation $\mathbb{E}_\theta[g(S)]=0 \;\forall \theta$ for some function $g$ and now want to show that $g(S)=0 \; \mathbb{P}_\theta$-a.s. for all $\theta$. Since the $X_i$ are iid it is easy to see that the cdf is $F_S(x)=1-(1-P_\theta(x))^n$, where $P_\theta(x)$ is the cdf of $X_i$. By taking the derivative we get the pdf for $S$: $f_S(x)=n\cdot p_\theta(x)(1-P_\theta (x))^{n-1}$. $P_\theta (x)$ can be easily calculated and we get \begin{align*} f_S(x)=n\cdot e^{n(\theta-x)}. \end{align*}
Hence, $\mathbb{E}_\theta[g(S)]=\int_\theta^\infty g(x)ne^{n(\theta-x)}dx$ has to be $0$.
Is it now enough to say that $g(S)=0 \; \mathbb{P}_\theta$-a.s. for all $\theta$, since the exponential function is always positive? Or is there a more rigorous way to show it?