I'm trying to reproduce the results from this paper "Discrete time techniques for Time Delay Estimation" doi:10.1109/78.193195, for both the correlation and Least Squares.
I've generated a random (statistically gaussian distributed) signal and filtered it with a gaussian low-pass, to make it band-limited. Then I added increasing noise to the signals.
I've tried for 0, subsample and 1< sample delays. In all cases, the variance of my DC keeps decreasing as I increase SNR (actually, the behavior closely matches that of the LS or average square difference function, as they call it).
Is there any reason as to why I don't see that asymptotic behavior predicted by the paper? Does it have to do with the signal properties or the way I'm processing in some way?
Also, if possible, can someone please make sense to me of why that limit exists for the correlation, conceptually? I can't seem to understand from the paper.
Here is a simplified version of the code that shows the problem.
Any help would be greatly appreciated.
This is a plot from the article, showing the effect I can't reproduce.
EDIT/UPDATE: Also tried for different reference correlation/least squares window sizes, as represented on the code. Essentially, I considered both the case where I also compare the signal with the surrounding data of the time window under analysis, and for the case where I ignore that data.
My reasoning is that a time delay in a continuous signal will bring uncorrelated samples into the signal. If I consider the surrounding window the number of uncorrelated samples will be constant, and not dependant on the delay (as long as my delayed window is still contained within the larger reference window.). This makes least squares more intuitive, also, since I always divide by the same number of points. Either way, I get the same result.



From my understanding, then, you get a fixed bias due to the sub-optimal method of interpolation. My question, in that case, lies with why the Least Squares estimator keeps decreasing in variance according to the paper, since they use the exact same interpolation method.
– LDPC Oct 08 '17 at 10:34Either way, in my simulation I get decreasing values of variance for whichever delay I try.
– LDPC Oct 08 '17 at 11:36