I am trying to measure the delay introduced by smoothening filters to my data, including a double-exponential smoothening filter. I run my series data through this function:
def double_exponential_smoothing(series, alpha, beta, n_preds=1):
"""
Given a series, alpha, beta and n_preds (number of
forecast/prediction steps), perform the prediction.
"""
# first value is same as series
result = [series[0]]
for n in range(1, len(series)+1):
if n == 1:
level, trend = series[0], series[1] - series[0]
if n >= len(series): # forecasting
value = result[-1]
else:
value = series[n]
last_level, level = level, alpha*value + (1-alpha)*(level+trend)
trend = beta*(level-last_level) + (1-beta)*trend
result.append(level+trend)
return result
And then measure the time delay by comparing the filtered data with the raw data (y2 & y1 respectively):
def lag_finder(y1, y2, sample_rate):
n = len(y1)
corr = signal.correlate(y2, y1, mode='same') / np.sqrt(signal.correlate(y1, y1, mode='same')[int(n/2)] * signal.correlate(y2, y2, mode='same')[int(n/2)])
delay_arr = np.linspace(-0.5*n/sample_rate, 0.5*n/sample_rate, n)
delay = delay_arr[np.argmax(corr)]
return delay
From the results of this function, and from plotting the data, I can see the filtered data can lead the raw data. Why is that? Am I calculating the delay between the two signals correctly?