There are instances when neural networks, after being trained continually on a subset of data, tend to drastically lose their performance. I've attached an image below depicting the same. I am looking for an algorithm to detect this sudden fall in performance.
I'm quite new to this area and therefore don't have any idea where to start. My initial guess would be to first smooth the plot, so as to reduce the noise.
Edit 1: I highly appreciate the good advice that I have received in the comments and answers. I have added some code that generates live time-series, that'll help better understand my problem. The code is designed to run in Jupyter Notebook cells.
import matplotlib.pyplot as plt
from IPython.display import clear_output
import time
import numpy as np
def test():
accuracy = []
for _ in range(1000000000):
time.sleep(1)
accuracy.append(np.random.randint(30))
clear_output(wait=True)
plt.figure(figsize=(20,5))
plt.plot(accuracy)
plt.show()
test()
sum(values[-10:]) / 10. That's it. That's the code. – Marcus Müller Jul 12 '22 at 19:30