So I work in this domain of biophysics that has to do with a light-based detection for measuring small movement of molecules (nanometer and piconewton scale) via a Quadrant Photodiode. This signal contains lots of information but is riddled with noise. One of the challenges is denoising this signal and while conventional methods such as savitsky-golay tends to work well there are set cutoff and threshold values that go into this method which makes it not as feasible.
Time-series traces from this measurement look like a sawtooth curve and as the particle moves in space and time, the noise changes (so noise is the not the same everywhere) (Figure attached below).
My question is - I have noise measurements from this signal (I have recordings where sawtooth event never happens and only noise is left). Can I train a self-supervised learning method to denoise this signal using my known noise recordings? For example - is there a high-frequency bandpass filter that takes in some noise and can be trained to automatically smooth this curve to what we might expect the ground truth to be? Is there a better approach to it? If my question is unclear please let me know and I can provide more information.


