I have a low-voltage amplifier with an analog LPF of ~106 Hz and a HPF of 0.482 Hz (both first-order RC). I can choose to apply a calibration pulse that will shut off the input and send a 500uV, 5Hz square wave across this analog filter band. The amplifier is followed by an ADC and I'm receiving all values in code.
What's the best way to use this calibration signal? Is it possible to "reverse" the effects of the analog filters if I know what the input signal is, so that I could then calculate gain values?
EDIT: I will add that I have the basic calibration idea down... I am averaging the top and bottom values of the square wave (ignoring the transition values) and scaling to the expected 500uV. I am looking for a way to account for the settling of the square wave due to the HPF.