I am getting into signal processing, and am working on a project.
My analog-to-digital converter has a sample rate of 10 kHz, whereas my microcontroller can only handle up to 3 kHz.
- What does that mean?
- Will my Microcontroller get overspammed and start burning?
- do i need downsampling after adc? what would the benefits be of doing so?
My input signal is a 4 kHz sensor signal.