Are there ways to reduce the smearing / spectral leakage of zero-padding interpolated data?
I learned that, given a small collection of samples, one can increase the frequency resolution of an FFT operation by interpolating the sample size by appending zeros. This makes it easier to identify a peak frequency that would otherwise fall between bins of the original samples, but it also increases smearing.
To illustrate this, I generated 4 cycles of a 50hz sine wave at a sample rate of 48khz. This resulted in 3,840 samples, interpolated to 32,768 samples, and ran the results through FFT:
(written in Kotlin and using JTransforms for FFT operations)

The smearing makes sense as zero-padding does not fundamentally increase the amount of useful data. Consequently, I don't expect to be able to eliminate smearing completely (as nice as it would be). But, it would be nice if there were filters/algorithms/etc that reduce the severity given we know how many samples were padded.
Additional context, if helpful:
I am building an open-source light organ that colors LEDs based on the average frequency (weighted by magnitude) between 20hz and 120hz. Given that this is processing music in real time and updating the LEDs accordingly, low latency is essential.
I am using a buffer to increase responsiveness, but increasing my buffer size necessarily increases how far back in time I am looking, thus increasing perceived latency. I've found that 4096 samples @ 48khz (aka: 85ms) is the largest sample size before the delay becomes a nuisance.
I have identified some compromises if smearing cannot be overcome, but would much prefer to have cleaner data to work with.
Is there something particular in that post that you think I should be considering?
– Alex Larson Jan 12 '23 at 01:17