Consider a D/A converter for audio signals consisiting of a zero-order-hold interpolator followed by a continuous-time lowpass filter with positive passband between 0 and 20KHz and stopband starting at fa= 40KHz.
Assume we want to convert a digital signal originally sampled at 16KHz. What is the minimum oversampling factor that we need to use?
for getting 16 khz i used 2 as my oversampling factor which seems not right. can you elaborate on top of it ?? i used 2 because (40khz/20khz)