Q1) I have understood that linear-phase filters have more delay time whereas minimum phase filters implement a lesser delay time but introduce phase distortion artifacts which linear-phase filters avoid. For which applications in real-time DSP is minimum phase more useful than linear phase.
Asked
Active
Viewed 779 times
1
-
1perhaps when delay is more important that preserving the waveshape of a signal. – robert bristow-johnson Mar 24 '16 at 00:05
-
Also, a minimum phase filter can be inverted and still be stable. – CMDoolittle Mar 25 '16 at 17:38
-
@CMDoolittle So linear filters are unstable when inverted? Is there any document you have that could explain this concept? – Akhilesh Rao Mar 25 '16 at 19:50
-
1A minimum phase filter has all its poles and zeros inside the unit circle. When you invert it, the poles and zeros switch places, but this means the poles are still inside the unit circle, and it is therefore stable. – CMDoolittle Mar 26 '16 at 23:01
-
@CMDoolittle, what about zeros on the unit circle? – Jazzmaniac Mar 27 '16 at 12:38
-
@Jazzmaniac I just found someone has posted this. Maybe this will be of help :) http://dsp.stackexchange.com/questions/2241/what-is-the-true-meaning-of-a-minimum-phase-system – Akhilesh Rao Mar 27 '16 at 19:35