All filters will introduce delay; the sharper the roll-off the more delay is required since the two are directly related. It is the variation in group delay specifically that can cause distortion in the signal when we are concerned about the time alignment of signals that occupy different frequencies. Group delay is the negative derivative of phase with respect to frequency, so a changing phase over frequency is associated with time delay - when the phase slope is linear, then the delay is constant at all frequencies which then causes no distortion since all frequency components will stay aligned in time. When the phase slope varies over frequency- we get group delay variation and what we refer to as "group delay distortion".
When concerned about group delay in an analog filter, a Bessel filter is a better choice as it provides constant group delay across the passband at the expense of significantly less stop band roll-off and out of band attenuation. Alternatively, the signal can be equalized for the group delay variation, and doing this in the digital domain as an all-pass filter is a simple and attractive solution as we can determine and equalize for all distortion effects that may be introduced elsewhere in the analog system as well (see this post for details on determining the filter coefficients; for a fixed equalizer solution, the computation can be done off-line such that the implementation is just an FIR filter). Alternatively an all-pass filter can be implemented as an analog filter as well.
I suggest first reviewing the actual expected group delay variation for the specific filter used (either compute or test) and determine if the specific time variation involved is sufficient to even be a concern.