Electrical – Phase Shift in Audio Amplifier

active-filterfilterphase shift

I am trying to design a desktop audio amplifier, and I'd like to better understand the effects of phase shift on my output. I'm trying to design this amplifier to be of the highest quality I can, and as I understand, phase shift cannot be completely eliminated. Is there an approximate level of phase shift that is considered "acceptable" by some standard? For example, a common filter I see recommended for flat phase response is the biquad filter, but even that one has a phase shift that goes like arctan(f), so that a corner frequency even at 100kHz produces significant phase shift in the audible region. What are some good topologies/techniques to reduce phase shift for an audio amplifier?

Best Answer

Unfortunately unless you have expensive linear-phase, planar-speakers or similar high quality speakers, the group delay distortion is quite significant in that it distorts the triangulation of the sound source on a stage.

But lets assume you have an anechoic room with perfect speakers.

The phase response is only shifted 50% at the half power point and the pass band must be 2 decades higher if phase shift is critical but by that point the amplitude is now 20 dB attentuated per order of the filter per decade. So our perception tends to ignore that.

My Rule of Thumb for phase response is 1.5 decades of bandwidth above useable bandwidth with less than 10 degree phase shift. For 1 degree such as TV baseband color video, you need at least 2 decades more bandwidth.

The ideal LPF response is a flat group delay found in this filter normalized at @1kHz. enter image description here

. enter image description here