[ardour-dev] The new pan paradigm

Fons Adriaensen fons.adriaensen at skynet.be
Tue Aug 24 10:27:29 PDT 2004

On Mon, Aug 23, 2004 at 08:35:06PM -0500, Matt Walker wrote:

> Panning is really a pretty "artificial" concept. In the real world, a sound 
> source exists in a stereo sound stage, and we "locate it" (the eyes do this 
> much better than the ears) by virtue of the amplitude AND phase at each ear. 
> Reading Phillip Newell's monitoring book really drove this home. When you 
> turn a pan pot, your are ONLY locating the amplitude.

This is wrong. If you calculate the signals _at the ears_ that result from
playing an amplitude panned (sin/cos) signal through stereo speakers, you
will find that there is a phase difference between the sounds reaching the
ears, and at low frequencies there will hardly be any amplitude difference.

The only thing that's wrong with amplitude panning between pairs of
speakers is that the ratio of amplitudes of the pressure and velocity waves
is not correct. This is because the first add as scalars, and the second
as vectors. The only way to get this right is to use more speakers. Phase
differences between the speaker signals will only make things worse.

What *is* completely unnatural is to listen to panpotted stereo with
headphones. To solve this, convert to sum and difference, high-pass the
difference (first order, around 800 Hz) and convert back to L and R.
This will at low frequencies translate an amplitude difference into a
phase difference, as required by the ears.


More information about the Ardour-Dev mailing list