Electronic – Why do audio op amps require such high rail voltages

adcaudiooperational-amplifier

This might be a silly question, but I haven't been able to find it addressed directly anywhere on the internet yet. I also have a handful of related questions in-line, which I hope don't stray too far off-topic.

In pro gear, Line-level audio signals are roughly ~3.5V peak-to-peak, so why do audio circuits routinely require or recommend rail voltages of +/- 12v or higher?

Is this purely a headroom thing? Or do non-linearities in op amps depend on the supply voltages?

Or to support cheaper components? Looking at the datasheet of the TL072, the maximum output voltage can be as low as 2/3 of the rail if the load resistance gets low (2k Ohm), but is typically at 90% of the rail for a 10k Ohm load. But, you could also use a higher-end op-amp that's rail-to rail?

The main thing that prompts this question is looking at the datasheet for the Cirrus CS4272 and the schematics / data on the evaluation board. In that case, even though the ADC operates from 0v to 5v, they still opt to use a bipolar +/-18V supply for the input buffer. In that particular example, they're using the NE5532D8, which has a worst-case output swing of 80% of the rail, and supports rails as low as +/- 3v.

So, why would they use +/- 18V supplies if the ADC only supports 0-5v audio (presumably biased around 2.5v), and using a +/- 3v supply would still easily accommodate the 3.5V peak-to-peak range?

According to the datasheet, there is also no scaling (gain or attenuation) happening in this circuit:

XLR connectors supply the CS4272 analog inputs through unity gain, AC-coupled differential circuits. A 2 Vrms differential signal will drive the CS4272 inputs to full scale.

So any signal that's over line level would wind up getting clipped by the ADC anyways. Is it better to have the clipping in the ADC vs. the op amp? Or is the higher rail required for the output stage, even though it would still only provide a ~3.5v peak-to-peak line-level output signal?

In the context of driving a 5v single-supply ADC, what are the reasons that using an input stage with higher, bipolar supplies is better than using something like an LT1215 on a single-supply at 5v? (I can't post a link because I don't have 10 reputation on this particular Stack Exchange yet… It's easy enough to google)

Thanks!

Best Answer

  1. Todays high-definition audio equipment that works with 24 bit audio, such as mixers, use 600 ohm differential inputs and internal buss connections (until the signal is digitized) that need a voltage swing of +/- 10 volts to handle the 120dB dynamic range of SACD / DVDV / Blu-Ray audio tracks. The diff driver IC is often a SSM2142 which has supply rails of +/- 18 volts (in mixers).
  2. Yes, some of this extra voltage is to make up for the Vdrop across distant 600 ohm loads, so a +/- 10 volt swing is available at the load. This higher drive voltage began with CD's (early 1980's) which have about 90dB dynamic range, and they overloaded the input of many older stereos so the sound was very distorted.
  3. The fix was to install -6dB phono-plug adapters so the level was dropped to what the old stereo's were used to. I cannot account as to why a designer would use a hi-voltage op-amp to drive a low voltage ADC, unless the input is the SACD standard +/- 10 volts, then divide it down before the ADC.
  4. Todays CD / SACD / DVD / Blu-Ray outputs have a +/- 10 volt swing as a maximum output, so just the analog output IC has to have the high-voltage rails. Before this IC the signal is still in a digital format. MP3 players can work with much lower voltages, as the dynamic range is about 60dB.
  5. A higher quality player that can play CD files may have 90dB of dynamic range by boosting the voltage for the headphone / earbud driver IC only.
  6. Unless your handling 24 bit audio or ultra precision DC measurements, 5 volt single ended supplies and ultra-quiet 5 volt and 3.3 volt ADC's and DAC's are good enough for CD, MP3 and conventional audio. So called 'chopper' amplifiers can also perform precision DC measurements with a 5 volt supply, though the sample rate is limited to maybe 100 sps;this will increase over time. Power amplifiers use +/-15 volt rails (or higher) for the op-amps to handle todays high definition audio with almost no distortion.
  7. If the designer installs +/- 12 volt op-amps in 5 volt ADC circuits, it could be for ease of design, maybe driving lower impedance loads, and/or to avoid buying 'special' low voltage op-amps that may just add to inventory cost.
  8. If you look up SACD, it is a studio-grade 24 bit recording format that is mixed down to DVD/Blu-Ray audio tracks, with music concerts also having a mix-down to 16 bit format for CD's. +/- 10 volts is maximum line level for 600 ohm loads or higher, but todays home audio expects these level from these devices.
  9. The standard 'Tape In' and 'Tape-Out' and MP3 and cassette input still work at their original signal levels, about +/- 1 volt. The old level standards still exist, but alongside of high-definition audio (24 bit).
  10. Five volt and 3.3 volt ADC's are capable of much more dynamic range (very low noise floor) then in the past. I do not mean to imply that 5 or 3.3 volt ADC's can not handle high-definition audio, as long as it is purpose built for that task. Their cost was initially very high but is coming down year by year.
  11. So called "high voltage" op-amps are used for many reasons, including audio and ultrasonic sounds. Supply rails of +/- 12 volts is actually on the low side, as the LTC6090 has supply rails of +/- 70 volts, for ultra-sound and motion control.
  12. Some audio power amplifiers, such as those in the Cerwin-Vega Metron series, have supply rails of +/-130 volts, for an output of 1,500 watts, so a preamp stage with +/- 15 volts rails seems like low voltage.
  13. In this case and others, the higher voltages are needed because the next stage or target device needs that wide voltage range to work properly. The load maybe a voltage divider designed for a low-impedance low-noise load or simple impedance matching.
  14. By the way, "xxdB" is in reference to the noise floor of the circuitry or the recording, such that the highest noise at the floor level is too low to detect or be noticeable, compared to 'full volume'. Also, it does not indicate the voltage an IC is powered by, but how quiet the IC is with no signal applied, compared to full volume sound levels.
  15. Supply rails of +/-12 volts and higher are being reduced to those devices that actually need them, such as power amps, power supplies, test equipment, etc.

    EDIT: Due to constant improvements in the noise floor of DC and audio IC's it is worth mentioning that at some point in time the old 'standard' +/- 10 volts into 600 ohms specification will be superseded by lower voltages into 600 ohms. The 600 ohm standard comes from the days of running 200 feet of 32 channel shielded-twisted-pair (STP) cables from a concert stage to the sound tower 200 feet away, and at that time was convenient for analog mixers with 600 ohm XLR inputs.

    All of these expensive steps were needed to keep noise out of the microphone and instrument cables, such as overhead lights,strobe lights, lasers, walkie-talkies that the stage crew used, etc.

    That being said, it is clear that as time passes more sound sources are being digitized at the source, avoiding the need for the 200 foot 'snake'. At some point in the future all sources will be digitized right away, and may not become analog again until the signal reaches the headphones and speakers, rendering my current answer obsolete in terms of signals, but not power.