I want to be able to digitally switch the input of a line-level preamplifier. I was considering using relays to physically connect or disconnect each source to the input of the amp, which would be perfect in terms of fidelity. But is that sort of overkill? Would CMOS transmission gates be good enough? Am I overlooking something far simpler and more obvious? Thanks.
With a bit more searching I think I may have come across the solutions.
It looks like, for a simple combination of inputs a simple resistive circuit as shown in the question with simple mixing of the outputs is fine, if gain control is required on the individual inputs then it is better to use a virtual earth mixer is best.... so I think what I propose should work.
I'll leave the question open for a couple more days to see if anyone can add any further inputs.
Yes that solved it. I have three pots dividing the voltage across the three speakers (left, right, centre) at the speaker outs on the receiver. The outputs of the pots are combined (mixed) into one input for the mid-woofer crossover and then through to the amp... pretty much perfect now... one day (when we get a bigger home) ill be able to get proper speakers and get the best sound out but this is a great workaround for a small house.
- Todays high-definition audio equipment that works with 24 bit audio, such as mixers, use 600 ohm differential inputs and internal buss connections (until the signal is digitized) that need a voltage swing of +/- 10 volts to handle the 120dB dynamic range of SACD / DVDV / Blu-Ray audio tracks. The diff driver IC is often a SSM2142 which has supply rails of +/- 18 volts (in mixers).
- Yes, some of this extra voltage is to make up for the Vdrop across distant 600 ohm loads, so a +/- 10 volt swing is available at the load. This higher drive voltage began with CD's (early 1980's) which have about 90dB dynamic range, and they overloaded the input of many older stereos so the sound was very distorted.
- The fix was to install -6dB phono-plug adapters so the level was dropped to what the old stereo's were used to. I cannot account as to why a designer would use a hi-voltage op-amp to drive a low voltage ADC, unless the input is the SACD standard +/- 10 volts, then divide it down before the ADC.
- Todays CD / SACD / DVD / Blu-Ray outputs have a +/- 10 volt swing as a maximum output, so just the analog output IC has to have the high-voltage rails. Before this IC the signal is still in a digital format. MP3 players can work with much lower voltages, as the dynamic range is about 60dB.
- A higher quality player that can play CD files may have 90dB of dynamic range by boosting the voltage for the headphone / earbud driver IC only.
- Unless your handling 24 bit audio or ultra precision DC measurements, 5 volt single ended supplies and ultra-quiet 5 volt and 3.3 volt ADC's and DAC's are good enough for CD, MP3 and conventional audio. So called 'chopper' amplifiers can also perform precision DC measurements with a 5 volt supply, though the sample rate is limited to maybe 100 sps;this will increase over time. Power amplifiers use +/-15 volt rails (or higher) for the op-amps to handle todays high definition audio with almost no distortion.
- If the designer installs +/- 12 volt op-amps in 5 volt ADC circuits, it could be for ease of design, maybe driving lower impedance loads, and/or to avoid buying 'special' low voltage op-amps that may just add to inventory cost.
- If you look up SACD, it is a studio-grade 24 bit recording format that is mixed down to DVD/Blu-Ray audio tracks, with music concerts also having a mix-down to 16 bit format for CD's. +/- 10 volts is maximum line level for 600 ohm loads or higher, but todays home audio expects these level from these devices.
- The standard 'Tape In' and 'Tape-Out' and MP3 and cassette input still work at their original signal levels, about +/- 1 volt. The old level standards still exist, but alongside of high-definition audio (24 bit).
- Five volt and 3.3 volt ADC's are capable of much more dynamic range (very low noise floor) then in the past. I do not mean to imply that 5 or 3.3 volt ADC's can not handle high-definition audio, as long as it is purpose built for that task. Their cost was initially very high but is coming down year by year.
- So called "high voltage" op-amps are used for many reasons, including audio and ultrasonic sounds. Supply rails of +/- 12 volts is actually on the low side, as the LTC6090 has supply rails of +/- 70 volts, for ultra-sound and motion control.
- Some audio power amplifiers, such as those in the Cerwin-Vega Metron series, have supply rails of +/-130 volts, for an output of 1,500 watts, so a preamp stage with +/- 15 volts rails seems like low voltage.
- In this case and others, the higher voltages are needed because the next stage or target device needs that wide voltage range to work properly. The load maybe a voltage divider designed for a low-impedance low-noise load or simple impedance matching.
- By the way, "xxdB" is in reference to the noise floor of the circuitry or the recording, such that the highest noise at the floor level is too low to detect or be noticeable, compared to 'full volume'. Also, it does not indicate the voltage an IC is powered by, but how quiet the IC is with no signal applied, compared to full volume sound levels.
- Supply rails of +/-12 volts and higher are being reduced to those devices that actually need them, such as power amps, power supplies, test equipment, etc.
EDIT: Due to constant improvements in the noise floor of DC and audio IC's it is worth mentioning that at some point in time the old 'standard' +/- 10 volts into 600 ohms specification will be superseded by lower voltages into 600 ohms. The 600 ohm standard comes from the days of running 200 feet of 32 channel shielded-twisted-pair (STP) cables from a concert stage to the sound tower 200 feet away, and at that time was convenient for analog mixers with 600 ohm XLR inputs.
All of these expensive steps were needed to keep noise out of the microphone and instrument cables, such as overhead lights,strobe lights, lasers, walkie-talkies that the stage crew used, etc.
That being said, it is clear that as time passes more sound sources are being digitized at the source, avoiding the need for the 200 foot 'snake'. At some point in the future all sources will be digitized right away, and may not become analog again until the signal reaches the headphones and speakers, rendering my current answer obsolete in terms of signals, but not power.
- Electronic – Output coupling capacitors in dual-supply op amp circuits
- Electrical – Auto Audio Gain Control IC for normalized audio from line level input
- Electronic – Audio Noise with 5532 Opamp
- Electrical – Arduino controlled audio amplifier gain
- Electronic – Very high frequency FET
- Electronic – Would a suitably placed inductor stop audio feedback whine