Input voltage for the ADC PCM1807

adcanalogvoltage

In the datasheet (http://www.ti.com/lit/gpn/pcm1807) of the Texas Instruments PCM1807 in section ABSOLUTE MAXIMUM RATINGS (page 2) it says

Analog Input Voltage \$V_{IN}L,V_{IN}R,V_{REF}\$: \$-0.3V \text{ to } (V_{CC} + 0.3V\$)

Ok, that means it's a standard ADC for which an AC signal must be shifted before it's fed to the input pins. Oh, and there's a convinient \$V_{REF}\$ for the DC offset (aka center voltage) as well.

But the schematic in section TYPICAL CIRCUIT CONNECTION DIAGRAM (page 25) both \$V_{IN}L\ \text{ and } V_{IN}R\$ are connected using a capacitor, which removes the DC offset and makes it an AC signal again, obviously conflicting with the absolute maximum ratings (and the overall logic of level shifting the input voltage to the \$0-V_{CC}\$ range).

What did I overlook?

Best Answer

Here's an application sheet for the PCM1807. Note the circuitry contained inside: -

enter image description here

Clearly, the analogue circuits only make an AC connection to the device's inputs. From this I assume that the basic input of the PCM1807 is biased at a suitable voltage and this bias voltage should not be disturbed without incurring possible loss of performance.