Electronic – the proper dB unit to use when expressing the amplitude of a sample in a digital audio recording

decibel

I've seen 20*Log10(amplitude) as the proper way to calculate dB, where 0 < amplitude <= 1. (So, to get amplitude when signed integer data types are used to represent amplitudes, you use value/maxvalue for positive values and value/minvalue for negative values.)

I know this formula generates 0 for extreme values and negative numbers for everything else. I also know it's showing a relationship between the sample and an extreme sample value. Is this properly called dBu or dBv or something else? I've seen both, plus just "dB".

I didn't study electrical engineering, so as close as you can get to a layman's explanation of why a particular unit is correct would also be helpful.

Best Answer

Your confusion seems to be related to not understanding exactly how decibel measurements work. You say \$20·\log_{10}(amplitude)\$, but this isn't quite correct.

The proper formula is \$20·\log_{10}(\frac{amplitude}{reference})\$, where \$reference\$ is your reference value. The choice of reference value is what determines the full symbol used; if \$reference = 1\mathrm{μV}\$, for example, then the unit would be written as dBμV or dBuV. Likewise, with a millivolt reference, you would write it as dBmV. Current quantities are also written this way, dBμA for example.

Note that the formula is different when dealing with power quantities. With a power measurement, you use \$10·\log_{10}(\frac{amplitude}{reference})\$. The reason 10 is used instead of 20 here is because of the quadratic relationship between power and voltage or power and current.