Go Back   Home > Forums > Blogs > RJM Audio Blog

If I put my notes here, I might be able to find them again later!
Rate this Entry

Quick note on line level output currents

Posted 18th July 2016 at 03:58 AM by rjm
Updated 22nd July 2016 at 10:41 PM by rjm

Consumer audio standard line level output is -10 dB, 0.316 V rms [dB = 20 * log (V/1V)]. Some devices like computer sound cards can boost that at the max volume settings, my Asus Xonar can do 6 dB or 2 V rms. Quite a lot of digital audio produces 2 V rms output, DACs and CD players and not just computer sound cards.

The amount of output current required by the line driver is the signal level divided by the load impedance, so to estimate the worst case scenario we have to consider the smallest practical load and the largest likely signal. The input impedance of consumer audio is typically 10k to 100k. 10k is the lowest design point, but sometimes people do strange things like drive two components at once which halves the value, or headphones, or pro audio gear with 600 ohm inputs.

The long and short of it, though, is that consumer audio inputs are never normally going to draw more than 1 mA. For pro audio the maximum is meanwhile 3 mA. 5 mA bias current through the driver output stage will be enough to keep it in class A under all foreseeable circumstances short of driving headphones.

*****

To be brutally honest, outside of the little exercise above with currents, I don't know how to put this knowledge to much other practical use. Mostly our interest in signal levels is about gain, and, at floor level, loudness: "How much gain should my phono stage have to play at the same level as my CD player?" "Do I need an active preamp?" "How do I match the amplifier sensitivity to the preamp gain?" The main difficulty as I see it is the relationship between the peak digital signal level and the typical music signal level is not defined, just as the relation between the "line level" -10 dB signal and the typical music signal level doesn't seem to be fixed. So a 2 V CD player could sound 16 dB louder than a "line level" tuner or phono stage, but as far as I can tell I have no basis for assuming it will be.

For phono stages with MC (0.5 mV, -66 dB typical signal) inputs, 55 dB gain is common, so that implies the output is about -10 dB or "consumer line level". MM (5 mV -46 dB typical signal), 35 dB gain, works out to the same -10 dB output. For "digital level" you'd need to boost that about 16 dB, though this is rarely done - leaving me to wonder if the real loudness difference is so large as that. It's a bit hard to get a handle on without test recordings on both LP and digital formats, which I don't have, so instead I'm having to go with the same album on both LP and CD and looking at where the volume control is for what I estimate is the same loudness...
Posted in The Lab
Views 578 Comments 2
Total Comments 2

Comments

  1. Old Comment
    jan.didden's Avatar
    Good summary. I only want to comment that your dB values seem to actually be dbV - dB below 1V. Since there are other 'dB values' like dBu (ref. 1mW @ 600 ohms) and dBr (ref some explicitly stated ref) I think this should be mentioned.

    Jan
    permalink
    Posted 21st July 2016 at 10:37 AM by jan.didden jan.didden is offline
  2. Old Comment
    rjm's Avatar
    Hi Jan,

    Thanks for your comment.

    I have read passionate arguments insisting that dB should not be adulterated to dBv or dB(V) since the expression is unitless and 20 dB is still 20 dB irrespective of whether we refer to voltage (10x) or power (100x). So I tend to adhere to that convention.

    The existence of pro-audio standards like dBu muddies this of course, and since we are discussing line output levels in absolute terms I'll add a note above.
    permalink
    Posted 22nd July 2016 at 12:09 AM by rjm rjm is offline
 

New To Site? Need Help?
Copyright ©1999-2017 diyAudio