Analogue vs Digital attenuation

Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.
Hi,

I have a Minidsp Balanced. Set to 2v in, 2v out.

My CD player, for example, outputs 2v rms, so if I connect it directly to the Minidsp, I optimise the bit depth in the dsp as its at full scale. If I attenuate digitally using the output faders (with a pot connected to the board) by 6dB, I get 1v at the output and I have lost a certain amount of bit depth and have a certain signal to noise ratio in the dsp.

Question is, if I attenuate the analogue output of the CD player by 6dB to 1v and then keep the faders at max in the Minidsp, I still get 1v at the output, but is the loss of bit depth and SNR the same as the other scenario?

Or to put it another way, if I have the Minidsp after a preamp and attenuate with the preamp, is this equivalent to inputting directly into the dsp and attenuating digitally?

Thanks
 
No, attenuating digitally will reduce the noise of the ADC built into the Minidsp, while attenuating its input signal will not.

On the other hand, analogue attenuation will give you headroom while digital attenuation will not. For example, when due to production tolerances or intersample overshoots the CD player produces 2.1 V RMS, you only get hard clipping in the case without any analogue attenuation.
 
Status
This old topic is closed. If you want to reopen this topic, contact a moderator using the "Report Post" button.