Sound quality degradation from clock jitter with I2S?

One project I want to start next year is to build a “as digital as possible” system, i.e. the audio path is completely digital until the very last stage. I have been thinking of using I2S to transmit the audio signal from digital sources (CDs, bluetooth streaming etc.) and something like a TAS5756 based amp that can accept I2S as input.

Reading up on how this could be done I have come across the concept of jitter. A number of articles seem to imply that a low jitter is important for sound quality: lower jitter equates to better quality. Now I can understand this when sampling analog signals or when extracting a clock from a digital pulse train, but in dealing with something like I2S I have not, as yet, comprehended why that should be. My understanding is that the I2S bit clock is used to time the sampling/latching of the data line, i.e. is it a one or zero at the point the bit clock rises. As long as any jitter on the bit clock is within a certain tolerance (i.e. less than the width of the data “pulse”), the correct digital bit information will be latched and the digital information is not corrupted.

Am I missing something?
 
Thanks MarcekvdG. I see what you mean - basically the opposite of what happens when a jittery clock is sampling a signal for ADC with all the adverse effects that can occur.

So what do I need to consider in my design? Does this mean that I cannot rely on any bit clock coming from an external source, but need to generate/reclock the clock into something with low jitter before converting the I2S signal into anything analogue? Or is this over compensating?
 
What you might consider doing may depend on the source of I2S you are considering using. Does it have low jitter clocks? If so, can you assure that transport of clock signals from the I2S source to the DAC can be done with minimal jitter degradation? Things like that to think about.

What is often considered optimal for 'good' DACs is for the master clocks to be located very close the DAC chip. Copies of the clocks can then be sent to I2S sources for use as own reference clocks.

However, the chip you are considering using probably has performance limitations compared to a particularly 'good' DAC. How much difference clocks will make could be something that might require some experimentation to find out.
 
Last edited:
Thanks MarcekvdG. I see what you mean - basically the opposite of what happens when a jittery clock is sampling a signal for ADC with all the adverse effects that can occur.

Indeed, and it is even aggravated to some extent by the ultrasonic contents of the DAC output signal (aliases, out-of-band quantization noise).

So what do I need to consider in my design? Does this mean that I cannot rely on any bit clock coming from an external source, but need to generate/reclock the clock into something with low jitter before converting the I2S signal into anything analogue? Or is this over compensating?

It depends... There are some USB to I2S converter boards with pretty good clock generators, if the audio signal comes from a computer. The computer then synchronizes its output data to the clock generator on the converter board.

If you want to play an S/PDIF signal, you have to synchronize to the incoming signal without copying all its jitter (unfortunate feature of the S/PDIF standard). That usually means either using a narrow bandwidth PLL that synchronizes a good local oscillator to the S/PDIF signal, or using an asynchronous sample rate converter with a slow tracking loop to convert the data to a local clean clock.