Toslink popular?

Much depends on the digital signal receiver. If the Toslink receiver has good ASRC and galvanic isolation of the devices is important to you, then the optical channel is very convenient. One way or another, a lot depends on the specific task and in some cases a coaxial cable may be preferable, and sometimes a USB cable may be preferable.
 

TNT

Member
Joined 2003
Paid Member
It's not superior in the sense that it has lower bandwidth than both the coaxial and the USB. The toslink is a consumer grade opto link - its only superiority is the complete galvanic isolation and resistance to carry or/and pick-up disturbances on the line.

I prefer it so here, it's popular ;)

//
 
  • Like
Reactions: 1 user
TOSLINK has a poor reputation for adding jitter to a digital signal.
The principle of data transmission itself, and not the toslink as such, is susceptible to jitter. I suppose previously there were simply no good and cheap optical transmitters and receivers, which led to greater jitter in optics than in copper. Today, in terms of jitter, there is no difference whether it is optics or copper; ASRC and modern optical receivers have reduced this problem to a level where it no longer makes sense to pay attention to it.
 
The principle of data transmission itself, and not the toslink as such, is susceptible to jitter. I suppose previously there were simply no good and cheap optical transmitters and receivers, which led to greater jitter in optics than in copper. Today, in terms of jitter, there is no difference whether it is optics or copper; ASRC and modern optical receivers have reduced this problem to a level where it no longer makes sense to pay attention to it.
TOSLKINK is rather highly susceptible to inducing jitter in to the physical interface, because no serious consideration was given to the jitter inducing mechanisms of the link when it was chosen. Only its basic function, and cost. As with any transmission link, impedance mismatch causes signal reflections within the link. For an optical link, this largely comes down to index-of-refraction mis-matching between the plastic fiber and the cheap butt connectors, although there are additional lesser causes of signal degradation.

I recall reading that the S/PDIF interface was originally intended only for manufacturing production testing of the data signal coming from the transport section of a CD player. Jitter was of little concern, so long as it wasn’t severe enough to provoke data read errors. Jitter produced spurious sidebands were not a consideration for the interface. Such mechanisms didn’t become a concern until dedicated DAC boxes, connecting to a transport via the S/PDIF port, entered the market.

ASRC remains controversial. According to Bruno Putzeys, if I correctly understood him, what ASRC does computationally is to transform jitter in to a sort of time smeared, small data error. It eliminates the jitter originated spurious sidebands, but also irreversibly changes the sample data sent to the D/A chip. Apparently, ASRC can be conceptually over-simplified as follows. In analog terms, jitter may be conceived as producing the correct sample value, but located at some incorrect time-instant. ASRC eliminates that error, essentially, by converting/computing that correct sample value located at some incorrect time-instant, in to an incorrect sample value located at the correct time-instant. Again, conceptually speaking. Forever locking that incorrect sample value in to the data stream. At least, I think that’s what Bruno was saying.
 
Last edited:

TNT

Member
Joined 2003
Paid Member
As I understand it, the limited BW is in itself jitter inducing. A lower BW link will always have higher jitter then a higher BW link.

The s/pdif interface could all have been solved and improved by two simple measures that cost wise probably had canceled out in the long term for all involved but given a big technical upper hand:

- I double link where the data is sent in a downstream link and the second link used to send the clock back to the sender for synchronisation purposes - making the DAC effectively master and thus making jitter on the line a non issue. Now oscillator could be placed very close to the conversion point. Opto would have been a good choice for reason stated above i.e. galvanic isolation.

- Skip the second frequency hiearchy - the difference between 44 and 48 is just too small to do any magic. Here comes the cost / complexity reduction to pay for the double opto link. This was just plain stupid nd probably driven by the profi part that wanted someone "own"...

And the Sony boss would still get Beeth:9th into one disc ;)

//
 
- I double link where the data is sent in a downstream link and the second link used to send the clock back to the sender for synchronisation purposes - making the DAC effectively master and thus making jitter on the line a non issue.
Synchronizing the source and receiver frequencies will not completely solve the jitter problem. In my opinion, the most rational solution to this problem is to use ASRC and reduce the amount of jitter both on the source side and on the signal transmission line side. Of course, the receiver must have a clock generator with a small jitter. In my opinion, as soon as ASRC and clock generators with low jitter appeared in DACs, we can immediately say that jitter has ceased to be a problem worth paying attention to.
 

TNT

Member
Joined 2003
Paid Member
Well it does as you don't have to extract a clock anymore - the only problem with jitter is the clock - the jitter impact on data is zero, zilch, nada. ASRC means recalculation of the incoming stream so no "bit correct" reception anymore...

//
 
Well it does as you don't have to extract a clock anymore - the only problem with jitter is the clock - the jitter impact on data is zero, zilch, nada
As I understand it, this is only true if you have no jitter in the source and receiver clocks and no jitter in the data line. And as I understand it, in this configuration all three jitters will be summed up.

ASRC means recalculation of the incoming stream so no "bit correct" reception anymore...
I think it’s worth asking the question: how important is bit perfection in a 24-bit data stream if the DAC itself at the output of the analog part has an ENOB of -18-20 bits?
 

TNT

Member
Joined 2003
Paid Member
As I understand it, this is only true if you have no jitter in the source and receiver clocks and no jitter in the data line. And as I understand it, in this configuration all three jitters will be summed up.
I'm afraid this is not a correct understanding. Also, if you have the clock in the receiver as well as a link for backwards syncronisation, a very rudimentary re-clocker will get rid of all incoming data signal jitter. But its not really needed as on the reception side everytimng is nor clocked by the local clock so you're good.

But still - jitter is a clock problem, not a data problem. I hope you agree to this. In s/pdif, the clock and data is intertwined - this is the problem. But after the data is extracted form the s/pdif signal and you have a locka stable clock - what happened on the link is forgotten and not a problem anymore...

Re: ENOB - do you think ENOB would be affected if you have a jittery incoming clock extracted from the s/pdif and used as the clock for conversion? I mean the usual "good ol' (bad!) way.... instead of fancy backwards sent sync clocks (all Linn) or the latest ASRC?

//
 
I'm afraid this is not a correct understanding.
It is quite possible that I am mistaken. I'm not an expert in this area.
instead of fancy backwards sent sync clocks (all Linn) or the latest ASRC?
If I am offered a choice between a DAC that requires synchronization with a source and a DAC that does not require synchronization with a source, if both DACs have ENOB 18 bit, then I will choose the DAC that does not require synchronization with a source.
For me, the ENOB indicator is more important than some kind of bit perfection that is visible only in the digital part of the DAC, but not visible at the output of the analog part of the DAC. But this is purely my preference, I am quite sure that there will be people for whom bit perfection is more important than ENOB.
Synchronization complicates the data line, and I want to avoid this complication if the DAC output parameters remain the same.
 
But still - jitter is a clock problem, not a data problem. I hope you agree to this. In s/pdif, the clock and data is intertwined - this is the problem. But after the data is extracted form the s/pdif signal and you have a locka stable clock - what happened on the link is forgotten and not a problem anymore...
When the input and output frequencies are equal, that is, when conversion of the frequency grid is not needed. Isn't that what ASRC does?
As I understand it, the principle of asynchronous data reception itself helps to suppress jitter in ASRC, and not frequency conversion, which some people have complaints about.
 
Member
Joined 2004
Paid Member
transceivers would have improved if the adoption would have been better.
Transceivers have improved. The ones that are reasonably priced at Digikey are made by Everlight, and according to the datasheet, they are tested at 25MHz using "standard plastic optic fiber cable". That's a lot higher than the 6Mb/s of the earlier Toshiba devices.

I've always thought of USB as a "force-fit" for audio, and I doubt that it will continue to survive in the future. The best way to send audio, IMHO, is HDMI. Most HDMI receiver chips extract the digital audio and output it using SPDIF formatting. If that signal isn't available in your system, you can buy a 4K HDMI audio extractor for less than $35. The audio extractor has a TOSLINK output, and as a bonus it can drive a headphone from the analog output.

As others have pointed out, the optical cables eliminate ground loops, and that feature solves a lot of connection problems.
 
I've always thought of USB as a "force-fit" for audio, and I doubt that it will continue to survive in the future. The best way to send audio, IMHO, is HDMI.
IMO HDMI is technically very similar to USB adaptive. Data sent in packets, audio clock needs to be recovered from the incoming stream, transmitter is the clock master. No feedback channel as in USB async. Capture channel must be provided by a completely different technology (IIUC ARC is basically SPDIF https://e2e.ti.com/support/audio-gr...9211-hdmi-arc-cable-direct-connection-pin-map ). IMO HDMI is a much bigger compromise than proper USB async for audio.
 
  • Like
Reactions: 1 user
Member
Joined 2004
Paid Member
OK, if you are sending the audio to a DAC, there are benefits to USB async. But I believe that the evolving model is to keep the audio in digital form and convert to analog later in the audio chain, preferably in the power amplifier. The sources that are masters get resampled by an ASRC. I think WiFi audio and HDMI/w SPDIF audio are going to be the standards that survive in the long run, although USB DAC's may still be around for boutique or "high-end" audio.