FFT am I doing this right?


So the setup in no particular order;
  • DAC is silent, scope is on the left channel, DAC is connected to a 10KOhm line in.
  • "Default" the scope
  • Auto tune
  • Set time division really wide to allow analysis of lower freqs (1 per division 14Mpts in my case)
  • Math->FFT
  • Exclusive screen
  • Units dbm
  • Horizontal reference and Hz per division set to show 0-22k (35k actually).
  • Label peaks, show table, show frequency.

I don't see any 50Hz, unless that unlabeled peak is it? However its more likely to be FFT bin 0 aka DC.

Not shown is the rest of the spectrum, which has two very obvious spikes way up to -20dbm of ~3Mhz and exactly 24.576Mhz. No surprises where they come from.

Odd thing is, this is post "filter". Although it is a "filterless" DAC in that it require no output filter. It isn't doing a very good job of cleaning all the Mhz crap out of there. Not that I will hear it, it's just it isn't necessary to be there and might upset things down stream. Besides, routing 24.576Mhz around on audio cables won't do your EMI environment much good.
 

Attachments

  • SDS00006.png
    SDS00006.png
    13 KB · Views: 80
If am reading it right (and it should really have been on peak-hold mode), the noise floor is about what the DAC chip claims ~100db, but there is quite a lot of low noise spikes.

I should have captured a "dead rail" noise and done an x-y first maybe. I don't know if that noise is from the DAC or the neighbours cat.
 
The first question is "what are you trying to measure?". (Noise floor and audio-band harmonics of a DAC is my guess.) Second question would be: "do you know what is the resolution of the ADC in the scope?". If you are using a 16 bit ADC, you won't be able to resolve the full dynamic range of a (nominally) 24 bit DAC, and the noise floor will be from the ADC, not the DAC. Third question is "what signal were you feeding to the DAC in the first place?". Fourth: "why are you sampling at 59 MHz? I would have thought 96 or 192 kHz would be enough to try to see the noise performance of a DAC. HTH
 
The scope a Siglent SDS1104Xe is 12bit, 1Gsa/s, 100Mhz bandwidth, but multi-staged inputs. So it's currently looking at only the highest (smallest) resolution of the noise. The entire 12bit range only covers about 250mV or about 1/8th the output range. You could argue that makes it still only 16bit. But I'm not measuring it's "output", just the noise floor.

I'm not sampling at 59Mhz, I believe that is the natural frequency of the zero cross trigger given the input wave. I'm sampling closer to 100Mhz. I have limited the FFT view on the scope, it extends way out to 50Mhz and beyond. The lowest bandwidth limit I have is 20Mhz (for noise characterizing electronics power supplies!)

I did actually test the noise without the DAC being powered and ... noise mostly disappears, but then that will just be the clock noise. I didn't FFT it.

The total noise is 120mVpp (a lot!), both bit clock and master clock are there as expected. The full FFT has 2 massive spikes up there. Even from eye balling it, you can see distinct clocks interacting. Those clocks put a load of drain on the digital power rails and thus cause sympathetic noise everywhere. It's well, well away from the audio band so, in terms of audio alone I don't much care. There are other considerations around those noise spikes I will need to work on.

I wanted to know what noise it will be outputing without any tweaking, tuning and in the worse case scenario of it sitting on a breadboard with jumper wires.

There is no ADC. It's optical in 24@192k, I2S across jumper wires, DAC out. 3.3V supply rail.

If I am reading it right there are a couple of undesirables in the band, but at -59dbm being the highest, I can live with that. Even if I gain it x12 the noise still won't be a consideration to my old ears.

As to characterising the output, that is where the debate around bit resolution might come into play. While I think that might be fun, I don't have the equipment, the knowledge or the will to do such a test justice. The most I am going to do is run a few bode plots with a signal gen (a digital one) and check I'm not accidentally lopping any holes out of my band with a dodgy specced cap or resistor.
 
Yes, the scope has an vertical resolution of 8 bit. This will limit the depth to which you can measure noise. These scopes are built for speed, not depth and as such, not ideal for audio forensics. It's probable that your built in ADC in a laptop has better resolution and can measure lower noise that this scope. A normal cheap sound card (100€) and the free software REW will outperform it easily in terms of noise and distorsion - but limited to perhaps say 176 kHz observable bandwidth (fs = 384sps).

//
 
The reason I FFT'd the noise floor on this prototype is not because of what is there, but because of what is not.

You see, normally if I attach a scope to one of my projects all I see is a 50Hz wave with a wall of noise modulated on it and somewhere, barely discernable in all that noise is some music. Modulated on the 50Hz mains, then the noise from 100kHz up through 10s of Mhz modulated on top of that.

However the moment I hooked this setup to the scope I got a flat noise floor and when I played music and put the scope into "Roll" mode I could acutally see proper audio wave forms. I was quite surprised.

The only thing I can put this down to is that it's optical in and there is no USB power or otherwise involved. So I'm not sharing my grounds with the PCs.

Not audiophiling, just tinkering.
 
Interestingly. I rearranged some cables such that the breadboard and the headphone amp are running off battery.

The scope on maximum gain 20Mhz bandwidth limit will not trigger on the noise it's that low. Less than 40mV peak to peak and well over 20Mhz.

When the beer has left me tomorrow I might try and run a set of FFTs with varying conditions with controls.

By the way. There is no way in hell you are seeing the supposed "step" of 16bit or 24bit on the output, even at 100Mhz.
 
Can I clarify that you are feeding the signal-free analogue output of a DAC into one of these: https://siglent.co.uk/product/siglent-sds1104x-e-super-phosphor-oscilloscope/ ? There is an ADC in the scope, which has a nominal 8-bit resolution. While it will be possible to look at the DAC's noise floor, be very careful to keep the DAC output level within the dynamic range of the scope, otherwise all the artefacts that you see will be harmonics caused by clipping. Would this publication be worth a read? https://www.analog.com/en/education/education-library/mixed_signal_dsp_design_book.html
 
Yes. (I thought it was 10, but you are right). The 8 bit is enough when the front end gain is set for about 20mV per division, certainly for the noise floor. For actual signal, yes that clips generating a lot of high frequency garbage. Works at around 200mV per division, when, of course it doesn't have the same precision across that range with 8 bits.

It's interesting, but as I don't need the high bandwidth, in fact it's undesirable, I can use the probe in 1x mode, rather than 10x, so I gain a lot more sensitivity and lose a lot of bandwidth I don't need.

Anyone who says you can't process 24bit resolution with a 8 bit window never wrote a line of C code for an 8 bit micro, processed 24bit addresses on a 16bit architecture etc. etc.

On sound cards. The front end hardware you would require to accurately, precisely and repeatedly resolve LSB of 16bit or more is not going to come in a laptop or for under £10k. Scopes may be 8, 10 12 bit, but they are extremely accurate high bandwidth 8bits. They probably have enough bandwidth to super sample the audio band 100 times over at different offsets and produce a higher quality 32bit output than a sound card ever could. Why would you though?

Yes the ICs have those stats, but just like cheap chinese scopes being fit for nothing because thats what they advertise, most consumer audio gear comes nowhere near ideal characteristics to achieve the digital accuracy claimed by the IC stats. and professional audio gear with calibration and certification costs 100s of grand.

Put this way, if sound cards in laptops actually had 16.8million voltage levels, aprox 60nV on line level, a lot of precision instrumentation companies like Fluke would be out of business. I don't think I've seen proof of concept there. If your sound card actually have the full dynamic range of 24bits I'd be very surprised. It might output a 24bit word, but I expect a bit of statistical analysis will find groupings with a much lower bit depth.

It would be very easy to test.

Take some text, say a forum message, encode it in 24bit binary and send it through your sound card DAC, it'll sound lovely, then back through its ADC and display the text.

If it produces anything but absolute garbage I would be incredibly surprised. If it can't round trip 24bit raw binary via its analogue side its not 24bit accurate, precise maybe, accurate no.
 
Yes, but in audio you need resolution, not bandwidth - this what you miss out when you talk about Fluke etc... Find a box that does 16bit / 1Ghz... Nope!
Do you have all grips on PCM audio - there is nothing bit transparent about a complete ADC-DAC chain... "super sample the audio band 100 times over at different offsets"... say what... 16bit means 96 dB SNR... BW 22khz - easy for a modern design...

//
 
What you measure, certainly is not DAC output noise. Oscilloscope is not capable of that. If your DAC has -100 dB noise level, with 2 Vrms nominal output, that would be 20 uV total noise in the 20 Hz – 20 kHz bandwidth.

Actual noise floor at any specific audio frequency is 141 times lower or 142 nV. Only 3500 times less than best one div. sensitivity of your oscilloscope. Hmm. 🙂
8-bit ADC is 8-bit and it can’t measure with higher resolution using software workarounds.
 
8-bit ADC is 8-bit and it can’t measure with higher resolution using software workarounds.

You sir are obviously are not a software engineer. That is categorically incorrect.

I have a tape measure. Its 5 meters long. Its accuracy and precision in 1mm. What you are saying is. I can't use that tape measure to accurate measure 100 meters of road to the accuracy of 1mm.

"Software workaround".

No. Please go and look up what an analogue front end to an ADC in a scope actually does.

Again. I challenge you to use your DAC and ADC as a 24 bit line encoder. If it can't do that, it's not 24bit accurate.
 
You would't know as you can only measure to 8 in each point of time.

It seems you are saying, but must be a language problem, that if you take a binary file of a picture (Mona Lisa, tiff, 16bit color) and feed that to a DAC, take the analog signal out from said DAC and feed that to a ADC and then get Mona lisa back as a .tiff file, you need to check you sources again.

//
 
The front end hardware you would require to accurately, precisely and repeatedly resolve LSB of 16bit or more is not going to come in a laptop or for under £10k.
I challenge you to use your DAC and ADC as a 24 bit line encoder. If it can't do that, it's not 24bit accurate.
First 16-bits, now 24-bits. While not 24-bits there are many DACs/ADCs capable of resolving 20bits. Much better at measuring DAC noise floor than a scope.
 
You sir are obviously are not a software engineer. That is categorically incorrect.
But, but …. I am. 🤣
You didn’t comment actually expected noise voltage levels vs. measured values.

Example with meter is not appropriate. Yes, you can process 24 bit data with 8 bit processor but that analogy doesn’t apply here as well. Problem is hardware that performs digitization of analog value, not the processor or software behind. And that hardware has only 256 discrete steps. No software voodoo can change that. Explanation: https://www.electronics-tutorials.ws/combination/analogue-to-digital-converter.html

ADC.png