Geddes on Distortion perception

I have experienced large differences, quite humiliating for an exotic power cable, to small differences, almost unnoticeable. But the problem always is, what measurement metrics do you use to show the difference! This is critical because if one wants to say the power supply is the crappy part, you have to show data to prove it to a supplier, and in a way which relates with the different power cables. I once did show data on capacitors to prove he was not delivering what he was supposed to, and they admitted.
Oh, recent visit to a new studio close by, the owner also used very specific power cables for the equipment as well as a separate 100amp line specifically for the control room.

Another thing to keep in mind that with the cables, as with any man-made artifacts, one would need to run a statistically valid test, and to report a statistically qualified result, to arrive to something making scientific sense.

Let's say, for a start, testing RLC parameters of 100 exotic cables (same model) against 100 cheapest Monoprice ones (same model), before and after they experience a typical use. I've never seen results of such research published. Have you?

I would expect on average the exotic ones to maybe possibly test a tiny bit closer to straight zeros on R, L, and C characteristics.

But the outliers could be non-negligible. Specific specimens of exotic cable or of cheap cable could measure widely out of one standard deviation range. This could be due to deliberate and unintended production process variations of the exotic cables, insufficient quality control, or damage during shipping and/or use of the cables.

Why do I emphasize the "deliberate and unintended production process variations"? Because the exotic cables are often low-run hand-made unspecified-quality-control, and as such their parameters can vary much more significantly than the parameters of mass-produced ones.

If the interaction between a formally out-of-spec exotic cable and a typically sophisticated and uniquely tuned audiophile system turns out to be subjectively different, it would immediately constitute a "proof" for the cable buyer.

The confirmation bias may have less or more influence on specific people, and could be even a dominant factor in the vast majority of cases, yet we can't rule out that there could be an objective basis for some of the "improvement" claims.

An interesting corollary to the above is that one may expect about as high probability of "improvement" from an old cable that served under harsh conditions. The "junkyard cables" may become a hot new trend, just like artificially worn-out jeans :)

As to the cable damage during transportation and use, this is a well-known factor budgeted for by live sound stage engineers. They usually don't care much about how a specific model of microphone cable sounds - the cables don't sound noticeably differently until you get to lengths of half a mile.

Good stage engineers care a lot about the cable's interference rejection, and robustness in regard to physical abuse and weather elements. Also, they are absolutely religious about the way they wind their cables in a roll, and obsessed about protecting their cables during transportation.
 
I haven't read the whole thread, only the first post. Just curious - surely there comes a point when low order distortion becomes intrusive?

Of course, 100% 3rd order is going to be noticeable. In a test of sound quality in compression drivers THD numbers of 20-25% were deemed not to be significant factors.

What makes a system "transparent" with very good soundstage and imaging while not being bright and\or fatiguing?

"transparent" and "sound stage" are not well enough defined terms to be answered. "Imaging" can be well defined if we stick to the standard text definition. It requires a large free time of the speaker signals to the listener, i.e. eliminate the earliest reflections as much as possible through speaker Q and room control. I would interpret "transparent" to be free of distortions, but these turn out to be the linear aspects rather than the nonlinear. Solve these problems and fatigue will go away.
 
Member
Joined 2005
Paid Member
It’s been a decade and a lot has changed in the marketplace since the OP stated that
“ THD and IMD are meaningless measurements of distortion as far as perception is concerned.”

does all this explain why the world has moved to 3” bluetooth enabled peakers and laptop speakers as being good enough; while greying hair audio enthusiasts/ audiophiles are clutching onto, or still buy speakers which are the size of shoeboxes (at the very minimum) all the way up to refrigerators ?

Not trying to be facetious here, but to me non linear distortion is related to SPL. And thus purpose.

Do you want to something to try to emulate the sound in a jazz club, a bar, a nightclub or concert hall?

Or do you just want to listen to your favourite tunes in the background whilst cooking dinner?

So if you’re playing at 60dB in your bedroom any 3” full range driver is good enough from 100Hz to 10Khz; where the most of music can be captured (“enough bass; mids; and treble”) and the THD is below the noise floor of your room.

Now take that portable speaker outdoors and turn it all the way up and it sounds
“harsh, too loud, turn it down!”

Well what’s happening is HD3 is skyrocketing up to 10-100% and all the higher order harmonics from H5-H9 have gone up too; well beyond -30dB

To say “ Pursuing a loudspeaker design to lower the distortion is a waste of time if its nonlinear distortion that you are trying to lower. It simply doesn't matter.” is a blunt instrument because it implies that you don’t have to care whether you are playing at 60dB @1m, but you need to care about playing loudly and cleanly at hitting 96 dB at 10m.

I’ve yet to see double blind tests subjects are allowed to play with the volume control freely (this is needed external validity) and see if and when when (s)he prefers the JBL M2, Geddes Summa, or the diminutive the JBL Charge 3…
 
Last edited:
You obviously have not read all herein because I've said exactly what you are saying. There is a limit to any system and we should know where that is at, but alas that is not common. I have always intended to mean that nonlinearity means nothing - until it means everything, like clipping. But even then I have shown that if you "soft", it is much less offensive.

It's absurd to think that no system has its limitations. Stay below that and I stand by my claims. And I've always said that we can make speakers where nonlinearity is inaudible, I have never said that this is true of all speakers.

Let me also add that I have shown that linear effects of group delay and resonances are not linear with level. In my experience in a "good" system, it is the linear effects at high SPL that are the more likely contributor to low preference than the nonlinear ones.
 
Last edited:
diyAudio Moderator
Joined 2008
Paid Member
to play with the volume control freely (this is needed external validity) and see if and when when (s)he prefers the JBL M2, Geddes Summa, or the diminutive the JBL Charge 3…
I don't know that it needs repeating but I also find that it is the linear effects that cause us to be careful where we put the volume control. A speaker can effectively be free of that.
 
OK, it's easy to find current woofers, mids, CDs and domes with THD down more than 40db. I use Acoustic Elegance, Faital, Eighteen Sound, B&C, RCF, Wavecore and JBL devices of all sorts in my home hi-fi designs. My designs all have reasonable smooth frequency and phase responses and sub 40db THD even <= 60hz. As a builder I mostly have to make good device and classic design and construction choices to get low THD.

What else should I/we be looking for when choosing a device?

What design considerations should we be targeting when putting those devices to their best use?

I tend to gravitate toward woofers and mids with low Le, larger Vas, well behaved cone break up but not to dead of a cone for midbass\midrange.

I like devices I can take out of the packaging, calculate a quick dirty crossover load them, hook them up and like the sound for a few weeks until I can sort out a proper crossover. Using REW with $100 in a mic and USB DAC I can get good frequency and phase response in as little as 30 minutes to over 30 hours of an iterative process of calculations, measurements and listening. Then enjoy the system for years.

Recently, I moved away from traditional 12 and 15" two ways crossed from 500-1600 hz to quality CD\horn combos to another traditional three way with 6.5" mid and dome. The two ways measure better in every way then the three way, especially further into the room where dirctivity proves it's value and use much more expensive parts. However, the two ways sound like half of the music is missing. The three way is much more detailed in the vocal range. Why is that?

I
 
OK, it's easy to find current woofers, mids, CDs and domes with THD down more than 40db. I use Acoustic Elegance, Faital, Eighteen Sound, B&C, RCF, Wavecore and JBL devices of all sorts in my home hi-fi designs. My designs all have reasonable smooth frequency and phase responses and sub 40db THD even <= 60hz. As a builder I mostly have to make good device and classic design and construction choices to get low THD.

What else should I/we be looking for when choosing a device?
What design considerations should we be targeting when putting those devices to their best use?
I don't pay a lot of attention to woofers in general. Below about 700 Hz is not a critical part of perception, maybe 40%. Of that the most important would be the upper octave where cone breakup happens, a poor woofer there can ruin a system - that aspect is about 20%, leaving a mere 20% importance for all the low frequency stuff like BL, etc. Displacement, power handling are critical of course, but little else matters.

Why is that?
I

I can't account for anyone's individual tastes, but in the mean, people like Toole have shown what matters and why. To me crossovers in the critical 700 - 7kHz region are to be avoided at all costs and the system has to be constant directivity.
 
Member
Joined 2005
Paid Member
Hmm. OK i'm starting to see your design process.

I mean, that severely limits your woofers- any 5.25 to 6.5 and most 8" probably won't fit the bill.

Some 10", and many 12-15" and some 18" woofers that can be crossed over to a compression driver.

I can see now why distortion hardly matters... it's just not a factor with those size drivers are playing at any reference listening level... (85dB-105dB)

Most of the public don't want a speaker than as Sonos 1 or soundbar... Earl it must be difficult to market your best product when good enough is enough for the listening public... I mean I've got an OLED TV but most people buy the biggest LCD they can afford...

I went an had a eye test at my optometrist because I was complaining about subtitles not being perfectly clear on my TV. The optometrist tested my vision and it was 20/20...and said "there's nothing wrong with your eyesight, you're probably a bit fussy"

But my wife gets angry when she finds out that I take the kids to Burger King.

Horses for courses...
 
Last edited:
I don't pay a lot of attention to woofers in general. Below about 700 Hz is not a critical part of perception, maybe 40%.

Hi,

i've never been able to understand the emphasis you give to the upper octaves.
I get that the ear is more sensitive in the upper octaves, ect,...
Seems to me ,vocals and instruments lie more in the bottom five octaves than in the top.

Of that the most important would be the upper octave where cone breakup happens, a poor woofer there can ruin a system - that aspect is about 20%,

Ok, but that assumes a woofer is used up to 700Hz....
That's been a no go to my ears..

leaving a mere 20% importance for all the low frequency stuff like BL, etc.

I don't remember Toole's specific weighting on sub importance, but memory has it way above 20%. I keep remembering the importance he laid on the bass. .....??

To me crossovers in the critical 700 - 7kHz region are to be avoided at all costs and the system has to be constant directivity.

Agreed historically.
Although today, I've found that is not true when complementary linear phase xovers are used. (FIR)
Ime, you can put the xovers anywhere in that range, and even crazy steep, with the effect to sound quality being solely confined to changes in the directivity curve....without other sonic penalty.

I know my post is just a list of disagreements.
Sorry, especially given your ongoing gracious contributions........i just feel the need to disagree. :)
 
I ended up in the group with the approach 2 here:
Design-Criteria

I find the 3-1/2inch upper mid beneficial and I try for Xovers out of the 1.2-4kHz range if possible. 2-ways may measure well but still do not do it for me (like someone above said some music seems "missing").

Speaker pairing with amps also comes into play. On Pass forum 2nd and 3rd order distortion amp circuits are used to get different flavors while higher orders taper off or are just kept low.
 
I don't remember Toole's specific weighting on sub importance, but memory has it way above 20%. I keep remembering the importance he laid on the bass. .....??
First Page of Chapter 8 in 2nd Edition of Sound Reproduction

"In assessing the factors contributing to subjective judgments of sound
quality, discussed in Section 5.7, it was shown that about 30% of the overall
rating is contributed by factors related to low-frequency performance
(Olive, 2004a, 2004b)."