Compression drivers in undersized horns

Status
Not open for further replies.
1) 97 db at 400 Hz would mean 109 db at 600 Hz (12 db higher in level), so that should work out fine one would think?

2)Regarding playing a 400 Hz tone then a 1 kHz tone; is it intermodulation distortion you're aiming at here? Or something unique to horns?

1) No, the driver is dropping at 20 dB per octave below 600 Hz, distortion is rising dramatically below 1000Hz.
2)IM is not unique to horns, but crossed too low, the horn driver will have far more than the cone driver at the same frequencies.
Even order harmonic distortion is fairly tolerable, IM just sounds lousy- "gargly", "blatty", "trashy"...

Art
 

Attachments

  • %22Don't cross, Don't cross too low to me%22.png
    %22Don't cross, Don't cross too low to me%22.png
    573.2 KB · Views: 226
Hi AllenB,

with all due respect, does your answer not replace one undefined term "undersized horn" with another "the right choice of horn"?

In another words, what is the characteristics of "the right choice horn"? Coverage angle? Pattern control? Loading/radiation resistance?

Apparently, I am the only moron here, because everybody else happily participates in the debate.

Kindest regards,

M
 
diyAudio Moderator
Joined 2008
Paid Member
mefistofelez I agree with GM, but to go deeper in this assumption the cutoff of the shortened horn is the same cutoff of a full sized horn which is the "best or most common fit" for the driver, and where the driver is still being used in that range.

Why do I assume this? Because it's usually the choice that is made, it's usually what is being talked about.. even though it is not always a good choice. A little foreshortening yes, but not too much. (and yes, the term cutoff applies differently to different horns but interestingly that isn't as often discussed)

I can only guess that some see it as practical, and some may assume that the narrow throat is the part that does the work so they want to keep it even though this is not so simple.

In any case, a thread asking what the differences are seems the logical step?
 
Hi GM, AllenB,

thank you for the answers. Regretfully, they do not make it clearer for me. The term "ideal full size horn" is another undefined term.

Similarly, "'best or most common fit' for the driver". I have seen horns for e.g., TAD 2001, ranging from conical to OS to radial, etc. So, which one is the "best or most common"?

In any case, a thread asking what the differences are seems the logical step?

I was always taught that if one is to ascertain difference(s), there must be a definition of the items to be compared and the attribute or attributes, a difference of which is to be ascertained. Hence my inquires.

In any event, please do not mind me, I will bow out of this thread.

Kindest regards,

M
 
diyAudio Moderator
Joined 2008
Paid Member
I agree. The first thing I'd want to ask is do you know what the full sized horn gives you? Do you know what you need it to do? Did you choose it because of these reasons or was it copying what everyone else does.. (Defining cutoff and explaining horns and waveguides isn't easy.)
 
In my opinion, the SPL repsonse would be more or less the same even when there's no horn (worst-case), while the same thing undergoes dispersion with a horn fitted to it. Thus, what differs (and matters) would be the stress on the compression driver's diaphragm (and the THD, potential damage etc.). If such issues are found to be absent (or negligible) then an undersized horn maybe used as long as the resulting pattern is acceptable to the listener.
 
Defo,

At 400 Hz, a wide angle (90 degree) "waveguide" or "undersized horn" will provide very little gain, the output level would be little more than that of a similar size cone driver with similar excursion.
Most HF compression drivers have little more than 0.5mm distance between the diaphragm and phase plug, few have more than 0.8mm.

The DE250 has a 1.7" (44mm) diameter diaphragm, at 400 Hz it should do about 87dB before the diaphragm hits the phase plug, a 3" could do about 97dB (peak) at one meter.
Polyimide diaphragms (like in the DE 250) won't sound as bad as metal when they hit the phase plug- kind of like the difference between substituting a Frisbee for a cymbal ;).
If the drivers don't ever exceed that level, no problem.

Just for the experience, play a 400 Hz tone through the horn/driver at a "pretty loud level", then add in a 1kHz tone, tell us what you think of the 1400 & 1800 Hz tones you hear in addition.

Art

What is the source of this xmax/SPL calculator? Freely available?

For 18sound drivers the pp distance is generally 0.5mm-0.6mm. These drivers won't do much excursion so imho good horn loading is essential down to LF. The good old rule xover 1,5x to 2x cut-off or simply looking at the fr curve of the measured driver in a horn/wg. What can we expect from a wg less than 10cm long for frequencies below 1k? I have never seen imd measurements when the driver is pushed by 10-20dB in such a waveguide.
 
1) No, the driver is dropping at 20 dB per octave below 600 Hz, distortion is rising dramatically below 1000Hz.
Art

For 2.5 inch diaphragm max SPL at 400 Hz should be 93 db before hitting xmax. The SPL difference between 400 Hz and 600 Hz on the Faital combo is 12 db, which means it should be able to do 105 db at 600 Hz.

Given distortion still remains acceptably low ofc. Audioexpress' distortion measurements are a bit questionable in my experience.

1)
2)IM is not unique to horns, but crossed too low, the horn driver will have far more than the cone driver at the same frequencies. .
Art

Surely only if the cone driver is actually of a larger diameter right?

Don't mean to question your insight here btw, just trying to learn :D

Here's some interesting distortion graphs comparing an RCF ND350 (on a Dayton H812) to a couple of cone drivers. RCF remains considerably lower in distortion up until around 800 Hz even though it's only a 1,75 inch diaphragm driver.
 

Attachments

  • Overlay-100db-RS270ND350.jpg
    Overlay-100db-RS270ND350.jpg
    126.3 KB · Views: 281
  • Overlay-100db-SS26wND350.jpg
    Overlay-100db-SS26wND350.jpg
    131.3 KB · Views: 221
Last edited:

ICG

Disabled Account
Joined 2007
Are you sure.. horn loading is not so much about holding the diaphragm back, as it is about radiation resistance.

Gee, that's the same! If the acoustical impedance (=horn loading) goes down the drain, the excursion goes up, dramatically! Maybe that might be acceptable in some HiFi applications (read: single ended class A with 3W output) but you'll lose (typically) ~12-20dB max spl and dynamic.
 

ICG

Disabled Account
Joined 2007
What is the source of this xmax/SPL calculator? Freely available?

You can use WinISD, that does a good job at simulating max spl, power and excursion. Though, you won't get TSP from CDs. And that's not important because usually CDs are not used at their fs.

For 18sound drivers the pp distance is generally 0.5mm-0.6mm. These drivers won't do much excursion so imho good horn loading is essential down to LF. The good old rule xover 1,5x to 2x cut-off or simply looking at the fr curve of the measured driver in a horn/wg. What can we expect from a wg less than 10cm long for frequencies below 1k? I have never seen imd measurements when the driver is pushed by 10-20dB in such a waveguide.

That rule applies to 'traditional' horns. On many WGs you can actually xo a bit lower, a WG usually fades out' slowlier. But that doesn't mean you got any leeway regarding the excursion. That's physics and you can't cheat at that. The worst thing about it is, these are non-linear distortions and unlike k2 which is often actually preferred or k3 (already quite unpleasant), these make your ears bleed.

If you want to use a CD low or even below the horn loading, consider a narrower horn dispersion because these are typically smaller in size. Or simply use a CD with a much larger diaphragm which can still make the needed spl you need. Or simply cut your spl requirements down. A lot.
 

ICG

Disabled Account
Joined 2007
Try a lense if a short horn is so important to you. It helped when I tried it. Interesting effect. But not my cup of tea.

While that widens the dispersion (if matched with the driver and horn), these acoustical lenses do not extend the lower end/cut off of the horn. And they work best at the upper end since the long horns back in the day started beaming early on. I'm not a fan of these either because they got a lot of problems too and there are much better horns and WGs available nowadays which do a much better job at it.
 

ICG

Disabled Account
Joined 2007
Displacement increases at lower frequencies, but is it due to horn loading around cutoff? Can you demonstrate this?

Just take two horns of the same type but with different size/cut off frequency. Measure the distortion. At the smaller one it distorts up to a higher frequency.

The acoustical impedance shows the acoustic coupling of the air to the membrane. Below a certain frequency - depending on the size of the driver - the coupling gets worse, the sound pressure level drops in a slope.

Acoustic impedance - Wikipedia
 
diyAudio Moderator
Joined 2008
Paid Member
Cutoff has to do with the rate of change as you move along the horn.

I took those models and put them on a compression driver to see what it's displacement would be like on each horn. As shown, they are similar.
 

Attachments

  • disp.png
    disp.png
    23.4 KB · Views: 153
Status
Not open for further replies.