I've always wondered how the output specifications for these multi-channel amplifiers are devised. Typically, they'll specify 100w per channel (200w) in stereo mode and the same 100w per channel in cinema mode (500w).
No way on God's green earth can these units output 500w. What do they do? Do they use 5 x 100w chips with a 200w power supply? Or do they just lie?
No way on God's green earth can these units output 500w. What do they do? Do they use 5 x 100w chips with a 200w power supply? Or do they just lie?
When dealing with suspicious claims, I tend to try and look for the rail voltage and derive any potential output from that. For typical class a/b amps, you can divide the rail voltage by 1.41 and get a reasonable estimate of what an output at 8 ohms will be.
If it’s not possible to find that, sometimes just looking at what the voltage ratings of the power supply capacitors are, that will narrow it down a bit.
If it’s not possible to find that, sometimes just looking at what the voltage ratings of the power supply capacitors are, that will narrow it down a bit.
Actually, it is more like take the rail voltage and divide by *two* to estimate RMS output voltage under load. One factor or 70% to get RMS, the second 70% for the drop under load. The fake power rating is probably simply derived from the no-load rail voltage without any regard to what it really measures. Might even be using peak voltage in the power calculation too, resulting in yet still another optimistic factor of two. It really is back to the Wild West now in terms of power “ratings”.
The old FTC test of part power steady for an hour followed by FULL power was always more extreme than ANY speech/music signal.
You never run ALL channels to FULL output at the SAME time.
And as you add more channels the likelihood of them all working hard at once decreases.
The question then is: what sort of test, duty cycle, power distribution, shall we agree(!) to use? I suspect Dolby has suggested working conditions since Dolby organized the whole post-Quad world. (And if we all agree, is it a lie?)
You never run ALL channels to FULL output at the SAME time.
And as you add more channels the likelihood of them all working hard at once decreases.
The question then is: what sort of test, duty cycle, power distribution, shall we agree(!) to use? I suspect Dolby has suggested working conditions since Dolby organized the whole post-Quad world. (And if we all agree, is it a lie?)
The only part of the old FTC test that I consider “too extreme” is the preconditioning requirement. Nobody ever runs at full power - even my DJ abuse only runs at about 30% duty cycle. BUT I have found over the years that the only measure of how LOUD an amplifier will play before it sounds too mushy is the power that it can put out with both channels driven to maximum. The higher the number here, the louder it can GET. You’re really not supposed to run it that hard either, because you’re already clipping. But the maximum undistorted and maximum possible distorted power IS directly correlated. AV receivers can get away with less channels driven because the power is spread out, but when comparing you would want TWO channels driven to maximum, because that’s his you would use it listening to music. Any amplifier that cannot tolerate a sine wave test for a few seconds, long enough to take the reading, without something bad happening, will probably experience field failures in long term use and should be avoided.
Fuses are sized for expected maximum current draw when used with typical program material, and up-sized a bit to prevent nuisance blowing. When running a sine wave test you might need to bypass the fuse temporarily. If that causes a blowout, the amp wasn’t worth having because it would have failed eventually, even in more normal use. Same for power consumption rating - it is now customary to rate it at what would be considered maximum volume with normal program. Run a quick sine wave test and it will exceed that, sometimes by a LOT. A good quality amplifier will tolerate a short sine wave test, have good power output when doing so (something at least close to what they said it would), and serve you well for many years.
If it is running up at +/-55 volts it needs *two* pairs of output transistors per channel, not one.
Fuses are sized for expected maximum current draw when used with typical program material, and up-sized a bit to prevent nuisance blowing. When running a sine wave test you might need to bypass the fuse temporarily. If that causes a blowout, the amp wasn’t worth having because it would have failed eventually, even in more normal use. Same for power consumption rating - it is now customary to rate it at what would be considered maximum volume with normal program. Run a quick sine wave test and it will exceed that, sometimes by a LOT. A good quality amplifier will tolerate a short sine wave test, have good power output when doing so (something at least close to what they said it would), and serve you well for many years.
If it is running up at +/-55 volts it needs *two* pairs of output transistors per channel, not one.
Thank you for these suggestions. Although, applying any of the suggested tests will get you thrown out of the store.
Unfortunately the only things you’ll be able to see in the store is what the salesman wants you to see. Might not even let you turn it up loud enough to determine which of two receivers puts out more real power. Truly at the mercy of sales and marketing. The only thing you CAN do in a store is determine if it sounds clean enough for you (if the speakers they have are capable), and maybe if it plays loud enough for your purposes. You might end up with an amp that can put out 100 watts, it might only do 30 (and 30 may be enough). It may blow up in 6 months, might last 20 years. The only measure of what end of the spectrum you’re on might just be the price tag.
They are not all specced the same so you need to dig deeper on the specific unit to get a better understanding on how they would perform.
I usually do a first sanity check with how much power it can draw from the powerlines, this is many times a joke like a 5*100w 8 ohm amp with a max draw of 200w from the mains while another would be 5*250w 8 ohm with a max at 2.5kw. If this is within reason I would check the cooling and outputages, rail voltage, what transistors and how many per channel.
I usually do a first sanity check with how much power it can draw from the powerlines, this is many times a joke like a 5*100w 8 ohm amp with a max draw of 200w from the mains while another would be 5*250w 8 ohm with a max at 2.5kw. If this is within reason I would check the cooling and outputages, rail voltage, what transistors and how many per channel.
I'm restoring a Telefunken TRX-2000 quad receiver from around 1976 that claims 4 x 50W or 2 x 60W rms into 4 ohms (it has 32V rails). However it also gives a "maximum" power of 4 x 90W (<1% THD). The sticker on the back says maximum power draw is 540VA and it has a 4A fuse for 230V supply. These figures would seem to be much more realistic than what is typically given nowadays.
They are not all specced the same so you need to dig deeper on the specific unit
If I wanted to "dig deeper" I would not be shopping on the AV Amplifier side of the store. AV Amplifiers are "designed to please", within price limits.
...how much power it can draw from the powerlines, this is many times a joke like a 5*100w 8 ohm amp with a max draw of 200w ...
The average listener WILL be pleased with 2 or 3 channels peaked over 50 Watts, and would never know if it drew 220 Watts.
The present (not 1976) guidelines use "1/8th power" a lot. I've abused large amps with lightly clipped speech/music and I agree that the several-second average is rarely more than 1/8 of what I'd get on the testbench.
All this is not much different from car and truck engines. A "300HP" car engine is MUCH smaller than a 300HP truck engine. The car may do 300HP-- for some minutes. Send it across Wyoming (uphill upwind both ways with snow) and it will boil or burst.
One of the things that always confused, or mislead the public is what “continuous average power” on a spec sheet even really meant. You hear that and you get the idea that it’s like the total kilowatt hours that show up on your bill at the end of the month - a true average. Amplifiers cannot do this, or it will sound so bad you can’t stand to listen to it and the speakers would go up in flames. “Continuous” only refers to the fact that the test signal is a sine wave, as opposed to dots and dashes like Morse code. “Average power” is simply what laymen refer to as “RMS power”. It really only refers to a short term test. It is however, the only real measure of how loud an amplifier can play. Real average “kilowatt hour meter” type of power will only be a fraction of this.
What they do now is play games with the duration required, and wether or not there is any real long term loading on the amplifier. Even the FTC test requirements have gradually backed down over the years - from requiring a full power burn in, to 1/3 power, and finally to 1/8 power. It has moved from sine wave to pink noise for the burn n. And then the power measurement was still supposed to be a sine wave, swept over the full bandwidth. Now no one bothers with an FTC rating anymore - even most pro equipment dispenses with it. Most amplifiers “rated” at 100 watts can’t even put out 100 watts of average power for a full cycle at 20 Hz anymore - they might for 1/2 a cycle at 1kHz, assuming no other loading on the amp so the capacitors remain essentially at full charge. If you get one that is capable of the “spec” power rating for a few seconds at least you really do have to pay for it. It won’t be the cheapest one, and it probably won’t be on sale.
The only saving grace is that most people just don’t NEED that much power to play movies and listen to music. What is usually the problem is the speakers. Better sounding speakers are often a difficult load to drive. Amplifiers that have trouble putting out any real power often have trouble with difficult loads, even if being run at a sane volume. “Trouble” can be anything from overheating to excess distortion (often very obvious) to blown output transistors.
What they do now is play games with the duration required, and wether or not there is any real long term loading on the amplifier. Even the FTC test requirements have gradually backed down over the years - from requiring a full power burn in, to 1/3 power, and finally to 1/8 power. It has moved from sine wave to pink noise for the burn n. And then the power measurement was still supposed to be a sine wave, swept over the full bandwidth. Now no one bothers with an FTC rating anymore - even most pro equipment dispenses with it. Most amplifiers “rated” at 100 watts can’t even put out 100 watts of average power for a full cycle at 20 Hz anymore - they might for 1/2 a cycle at 1kHz, assuming no other loading on the amp so the capacitors remain essentially at full charge. If you get one that is capable of the “spec” power rating for a few seconds at least you really do have to pay for it. It won’t be the cheapest one, and it probably won’t be on sale.
The only saving grace is that most people just don’t NEED that much power to play movies and listen to music. What is usually the problem is the speakers. Better sounding speakers are often a difficult load to drive. Amplifiers that have trouble putting out any real power often have trouble with difficult loads, even if being run at a sane volume. “Trouble” can be anything from overheating to excess distortion (often very obvious) to blown output transistors.
It is *possible* that a total power of 850 watts can be obtained from the sum of all channels in a short term test. It is also possible that it could be a lie, and just the sum of what the individual channels can each do independently. And the length of time could be anything from a millisecond to a minute.
It is also at 10% distortion, which is driven about 3dB into clip.
It is also at 10% distortion, which is driven about 3dB into clip.
- Home
- Amplifiers
- Solid State
- AV Amplifier Output Specifications?