Class D amplifier output not voltage dependant.

Status
Not open for further replies.
I've been playing with a couple of cheap (£4) TPA3116 based Class D amplifiers from ebay and can't fathom their output.
The ti chart for Power vs Supply Voltage seems pretty straight forward. Not exactly a straight line but certainly a progression.
I've been running one of them from a 4S (14.8V) LiPo battery and wondered if a 6S (22.2V) would make much of a difference to the output volume.
Turns out it makes none at all.

I've purchased a cheap sound level meter from Amazon and tried it out today with the amp connected to a bench power supply.
With the voltage at 14V and all volume and tone controls set to maximum, the amp produces 93dB of output into two 6 ohm speakers, one on each channel, with the input split between them.
With the voltage at 24V, it also produces 93dB.

The other board behaves in exactly the same way, except it only produces 86dB.

I can see the power LED on the amps changing in brightness as the voltage changes, so I know the voltage is changing.
The boards are pretty compact, so no room for a voltage regulator.

Why aren't these things following the script ?

Adrian.
 
You probably do not measure the sound level precise enough.
More supply voltage means higher output power which means a slightly higher sound pressure. But, the sound pressure is not linear, probably logarithmic. The speaker guys can tell you more about that and probably also how you measure sound level more precisely.

One exception is if you run into current limitation with both supply voltages. Then, the maximum output power is not decided by the supply voltage but by the current limitation. In this case, the output power at current-limit is the same and the speaker sound level as well.

I just notice Davey's valid comment: The supply voltage does not work as a volume control such that by increasing the supply voltage the output is increased. This is a matter of gain and the gain most likely remains constant and the output power as well.
 
Last edited:
Many thanks for the comments.

So, my basic assumption was that a Class D is a switch, and by increasing the supply voltage, the amp would be switching a higher voltage across the speakers, which are a fixed impedance, hence more power flows and more sound comes out of the speakers.

If I'm reading you correctly, with the current input signal, I only have sufficient gain in the amplifier to generate a signal at the speaker which is less than 14V, hence no noticeable distortion, and increasing the voltage doesn't make any difference as the amplifier hasn't got the gain to make use of the increased headroom.

So, to get more dB out, I need to put more signal in, assuming I don't then overdrive it?

Or see if I can find the TPA3116 gain resistor network.
 
Theory would say that because the output transistor is just a switch, increasing the Vcc would increase the output voltage.

However theory also says that the output is negative-fedback to the amp chip in some way or other so the eventual output is still decided by the gain setting.
 
I have a 3116 on a variable voltage supply, and have also found that moving from 12 to 24 while listening, does nothing. I didn't look into it further, but was surprised. Suspecting the psu might need a re-start to take it's new settings. It's switched in about 6 increments you see. It sounds like the psu is fine though.


I use motor controllers who's duty cycle is simply as long as it takes for the current to reach the level it's looking for. Increasing the voltage won't effect the motor current. So torque isn't noticeably effected. Speed is though, as the drive moves on quicker. Round it's 3~



Just putting it out there. It's 4:30am and I can't process much. My neighbour just had a house invasion. So we have been hot on their trail, planning on our feet, instead of sleeping. I'm exhausted. Which is a lot better than he is. Poor bugger.
 
Nugget: If changing supply voltage directly affects output voltage, the amp would have a PSRR of 0dB.

The higher Vcc does let the output node charge up quicker. But once the voltage charges up high enough / to the intended output voltage, it stops charging. Because feedback.

In other words, if you use higher Vcc, the duty cycle becomes smaller, output voltage remains the same which is controlled by the gain setting.

This is in many ways similar to a regulated buck converter. If you are thinking of an unregulated buck converter, then yes, for a fixed duty cycle, output depends on input voltage.

Many thanks for the comments.

So, my basic assumption was that a Class D is a switch, and by increasing the supply voltage, the amp would be switching a higher voltage across the speakers, which are a fixed impedance, hence more power flows and more sound comes out of the speakers.

If I'm reading you correctly, with the current input signal, I only have sufficient gain in the amplifier to generate a signal at the speaker which is less than 14V, hence no noticeable distortion, and increasing the voltage doesn't make any difference as the amplifier hasn't got the gain to make use of the increased headroom.

So, to get more dB out, I need to put more signal in, assuming I don't then overdrive it?

Or see if I can find the TPA3116 gain resistor network.

Most of these class-D chip amps have an input stage that is powered by low voltage DC (5V or less) so your max input amplitude is going to be limited. Changing gain would be sensible. However most speaker amps should have a gain of at least 20x and higher so that should be sufficient to max out the power supply voltage even with a 1Vrms input. Find where your bottleneck is and decide your fix from there.
 
Last edited:
Status
Not open for further replies.