• We would like to remind our members that this is a privately owned, run and supported forum. You are here at the invitation and discretion of the owners. As such, rules and standards of conduct will be applied that help keep this forum functioning as the owners desire. These include, but are not limited to, removing content and even access to the forum.

    Please give yourself a refresher on the forum rules you agreed to follow when you signed up.

Transformer Drive

FractalAudio

Administrator
Fractal Audio Systems
Moderator
Part of the sound of certain tube amps, particularly those who derive their distortion from power amp overdrive, is attributable to the output transformer. The distortion produced by an overdriven output transformer isn't particularly pretty but it does play a role in the tone and without it the distortion would not be authentic.

When a transformer is overdriven the iron core saturates. This happens because all the magnetic domains are aligned with the field and no more can be "rotated". In engineering terms the flux density (B) no longer increases linearly with the flux intensity (H).

Since a transformer presents an inductance to the power tubes the flux intensity is inversely proportional to the frequency applied. Therefore the distortion increases at lower frequencies.

Manufacturers frequently specify the frequency response of the transformer at its rated power. For example Hammond specifies most of its output transformers of having a bandwidth of 70 Hz to 15 kHz (re. 1 kHz +/- 1 dB) at rated power. The bandwidth of the transformer, however, will be much greater when operated below it's rated power.

The Axe-Fx II allows the user to adjust the amount of output transformer saturation via a parameter called XFRMR DRV (Transformer Drive). Lower values decreases the amount of distortion, higher values increase it.

The parameter is normalized to a rated-power lower-frequency cutoff of 40 Hz, i.e. a value of 1.0 means that the virtual output transformer will have a lower cutoff frequency (-3 dB point) of 40 Hz when the virtual power amp is operating at the rated power of the transformer. So, if the transformer has a rated power of 50W and the lower cutoff frequency is 40 Hz at that power, setting XFRMR DRV to 1.0 will duplicate that behavior.

The formula for rated power cutoff frequency is simply D = f_c / 40, where D is the drive level and f_c is the desired cutoff frequency. For example if we wanted to duplicate the aforementioned Hammond transformer we would first need to find the equivalent -3 dB frequency which is roughly 3/4 assuming it's -2 dB at 70 Hz (since they strangely specify +/- 1 dB) which would be about 50 Hz. Plugging into the formula we get D = 50 / 40 = 1.2.

As always use your ears. I personally prefer a setting of around 1.5 - 2.0 for clean-to-lightly distorted tones. I find it adds a bit of richness to the bass frequencies. For higher gain tones I prefer less as it can sound muddy. Note that the effect of output transformer distortion is highly dependent upon the how hard the virtual power amp is driven which is a function of Master Volume and overall gain.

There are lots of strange things that happen with an OT saturates but those are trade secrets and I can't elaborate further.
 

shasha

Fractal Fanatic
I don't think that I can ever remember any company giving such thorough descriptions of any kind of parameter to the end user. Most of the time it'd be something like:

Transformer Drive:
Adjusting this effects the way that the transformer distorts. Distortion is a nonlinear characteristic that can be more or less pleasing to the ear. Changing the level will change the point at which the transformer distorts. See note 1 for further details.

Note 1: Oh yeah, we don't have this parameter.
 

rsf1977_again

Power User
wow! xformer drive has a great deal of effect on the low end. I was just messing with it with the 5153 red and lowering it just .20 from it's default was a very different sound. Brighter and tighter lowend. Thanks!
 

Kriig

Fractal Fanatic
I don't think that I can ever remember any company giving such thorough descriptions of any kind of parameter to the end user. Most of the time it'd be something like:

Transformer Drive:
Adjusting this effects the way that the transformer distorts. Distortion is a nonlinear characteristic that can be more or less pleasing to the ear. Changing the level will change the point at which the transformer distorts. See note 1 for further details.

Note 1: Oh yeah, we don't have this parameter.
+1. It makes me speachless sometimes...
 

miketheman

Experienced
Yeah, many are taking the "can't elaborate further" thinking too seriously. Almost to the extent that makes you wonder why they even bothered to write and print a "manual" in first place anyway...

With threads like these, there really is no excuse to not experiment IMHO (if time permits of course).
 

Stratoblaster

Fractal Fanatic
Great information, I love reading stuff like this. It's interesting to have the details to match a given transformer to the AFX if needed. I experiment with this parameter in each of my presets to see if adjusting it either way adds to the tone/feel/breakup characteristic of a given amp that I may prefer...

This is an important parameter IMO. A lot of tube amp designers have indicated that the OP transformers is a huge part of an amps character, and is why many builders use custom ones in their designs. Mercury Magnetic transformers swapped into a Marshall is apparently a very sweet mod/upgrade from what I hear...
 
Last edited:

Patzag

Fractal Fanatic
Thanks for this. Even I understood it :mrgreen
All joking aside, I love Cliff's Notes. Thanks!
 

Stratoblaster

Fractal Fanatic
would be cool to have some real world Axe-Fx values to simulate some specific transformers for the users without a degree in physics ;)
Yea that would be interesting for sure; I was curious and had a quick look around to see if I could find any transformer data but didn't really find anything useful, but again, I didn't spend a lot of time on it....
 

DLC86

Fractal Fanatic
Is the formula still valid for recent firmwares?
I just checked what's supposed to be the -3dB cutoff frequency of my beloved hiwatt's output transformer and it is 3.6 Hz, according to this page at least http://hiwatt.org/tech.html

Using the formula I end up with a value of 0.09, that's pretty different than the default 1.00 for the Hiwatt models, why? Which one is wrong?
I know that partridge transformers were pretty high-spec'd and famous for their exceptional "hi-fi" bandwidth, xformer drive set at 1.00 (40 Hz cutoff point) doesn't seem in line with this anyway.
 

FractalAudio

Administrator
Fractal Audio Systems
Moderator
Is the formula still valid for recent firmwares?
I just checked what's supposed to be the -3dB cutoff frequency of my beloved hiwatt's output transformer and it is 3.6 Hz, according to this page at least http://hiwatt.org/tech.html

Using the formula I end up with a value of 0.09, that's pretty different than the default 1.00 for the Hiwatt models, why? Which one is wrong?
I know that partridge transformers were pretty high-spec'd and famous for their exceptional "hi-fi" bandwidth, xformer drive set at 1.00 (40 Hz cutoff point) doesn't seem in line with this anyway.
The stated bandwidth of the HIwatt transformer is not it's full-power rating (they never are). Typically these ratings are for 1W, not full power. The default value is accurate.
 

DLC86

Fractal Fanatic
The stated bandwidth of the HIwatt transformer is not it's full-power rating (they never are). Typically these ratings are for 1W, not full power. The default value is accurate.
Oh ok, thanks for the explanation! :)
So, if I got it right, xformer low cut and high cut set the low power bandwidth.. Should i set high cut to max instead of the default 22000Hz or is there something else lowering the Hi corner frequency (eg: power tube and primary impedances)?
 
Last edited:
Top Bottom