Part of the sound of certain tube amps, particularly those who derive their distortion from power amp overdrive, is attributable to the output transformer. The distortion produced by an overdriven output transformer isn't particularly pretty but it does play a role in the tone and without it the distortion would not be authentic.
When a transformer is overdriven the iron core saturates. This happens because all the magnetic domains are aligned with the field and no more can be "rotated". In engineering terms the flux density (B) no longer increases linearly with the flux intensity (H).
Since a transformer presents an inductance to the power tubes the flux intensity is inversely proportional to the frequency applied. Therefore the distortion increases at lower frequencies.
Manufacturers frequently specify the frequency response of the transformer at its rated power. For example Hammond specifies most of its output transformers of having a bandwidth of 70 Hz to 15 kHz (re. 1 kHz +/- 1 dB) at rated power. The bandwidth of the transformer, however, will be much greater when operated below it's rated power.
The Axe-Fx II allows the user to adjust the amount of output transformer saturation via a parameter called XFRMR DRV (Transformer Drive). Lower values decreases the amount of distortion, higher values increase it.
The parameter is normalized to a rated-power lower-frequency cutoff of 40 Hz, i.e. a value of 1.0 means that the virtual output transformer will have a lower cutoff frequency (-3 dB point) of 40 Hz when the virtual power amp is operating at the rated power of the transformer. So, if the transformer has a rated power of 50W and the lower cutoff frequency is 40 Hz at that power, setting XFRMR DRV to 1.0 will duplicate that behavior.
The formula for rated power cutoff frequency is simply D = f_c / 40, where D is the drive level and f_c is the desired cutoff frequency. For example if we wanted to duplicate the aforementioned Hammond transformer we would first need to find the equivalent -3 dB frequency which is roughly 3/4 assuming it's -2 dB at 70 Hz (since they strangely specify +/- 1 dB) which would be about 50 Hz. Plugging into the formula we get D = 50 / 40 = 1.2.
As always use your ears. I personally prefer a setting of around 1.5 - 2.0 for clean-to-lightly distorted tones. I find it adds a bit of richness to the bass frequencies. For higher gain tones I prefer less as it can sound muddy. Note that the effect of output transformer distortion is highly dependent upon the how hard the virtual power amp is driven which is a function of Master Volume and overall gain.
There are lots of strange things that happen with an OT saturates but those are trade secrets and I can't elaborate further.
When a transformer is overdriven the iron core saturates. This happens because all the magnetic domains are aligned with the field and no more can be "rotated". In engineering terms the flux density (B) no longer increases linearly with the flux intensity (H).
Since a transformer presents an inductance to the power tubes the flux intensity is inversely proportional to the frequency applied. Therefore the distortion increases at lower frequencies.
Manufacturers frequently specify the frequency response of the transformer at its rated power. For example Hammond specifies most of its output transformers of having a bandwidth of 70 Hz to 15 kHz (re. 1 kHz +/- 1 dB) at rated power. The bandwidth of the transformer, however, will be much greater when operated below it's rated power.
The Axe-Fx II allows the user to adjust the amount of output transformer saturation via a parameter called XFRMR DRV (Transformer Drive). Lower values decreases the amount of distortion, higher values increase it.
The parameter is normalized to a rated-power lower-frequency cutoff of 40 Hz, i.e. a value of 1.0 means that the virtual output transformer will have a lower cutoff frequency (-3 dB point) of 40 Hz when the virtual power amp is operating at the rated power of the transformer. So, if the transformer has a rated power of 50W and the lower cutoff frequency is 40 Hz at that power, setting XFRMR DRV to 1.0 will duplicate that behavior.
The formula for rated power cutoff frequency is simply D = f_c / 40, where D is the drive level and f_c is the desired cutoff frequency. For example if we wanted to duplicate the aforementioned Hammond transformer we would first need to find the equivalent -3 dB frequency which is roughly 3/4 assuming it's -2 dB at 70 Hz (since they strangely specify +/- 1 dB) which would be about 50 Hz. Plugging into the formula we get D = 50 / 40 = 1.2.
As always use your ears. I personally prefer a setting of around 1.5 - 2.0 for clean-to-lightly distorted tones. I find it adds a bit of richness to the bass frequencies. For higher gain tones I prefer less as it can sound muddy. Note that the effect of output transformer distortion is highly dependent upon the how hard the virtual power amp is driven which is a function of Master Volume and overall gain.
There are lots of strange things that happen with an OT saturates but those are trade secrets and I can't elaborate further.