ICONS accurate input calibration

Cheap meter here also - I say we split the diff and call it 17.5
Fwiw I've measured several times my fm9 input calibration and it goes from 17.5 dBu with A/D sensitivity set at 50% to 17.7/17.8 when going towards the extremes.
So yeah, I'd call it 17.5, I guess cheap DMMs plus tolerances in the analog input circuitry makes trying to find a more precise value a fool's errand.
1Vp=0dBFS @+18dB => 17.3dBu sounds good too.

PS: for anyone interested, some time ago I (with the help of chatgpt) made this handy little tool on my website to convert between dBu, dBv, Vrms, Vp and Vpp, which can also be calculated for other common waveforms or a custom crest factor... and there's a visualizer as well:
https://shop.thetonescientist.com/pages/tts-levels-calculator
 
Last edited:
convert between dBu, dBv, Vrms, Vp and Vpp
Speaking of which…

So we have a spec saying that maximum input level is, say, 16 dBu

Looking at your tools (and similar ones), conversion from dBu to all kinds of voltages is done assuming that dBu value is RMS (most tools also assume sine wave)

But what kind of sense does it make for music, and guitar in particular, where crest factor, and therefore peak voltage, will be much higher?
 
Speaking of which…

So we have a spec saying that maximum input level is, say, 16 dBu

Looking at your tools (and similar ones), conversion from dBu to all kinds of voltages is done assuming that dBu value is RMS (most tools also assume sine wave)

But what kind of sense does it make for music, and guitar in particular, where crest factor, and therefore peak voltage, will be much higher?
The headroom values I mentioned in my posts above were taken from my Daw's peak meter as yielded from the 0dBu signal fed to Axfx input.
 
  • Fire
Reactions: KFF
The headroom values I mentioned in my posts above were taken from my Daw's peak meter as yielded from the 0dBu signal fed to Axfx input.
Well my question is maybe more about what the spec means.

If you run a sine wave into an input rated at 16 dBu (and assuming that spec is RMS), then your daw should show 0dBFS peak when your rms voltage is a bit less than 5 volts.

But that says nothing about actual peak handling of the input if this is the case because guitars don’t produce sine waves, and while for a sine wave peak will be around 7 volts, it’ll quickly go closer to 9 or 10 with actual music, with the same RMS value.

So what does that 16 dBu even mean, if anything?
 
Well my question is maybe more about what the spec means.

If you run a sine wave into an input rated at 16 dBu (and assuming that spec is RMS), then your daw should show 0dBFS peak when your rms voltage is a bit less than 5 volts.

But that says nothing about actual peak handling of the input if this is the case because guitars don’t produce sine waves, and while for a sine wave peak will be around 7 volts, it’ll quickly go closer to 9 or 10 with actual music, with the same RMS value.

So what does that 16 dBu even mean, if anything?
So what would you suggest then, as a straight forward practical method for everyday users to measure their particular interface's headroom and use that value to calculate a plugin input compensation when the plugin author has provided an input gain reference value to facilitate using their plugin accurately and as intended (Maybe a standard factor of some sort that can be applied to headroom measured via sine wave to account for guitar signal peaks?)
 
Last edited:
  • 100
Reactions: KFF
Well my question is maybe more about what the spec means.

If you run a sine wave into an input rated at 16 dBu (and assuming that spec is RMS), then your daw should show 0dBFS peak when your rms voltage is a bit less than 5 volts.

But that says nothing about actual peak handling of the input if this is the case because guitars don’t produce sine waves, and while for a sine wave peak will be around 7 volts, it’ll quickly go closer to 9 or 10 with actual music, with the same RMS value.

So what does that 16 dBu even mean, if anything?

dBu units are specified in reference to 0.775V RMS. So 0dBu = .775V RMS.

0.775V RMS because that's the voltage required that results in 1 mW of power into a 600Ω load. 600Ω comes from the telephone era, the impedance of the telephone transmission lines.

For dBV units, the reference is 1V RMS instead.

When manufacturers of audio interfaces declare a spec like "16 dBu max input", the usual standard reference is a 1KHz sine wave at .775V RMS - so therefore you know the peak and peak-to-peak voltage spec of the interface. In this case, 13.83 Vp-p - if your guitar signal exceeds this you will clip.
 
Last edited:
dBu units are specified in reference to 0.775V RMS. So 0dBu = .775V RMS.

0.775V RMS because that's the voltage required that results in 1 mW of power into a 600Ω load. 600Ω comes from the telephone era, the impedance of the telephone transmission lines.

For dBV units, the reference is 1V RMS instead.

When manufacturers of audio interfaces declare a spec like "16 dBu max input", the usual standard reference is a 1KHz sine wave at .775V RMS - so therefore you know the peak and peak-to-peak voltage spec of the interface. In this case, 13.83 Vp-p - if your guitar signal exceeds then you will clip.
I got lost on the peak to peak aspect, but rather than ask probably dumb questions I'll try n study up a bit first 😳

Wondering though, do the peaks characteristic of typical guitars as mentioned by Mr Vangrieg above really matter in the context of just calculating a compensating plugin input value (to retain authenticity), where the plugin's author has provided an input gain reference value which is different than the headroom of the the interface being used to access the plugin (once I am setup for accuracy with the plugin I will worry about guitar transient peaks the same as I would in any other scenario of using my guitar into my interface).
 
Last edited:
Can somebody confirm my thinking as I'm pretty new to plugin gain staging. If I'm using a first generation Apollo X4, Hi-Z input with gain set to minimum, that's about 8db louder compared to the DI coming from my AM4. When using the AM4, the plugin recommends setting the input calibration to +18db. When I'm using the Apollo's HI-Z, I think I should set the input calibration to +10db to match levels. Am I right?

From the AM4 manual:
1771148424173.png

Fron the X4 manual:
1771148470912.png
 
Last edited:
Can somebody confirm my thinking as I'm pretty new to plugin gain staging. If I'm using a first generation Apollo X4, Hi-Z input with gain set to minimum, that's about 8db louder compared to the DI coming from my AM4. When using the AM4, the plugin recommends setting the input calibration to +18db. When I'm using the Apollo's HI-Z, I think I should set the input calibration to +10db to match levels. Am I right?

From the AM4 manual:
View attachment 166692

Fron tge X4 manual:
View attachment 166693
UAD Apollo is 12.2dBu, so you’d set the plugin to +13 (just like what @James Freeman recommended for the Focusrite’s as they have the same specs).

I think the AM4 specs are misleading - 20dBu is likely referring to the (analog) input circuitry. The DI level coming from the converters matches the AxeFX III specs at 17.4dBu.
 
Speaking of which…

So we have a spec saying that maximum input level is, say, 16 dBu

Looking at your tools (and similar ones), conversion from dBu to all kinds of voltages is done assuming that dBu value is RMS (most tools also assume sine wave)

But what kind of sense does it make for music, and guitar in particular, where crest factor, and therefore peak voltage, will be much higher?
What @AlbertA said, dBu is an rms value by definition.
Gear specs are made by measuring the input or output with a 1 kHz sine wave, so you automatically know that the peak value is 3 dB higher than the dBu value.
Or, in voltage terms, √2 times higher.

PS: the clipping point of a given converter corresponds to a specific peak voltage, the measured dBu/rms level of the signal doesn't really matter, it's the peak voltage that matters and that remains the same for any kind of signal.

E.g. this two 8 Vp waves would both reach about 0dBFS on the axe fx input, even though the first one measures as 17.26 dBu and the second one as 2.21 dBu.

1771153124579.png
1771153218620.png
 
Last edited:
the clipping point of a given converter corresponds to a specific peak voltage, the measured dBu/rms level of the signal doesn't really matter, it's the peak voltage that matters and that remains the same for any kind of signal.
Well exactly.
 
I got lost on the peak to peak aspect, but rather than ask probably dumb questions I'll try n study up a bit first 😳
I just meant that since it's a known reference by convention( a 1KHz sine wave at .755V RMS), we know what the peak voltage is, since it's a sine wave.

So an interface advertising a 16dBu max input therefore means, we can feed it a 1KHz 4.89V RMS sine wave without clipping. That sine wave has a peak voltage of 6.9V and -6.9V (or 13.8V peak to peak). So if my guitar signal instantaneous voltage exceeds +6.9V or goes below -6.9V then it will clip with that interface.

Wondering though, do the peaks characteristic of typical guitars as mentioned by Mr Vangrieg above really matter in the context of just calculating a compensating plugin input value (to retain authenticity),
Not really no.

where the plugin's author has provided an input gain reference value which is different than the headroom of the the interface being used to access the plugin (once I am setup for accuracy with the plugin I will worry about guitar transient peaks the same as I would in any other scenario of using my guitar into my interface).
Yeah the plugin only sees digital values - it has no idea what voltage generated that digital value so it has to make some assumption.
 
Back
Top Bottom