Input Clipping Thread

First I'll say that I very much appreciate the addition of this input clip warning. I've noticed it before on an ultra clean 80's studio-style preset, but couldn't figure out what was going on because nothing appeared to be clipping.

That said, this enhancement has made it clear that very little of the input level range was ever actually usable. Seems like a future enhancement should be re-scaling the parameter. If people are needing to set this in single digits to avoid clipping, this question is going to just keep coming up.
Some users are. Some users are not.

Why would the scale be changed?
 
For the same reason that you wouldn't create a volume control where 2 was too loud for a majority of your listeners.

You mean like most guitar amps? :)

And how did you determine this non-issue affects a majority of AxeFx users? Hopefully not from assuming the squeaky wheels in this thread are a representative sample. There are many, many more players out there who aren't interpreting this as some sort of problem.
  • the AxeFx still works perfectly fine even if the input sensitivity is set to a low value
  • guitars overall have a wide range of signal level
  • there are physical limits with A/D conversion, which means even the best converters will clip with sufficient input signal
  • more isn't always better
 
Though I get the sense that, in most cases, there is a single usable input sensitivity value that will fit users' weakest and strongest guitars, it might be good to allow input sensitivity value to be set per preset for those that can't find a single value to suit all their guitars and so requiring adjustment when changing guitars (assuming one would be dialling in presets per guitar). No biggie though, since your changing guitars anyway - I just put input sensitivity on the Perform page for easy access. Maybe there's a reason it's not a good idea to have input sensitivity varying by preset (popping? .. dunno).
 
Though I get the sense that, in most cases, there is a single usable input sensitivity value that will fit users' weakest and strongest guitars, it might be good to allow input sensitivity value to be set per preset for those that can't find a single value to suit all their guitars and so requiring adjustment when changing guitars (assuming one would be dialling in presets per guitar). No biggie though, since your changing guitars anyway - I just put input sensitivity on the Perform page for easy access. Maybe there's a reason it's not a good idea to have input sensitivity varying by preset (popping? .. dunno).
Or maybe a “learn” feature that sets the input sensitivity automatically with the push of a button and hit of the strings
 
Or maybe a “learn” feature that sets the input sensitivity automatically with the push of a button and hit of the strings
I don't think that's needed since finding a workable value is easy - just find a spot where you get a signal into the grid with light and aggressive strums while staying above +5% sensitivity and avoiding input clipping (as shown on the new clip indicator). Even with a range of guitars it's easy - just dial in sensitivity with the strongest guitar, and more often than not, that same setting will work for the weakest guitar. The 2nd case I refer to above, is where there is no one sens setting to be found for strongest and weakest (setting for strongest guitar is >5% but too weak for the weakest guitar, and/or, setting for the weakest guitar clips with the strongest guitar). This seems to happen for the range of guitars a few folks have, and would require changing the setting with guitar changes - so having input sensitivity by preset would allow for different presets set up for different guitars' sensitivity values. The last (3rd) scenario is a guitar that no sens setting will accommodate (so strong it needs < 5%, or so weak that 100% is not strong enough to convert the signal).

I don't know how many people have a guitar in the 3rd scenario (no viable setting) but I can't imagine there are many.

Some have commented that the absolute strongest signal possible over 5% without input clipping is needed to avoid tone degradation and anything less than that (even though still >5%/not input clipping) will start to degrade tone. If this is true, having one sensitivity setting for a wide (strongest to weakest) range of guitars may not be optimal - but I really don't believe this to be true, or at least I can't hear any tonal difference at all with any sens settings between 15% and 90% on my weakest guitar - a strat (I have my sensitivity set to 15% to accommodate my strongest guitar which clips input at higher than 15% values - but this 15% value also works perfectly for my strat which does not clip input or even tickle red at 90+% sensitivity values).
 
Last edited:
“just find a spot where you get a signal into the grid while staying above +5% sensitivity and avoiding the new clip indicator.”

This is not possible with my SG sporting a 57 classic plus… and I’m not hitting the guitar so hard that it loses tune or anything, just palm mute chugs on G. I need to set the input sensitivity sub 4% to avoid the warning. However, I can turn my strat up to 100% safely. This is why a learn function would help. Just hit it every time you plug in and no worries.
Additionally, it could adjust input level up to compensate for any lost gain with <5% guitars.
 
Last edited:
“just find a spot where you get a signal into the grid while staying above +5% sensitivity and avoiding the new clip indicator.”

This is not possible with my SG sporting a 57 classic plus… andI’m not hitting the guitar so hard that it loses tune or anything, just palm mute chugs on G. However, I can turn my strat up to 100% safely. This is why a learn function would help. Just hit it every time you plug in and no worries.
Additionally, it could adjust input level up to compensate for any lost gain with <5% guitars.
sounds like you are in the 3rd category with no viable sens setting for your SG (Surprised though as my SG with what I thought were more aggressive (than 57 classic) 490R/498T pickups is fine at 40% - could be a setup difference though).. I'm not totally sure adding input gain is really an exactly correcting fix to compensate for a sensitivity setting that's starting to affect the converters' capabilities due to being less than 5%.

In any case, I don't see how a learn function helps dramatically or materially - you can twist the knob while you strum and watch the meter, or you can press the learn button and strum while you watch the meter - kinda the same imo.
 
Last edited:
Some have commented that the absolute strongest signal possible over 5% without input clipping is needed to avoid tone degradation and anything less than that (even though still >5%/not input clipping) will start to degrade tone.
There is no 5% requirement. You can dial it all the way down to zero with no tone degradation.

It doesn't "blow up the input". There's nothing fundamentally wrong with setting it to 8%. You can set is as low as 0% with no detrimental effect.
 
Nothing is crapping out below 5%. The converters work just as well at 0% as they do at any other setting. They are still utilizing every bit of their 24 bit range. It's the automatic gain compensation that is not as balanced at settings below 5% so your level going into the grid may end up being slightly lower than it otherwise would be. Increasing the Input Gain parameter is exactly what is needed to compensate for that reduced level.

When you adjust the input sensitivity, you are adjusting the level going into the converter, but an equal and opposite change is also applied after the converter to ensure that you have unity gain through the input. However, below 5% that compensation isn't exactly equal and may need to be manually compensated a bit. If you set your Input Sensitivity down below 5% and you feel like your tones have lost some gain, then bump up the Input Gain a bit to compensate. That's it.
 
Last edited:
It's the automatic gain compensation that is not as balanced at settings below 5% so your level going into the grid may end up being slightly lower than it otherwise would be. Increasing the Input Gain parameter is exactly what is needed to compensate for that reduced level.

When you adjust the input sensitivity, you are adjusting the level going into the converter, but an equal and opposite change is also applied after the converter to ensure that you have unity gain through the input. However, below 5% that compensation isn't exactly equal and may need to be manually compensated a bit. If you set your Input Sensitivity down below 5% and you feel like your tones have lost some gain, then bump up the Input Gain a bit to compensate. That's it.
This what I have always thought too. But apparently this may have changed somewhere along the way. I don't recall any mention of it in any firmware release notes.

I did an experiment to see for myself.
https://forum.fractalaudio.com/threads/input-clipping-thread.193293/post-2417018
 
It's possible. I know I've seen the level reduction below 5% in my tests, but it's been a while and there's been quite a few FW updates since then.

all-waves-jpg.74283


I've been meaning to redo this test with the looper block feeding the Instrument Input to get more consistency, but your synth block test does basically the same thing.
 
Nothing is crapping out below 5%. The converters work just as well at 0% as they do at any other setting. They are still utilizing every bit of their 24 bit range. It's the automatic gain compensation that is not as balanced at settings below 5% so your level going into the grid may end up being slightly lower than it otherwise would be. Increasing the Input Gain parameter is exactly what is needed to compensate for that reduced level.

When you adjust the input sensitivity, you are adjusting the level going into the converter, but an equal and opposite change is also applied after the converter to ensure that you have unity gain through the input. However, below 5% that compensation isn't exactly equal and may need to be manually compensated a bit. If you set your Input Sensitivity down below 5% and you feel like your tones have lost some gain, then bump up the Input Gain a bit to compensate. That's it.
my bad - I always understood <5%=bad.

What about the other assertion I've seen expressed that sensitivity needs to be as strong as possible for optimal tone? IE - in my case I'm set to 15% sens to suit my Carvin so it does not clip input. When I switch to strat, I leave it at 15% even though my strat will not clip or tickle red even at 100% sensitivity. Is my strat tone compromised somehow at 15% sensitivity? I don't hear any degredation between 15% and 100% sens with the strat, but there was some discussion here a while ago that seemed to say otherwise - or maybe i misunderstood.
 
Last edited:
The level is compensated so the sensitivity setting will have basically no impact on tone except at possibly very low settings. That's when you might need to adjust things with the Input Gain.

Think of it like a camera with a zoom lens. The sensor of the camera can only capture so much area of an image. If you're trying to take a picture of a fly, you'll need to zoom way in with the lens so the tiny fly fills up more of the sensor for maximum detail. Going the other direction, if you're trying to take a picture of a huge elephant, you'd have to zoom out with the lens so the whole elephant will fit on the sensor. In either case the goal is to fill as much of the sensor as possible with the subject of the picture. The input converters work the same way. If you're feeding it a small signal from a very weak pickup, you might need to boost that signal up so it fills up a good amount of the converters dynamic range for an optimum signal to noise ratio. Going the other way, if you're feeding it a big signal from a very high output pickup, you might need to reduce the signal so the whole thing can be captured without clipping. The Input Sensitivity setting is how much you are adjusting that input signal level (the zoom amount in the camera analogy) to optimally fit in the converter's dynamic range.

The other side of that coin is the inverse compensation that happens in the digital realm after the converters. Cliff has said that processing on the Grid uses floating point math, so it's not limited to the hard 0 dBFS ceiling the input converter has. You can easily boost the virtual input level beyond that by just multiplying the floating point sample level. So your big, dynamic input signal gets reduced by the low Sensitivity setting so it fully fits through the converters without clipping, and then gets boosted right back to where it was in the digital realm where it can beat the crap out of the virtual amp's tubes in all its glory.
 
The level is compensated so the sensitivity setting will have basically no impact on tone except at possibly very low settings. That's when you might need to adjust things with the Input Gain.

Think of it like a camera with a zoom lens. The sensor of the camera can only capture so much area of an image. If you're trying to take a picture of a fly, you'll need to zoom way in with the lens so the tiny fly fills up more of the sensor for maximum detail. Going the other direction, if you're trying to take a picture of a huge elephant, you'd have to zoom out with the lens so the whole elephant will fit on the sensor. In either case the goal is to fill as much of the sensor as possible with the subject of the picture. The input converters work the same way. If you're feeding it a small signal from a very weak pickup, you might need to boost that signal up so it fills up a good amount of the converters dynamic range for an optimum signal to noise ratio. Going the other way, if you're feeding it a big signal from a very high output pickup, you might need to reduce the signal so the whole thing can be captured without clipping. The Input Sensitivity setting is how much you are adjusting that input signal level (the zoom amount in the camera analogy) to optimally fit in the converter's dynamic range.

The other side of that coin is the inverse compensation that happens in the digital realm after the converters. Cliff has said that processing on the Grid uses floating point math, so it's not limited to the hard 0 dBFS ceiling the input converter has. You can easily boost the virtual input level beyond that by just multiplying the floating point sample level. So your big, dynamic input signal gets reduced by the low Sensitivity setting so it fully fits through the converters without clipping, and then gets boosted right back to where it was in the digital realm where it can beat the crap out of the virtual amp's tubes in all its glory.
Thanks - I understood the compensation side of it, but the explanation of low vs hi sensitivity setting difference while not clipping was eluding me. The camera sensor analogy nails it (should put that in the wiki) - was probably explained here before but sometimes certain word combos unlock my understanding. So in my case, with my strat set at 15% sens I'm adding a bit more noise (not changing gain) as opposed to 100% where my strat still would not clip the input. I can't hear a difference so it's fine for me to stay at 15% to accommodate both my Carvin which clips above 15% and my strat - but if I could hear a difference, or for someone else with better ears than me, I/they'd wanna be twisting the sens knob between 15 and 100 with carvin/strat guitar changes.
 
Last edited:
What about the other assertion I've seen expressed that sensitivity needs to be as strong as possible for optimal tone?
...
Here's the deal:

You want A/D Sensitivity low enough that there's no clipping. You also want it to be as high as practical to get the best signal-to-noise ratio, But here's the deal about that:

Your guitar, all by itself, produces way more noise than your FM9 does. So even if you have A/D Sensitivity set way low, the noise from your guitar is still overpowering any noise from the FM9. So you can afford to set it ridiculously low without affecting the noise level at all.


TLDR: Clipping is a bigger enemy than noise here. Set A/D Sensitivity low enough that your hottest guitar doesn't cause clipping, and you're golden. It's time to play guitar.




IE - in my case I'm set to 15% sens to suit my Carvin so it does not clip input. When I switch to strat, I leave it at 15% even though my strat will not clip or tickle red even at 100% sensitivity. Is my strat tone compromised somehow at 15% sensitivity?
No.
 
Back
Top Bottom