" UltraRes " -vs- " Standard " ..... does it matter for *very short* IR's (?)

ben ifin

Experienced
Hi all !

If Im using an IR that is say only 5ms <-> 10ms in length, does making it "UltraRes" as opposed to "Standard" make any difference to its audio quality (?)

Thanks,
Ben
 
I'm not sure how UltraRes works exactly, but I'm pretty sure Cliff uses some tricks to achieve that IR length with that low processing power.

So my recommendation: Use "Standard", because that way you definitely get the most / all out of the IR.

But I'm also pretty sure you won't hear any difference.
 
I look at it this way: In a band situation nobody will be able to tell whether you're at UltraRes or Normal because there's so much else going on. If you're playing in a solo situation or recording then it might be an audible difference to you, but the average listener won't notice.

So, if you're inclined and have CPU to spare use UltraRes, but it's a subtle difference otherwise use Normal and don't sweat it.
 
UltraRes adds more low end and so far I tend to cut it anyway, so for me standard is the way to go.
 
IMHO - when I was still running 3.03 beta (which was still ARES modeling), I could hear a pretty noticeable difference when comparing Standard to UltraRes IR's.

Now that I'm running 4.00 beta 5, I've found that Cygnus modeling has made any desire I had to use UltraRes obsolete. I hear absolutely zero difference when comparing Standard to UltraRes now. Not worth the extra ~5% CPU usage per IR, for me.

Hope this helps.
 
Back
Top Bottom