Latency reporting to DAW

This was driving me insane so I called it quits on this issue and bought a dedicated interface with SPDIF I/O (Antelope Zen Go). Overkill for just tracking guitars and bass at home but we have the same interface at our studio so I was already familiar with it.

I plug my guitar cable into the Antelope and route that down the SPDIF output (into the AX3). This way I can record the dry guitar on one track and the SPDIF wet signal from the AX3 on another mono or stereo track.

I could have just gone with an used Scarlett 8i6 3rd gen for <$200 since I wouldn’t be running through the Scarlett’s analog inputs. That’s the unit I’d recommend for most people.

I’m much happier with this setup:
  1. no latency issues
  2. I can leave all my I/O permanently connected to the Antelope (headphones, guitar cable, monitor cables, SPDIF cables)
  3. when I bring the AX3 in and out of the studio all I have to connect is power, SPDIF I/O, and an XLR for the FC12, all of which I leave routed behind my desk so all 4 are right there waiting for me.
  4. I have more flexibility with routing and levels by using Antelope’s mixer application.
Did you need to adjust the spdif output level (if so, how exactly?)? How do you manage to have the same input level by means of the spdif out (as digital input for the axe) as when putting the guitar directly as analog input into the axe?

I really struggle to achieve this. When I go into the HiZ Inputbof my RME UCX and then route the signal to spdif out , it is way too hot for the axe and I have a pretty high noise floor.

Edit
Link added totwo files
Here no issues when Analog input into Axe - Spdif out into RME UCX


Here issues when reamping/ going Analog input into RME - spdif out as digital in in Axe and then back spdif out from Axe to Spdif in into RME with hiss/ noise floor
 
Last edited:
Due to the unstable reamp offset, I'm starting to look into the SPDIF option for remaping as well. I think @GlennO 's "Axe-FX For The Recording Musician" might briefly cover this, but I don't think it gets into the weeds. Defining the SPDIF reamp process more in-depth/step by step would be helpful...at least for me.

Things that @DonProm mention are a concern for me, but I'd also like to know how to route everything properly. Does it require two spdif cables for the bidirectional stuff or do you just use something like ASIO4ALL to aggregate? Maybe @strabes can elaborate some?
 
Last edited:
Did you need to adjust the spdif output level (if so, how exactly?)? How do you manage to have the same input level by means of the spdif out (as digital input for the axe) as when putting the guitar directly as analog input into the axe?
The simple solution is: don't plug your guitar into your Axe-FX when using a spdif connection to an audio interface. See option #5 here for complete instructions on how to use spdif with an audio interface:

https://forum.fractalaudio.com/threads/axe-fx-for-the-recording-musician.177592/

That gives you a DI you can record for reamping. No configuration changes are necessary on the Axe-FX when reamping. Simply redirect the USB input from the DAW to spdif out in your audio interface when the time comes to to reamp.
 
The simple solution is: don't plug your guitar into your Axe-FX when using a spdif connection to an audio interface. See option #5 here for complete instructions on how to use spdif with an audio interface:

https://forum.fractalaudio.com/threads/axe-fx-for-the-recording-musician.177592/

That gives you a DI you can record for reamping. No configuration changes are necessary on the Axe-FX when reamping. Simply redirect the USB input from the DAW to spdif out in your audio interface when the time comes to to reamp.
Maybe what I wrote was mIssleading. I have the hiss when signal is as follows:
Guitar into RME, Spdif into Axe, spdif into RME
So, no guitar into Axe directly in that scenario. I just compared it to: direct input in Axe - from there into RME. And that was where I did have the better signal and no issues (listen sound examples in my previous post)
 
Did you need to adjust the spdif output level (if so, how exactly?)? How do you manage to have the same input level by means of the spdif out (as digital input for the axe) as when putting the guitar directly as analog input into the axe?

I really struggle to achieve this. When I go into the HiZ Inputbof my RME UCX and then route the signal to spdif out , it is way too hot for the axe and I have a pretty high noise floor.

I experience high levels for digital in as well (unrelated to reamping) -- see thread https://forum.fractalaudio.com/threads/aes-effect-loop-input-return-level.176641/#post-2146873
I have to reduce global AES input level by 18dB for my presets that use a digital loop to be level matched with factory presets that don't use a digital loop.
 
Maybe what I wrote was mIssleading. I have the hiss when signal is as follows:
Guitar into RME, Spdif into Axe, spdif into RME
So, no guitar into Axe directly in that scenario. I just compared it to: direct input in Axe - from there into RME. And that was where I did have the better signal and no issues (listen sound examples in my previous post)
Is that noise present in your DI recording? Or only when you send the signal to the Axe-FX? In other words, compare DI recordings when plugging into the two different devices without any spdif connections.
 
Did you need to adjust the spdif output level (if so, how exactly?)? How do you manage to have the same input level by means of the spdif out (as digital input for the axe) as when putting the guitar directly as analog input into the axe?
I have to gain up my interface's hi-z input in order to hit the AX3's input 1 block with the same level as plugging into the AX3 directly. I just A/B'd it a few times till I got the level right. As far as I'm aware there is no "SPDIF output level" control on the Antelope control panel. I can't hear a difference in the noise floor and to me it sounds identical. Maybe someone that uses antelope stuff can show me a better way to do this.

Things that @DonProm mention are a concern for me, but I'd also like to know how to route everything properly. Does it require two spdif cables for the bidirectional stuff or do you just use something like ASIO4ALL to aggregate? Maybe @strabes can elaborate some?
I'm using setup #5 in GlennO's guide. The specifics will depend on your interface but for me, I set it up so that my interface sends the following to my DAW:
  1. mic pre / line / hi-z jack 1
  2. mic pre / line / hi-z jack 2
  3. spdif in L
  4. spdif in R
My guitar is plugged into the Antelope's second combo jack so in my daw I can record the dry signal using input 2, and the wet with inputs 3+4. I had to select which inputs were being sent to the ax3 over spdif so I just set this to input 2 which again my guitar is plugged into.

For reamping I can just set the output of my DAW track to one of my interface's outputs, which I then route to SPDIF out (in the same section of the antelope control panel that I route my guitar to when recording guitar).
 
Sorry if I'm being a bit daft, but does this sound right?
  • Axe-FX grid/routing options/setup configured appropriately (off the top of my head can't remember exact)
  • Focusrite Saffire Pro 24's SPDIF out >>> Axe-FX III's SPDIF in
  • Axe-FX III's SPDIF out >>> Focusrite SPDIF In
  • in DAW set interface to Focusrite
  • in DAW set my DI track to output to SPDIF OUT on the Focusrite
  • in DAW set up another track for reamp, set to receive from SPDIF in
  • Reamp and measure Focusrite's loopback latency/offset with this setup
  • enter the offset into DAW settings
  • any new Reamps should now line up as presumably the Focusrite doesn't have the same unstable reamp/latency offset that I get with the Axe-FX
 
Yes, that's basically it. There is no change to be made for the processed track you're recording in your DAW, since you always record the same processed source (Saffire spdif in) regardless of whether you're recording live or re-amping. You only need to change the source of the Saffire spdif out when re-amping.

You probably won't need to measure/set any DAW settings. That's only for when recording an Axe-FX via USB. If you measure it, you'll find it's off a small amount, and it'll vary a bit, but that's always the case with outboard gear.

For anybody who's following this, you'll find a routing diagram to illustrate how to do this, and detailed instructions, in the recording guide (configuration #5):
https://forum.fractalaudio.com/threads/axe-fx-for-the-recording-musician.177592/
 
Yes, that's basically it. There is no change to be made for the processed track you're recording in your DAW, since you always record the same processed source (Saffire spdif in) regardless of whether you're recording live or re-amping. You only need to change the source of the Saffire spdif out when re-amping.

You probably won't need to measure/set any DAW settings. That's only for when recording an Axe-FX via USB. If you measure it, you'll find it's off a small amount, and it'll vary a bit, but that's always the case with outboard gear.

For anybody who's following this, you'll find a routing diagram to illustrate how to do this, and detailed instructions, in the recording guide (configuration #5):
https://forum.fractalaudio.com/threads/axe-fx-for-the-recording-musician.177592/
Awesome. Thank you so much.

I'm already routing from the Axe-FX to my interface for hitting my studio monitors. Setting the rest of the config there should be straightforward. From there, I just need to get another SPDIF cable to send to the Axe-FX from my interface.

Hoping the rest will be smooth sailing!
 
Is that noise present in your DI recording? Or only when you send the signal to the Axe-FX? In other words, compare DI recordings when plugging into the two different devices without any spdif connections.
Checked the DIs: it is not present in the DI, only when/after sending it to the Axe and back (independent if I sent it analog or via SPDIF)
 
In that case I would guess it's a level balancing issue, with your signal coming in to the grid at a different level when you do it from your audio interface compared to when you plug in your guitar to the Axe-FX. This illustrates one of the benefits of using USB, since you don't have to deal with this :).
 
i got you, let's say during recording you activate and deactivate blocks. This changes the latency, right? if so, in such a scenario i don't think the DAW would be able to adjust the latency further.

And sorry if i wasn't clear, i did not mean this only reamping but for generall latency compansation when recording. With all this said, just as i don't reamp, i do also not use Axe Fx as an audio interface.
Some perspective taking for this thread, because its really easy to let a thread like this start to become a mindf$#% exercise that has folks (like me) irrationally worrying away their weekend rather than rationally thinking through a problem and testing to see to what extent it is a real-world problem for them personally. Not at all meant to apologize or diminish the importance of fixing this issue AT ALL, but this thread kept me holding off on pulling the trigger on a III for several months and it really shouldn't have. If I'd thought it through rationally, taken the numbers that are available in this thread and done some testing with the audio interface I was using (all of which is super easy to do), I would have quickly realized that to the extent there is a problem, its easily fixable, and to the extent there is variation around the solution to that problem...that variation is at most the same as the variations that I've been a-okay with for my entire life as a home recording guitarist.

Executive Summary of a weekend of testing/comparing: (1) I don't think i would be able to identify even 12ms of offset in separately recorded tracks (i.e., two different takes); though I can hear that kind of offset easily for re-recording of a track. Listening to double tracked guitars where neither has a 12ms offset; both have a 12ms offset, and one has no offset and the other has a 12ms offset, I'm pretty certain I'd never be able to pick out a difference amongst any of them. (2) Based on this it's worth setting my DAW to correct by ~10ms, which is the average offset I get, just for peace of mind and (3) There is absolutely no way I or, I would wager, anyone else, is going to be able to hear an uncorrected offset of 1-2ms, so there is no reason to worry about whether or not there are any slight variations of uncorrected offset from use-case to use-case, or from project-to-project, or from preset-to-preset. (4) I tested the Behringer UMC1820 interface I've been using worry free for years and it is offsetting tracks 40 samples EARLY -- i.e., it's printing what I'm inputting to it 0.8ms earlier than when it actually received the signal input relative to its playback of the monitor mix.

Here are some audio examples of recordings that were tracked with various degrees of offset in the range of 10ms. The double tracks include various combinations of tracks recorded with either the same or different offsets in this 10ms range.




The uncorrected offset I measured when sending an audio track of a rim shot (this started life as a MIDI track using an EZDrummer rim shot; just to be consistent, I then converted that to an audio track within Logic) out of my DAW to my III, out of its analog Out 1, back into Instrument input and back to my DAW over USB to re-record it was 460 samples. Sending this track to my III and back to my DAW without ever reverting to the analog world (i.e., same scenario as "reamping" a DI with the Axe Fx III) was somewhere in the low 400 samples range, I believe - when I saw that this variation was roughly the same as the error that the Behringer audio interface I'd used worry free for years has in it, I just stopped worrying about it. I am using a 2012 Mac Mini on Catalina OS and its core audio drivers (i.e., two generations old OS). That is less than 10ms (460/48000 = 9.6 ms) for base offset of recording an analog input. So I've put a -460 sample recording delay in Logic and don't plan on revisiting the issue unless there is an update to the Axe III that requires me to.

While nobody wants to give @Ugly Bunny another reason to start talking about latency again (cheers, UB!), his video on latency gives some context -- he measured Variax acoustic models to introduce 8ms of latency and found that undetectable when playing; when introducing tuning to electric guitar models on the Variax, the latency went up to 19ms and "you could feel it a LITTLE bit, but its not gonna screw you up or anything". Plenty of folks have said there is a "feel" issue with using a Variax to play downtuned electric guitar; but I don't think I've ever read anyone saying "I just can't use it for recording because that level of latency -- I can hear it in the final recorded product. The timing doesn't feel/sound right of recordings I make." Indeed, I've never even seen anyone ask the question in terms of final musical output. I've never heard anyone complain about the rhythmic tightness of the Variax-using band Twelve Foot Ninjas, or say that Rabea Massad is less rhythmically tight when playing his Variax with tuning effects engaged compared to when he plays a non-digital guitar.

Ugly Bunny measured a Line6 G10S wireless system to introduce 5ms of latency on its own. While plenty of folks complain about using wireless for tonal issues, it's the rare bird that says they can feel a latency problem with modern devices. And again, anyone that does talk latency talks about it within the context of feel, not concerns over what the audience/DAW is hearing. Feeding that into an Axe Fx III gives 7ms total latency. How many metal bands that are supposedly the pinnacle of needing super tight rhythms run guitar into wireless to Axe Fx without complaint about timing?

Have you ever read/heard anyone worry about the rhythmic timing issue of recording guitars with a mic 6 feet away from a cab vs. close mic'ing it (which introduces ~6-7ms of "latency")?

Back to the issue raised by unFILTERed, latency based on signal chain in the Axe Fx is just a fact of life in using digital processors -- modelers, pedals, wireless systems, whatever. Many many MANY folks that run the analog output from their modeler into the analog input of their audio interface and never sweat that the track is being laid down a couple ms later than what they actually played because they are using an Axe Fx, much less worry about whether that latency is changing based on turning blocks off/on in the patch. In other words, the only reason this worry ever even manifested is simply because once there is one inaccuracy, folks (me definitely included) start irrationally worrying about every possible inaccuracy, whether they are real-world relevant or not.

Again, not meant to be an apology, or to imply that this pretty basic level of audio interface functionality is an optional thing. Especially years after the product's release. Just giving some context in case there is another person out there similarly situated to me that has been holding off on buying a Fractal product over this issue -- take a basic measurement, put in the appropriate offset in your DAW, stop thinking about it.
 
Last edited:
We know that you can set a static reamp offset in the DAW's settings and this solves the problem for most. However, for a seemingly small subset of people, the reamp offset -- even when you account for it in the DAW -- is random enough that you have no idea where the reamp will land. Therefore the "just account for it in your DAW and then it's not that big of a deal" is not a viable workaround for those people--no matter how many times this solution gets presented. The amount the reamp is off is not always an insignificant amount, and when accumulating across multiple tracks, results in amateur sounding recordings.

As it is right now, for me (and a handful of others) reamping offset timing with the Axe-FX USB is unreliable enough to not want to do it. Due to the randomness of the offset, you have to manually nudge every track to be in time. I personally do not want to do that, especially if there is a better solution on the horizon. I'm probably going to have to resort to the SPDIF workaround (posts #88 & #89) if this issue doesn't get addressed soon.
 
Last edited:
We know that you can set a static reamp offset in the DAW's settings and this solves the problem for most. However, for a seemingly small subset of people, the reamp offset -- even when you account for it in the DAW -- is random enough that you have no idea where the reamp will land. Therefore the "just account for it in your DAW and then it's not that big of a deal" is not a viable workaround for those people--no matter how many times this solution gets presented. The amount the reamp is off is not always an insignificant amount, and when accumulating across multiple tracks, results in amateur sounding recordings.

As it is right now, for me (and a handful of others) reamping offset timing with the Axe-FX USB is unreliable enough to not want to do it. Due to the randomness of the offset, you have to manually nudge every track to be in time. I personally do not want to do that, especially is there is a better solution is on the horizon. I'm probably going to have to resort to the SPDIF workaround (posts #88 & #89) if this issue doesn't get addressed soon.
Reading your posts it appears you are reamping the same track multiple times in a single project. I f the goal is to have all of those reamps playback essentially as identical copies of the original DI, even small variations would be problematic.
 
We know that you can set a static reamp offset in the DAW's settings and this solves the problem for most. However, for a seemingly small subset of people, the reamp offset -- even when you account for it in the DAW -- is random enough that you have no idea where the reamp will land. Therefore the "just account for it in your DAW and then it's not that big of a deal" is not a viable workaround for those people--no matter how many times this solution gets presented. The amount the reamp is off is not always an insignificant amount, and when accumulating across multiple tracks, results in amateur sounding recordings.

As it is right now, for me (and a handful of others) reamping offset timing with the Axe-FX USB is unreliable enough to not want to do it. Due to the randomness of the offset, you have to manually nudge every track to be in time. I personally do not want to do that, especially if there is a better solution on the horizon. I'm probably going to have to resort to the SPDIF workaround (posts #88 & #89) if this issue doesn't get addressed soon.
All good points, but to clarify for anybody reading this thread, the problem is not restricted to those who are re-amping. Even if one never re-amps, you'll still encounter this alignment problem whenever you record your Axe-FX when using it as an audio interface.

As a refresher, here are the details on the topic sixtystring and Watt are discussing:
https://forum.fractalaudio.com/threads/latency-compensation-measurement.177851/
 
I'm trying to establish a baseline so that i can even begin to have something accurate. When I try to establish a value to set in my DAW for latency offset, sometimes its 100 samples off. sometimes its 800 samples off. sometimes its more, etc. This offage can be either early or late...so where do I even begin to pick a value to even start with?

If I can't get a reliable starting point...it makes me question whether or not I can trust my reamps to be in time. The expectation (at least for me) is that the start of my reamp should perfectly line up with the start of my DI. If this is a misunderstanding--and we should expect reamps to maybe be off by a certain # of ms from the DI--then I believe that the manual should highlight this to help alleviate the confusion & frustration.
 
Some perspective taking for this thread, because its really easy to let a thread like this start to become a mindf$#% exercise that has folks (like me) irrationally worrying away their weekend rather than rationally thinking through a problem and testing to see to what extent it is a real-world problem for them personally. Not at all meant to apologize or diminish the importance of fixing this issue AT ALL, but this thread kept me holding off on pulling the trigger on a III for several months and it really shouldn't have. If I'd thought it through rationally, taken the numbers that are available in this thread and done some testing with the audio interface I was using (all of which is super easy to do), I would have quickly realized that to the extent there is a problem, its easily fixable, and to the extent there is variation around the solution to that problem...that variation is at most the same as the variations that I've been a-okay with for my entire life as a home recording guitarist.

Executive Summary of a weekend of testing/comparing: (1) I don't think i would be able to identify even 12ms of offset in separately recorded tracks (i.e., two different takes); though I can hear that kind of offset easily for re-recording of a track. Listening to double tracked guitars where neither has a 12ms offset; both have a 12ms offset, and one has no offset and the other has a 12ms offset, I'm pretty certain I'd never be able to pick out a difference amongst any of them. (2) Based on this it's worth setting my DAW to correct by ~10ms, which is the average offset I get, just for peace of mind and (3) There is absolutely no way I or, I would wager, anyone else, is going to be able to hear an uncorrected offset of 1-2ms, so there is no reason to worry about whether or not there are any slight variations of uncorrected offset from use-case to use-case, or from project-to-project, or from preset-to-preset. (4) I tested the Behringer UMC1820 interface I've been using worry free for years and it is offsetting tracks 40 samples EARLY -- i.e., it's printing what I'm inputting to it 0.8ms earlier than when it actually received the signal input relative to its playback of the monitor mix.

Here are some audio examples of recordings that were tracked with various degrees of offset in the range of 10ms. The double tracks include various combinations of tracks recorded with either the same or different offsets in this 10ms range.




The uncorrected offset I measured when sending an audio track of a rim shot (this started life as a MIDI track using an EZDrummer rim shot; just to be consistent, I then converted that to an audio track within Logic) out of my DAW to my III, out of its analog Out 1, back into Instrument input and back to my DAW over USB to re-record it was 460 samples. Sending this track to my III and back to my DAW without ever reverting to the analog world (i.e., same scenario as "reamping" a DI with the Axe Fx III) was somewhere in the low 400 samples range, I believe - when I saw that this variation was roughly the same as the error that the Behringer audio interface I'd used worry free for years has in it, I just stopped worrying about it. I am using a 2012 Mac Mini on Catalina OS and its core audio drivers (i.e., two generations old OS). That is less than 10ms (460/48000 = 9.6 ms) for base offset of recording an analog input. So I've put a -460 sample recording delay in Logic and don't plan on revisiting the issue unless there is an update to the Axe III that requires me to.

While nobody wants to give @Ugly Bunny another reason to start talking about latency again (cheers, UB!), his video on latency gives some context -- he measured Variax acoustic models to introduce 8ms of latency and found that undetectable when playing; when introducing tuning to electric guitar models on the Variax, the latency went up to 19ms and "you could feel it a LITTLE bit, but its not gonna screw you up or anything". Plenty of folks have said there is a "feel" issue with using a Variax to play downtuned electric guitar; but I don't think I've ever read anyone saying "I just can't use it for recording because that level of latency -- I can hear it in the final recorded product. The timing doesn't feel/sound right of recordings I make." Indeed, I've never even seen anyone ask the question in terms of final musical output. I've never heard anyone complain about the rhythmic tightness of the Variax-using band Twelve Foot Ninjas, or say that Rabea Massad is less rhythmically tight when playing his Variax with tuning effects engaged compared to when he plays a non-digital guitar.

Ugly Bunny measured a Line6 G10S wireless system to introduce 5ms of latency on its own. While plenty of folks complain about using wireless for tonal issues, it's the rare bird that says they can feel a latency problem with modern devices. And again, anyone that does talk latency talks about it within the context of feel, not concerns over what the audience/DAW is hearing. Feeding that into an Axe Fx III gives 7ms total latency. How many metal bands that are supposedly the pinnacle of needing super tight rhythms run guitar into wireless to Axe Fx without complaint about timing?

Have you ever read/heard anyone worry about the rhythmic timing issue of recording guitars with a mic 6 feet away from a cab vs. close mic'ing it (which introduces ~6-7ms of "latency")?

Back to the issue raised by unFILTERed, latency based on signal chain in the Axe Fx is just a fact of life in using digital processors -- modelers, pedals, wireless systems, whatever. Many many MANY folks that run the analog output from their modeler into the analog input of their audio interface and never sweat that the track is being laid down a couple ms later than what they actually played because they are using an Axe Fx, much less worry about whether that latency is changing based on turning blocks off/on in the patch. In other words, the only reason this worry ever even manifested is simply because once there is one inaccuracy, folks (me definitely included) start irrationally worrying about every possible inaccuracy, whether they are real-world relevant or not.

Again, not meant to be an apology, or to imply that this pretty basic level of audio interface functionality is an optional thing. Especially years after the product's release. Just giving some context in case there is another person out there similarly situated to me that has been holding off on buying a Fractal product over this issue -- take a basic measurement, put in the appropriate offset in your DAW, stop thinking about it.

the part of your post which is a response to me, is not really related to what i was talking about. i never said i care about 1-2ms latency or that i can hear it. I was only pointing out that “if blocks are adding latency when they are added to the chain or activated, than a DAW can not calculate and compensate the latency of Axe FX and here lies the problem.”

i am still not sure if this is the reason of the problem or not.
 
Last edited:
Back
Top Bottom