knoll
Inspired
I used to think the same but now I'm not so sure anymore. I've increased my knowledge about digital connections after Cliff's statement about spdif quality. I must state here that I'm no spdif expert so the following may be correct or it may not.Clawfinger said:steverosburg said:Not unless you're using such a poor quality cable that you're actually getting bit errors. That's the same argument as a fancy HDMI versus a cheap one -- it's all digital, and you either get a perfect picture or bit errors, which are generally very noticeable.rsf1977 said:I was reading somewhere that higher quality s/pdif cables have less jitter? Is there any truth to that?
This. Digital either works or not. That's it.
"Digital either works or not". While that is true there still can be jitter issues, i.e. bit value changes too soon or too late.
Jitter is caused if the clock is not perfectly in sync, and no real world clock is ever perfectly in sync. In spdif the clock signal is embedded within the actual signal that is supposedly even worse. Receiver must extract the clock from the signal and that isn't easy to make accurate. Separate clock signal would be better. <wondering why axe doesn't have word clock output?>
SPDIF cable can also cause jittering. Well actually it's not the cable itself but outside RF (and position of the moon) affecting the signal. Now, it probably won't change the intended bit value from zero to one but it can change the point in time where transition from zero to one is happening (zero turns into one too soon or too late, not in sync). That is jitter right there in the cable. So shielding is important in spdif cable. I don't recommend buying any hyped best of all cables. I use diy spdif cable made from normal good quality tv-cable + rca connectors. Tv cables should have pretty good shielding and they are 75ohm, and cheap.
So far I haven't done any jitter measurements on my gear to get into more details.