This is funny because I remember getting my first digital audio device at work and trying to align it so that +4dBu=0VU was 0dBFS. I mean a meter is a meter, right?
I figured out pretty quickly that -15dBFS with my reference level was about as high as I could go without clipping and didn't understand why for literally years. I just experimented with it until I didn't get any clipping. It makes me jealous the knowledge that is out there today vs. when I was coming up in this stuff. If I knew half of what I know today then I'd have known twice as much as I did back then (linearly).
But what really sucks in my opinion is that you can get material that is -22dBFS, -20dBFS, -18 dBFS, -15dBFS....and they're all technically acceptable because no one bothered to specify a true analog to digital reference level standard. I never understood that. Hell if memory serves we had a system that tried to use an analog representation for the metering, but it was scaled for like -10dBFS or something so you'd constantly clip. Then we had to record and play it back and convert it several times to figure out what unity was and then figure out the offset for the bullshit meter.
I just set everything for -20dBFS = +4dBu these days and if it's off I live with it. Nobody seems to care anymore.
/old angry guy rant