VU Calibration

GroupDIY Audio Forum

Help Support GroupDIY Audio Forum:

This site may earn a commission from merchant affiliate links, including eBay, Amazon, and others.

pittsburgh

Well-known member
Joined
Nov 8, 2009
Messages
240
Location
Nashville, TN
  I just finished the assembly of 4 VU buffers for a meter bridge. How should I calibrate them? I'm thinking that I want +3 to be digital clipping.
 
They're VUs, so average reading, not peak meters. On transient programme it will be hard to judge peak level, so you actually allow far greater headroom than you might expect.

0VU should be calibrated to -16 dBFS.

 

Attachments

  • Analogue - Digital scale.JPG
    Analogue - Digital scale.JPG
    46.2 KB · Views: 79
MagnetoSound said:
0VU should be calibrated to -16 dBFS.
Where does this come from? I was aware of two accepted "standards", one "american" with 0dBfs at +24dBu and another "european" with 0dBfs at +18dBu. I must say that 0dBfs at +20dBu makes more sense since most of the equipment runs on +/-15v, which sets clipping at ca. +20dBu.
 
Hi,

For a standard VU meter, set your sig gen at say 1kHz sine and set o/p voltage to read 1.23V rms.

This will give "traditional" 0VU to be +4dBm (where 0dBm = 1mW into 600R  = 0.775V)

It's really +4dBu in this situation, as most of our modern-day signal lines terminate in a Hi-Z (i.e. >10k) load.

...but it's good enough for most recording purposes, and after all a VU is a VU and not a bargraph or digital PPM!

Useful for many applications though.

Mark
 
pittsburgh said:
So I should set my signal generator to -18 or -24 and set the meters to 0?
Again, context... What equipment are the meters connected to?
As we've seen, it seems there are 3 different "standards"; you need to know what analog level corresponds to your 0dBfs.
Interestingly enough, by checking some websites, it seems that the 20dBu standard is adopted by more and more companies (360Systems and Tascam for example), which seems a good thing to me. Lavry continues to adhere to the most stringent standard of +24dBu, but in fact their units are adjustable over a range of  about 10dB.
 
Mark makes the correct point regarding VU meter calibration in the conventional manner, referenced to +4dBu (see diagram), and this would be the right place to start if your meters are going onto console outputs, or analogue recorder inputs. With reference to 0dBFS however, which was the original question, matters are a little more complicated ...


abbey road d enfer said:
MagnetoSound said:
0VU should be calibrated to -16 dBFS.

Where does this come from? I was aware of two accepted "standards", one "american" with 0dBfs at +24dBu and another "european" with 0dBfs at +18dBu. I must say that 0dBfs at +20dBu makes more sense since most of the equipment runs on +/-15v, which sets clipping at ca. +20dBu.


I found that diagram here some time ago, whilst doing some research on the subject.

From reading this, it's actually pretty arbitrary, there is no rigid standard, opinions vary and it has a lot to do with your line of work - are you primarily tracking, mixing, mastering? - but there is a ballpark within which to calibrate, and exactly what level you choose ought to take into account, not only your main area of work but, as Abbey points out, your analogue headroom as well.

I do know for sure though, you certainly should not be anywhere close to absolute zero at +3VU, or you will be hitting overs pretty continuously.
 
A standard seems to have developed in the movie industry, also adopted in the USA by National Public Radio and some other broadcast outlets: 0 VU = -20dBFS when measured with mid-frequency tone (e.g. 440Hz or 1kHz). With a 24-bit system that'll leave plenty of headroom and still be good and quiet.

Peace,
Paul
 
I'm surprised that nobody has stated one other requirement... that the tone be a SINE.

(You'll find that the readings change if you're using a square, sawtooth etc.)

I generally set 0VU = -20dBfs @1kHz sine.

Set +3VU to 0dBfs, and you'll never see the needle twitch very much at all before you're clipping.

Keith
 
SSLtech said:
I'm surprised that nobody has stated one other requirement... that the tone be a SINE.

Actually, Mr. Burnley did.

Personally, I like to calibrate all my gear using triangle waves, because I am such a pervert.  ;)

 
I built a handy 2U rackmount VU box for monitoring varying levels easily- just ran the VU meter buffer with a +12dB gain stage before it, with a switched attenuator beforehand. This had 5 positions with 6dB steps between each.

So if the centre step is 0dB (0VU = +4dBu), the lower two steps make it less sensistive (i.e. 0VU is +10dBu) -the lowest position being 0VU = +16dBu (with VU full scale being +19dBu) and the highest settings above 0dB are then -2dBu and -8dBu as the 0VU point. This allows you to patch it into anywhere between low and blistering line levels and use the "useful" part of the VU meter scale!

..I also have one in my tool box which continues this idea with another 6dB range either side which has been extremely useful- battery powered and ruggedised!

Mark
 
The digital peak meter directly tells you when you run out of bits.

The VU meter evolved in systems which did not "hit the wall", did not generally clip, which just rose slowly from 0.5%THD to a mellow 5%THD.

This is true for magnetic tape: soft overload.

This is not true for film-sound, and film recordists used neon peak indicators.

One notable exception: AM radio clips violently at -100% modulation. Yet the custom was to set zero VU just 10 dB down from clipping. Note though, that operators rarely touched zero VU. Also S/N was only 40-50dB, thus occasional peak-clip had to be balanced against being lost in hiss. And broadcast is "live"; if you think you heard a clip, you generally can't go back and check, you go on with your life. And ultimately this 10dB lead on unprotected transmitters was unsatisfactory, and limiters came into general use.

When radio consoles are repurposed for recording (where you may hear a clip many times while producing), they generally added another 4dB (from +8dBm to +4dBm) headroom. This avoids a lot of clipped peaks (and there was generally ample headroom above nominal "max" level).

Running 16dB between zero VU and clipping makes clips very rare, a few times a day.

To have NO! clips, 18 or 20dB is needed.

Which all proves that a VU meter is NOT a tool for clip-protection, unless this is 1939 all over again and you have nothing better (at your budget).

If you "must" run a VERY HOT digital level.....

Run normal signals right "at" digital clipping. "OL" light flashing every beat.

Trim VU to kick just into the red.

Run different "normal" signals. Vocals, drums, band, orchestra. Each will need a different "lead". Compromise.

No further accuracy is warranted. The peak light has little correlation to ear-loudness. The VU sorta reflects ear loudness but is designed lazy about short peaks.

-OR- .... accept that "VU calibration" is meaningless today. Adjust for mid-scale readings in loud passages of correctly-recorded material, and just use that as a guide to "approximate Loudness".
 
-16dBfs =0VU was on the original Tascam DA-88 machines. They were later modified to EBU -18.
There are only 2 standards. (Sydec please note!!) SMPTE or EBU. and these are pretty much based on whether you are using ±18v or ±15v for the analog circuitry.
To be using VU meters in digitland misses the point. They were barely adequate for old style analog which is why the Germans developed 10mS rise time and the BBC developed 5mS rise time peak reading meters.
The AES peak meter standard is 100µs rise time (or something like that). What's a VU meter?, 300mS up and down, and hardly any of those long arc ones did that anyway.
This is not a Europe bash USA discussion, but in modern recording facilities, the VU is just a means of setting digital reference levels, not for program monitoring and measurement.
 
Back
Top