Getting your pro audio to behave and stay locked with your video can sometimes be a daunting task. While video quality of smartphones and inexpensive cameras is increasing with every generation, audio is sorely lagging behind. If you’re going to stream musical performances that are important to you or that have a professional feel, you’re going to want to capture that audio directly, and likely, separately from your video. Here are some tips to put it all back together before you stream it to the teeming masses.
Step one in your audio process should be to set up as if you were tracking in your studio. If possible, get everything coming into your audio interface and utilize an audio interface that allows for latency free monitoring and mixing. While I utilize the Universal Audio Apollo series, they are not the only choice. Focusrite, RME, MOTU and many other top companies offer interfaces with accompanying mixing software that allows you to create a live mix without a DAW.
If your audio interface includes preamp simulation or plugins that will help you get a studio sound in a live situation, now’s the time to enable them. When I’m tracking acoustic guitars, I’m likely going to utilize a compressor, EQ, and some ambiance plugins - so why not use them live if the option is available? Many audio interfaces allow you to put plugins on your mix with no discernible latency. If you don’t have an interface that does this, you can always open up your favorite DAW, enable monitoring and load your plugins up in a mixer window, then route the audio from that computer to ANOTHER computer which handles the live stream. Don’t worry about the latency, we’re going to be monkeying with it on the streaming side anyway.
Open Broadcaster Software, or OBS, is extremely popular not only because it’s free, but also immensely powerful. These apps basically let you stream to a variety of services, but will let you combine multiple cameras, angles, audio sources and graphics before the stream gets to the site you’re using. Think of this app as a software version of a big video mixing board. You put everything into your streaming app, choose your video sources and audio sources (which you can switch on the fly) and head off to the races.
Now that you’ve added in your devices (like your camera capture device and audio interface) you’ll want to adjust the sync offset for each device. This can be entered in as negative or positive values, depending if the audio is earlier or later than the video. Even if you’re using tons of plugins and have some latency, some video capture devices may be slower than your audio interface due to codec and compression happening in the background, so don’t assume your audio is always behind!
You can test this out in one of two ways - you can simply enable a live stream and have someone watch it, then they can report whether you’re a little ahead or a little behind in terms of audio. I usually utilize a snare drum to act as a ‘slate’ as it’s a simple percussive sound that you can easily tell is behind or ahead of the video. If you don’t have the manpower to monitor the feed, you can simply record a test run with OBS or your favorite streaming software. The recorded test run will utilize the same audio/video sync values, so if it looks good in the recording, it will likely look good to the world on the streaming side.
This is only one small aspect of professional AV broadcasting, but it’s an important one, and it’s something that is easily addressed before you stream. Now stop using that built in camera mic and start using that expensive audio interface you’ve got in your studio!