Type here to search
Search
Plan & launch
Produce & edit
Present
Grow & monetise

What Is Latency? And How to Deal With It!

article featured image

 

  • Latency is the time between the signal and the response. For example, the time between me clapping my hands, and you hearing it
  • Latency has internal and external factors. For example, the room we are in, and the distance between us
  • Latency is inevitable, but can be limited
  • Only some latency is under the control of the user
  • Controlling latency is a balance between speed and performance
  • Read more for tips on reducing latency

If you’ve ever lost a shootout because of lag in an online game, or if you’ve ever stood at the edge of a cliff and shouted to hear your own echo, you’ve experienced the phenomenon of latency. But when latency affects performance or reliability in our online recordings, it can be frustrating and even prohibitive, especially as latency can change in an online setting. Let’s look at the causes of latency, which causes we can control and which causes we cannot.

What Is Latency?

In audio, latency refers to the delay between when a signal is sent and the audio is heard in the monitors. Latency is everywhere, even in live situations between a speaker and a listener in the same space. To understand why we’ll need to look at a little math:

there's a latency between speaking and hearing

The speed of sound is 343 meters per second. (You don’t have to know this, so we’ll use meters as it illustrates the point either way). This means that if you are standing one meter away from someone you are speaking to, there is a small delay between when you say something, and the listener hears it.

In our example, we would take the distance (one meter) and divide it by the speed of sound (343 m/s) to get the latency between the speaker and the listener, about 3 milliseconds (ms) or three-one-thousandths of a second. Increasing the distance increases the delay. So, a 2-meter distance produces about a 6/ms delay.

The difference, while perceptible to the human ear and the body, is not perceived as a difference between the eyes and the ears by the brain. Even though light travels much faster than the sound, our brains are designed to compensate for latency up to between 15-30 ms to keep our ears in sync with our eyes.

Musicians, however, will be affected by timing issues at around ms, because in addition to our hearing, our bodies and ears feel sound, and our timing in music is based on that feel. Actors, too, can find it challenging to keep pacing in a scene when acting together if the delay exceeds 20 ms.

Latency on Your Computer

Latency happens. Every process that we add to the audio chain adds latency, though often in miniscule amounts. Some we can control. Others we cannot.

Audio Flow: what is latency?
Audio Signal Flow

The chart above illustrates the signal flow of audio when recording on your computer. Each stage in the process introduces a little latency. From the voice to the microphone we have the normal delay from distance, which is likely about 1 or 2 milliseconds with proper microphone positioning.

The cable carries electricity introducing a trivial amount of delay, as the signal now travels at the speed of light instead of the speed of sound.

Your audio interface and drivers add some processing latency going to the RAM and CPU. RAM speed and size can cause some bottlenecks, as can the speed of your CPU and the number of processors. And finally, the distance of the monitors from your ears can ad some latency.

Any additional processes, including your DAW (your recording and editing software), the number of tracks recorded or playing, plugins and additional background processes on your computer can also add to latency.

How to Reduce Computer Latency

Reducing latency requires striking a balance between acceptable delay and computer performance. Increasing the performance means you can use more plugins and tracks at once, at the cost of a higher delay due to larger buffers. Too little latency can quickly lead to dropouts, clicks and other artifacts when listening or recording. The goal is to find a sweet spot between the two,

1. Direct Monitoring

Direct monitoring on a Focusrite Scarlett

Many modern audio interfaces and USB microphones feature direct monitoring. Direct monitoring is not the same as input monitoring in your DAW. Input monitoring allows you to hear the signal after processing with latency. In direct monitoring (also known as zero-latency monitoring), the audio signal is rerouted back to the monitors or headphones without processing. This means you will hear your voice without any effects or processing and with no perceivable delay.

To be fair, direct monitoring doesn’t reduce latency on your computer. It simply bypasses all of the processes that produce latency during recording.

2. Turn off Unnecessary Background Processes

CCleaner
CCleaner

Computers run a lot of background processes related to the operation of the OS and other software on your computer. Some applications, like Chrome, continue to run in the background, even when you have closed them. These programs can continue to tax your system resources.

A program like CCleaner, allows you to monitor what programs and processes are running at startup and disable unnecessary ones to free up precious system resources, allowing more audio data to be processed at once.

3. Change Your DAW Buffer Settings

Your DAW’s audio buffer size (or block size) controls how much audio data is buffered in order to process. Larger audio buffers allow more processing time and stability at the cost of latency. Lower buffers decrease latency, but may also decrease performance.

It is common practice to use lower buffer settings when recording and use higher settings when editing, allowing the full processing power of the DAW for use with plugins.

A typical setting for recording is 128 or 256. 512 to 1024 and higher is often used when editing.

4. Check Your Sample Rate

Your Sample Rate and Bit Depth directly affect the size of the data that is being processed. However, as long as your processor can handle the load increasing your Sample Rate to 48kHz (Studio Quality) will have a file size that is approximately 9% less latency than one recorded at 44.1 (CD Quality).

This seems counter-intuitive, as the file size and packet size will increase. However, because your processor is processing more data at once in each packet, it will lead to faster processing time and decrease latency. It does, however, come at a cost of increased file size.

5. Remove or Bypass Plugins

Plugins are great for processing your audio to achieve the desired sound, but they are a tax on system resources depending on the size and number of plugins used. Plugin suites, especially, can use a lot of system resources as they often are performing the functions of many plugins. Remove or bypass plugins when recording to reduce latency from processing the audio.

6. Faster System Components

If changing the DAW settings doesn’t improve latency, it may be time to upgrade either the components in your computer or the computer itself.

Faster RAM and CPU will help decrease latency by being able to process data faster. More CPU cores will increase the amount of data that can be processed at once by splitting the work over two or more processors.

For more information on system specifications for podcasting, check out What’s the Best Computer for Podcasting & Audio Production?

Latency in Internet Recording

Audio latency while recording over the internet is a compounded problem. In addition to latency caused by your local machine and the computer your guest is using, the geographic distance between you, the number and capacity of any junctions between you and your guests. Many of these limitations are beyond our control. However, there are practices we can implement to reduce the latency we experience.

Here are the best online call recording apps for podcasters.

Looking at Internet Latency

Let’s look at a best-case scenario. We’ll assume that we’ve optimized the host and guest’s computer to around 10ms of latency locally. Already we have a compounded latency dilemma, as the combined latency of the two systems is already around 20 ms. This puts us right at the cutoff of perceptible latency before we’ve even added the latency of the network. And more than likely, unless our guests work in audio, their latency is not going to be optimized.

Distance

One of the biggest factors affecting latency is distance. There’s not much we can do about it as we are limited by the constraints of the speed of light. Using our example above, let’s say I am the host in Buffalo, NY, and the guest I am interviewing is in Scotland. The guest is about 5360 kilometers (3331 miles) from me. The speed of light, rounded for easy math is about 300 thousand kilometers per second. Even if our computers were directly connected over that distance, it would add 17.9 ms to your latency, putting us at about 37.9 ms of delay, well above a noticeable latency period.

Hops

In reality, there is no scenario in which recording online connects directly from computer to computer. If we use Zoom to record the conversation, we connect to a server that acts as a junction for our conversation.

If we were to connect to the same junction, the distance between Dundee Scotland, and the junction would be added to the distance from Buffalo to the junction to create even more latency. But in practice, Zoom would connect us to separate junctions that then communicate with one another through yet another master junction which connects the junctions together, so our delay is increased even more.

In addition to hops from your online communication choice, there are hops to junctions that handle the dataflow of the internet. And there are several junctions.

Ping Plotter
Ping Plotter

The above data shows us there are ten hops between my laptop and Google.com. Each hop adds its own latency to the transmission of data, and there is significant latency at the local junction of my cable provider at row 3. This gives me a total round trip latency of 26 ms. Despite the lag from my provider, a 25 to 35 ms response is about average for a ping response. All of these junctions, however, add latency along the path of the data.

Internet Latency Solutions

There isn’t a lot we can do to control the number or quality of the junctions in our data path. There are, however, some workarounds that can improve the efficiency of the data to make it easier for the data to get through.

1. Cable Up!

If you connect to your router via Wi-Fi, you can speed up your data transmission significantly by instead connecting using an ethernet cable. Ethernet cable carries data at significantly higher speeds. The latency gain will be minimal, about 1-5 ms, but every ms counts. An ethernet cable also prevents data loss due to interference from other devices operating in the area.

2. Reduce Internet Usage While Recording

Downloads, streaming media, browser usage and other internet related services compete for the limited resources of your internet connection. Freeing up these resources allows your modem to dedicate more bandwidth to your online meeting session.

3. Kill the Video (and Screen Sharing)

Video adds to the amount of information each packet of data carries in an out of your session. Turning off the video on both ends of an online meeting often significantly improves the quality of the audio, with fewer dropouts and less latency. This includes utilities like screen sharing. Text-based messengers, since they deal with much less data, should not significantly impact your performance.

4. Call Your Internet Provider

If you, like me, experience a slow down from your internet provider, contact them and let them know about the issue. Most often, it won’t provide an immediate solution, but it’s worth making the call anyway, as many companies will not take action unless a significant amount of cases stack up. Some providers may recommend increasing your bandwidth. This may increase your ability to stream video and audio at the same time, but it will not solve the latency problems caused by a bottlenecking junction.

Latency: So What Did We Learn From All This?

Latency, like noise and reverb, is something we cannot eliminate entirely. There are quite a number of factors that affect latency. We only covered a fraction of them here – and they stack up the more processes are involved.

Due to the nature and the design of the internet, there is currently no way to reduce the latency over the internet to the point where it is imperceptible. The best that we can do is reduce the latency in our own equipment as much as possible and limit the amount of data we send to find an acceptable middle ground between performance and latency.

Need More Help?

Whether it’s more audio and tech-based stuff you’re struggling with, or have questions around content and audience growth, we can work with you in Podcraft Academy. There, you’ll find all of our courses, resources, and get access to our weekly live Q&A sessions. It’d be great to see you in there!

From idea to legendary podcast...

Plan & launch

From idea to recording

Explore

Produce & edit

Gear, software & tips

Explore

Presenting

Be the best show host

Explore

Grow & monetise

Promote and earn

Explore

We’ve got every step covered.