Latency can occur anywhere along the delivery chain -- first, middle, or last mile. The first mile, which represents the connection of your origin server (where you are storing and serving your video files) to the Internet, can add lots of latency. Maybe your bandwidth (or your Network Interface Card) fills up with too much traffic. Or maybe your server isn’t tuned properly for small object delivery (i.e., persistent connections) and each user session has too many round trips to deliver the best experience.
Of course, the middle mile (which is the Internet) can add an unforeseen amount of latency. There’s no way to predict how much traffic your video will be competing with (like when Netflix accounts for 30% plus of all Internet traffic during prime time). Finally, there’s the last mile that is the connection between users and the Internet (through their ISP). Again, a host of issues can creep into delivery, making your video experience less than perfect: throttling by the ISP, computer issues (such as low memory and hard disk space), and home network connectivity.
So how do you identify where latency is occurring? The simple solution is to utilize server-side and player-side analytics. But there are other things that you can do to help ensure that your video experience is latency-free. I’ve provided a few below:
1. Deliver using scalable, elastic infrastructure. Your video origin (even in multiple data centers) is still a bottleneck. Using a third-party delivery infrastructure like a CDN ensures that you can handle not only your projected traffic but also sudden spikes. And always make sure that persistent connections is enabled on your origin (better delivery environment for small-chunked files like HTTP video).
2. Serve video using multiple bitrates (adaptive bit rate). This can cut down on buffering, as video bit rates switch to accommodate the viewer’s capabilities for streaming. This is especially helpful in mitigating last-mile latency.
3. Ensure a fast start time. If you are using a cache with chunked content (whether your own or via a CDN), prewarm your cache before an event (or right after publishing) to ensure that video content is in cache when users request it. You can do this by requesting your video from different regions and locations to help fill the cache.
4. Improve the performance of everything around your video. People aren’t watching your video in just your player. It’s embedded in a website or application. When all of the stuff surrounding your video is slow, it can slow down the start time of your video, especially if your video isn’t the first thing that loads on your Web page.
Latency isn’t a mythical unicorn. It’s real -- and every video experience suffers it in some way. But learning how to identify and mitigate it can help ensure that your users have the best video experience possible.