Deep Inside The Internet Backbone

Related: The Lay of the Land

Speaking at video-industry events this year, Charles Kraus noticed the same question popping up at each one: Just how many over-the-top offerings exist?

So Kraus, the senior product marketing manager for content delivery network (CDN) services provider Limelight Networks, took on a daunting task. He tallied up every OTT offering he could find around the world.

His rough data: the number of OTT services available today — small, medium and large, from the most niche-oriented to the most popular — has passed the 3,000 mark. And that’s a low-end approximation, he said.

The rapid growth and sheer number of online video services does mean great business for CDN companies: According to a June report from Cisco Systems, in 2016 CDNs were responsible for carrying just more than 50% of internet traffic worldwide. By 2021, Cisco forecasts that 71% of all internet traffic will run through CDNs, with those services handling a video content load that’s expected to account for more than 80% of all internet protocol traffic.

All of this growth has put the backbone of the internet on the X-ray screen. CDNs make up a stratum in the Internet, and offer all kinds of content delivery services: video-streaming, downloads, caching, etc.

The added video traffic has CDNs keen on associated challenges: reducing latency, ensuring content security and keeping up to speed with every delivery standard used by content companies. spoke with Kraus and others in the CDN space about how they’re tackling today’s onslaught of entertainment services.

Latency, Latency, Latency

It’s a painful reality: if you’re watching a live event via an online service, you’ll be behind the over-the-air or satellite broadcast, even if it’s just by a few seconds.

A lot of that has to do with the video specs most widely used today for live, digital content delivery, according to Ted Middleton, chief product officer for Verizon Digital Media Services, which operates the Edgecast CDN service.

Take the popular HTTP Live Streaming (HLS) protocol as an example. “Traditional recommendations in the HLS spec typically induce an approximately 30-second time-behind-live [delay], based on the number of segments and segment durations required to generate a live playback manifest,” Middleton said. “Other formats have different recommendations, but nearly all of the modern HTTP-based, adaptive bit rate formats have some amount of latency induced in order to provide reliability and reduce buffering.”

It makes sense: Delivering live video content digitally must account for internet bandwidth and the number of people viewing at any one time. But that doesn’t make the end user any more patient.

An early July study from live-streaming tech company Wowza Media Systems found that any delay of 15 seconds or longer for a live-streaming sports app was detrimental to viewers, with spoilers readily accessible on social media. Wowza’s study of live streaming apps in general found latency to be as low as nine seconds, but also as high as 101 seconds.

“[Live events] only happen once, and there’s no tolerance for error,” John Bishop, chief technology officer for CDN heavyweight Akamai, said. “Unlike the earlier days of on-demand streaming, when the occasional glitch was accepted, any hiccup in live delivery isn’t an inconvenience to the audience — it’s a disruption.”

What Akamai and other CDNs are doing to make live digital video closer to truly “live” is twofold: First, make sure they’re able to ingest the content from its point of origin, as close as possible (with, say, the signal being picked up via a production truck or master control), in order to quickly transcode and deliver across the network. Second, coordinate with client-based software (especially video apps, but also hardware) to more easily communicate with CDNs.

“This can help make sure the video is optimized for the viewer’s device capabilities, network type and conditions,” Bishop said.

Part of facilitating the former is the reason you’ll increasingly hear “CDN” and “edge” in the same sentence in the years to come: A CDN’s best bet for a seamless user experience is to deploy servers and nodes as widely as possible, with the goal of narrowing as much as possible the geographic proximity between the network and the end user.

“One of the critical aspects of providing a quality streaming experience for consumers is moving the content as close to the edge of the network as possible, [something] traditionally done with an edge CDN provider,” said Ed Haslam, chief marketing officer for Conviva, a digital video analytics firm whose roughly 200 clients (mostly in the OTT space) are largely reliant on CDN services.

“They’re doing that to decrease the latency when you hit play in the video player, get the content as close to the viewer as possible, to minimize the video start time, which is often critical for consumers,” Haslam said.

Content Security

As CDNs are challenged with handling more content for worldwide audiences, they will need to store it in and move it to more locations. And security concerns among content owners are always at the forefront, CDN representatives agreed.

“As a shared service, the main challenge for us is that not all of our customers have the same content security needs, and many leverage different security solutions,” Jennifer Cleveland, VP of content sales for CDN company Level 3, said. “The challenge becomes making sure our system is flexible enough to work with each customer’s needs, without having to create a completely customized solution for each. [And] it can require extra time and diligence in the onboarding process.”

Limelight’s Kraus echoed those thoughts: CDNs are widely offering a “suite of protections for customers,” packaging in distributed denial-of-service (DDoS) and firewall protections that were previously sold separately. There’s also an advantage to having your CDN supply security services, he added.

“Because most CDNs are massive global networks, customers are protected without having to do much, because the CDNs can absorb things like DDoS attacks,” he said. “Customers won’t even know they’re being attacked.”

While movies and TV programs may be the most sought after content, Akamai’s Bishop stressed, they are far from the only assets CDNs are being tasked with protecting.

“Security doesn’t apply solely to content,” he said. “Web security in general is of vital importance to any organization delivering premium content. Keeping sites and apps up and running, even if they’re under attack, is critical to any organization’s business and brand.”

4K, VR … and Bandwidth Needs

While it may feel like a slow burn so far with regards to 4K and virtual reality (VR) digital content delivery, both applications will increasingly push both ISPs and CDNs to figure out how to get high-bit-rate content through low-Megabit-per-second pipelines.

Cisco estimates that 4K IP VOD content will account for 30% of global VOD traffic by 2021, while combined VR and augmented reality (AR) video traffic will grow 20-fold during the same span.

But while Netflix and other service providers recommend a connection of at least 25 Mbps to stream 4K content, Akamai’s first-quarter State of the Internet report found that the average download speed for U.S. internet users was less than 19 Mbps (though still up 22% year over year).

“[And] virtual reality makes it worse,” Limelight’s Kraus said. “Not only do you have to accommodate a 4K stream, you have to accommodate for the adjacent views when a person moves their head up, right, left or down. You have to take that 25 [Mbps] and multiply it by four. Right now, 4K VR is a dream, with the amount of bandwidth needed.”

Akamai’s Bishop struck a more optimistic tone: CDNs have faced similar challenges in the past, tasked with last-mile delivery of high-definition and higher frame rate video content, using less than ideal pipelines.

“It’s not a matter of throwing more servers at the challenge,” he said. “It’s making the platform smarter and more nimble, being able to deliver the higher bit rates needed for these emerging formats with greater efficiency.”