Proponents of Ultra HD/4K are still waiting for the pixel-packed format to deliver the kind of “wow” factor that excites consumers to snap up new, shiny 4K televisions in droves.
Early on, the only wow factor delivered by 4K was the reaction to the price of early TVs themselves.
While 4K delivers more pixels— about four times the resolution of today’s HD—and while the price points have come down, today’s 4K content isn’t demonstrably better than the best content that’s available in 1080p HD, and still leaves something to be desired.
Still, the entire video industry seems now to be in hot pursuit of “better” pixels. The secret sauce of this new and improved 4K recipe is something called High Dynamic Range (HDR) imaging.
Following the Ultra HD buzz that enveloped last year’s International CES, HDR is expected to be one of the hot topics at next month’s gadget-fest in Las Vegas, even if much of what will be shown on the floor is of the pre-standard, prototype variety.
Cable operators, programmers, equipment makers and software companies will be fixated on CES’ 4K news. Among major U.S. cable operators, just one—Comcast—has announced plans to launch a 4K offering before year-end. DirecTV, meanwhile, was the first U.S.-based multichannel video programming distributor to introduce a small, on-demand streaming offering that will pave the way for limited live 4K programming sometime next year. Almost all others are expected to follow. Amazon announced its 4K service last week.
Part of the issue is that the 4K sets and programming now on the market are based on standards that haven’t kept up with the pace of technology. And that’s where HDR, whether it’s used to enhance today’s HDTV services or newer 4K services, will look to step in.
Seeing will be believing, of course, but HDR promises to vastly improve the spectrum and depth of colors offered on HD and 4K sets. The blacks will be blacker and the blues will be bluer. The picture will simply be better and brighter—presenting more than just a denser bundle of pixels all scrunched together on the screen.
In addition to making the pixels appear more luminous, the increased bit depth will make for smoother transitions and support a much larger range of color increments.
“HDR is one of the factors that will be important for making a large step into improving the experience,” said Daryl Malas, principal architect in the video-application technologies group at CableLabs, the cable industry’s R&D consortium.
Sean McCarthy, an engineering fellow a Arris, a maker of video set-tops and network equipment for cable operators, telcos and other types of service providers, added: “It boils down to better pixels and more contrast and more bits per pixel … [creating] a feeling of reality that we don’t have in television now. I think in the case of HDR, it will be pretty clear that it brings a realness to TV that is not available otherwise.”
And they warn CES show-goers not to be fooled by HDR demos they might see on the floor. There is still lots of work to be done, particularly when it comes to developing standards around HDR technology.
While Hollywood is already accustomed to creating content with higher bit depth and a broad palette of colors, post-production and distribution is where that work will be concentrated.
One area that will need work is the electrooptical transfer function (EOTF), whereby the camera is the optical receiver of the information and its output represents the electronic portion. The industry would likely start with a standard around that developed by the Society of Motion Picture and Television Engineers (SMPTE), but the International Telecommunication Union is also mulling over options. Meanwhile, makers of Blu-ray Disc players are also working on 4K standards, and they’re expected to include aspects of HDR. The Consumer Electronics Association and the Moving Picture Experts Group are also working on standards around HDR.
Other areas of focus for standards will include the encoding and decoding of HDRfacing content, and how to make legacy TVs and other equipment backwards-compatible to support HDR-encoded content.
Encoding players are trying to keep pace by making their architectures flexible enough to accommodate HDR. Arris is trying to achieve this with the ME-7000, a multi-codec (MPEG-2, AVC, HEVC) and multi-format encoding/transcoding platform it launched at the IBC show in September.
But despite the emergence of stopgaps for carrying and creating the signal, there’s lots to be ironed out with HDR before the TV ecosystem can even begin to form anything resembling a consensus.
Meanwhile, Technicolor, Philips and Dolby (with its end-to-end “Dolby Vision” platform) are already stepping into the breach with proprietary HDR systems as the standards are shaken out. Dolby, which has already gotten Sharp, TCL and Vizio to sign up for Dolby Vision, claims that its approach can deliver signals that are 40 times brighter than today’s TVs, with up 1,000 times greater contrast.
“There’s a lot of activity that’s happening that has not been standardized,” Malas of CableLabs acknowledged. “We will adapt as necessary down the road.”
“My message is, let’s get behind the standards,” Arris’ McCarthy said. “We’ve got to make sure the train gets all the way to the station.”
Given the state of that work today, the expectation is that several major TV makers, including Sony, Panasonic, LG Electronics and Samsung, will present HDR demos next month, and that the introduction of products won’t occur until the 2016 CES confab.
While 4K generally requires more bandwidth than traditional HD video, the good news for cable operators is that HDR won’t require a bunch more bandwidth, at least relative to the premium required to handle pre- HDR 4K streaming.
To keep 4K bandwidth in check, operators are looking to High Efficiency Video Coding (HEVC)/H.265, a codec that is 50% more efficient than H.264/MPEG-4. If some forecasts are correct, operators will need to squeeze as much life out of their pipes as they can.
Malas estimated HDR will require a “double- digit” percentage increase when HDR is added in to support 10-bit technology and the wider color gamut—well below the 300% increase cable operators had to worry about in the initial jump to 4K without implementing new encoding schemes such as HEVC.
McCarthy put a finer number on it, noting that HDR will require in the range of 20% to 30% in bandwidth headroom.
4K: A BANDWIDTH-BUSTER?
The 4K streaming era is on the horizon, but will broadband networks buckle under the stress? That’s one question that will come under scrutiny following a forecast from ACG Research that sees 4K delivering a big broadband punch in the years ahead.
According to ACG, the average household bandwidth requirement (from the 7-11 p.m. daily peak period) will jump to 7 Megabits per second, from 2.5 Mbps in 2014, representing an anticipated compound annual growth rate of 31%.
That total household fixed bandwidth average consists of broadband (Internet access) and a multichannel subscription-TV service (cable or telco TV; satellite-TV service was excluded from the study).
ACG said it believes that 4K video streams will cause the average to climb higher because they require “at least” 20 Mbps.
“4K TVs will capture a significant share of total bandwidth in 2018,” the firm said in its report, noting that the average household views more than 40 hours of traditional subscription TV services per week, versus just a couple of hours per week of Internetdelivered TV.
“The move from broadcast service, which is multicast across the metro network, to broadband video service, which is unicast, and the use of many more devices in each household will have a massive impact on the required bandwidth capacity of the metro network,” ACG added.