A 21st Century Blueprint

Stations and networks create a new infrastructure to meet the demands of file-based distribution
Author:
Publish date:

Ten years ago, digital technology grabbed a place in the market with DVDs and HDTV. Today, it is transforming the relationships among content owner, content provider, advertiser and consumer.

“It was 50 years ago that Warner Bros. realized that TV was here to stay and began shooting TV programs on film,” says Ed Hobson, VP of National TeleConsultants and president of the Society of Motion Picture Television Engineers. “Now it’s up to the TV industry to do the same and figure out how to work with portable devices, TiVo and suppliers of Internet video.”

The approach a station or facility takes toward its plant design goes a long way toward addressing those new distribution methods, says Hobson, whose company designed ESPN’s digital center in Bristol, Conn., as well as facilities for MTV Networks and A&E Networks. The control room of today relies on information technologies (IT) instead of traditional playback devices like videotape recorders. Even videotape itself is so 20th century.

From Frames To Files

“The industry is moving from a frame-based model where pictures are sent one frame at a time to one where the program material is stored as a file,” Hobson says. Once program material is in file form, it can be moved into and around a facility in a number of ways. Reporters in the field can transmit stories via broadband connections or even wireless Internet (WiFi). And once content is in the facility, it can be moved over a variety of pipes, from Fibre Channel to Gigabit Ethernet or even 10 Gigabit Ethernet (see a glossary of tech terms on page 21).

Getting those pipes into a facility is one of the most important ways a company can prepare for the future. Traditionally, TV facilities have relied on cables connecting videotapes, decks and other gear to switchers that then route the signal between devices.

The move to file storage and transfer requires a change. For any new signal-routing infrastructure, Hobson recommends a mix of traditional video cabling and data pipes, with the latter making up at least 50% of a new facility’s routing infrastructure.

Rob Hunter, ESPN’s senior director, systems and engineering and media technology, keeps the sports channel’s 120,000-square-foot plant humming. As an early adopter, he has advice for other stations and cable networks that will soon build out with digital: Wider pipes are better. A new facility needs cabling with enough bandwidth to meet future demands.

DiGital In Stages

“Everybody should go to fiber or 10 Gigabit Ethernet as a core backbone,” Hunter says. “If you’re going to be in the HD business someday, you’ll be moving around 100-megabyte streams of data, and you’ll run into saturation issues if you only have 1-gigabyte links.” That, of course, will require more capital: Today the cost of a 10-gigabyte plant is about three times that of links with less capacity.

But once those cables are pulled, it becomes much easier for a facility to move from a videotape-based environment to one that relies heavily on video servers and computer-based ingest, storage, editing and playout.

Hunter says expectations should be tempered: Tapeless doesn’t mean all tape disappears. “We still have a large amount of tape-based material,” he says, estimating that nearly 150 hours a day of incoming material are recorded on servers. “We’re still a hybrid analog/digital facility. The transition doesn’t need to be abrupt.”

While ESPN is in transition mode, going digital has helped it address multiplatform issues. Files can be easily copied, making it easier to repurpose content for video-on-demand, Internet streaming and mobile-phone applications simultaneously.

But ESPN operates in a fiscal realm that TV stations can only dream about. So for cost efficiency, some broadcasters, such as Clear Channel, are building digital transmission systems that serve several stations.

Clear Channel’s facility in Tulsa, Okla., is now the master-control locale for KVOS Bellingham, Wash., and two other stations. A fourth, KFTY Santa Rosa, Calif., will hook up in February. “We’ll be able to get KFTY on-air for about $350,000,” says Mike DeClue, Clear Channel TV VP and director of engineering.

All of the programming, commercials and transmitter monitoring is done in Tulsa. The station sends content via tape or streaming.

The Clear Channel operation center is based on the IT/IP model. It’s filled with IBM, Gateway and HP servers and storage systems that cost less than $5,000 each and provide capacities of more than 100 hours of content.

Cost-effective technologies like those are increasingly important as broadcasters move into digital multicasting. “When broadcasters talk about adding incremental channels, it will cost us between $50,000 and $75,000 to get on-air,” says DeClue, “and that’s a channel that will be programmed and interesting, not another weather channel.”

Perhaps surprisingly, moving to digital creates some decidedly low-tech problems. Like air conditioning. As facilities get packed with equipment, ESPN’s Hunter says, air-conditioning costs can quadruple.

And that introduces yet another problem: noise. “If you triple the amount of air conditioning,” he explains, “you triple the amount of noise, and managing that can be tricky.” The noise problem will be even bigger when ESPN builds another digital facility, this time in Los Angeles. The building will be smaller, which ought to intensify the air-conditioning drone. Says Hunter, “We’re still figuring out how we’ll handle that issue with limited space.”

“Workflow is What Matters”

Air-conditioning and power requirements are problems that can seem trivial compared with the changes in job duties, business models and workflows to follow. But they prove why Hunter says it’s best for a facility to take its time in the transition and get lots of input.

“Workflow is what matters, not the technology,” he says. “We invited our production and operations people in on day one and asked them to imagine what it is they wanted the system to do and how they wanted it to work. The buy-in was unbelievable.”

Related