Next Gen TV Dominates IEEE-BTS Gathering
ARLINGTON, VA.--Broadcast engineers from around the globe converged here for three days last week to exchange information about the industry and broadcast technology developments at the annual IEEE Broadcast Technology Society Fall Symposium.
In spite of the demands of the ongoing television spectrum repack, some 160 engineering consultants, equipment manufacturers, and broadcast group personnel took time away from their duties to join academics and others from as far away as Russia, Brazil and South Korea at the Oct. 9-11 tech con.
Next-Gen TV was the hot topic this year, with nearly half of the presentations falling into this category and nine directly focused on ATSC 3.0’s potential, field testing, deployment, business models, and evaluation metrics. Presentations even included some “one step beyond” views into the future television experiences. One described methodology for transmitting information to provide sensory stimulation beyond sight and sound, and a lunchtime keynote presenter described taking virtual reality out to the ball game.
BRINGING BACK THE ACTION WITH VR
In her presentation, Uma Jayaram, engineering director for Intel Sports True VR division, described why her company decided to launch a virtual reality TV sports production element, and discussed some of the technical issues that had to be resolved to make it possible.
“The timeline for [introduction and acceptance] of new use cases is dramatically shortening,” Jayaram said. “The telephone took 75 years to reach the 50 million user mark; ‘Angry Birds’ took just over a month to reach this mark.
“Companies such as Intel realize that you cannot afford to join the party after it’s well underway and things are moving really fast," she continued. "So, you place some bets, you get into the ecosystem, you play with it, you try to move it, and essentially see what’s going on in a more involved manner.
“When you think of Intel, you ordinarily think of the more traditional segments—CPUs, graphics and so on, but some of the big bets we are making have to do with artificial intelligence, VR, hygiene, and automated driving” observing that all of these emerging technologies involved moving and processing massive amounts of data, an Intel specialty.
“These areas are moving so fast that you want to be in that ecosystem,” said Jayram in explaining Intel’s decision to launch the sports business unit two years ago.
“On ‘game day’ we show up with our cameras, we set up alongside the networks and stream VR experiences in near real time,” she said. “We provide [feeds] to about 10 right’s-holding broadcasters.”
In her presentation, Jayaram did flag some VR limitations, noting that perhaps the biggest centers on the cumbersome headgear that consumers must don.
“I can’t watch with a headset for more than a few hours,” she said.
As one solution, her unit is creating shorter duration or “snackable content” sports highlights packages. “We’re also working to provide a better user interface…something that would allow a person to do VR viewing while still interacting with friends,” she said.
She also described some technical considerations in VR that don’t really exist in ordinary television coverage.
“Time synchronization is very important when you have six separate cameras that all have to be synched together along with the audio," she said. "Seamless ‘stitching’ of the multiple views is also very important.”
Branding of the viewing apps for consumers is also something of a challenge as Intel's Sports unit is dealing with multiple broadcast entities operating in several countries. “In the U.S., NBC has the rights, so we have to have an NBC app with the 'NBC look and feel' and colors. We now have to put out about 33 of these apps [to accommodate the various broadcast entities and viewing devices].”
Jayaram also noted that a business model has to be established before widespread deployment of VR sports coverage can take place.
“We’re looking at subscription models,” she said. “In the end you have to make money. This is still being worked out.”
WHAT’S HAPPENING OUT IN THE [3.0] FIELD?
Status reports on ATSC 3.0 rollouts in three U.S. markets were part of the conference program, with WRAL-TV’s Transmitter Supervisor Matt Brandes describing activities at his company’s Raleigh-Durham, N.C. test facility. Fred Baumgartner, director of Next Gen TV implementation for Sinclair Broadcast Group, provided an update on the station group's Dallas-Ft. Worth ATSC 3.0 single-frequency network initiative, and Pearl TV’s Dave Folsom described goings-on in connection with the "Phoenix Model Market project.
All offered some lessons learned and dos and don’ts takeaways.
“If you’re a person blessed with the privilege of setting up a brand new ATSC 3.0, please download and read the ATSC recommended practices on the Physical Layer, go play with your toys, and then come back and read the recommended practices again,” said Brandes. “Also, buying a new encoder might do more for your coverage than putting in two transmitters. If you can get three dB improvement [in encoding], this is equivalent to doubling your power.
Speaking for Pearl TV, Folsom noted that early equipment implementations are still not complete and software versioning is a big issue
“Anytime someone makes a software upgrade, everything downstream quits working, including the receivers," he said.
Folsom added that the ability to interchange encoding, packaging and encapsulation units isn’t fully developed yet. “You should be able to connect up the unit and it should work, but it doesn’t," he said. "The standards are clear, [but] there’s a lot of misunderstanding among manufacturers about what needs to be there.”
Baumgartner observed that monitoring and control in connection with SFN transmitter sites is a potentially big issue. “We’re just starting to put our toes in that water and we can run [the Dallas-Ft. Worth installations] from our handheld devices, but at some point, it’s going to start to look like MediaFLO and then you’re got 300 transmitters out there and you have to start thinking about trucks, spare parts, and support infrastructure," he said. "You have to figure out how to make it all run economically.”
‘HACKING’ NEXT-GENERATION TELEVISION
Although the ATSC 3.0 racehorse is barely out of the stable, due to its hybrid broadcast/internet delivery, concerns are already surfacing about vulnerability of signals to manipulation by pranksters or others with darker agendas. Broadcast cybersecurity expert Wayne Pecena examined this in his presentation, “Hacking ATSC 3.0.”
He noted that the broadcast community was not exactly a stranger to nefarious intercepts of program streams, citing the 1986 takeover of HBO’s satellite transponder by “Captain Midnight,” and the “Max Headroom” incident the following year in which the studio-to-transmitter feeds of Chicago stations WGN-TV and WTTW were breached and third-party content aired.
Pecena observed that those incidents had required a lot of technical savvy and large equipment costing many thousands of dollars. “Now someone with a tablet computer sitting in a coffee shop can wreak a lot more havoc,” he said.
He cited the potential risks to broadcasters of ATSC 3.0 stream hijacking, including dead air, loss of revenue, public embarrassment, data breaches, potential legal liability, loss of public trust, and impact on station resources.
Pecena urged station operators to gain a thorough understanding of their IP systems and think about ways in which they might be compromised, observing that “the Internet of Things really provides only minimal, if any, safeguards against hacks, with security often being an afterthought or a ‘neverthought.’”
He said that even with the security components inherent in ATSC 3.0 and a tightening of security at broadcast facilities, there still could be breaches occurring within the home environment.
“The consumer industry must adopt stronger IoT security features,” he said. “The weakest link determines the overall security of any system.”
During the gathering the society also recognized Merrill Weiss with its highest honor, the Jules Cohen award for excellence in broadcast engineering.
The BTS Symposium moves to Hartford, Conn. in 2019. Conference dates are Oct. 1-3.
For a comprehensive list of TV Technology’s ATSC 3.0 coverage, see our ATSC3 silo.
The latest product and technology information
Future US's leading brands bring the most important, up-to-date information right to your inbox
James E. O’Neal has more than 50 years of experience in the broadcast arena, serving for nearly 37 years as a television broadcast engineer and, following his retirement from that field in 2005, moving into journalism as technology editor for TV Technology for almost the next decade. He continues to provide content for this publication, as well as sister publication Radio World, and others. He authored the chapter on HF shortwave radio for the 11th Edition of the NAB Engineering Handbook, and serves as editor-in-chief of the IEEE’s Broadcast Technology publication, and as associate editor of the SMPTE Motion Imaging Journal. He is a SMPTE Life Fellow, and a Life Member of the IEEE and the SBE.