Streaming video
Streams and files are two key concepts that are inextricably intertwined with the technology we use in production and delivery of content. Files are defined by their start and end. By definition, a stream is “not bounded,” though in practice we use the terminology in ways that muddy that distinction. When a file is received, one may know from the wrapper, though not necessarily, its precise size, and from the metadata inside, the length of its essence. We don't think of baseband (digital) video as a stream, but it fits the classical definition. One can enter a SMPTE 292 HD-SDI stream at any frame boundary and continue to decode until the stream ends. But one has no idea how long the stream will continue in most cases.
Similarly, one could enter an MPEG stream by finding the program map table (PMT) and following the chain to decode the essence, but one may still not know when the stream will end. But we can “stream” a file, meaning a bounded content item can be played out as a stream of content, and in fact, in common usage we more often think of stream to mean this specific instance. It would be silly to argue we need to change the lexicon at this point (though I wish we could).
My point in making this distinction is that we need to be careful how we use technical terms to describe what we are doing. In the broadcast realm, we create files that are later streamed, at least for the most part. So the details of streaming video are more commonly dealt with by us as creating files suitable for streaming. To be sure, there are live events that we stream in the etymologically correct sense, but repurposing content into files able to be streamed is more generally the technology question we face.
Formats
Charlie Jablonski, ex-NBC executive and SMPTE past president, has said that NBC never met a format it didn't like. As an industry, we appear perilously close to that paradigm when talking about streaming formats. That is in no small measure driven by the proliferation of consumer delivery platforms and technologies. Ultimately, broadcasters do not have control of the final delivery pipes to the home. Who would want to?! Managing a global distribution system for content associated with your station's website would require expertise outside that of broadcast engineers and could trap financial resources that would be better applied to content creation and management. As a result, everyone buys service from providers that are happy to have that responsibility, and profit.
So we have to find ways to cost-effectively manage a proliferation of delivery formats. It would be nice if this were easily solvable. It is not, in part because it is not a stable ecosystem. New consumer delivery options pop up every year. Some operators on the Web choose to solve the problem by embedding a YouTube player and not hosting at all. But broadcasters want to deliver to mobile phones (many types), ATSC-M/H, VOD files on cable, and hosted sites for content like YouTube, Hulu and many other options. That requires knowing the details of those delivery specifications so content can be efficiently created and pushed out knowing with certainty that the consumer will have access that works the first time and every time. Consumers will quickly stop trying to watch your content after at most a few unsuccessful attempts.
All of this is new knowledge for old engineers, though younger, and perhaps more computer-savvy, techies seem to take to it more quickly. I do think, however, that a simplistic view of content as just mathematical transformations of existing files is dangerous to our long-term progress as an industry. We are losing sight of the reason we do this, which I am still convinced is to supply compelling, enlightening and high-quality content to consumers. So I interpret our responsibility as technologists as a requirement that we understand what quality is and find the best tools possible to make the content accessible in the highest quality we can achieve.
Many blades, but are they sharp?
I will share a pet peeve of mine. Look at the cut sheets for most file transcode engines, and you will likely be struck by the focus on handling many formats, not on the intrinsic quality of the output. It's not hard to grasp why this is the marketing approach. We are faced with a mounting challenge created by a multiplicity of formats that must be supported. It would be nice if a single tool could solve all problems, like a Swiss Army knife. Marketing that way gets our attention. In fact, it draws it away from the desire to do quality work, which might be at the expense of speed and simplicity of workflow. We are all pressed for time, and we are offered a daunting array of choices to build the transcode workflow we need to make delivery of this multistream world possible. If we can solve the problem easily with the Swiss knife, we might try it.
But the fact is that each of the transcode tools we find has strong and weak points in their arsenals. That, of course, leads to specialized tool choices for some stream delivery applications. In past eras in our technological life, we sometimes chose to use companies that offered specialized services to solve particularly thorny problems with proprietary hardware (and software) systems. For instance, at one time, transferring video to film could be done at moderate quality with hardware you bought at high cost, or it could be jobbed out to a company like Image Transform that provided high quality at moderate cost, and without capital investment.
Today, we are offered systems “in the cloud,” service offerings that will receive content and transform it to any output format. This has no capital cost, no ongoing labor for maintenance and should allow the provider to incrementally improve the output quality by spreading fixed costs across a wide array of customers. Even better, it allows them to pick the best tools, taking into account both quality and efficiency of operation and workflow. Where appropriate, a supplier can invest in custom development to fill a niche it perceives as not well-served by competitors. Such custom development is out of the question for most, though not all, broadcasters.
If we can refocus our energy on delivery of quality instead of everyone looking at every possible tool for each possible transcode, we might find a new paradigm has taken shape in our industry. I have high hopes that the workflow and quality challenge will be solved by the inclusion of services and the implementation of SOA-based tools, like those envisioned by the AMWA and EBU in their Framework for Interoperable Media Service (FIMS) project, which has garnered a lot of attention lately. This engineer hopes they are successful at developing cost-effective and powerful tools that get broadly adopted.
John Luff is a television technology consultant.
Send questions and comments to:john.luff@penton.com
Get the TV Tech Newsletter
The professional video industry's #1 source for news, trends and product and tech information. Sign up below.