Video networks

Tapeless workflows have arrived. But are your files in the right place? Do you need to get them elsewhere? Reliably? Quickly? The list of potential requirements is very long.

We are so used to the reliability designed into technical broadcast systems, because, simply, they work, they're in real time, and they're reliable. Of course, that perceived reliability is the result of carefully established routes, relationships with telecom and satellite services suppliers, and multipath and failover design — all values deeply embedded into the very heart of current broadcast systems. Then file-based workflows came along, moving files via computers, from site to site, to and from remote sites, uploading and downloading — all usually done using some form of FTP.

Established and standardized through the Internet Engineering Task Force (IETF) and its predecessors, originally as RFC 114 in April 1971, the much-updated FTP today provides a simple transfer, with multiple clients, and many forms of added value. Such values are in security, authorization, efficient transfer, embedded encryption and notification.

On UNIX-like systems, a daemon runs and provides a service. There are thousands of variants, some tied to particular UNIX-like systems, some providing additional functions. Most, but not all, interoperate via the standardized protocol. Private versions abound, affording utility and even security through obscurity, though requiring root access to both ends for installation.

Today, there are many forms of IP-based transfer: FTP, managed FTP, transfer over HTTP, direct link technologies with security such as Point-to-Point Tunneling Protocol (PPTP), and more recently peer-to-peer (P2P) technologies, such as BitTorrent and Skype. (Skype has efficient file transfer built in and good firewall traversal.)

So what is happening in the file transfer market that is of interest to broadcasters? The answer is a great deal. Software technologies, services and DIY options exist, using cheap or open source forms of accelerated FTP. It's clear that there are several highly developed products and integrated services that lift the base level capability to the point of becoming a delivery system that achieves the levels of reliability required by broadcasters.

Myths about file delivery

First, let us nail down a few myths about file delivery systems:

  • Myth: FTP is not secure and loses packets. The reality: It all depends on what type of FTP you use.
  • Myth: User Datagram Protocol (UDP) is faster than FTP. The reality: Sometimes it is, but is has other issues.
  • Myth: FTP is used to load content to video servers. The reality: Those that are mostly used to load content to video servers are often private FTPs.
  • Myth: Video networks offer fast FTP services but don't integrate easily. The reality: That's a big myth; many providers offer APIs to their services today.
  • Myth: The software only works on Windows and is proprietary. The reality: Most vendors offer software on several platforms.

If you already have encryption, can you continue to use it? The answer is that transferred files are agnostic to their content; they're just bits. However, encryption in transit is a control on efficiency of transfer. Thus, reliable transfer with integrated manageability of encryption overhead is a high value.

File transfer acceleration

A need to back up large amounts of corporate data over various file links has emerged. Typically, a large organization, such as a networking equipment company Cisco, has a high value core of its own software that is in constant development, producing close to 1TB of daily changing data. Cisco's well-known router code is held and backed up on multiple sites, transferred frequently and securely by a Signiant system originally created by Nortel. The problem space addressed was simple to manage, allowing Cisco to reliably transfer its internal files across multiple sites, building in high efficiency, manageability and security, with tolerance of poor-quality networks.

To do this, Nortel turned to the team that had developed its own IP stack — the same stack that runs a large part of many international telecom networks today. This is serious, hard stuff and not the domain of small software companies. The requirements began in the corporate WAN backup area, attracted high investment and gained maturity. This is good news for broadcasters as these systems are now tuned for large file (video) delivery.

A needs analysis showed that files must be managed as they're transferred, and that a job ticket could be created that afforded manageability of the in-transfer session. Imagine initiating a terabyte transfer from home, or even at your station. It's likely you'd need some help!

Thus an important concept is introduced — initiating a transfer and then managing it all the way through to a final destination, which may be a video server.

Today in broadcasting, there are several companies offering services and technology that are built on these key values of reliability, manageability, security and acceleration, and in some cases with integrated workflows.

Technically, the majority of the acceleration component is achieved through a combination of repacketization, adjusting the delivery datagrams, taking into account the latency of the network connection in use and transferring the transport from the high overhead FTP to a lower overhead UDP.

Additionally, most of the software players have their own secret sauce, a patented code used in combining the compression algorithms with the packetization algorithms, thus providing a type of load shaping that moves faster. The math behind this is complex but is partly aided by the fact that cryptography and packetization both use similar mathematical approaches. In certain circumstances, they can be combined to provide an efficient transport.

Management is then the key, and this leads us to the nature of integration of such services, either through Web services over Simple Object Access Protocol (SOAP) or using a deeper API. This capability, which most providers offer to some extent, is the key to efficient transport integration, and to success in file-based transfer. In evaluating the integration of any such system, consider the integration of these services into your workflows. Deep APIs are attractive for complex integration. Most vendors offer a simplified product based on the core enterprise product with a straightforward start-up. Then they bring these systems into business use in short order.

Urgent delivery

Many of these software products originate from the complex world of telecom software and were developed originally for UNIX platforms. (UNIX began as a telephone operating system.) Today, companies offer fully compiled versions for the major UNIX-like systems — HP UX, IBM AIX and Sun Solaris — and for many Linux distributions, as well as a full compile for Windows servers. Core UNIX-like systems offer the advantage of efficiency right at the heart of the IP stack.

Some systems operate with node servers — which can act as an industrial P2P node, affording internode transfer as an additional function — with a management server, usually doubled-up for reliability. Thus a server is established at each major site and can send or receive, as well as initiate, local connectivity through a LAN. The management server communicates with the remote nodes and, by using network probes, adjusts the transfer as is required to be the most efficient, have the least cost or be the fastest.

These servers are scaled to the duty required and typically have strong connections to a storage architecture close by. The file transfers are usually below real time. They can be above real time depending on the nature of the files being moved and the compression and encryption being used. The better systems offer operating control on the trade-off of encryption versus cost of network connections, affording a valuable capability to cost-control transfers and set up routes that are business-effective.

An example network in full deployment is at NBC Universal (NBCU), where multiple nodes (50 at the last count) on multiple sites provide all file-based transfers within the company, including video files, scripts, corporate documents and multimedia. NBCU effectively uses its gigabit connectivity across its major sites and can integrate content other than video and audio files with ease.

The real value of all of these systems is in delivering your content reliably and securely — with proof of delivery — and at high speed, even over poor-quality networks.

So what happens when a news story already in transit needs to be delivered more urgently? In a simple FTP system, there is no option to speed the process. However, with today's managed systems, this can be done, as can rerouting to another site during transfer. From the management server, the real-time overview of a particular transfer can intercede part way through and command a different route, transfer rate or even a new target site.

Another scenario is the journalist in the field, operating across the public Internet. With integrated automation, the newsroom can be alerted to inbound content, and the state of the transfer can be monitored, even accelerated.

Publicly generated materials and user-generated content presents a new set of problems to the news desk. Lightweight forms of the software can be supplied over the public Internet through standard Web browsers that initiate and manage such transfers, aiding the reliability of submission. Authority and job tickets can be issued over the telephone. Thereafter, the inbound content is managed by the station, and the content is routed directly to the right desk, taking advantage of deals on bandwidth costs.

Conclusion

It is time to carefully review such software — not to be confused with simple point-to-point file acceleration — and to remember that adding encryption afterward usually slows things down if not integral to the software design. Costs can be controlled through efficient use of IP-based connectivity, and the load can be partitioned and scheduled to business advantage. Broadcasters can take advantage of multimillion-dollar developments in telco and corporate WAN to add file-based transfers at reasonable costs.

Glenn Hall is a senior associate at the Bakewell House Consultancy.