Content distribution networks

The media and entertainment industry is in the midst of an enormous transition. Business models are constantly being evaluated, and new services are being offered. Large media conglomerates are implementing media 360 programs, where content is made available on multiple distribution venues (e.g. terrestrial TV, satellite, broadband, mobile, etc.) and devices. IPTV initiatives are well underway, and the worldwide movement for being connected via broadband is progressing rapidly.

Because of these new business initiatives, there is an explosion of digital content, file formats, content transformation, platform-specific packaging and the incessant need to get content to so many places in ever-decreasing amounts of time.

File-based workflows and digital distribution

To address these new and continually-changing business models, content creators and manufacturers are adopting file-based workflows. Content must be in the proper format, in the proper location, and with proper essence and metadata so that it can be appropriately consumed, regardless of the listening/viewing device being employed. In practice, the media and entertainment industry is well underway in its transition to digital file-based acquisition, manipulation (editing, etc.) and distribution strategies. An ever-increasing amount of digital files are being generated, packaged and distributed, and much of this electronic distribution is occurring using the open Internet.

Because time is a huge factor in being able to generate and distribute files, file transfer protocols and file acceleration methods are being examined and adopted. It is important to understand how file transfer methods operate and what functionality to be mindful of when researching methods to move files within a digital media distribution strategy.

Moving files?

Digital files, especially large digital media files like HD ones, need to move in the most efficient way possible. WAN optimization technology can help, but file acceleration alone does not solve the problem. There are many other components that should be considered.

Files must be secure, the integrity of the data has to be maintained, assets must be tracked and verified that they were sent and received, and a variety of integration points with editing, transcoding and playout systems must be supported.

With all these moving parts to think about when developing a digital media distribution strategy, the following checklist should be considered before moving forward:

  • File acceleration techniquesThere are two fundamental approaches to WAN acceleration. One technique is to minimize data via compression, and the other is to minimize round-trip delays that are caused by network latency. (See Figure 1 and Figure 2.) With compression, users will see improvements when network capacity is relatively small. Even so, network latency is the main issue for large pipes. Further, digital media files typically do not benefit from additionaldata compression techniques.There are generic data compression techniques and data-specific techniques. On the generic compression side, Huffman encoding is typically used for text files. Run-length encoding is used for data exhibiting consistent patterns. Network comparisons, or differential transfers, is another method that can reduce the amount of data needing to be sent because it first checks to see what already exists on the target side.
  • Industry standard client connectivityBy using standard methods of connecting clients to clients, clients to server, and server to server, the benefits of IT commodity extend beyond the distribution system. Other systems and applications are able to use a standard, common interface.
  • Data confidentiality and data integrityData confidentiality refers to the securing of content so that it is not at risk of being pirated or “snooped” in transit. This differs from data integrity, which refers to the verification that the actual data that is sent is identical to the data received. This is especially important for corporate governance requirements.
  • Authentication of usersThe presence of a certificate authority and proper implementation is a means of exchanging public/private key strings to ensure that senders and receivers are bona fide members of the intended distribution network.
  • Access control — specific directories, firewall portsTransmission control protocol (TCP) is the connection-oriented protocol built on top of Internet Protocol (IP). Each TCP stream is identified by a source and destination IP address/port pair. Firewalls filter IP traffic using port information, among other things. The firewall must interpret the FTP protocol, determine which TCP ports are being used and dynamically alter the firewall rules.Many organizations employ FTP servers in order to facilitate business-to-business transactions. There are certainly a number of issues with this — among them, the inability to resume from where a file transfer was interrupted as well as the issues related to scaling the operation to large numbers of users. Low-end firewalls and filtering routers are not designed to implement adequate controls on FTP traffic. With respect to the protocol methodology in highly latent networks, round-trip delays are often introduced, causing transfer times to increase.
  • Automation of basic tasks, batch processingOnce files are being moved as efficiently as possible, it is necessary to interface and interoperate with other applications that represent the desired workflow. Files often have to be modified in some way during the creative process. Changes may involve both the essence and the metadata. File types often have to be changed, file names often have to be normalized,and individuals and groups need to be notified that new content has arrived and is available.
  • Centralized management vs. federated transfer modelThere is a fundamental difference between the two, and the choice of whether to use one or the other must be addressed prior to deciding what file transfer protocols should be implemented. In a centrally-managed model, the precise amount of bandwidth for any file transfer can be set and deterministically adjusted on the fly without affecting other transfers — forming a business-based policy prioritization. In addition, centrally managed transfers are much easier to control, secure and audit for access logs and download patterns.
  • Auditing access logs and download patternsIt may seem simple, but the ability to audit user access, control different levels of access and analyze download patterns may not always be available. However, these features are strongly suggested because they create a controlled environment that can be monitored and tracked clearly.
  • Cross-platform supportThere also exists a requirement to support multiple platforms and operating systems. For example, it is common to run a content creation application on Macintosh OS X, submit that content to a compositing application running on Linux, and then submit the final content to a transcoding application running on a Windows platform. These multiple platforms require that the distribution system software support multiple operating systems and platforms.
  • End-to-end securityBy implementing a secure system from source to target and creating a secure network, the actual payload of the data being sent is secured in-transit. Additionally, media encryption can be introduced to encrypt the actual data bits. Through the use of a certificate authority, all data movement can be tracked as data is moved, and a “certified delivery” receipt can be generated and used for electronic affidavits of all content movement.

Conclusion

To address the needs of the media and entertainment industry, a comprehensive study of all ingest, manipulation and distribution requirements of an organization must first be undertaken. The above 10 points can be boiled down to four pillars: automate, secure, manage and accelerate. By following and addressing all the items in this checklist, files can be moved securely and more efficiently, reducing costs associated with moving digital media. This increases control over the movement of files and enhances collaborative creation of content.

Tom Ohanian is chief strategy officer for Signiant.