Evaluating Professional Online M&E Platforms Part I

LUGANO, SWITZERLAND—The retooling of the entertainment industry’s production technology through digital image capture means that workflows from acquisition through distribution for television and film are now exclusively file-centric. 

The emergence of a mature digital supply chain represents a tremendous growth opportunity for content owners. It is creating new markets for content and new life for legacy assets. Much of this content will be played back exclusively on phones, tablets and other devices owned by millennial consumers (among others) hungry for engaging content.

As business relationships become more liquid and new market models emerge, content suppliers must execute faster, more flexibly, with greater control over cost—and with the confidence that their digital assets are being properly stored, managed and protected. 

Emerging digital asset management and IP delivery solutions will provide media companies with better control over their assets while enhancing security and reducing costs. New on-demand, software-based media processing systems, supporting established digital cinematography cameras, nonlinear editors and VFX software are now appearing on the market.

These systems will provide alternatives for clients looking beyond facilities, or consumer-style standalone tools for file transfers, delivery of marketing media and other functions.

For industry professionals looking to make a rational assessment of what is needed in a cloud-based system to support the principal tasks associated with popular digital workflows for top notch entertainment content, the following sections touch on the most important considerations. This guide is by no means exhaustive; there is a lot more to be said about each of the focal points. But it’s a good place to begin.


Does the end-user really need a platform of this type or is the system currently in place for handling content from inception through distribution sufficient for the workload and/or the size of the archive?

That’s really the first question that needs to be asked and answered. This usually will mean a closer and detailed inspection of how materials are currently handled (and for what purpose) and if that process could stand some updating, security enhancement, streamlining or cost containment.

If email attachments, Dropbox, videotape, hard drives and FedEx constitute the framework for moving around a company’s intellectual property, there is definite room for improvement. Beyond current practices, the size of the company’s library and the frequency, type and destination of the deliverables are important concerns.

If it’s a production company, either creating content or buying third-party, it’s worth looking at how production and post-production elements—from dailies through interim cuts—are trafficked among the stakeholders.If it’s a distribution company or sales agency, the question becomes, “is the primary focus on promotion, marketing and sales?”


The taxonomy of content for feature films, documentaries and episodic television series are quite specific to the entertainment industry. For those working in the business and tasked with editorial and distribution, there’s a particular way of organizing and delivering shows according to technical “bibles”—and an expectation that there will be an assortment of versions and formats that will be created for every unique title. It’s standard operating procedure for those in the know, and a complete mystery to everyone else.

In other industries, although there are digital asset management systems in place that do a great job of ingesting and cataloging multiple types of content, they are often “procrustean” for M&E (i.e. you have to force your content into their scheme by whatever means necessary, whether you like it or not). Finding a platform that follows a broadcast hierarchy should be the goal.


Using an M&E-compatible platform is not without its operational intricacies. That’s to be expected in a system with the depth to thoroughly manage a large volume of complex content. However, once mastered, the system should have a logical design and be reasonably intuitive in the execution of key tasks.

This is where heuristics (self-educating, problem-solving tools built into the software) are valuable. These include safeguards to challenge and prevent common keystroke errors coupled with some type of contextual online help to assist the end user.

The application should, of course, be browser-based and browser-agnostic making it available in any connected location on any operating system.


Search is the backbone of any digital asset management system. It must be an engine that can access the essential descriptive fields, filter content and include at least one customizable function that allows the end-user to specify a unique category. The idea here is to accurately accelerate the search process and make it less of a back-breaking exercise.


The more integrated this process is, the better. This component should include the ability to establish a basic content record (that can be augmented later ), trigger an accelerated file transfer protocol (e.g. Signiant or Aspera) and store the content in a secure, private cloud—all in a single, coordinated step.

Moving large files to online platforms is a function of the size of the file, connectivity speed and, to some degree, the distance between data center and the original location of the file transfer point. Moreover, DAM systems of this type must interface third party transfer clients for the secure and high-speed transfer of their files for both the sender who is uploading and storing content and, eventually, the recipient who will receive and potentially download the delivery.

Without this crucial feature, any large transfer, even with decent available bandwidth, is untenable. With an advanced file transfer technology in place, both the sender and recipient leverage the power of the platform’s UDP license at reasonable cost.

This function must also accommodate collateral content, such as key art, script and all manner of text files that can be linked to the master content so that a particular project can be fully packaged for delivery.


Metadata represents the rich repository of data that describes a piece of digital content. It’s the glue that holds together and defines the database contents. This includes both descriptive data that is manually entered or retrieved from other sources and technical information located in the file header. The media platform must be able to mine metadata on ingest and auto-populate multiple fields, effectively acting as a container check of the master file. Once the content has been uploaded, the platform should allow the addition of more metadata including the uploading of information from other platforms.

This is the first in a two-part article on cloud-based media management platforms. Part two explores how cloud-based platforms manage technical operations including the management of master files, support for codecs, transcoding and review and approval.

Kenneth Yas is Managing Director—Americas for WCPMedia Services, a developer of cloud-based content management and distribution systems. Yas is a 25-year veteran of the post-production industry whose background includes senior management and marketing roles with Lightning Media, Thomson Grass Valley, The Post Group and Lucasfilm.