Ingest for deep archives

Many broadcasters are faced with tape libraries that are turning to dust as binders decay and tapes disintegrate; what was once an asset becomes a liability. Investment now can turn that tape into a file asset that can be used in the future, but time is not on the broadcaster’s side.

Broadcast Engineering talked to Bruce Devlin, CTO of AmberFin, about some of the issues and processes, particularly about a preservation master versus a restored master that uses today’s technology.

Devlin explained how there are two aspects to archive management. “One is determining whether or not you can find the thing in the archive. Can I generate a business process with it?” he said. “Is the stuff in the archive any good? If I am putting content in an archive, how can I be sure I can generate revenue in 12 months or 10 years when I need to take it out again?”

AmberFin does not play in the “what is it, where can I find it” space, but the company does focus on “is it any good (QC), and could I have put it in better (encoding)?”

What may be considered acceptable encoding for today’s analog SD broadcasts may look very poor when upconverted to HD in the future. If a decaying tape asset, quad or 1in, is rescued and encoded as a suboptimal compressed file to save storage costs, then that may be a decision that the broadcast will regret in the future. Once quality has been downgraded below the original, and the master tape has been disposed of because of its poor condition, then the file has become the new master. The potential quality of the original is lost forever. Even if the tape has not degraded, it may be that the format cannot be played due to lack of the appropriate VTR. This can be the case with legacy formats such as D3.

Devlin believes that for these reasons, extreme care needs to be taken when preparing a master for archive to ensure the maximum quality. AmberFin specialize in encoding, transcoding and QC. When a broadcaster ingests a tape, by doing the best encode and the best QC (automatically, semiautomatically or manually) then a liability can be turned into a real asset.

Devlin stressed that metadata should be captured to give downstream processes more information about the original material. This includes item history, such as interlace, 3:2 pulldown and scaling. Poor encoding without QC can deliver an archive file unable to be viewed. Devlin cited an example of encoding for the Web that had been captured from file to video as 3:2 30fps and then poorly deinterlaced; it had then been scaled and reinterlaced. The result had been stored without QC, and the poor quality came as a surprise to the company that had encoded the material.

Some broadcasters have evolved a methodology that creates a preservation master file for the deep archive and a service master. Preservation ingest captures a file as close as possible to what can be read off the oldest version available. This means that future technologies that may be able to do a better job at restoration have the maximum possible information to work with.

A service master can then be created from that preservation master. This can be restored with current technology: noise reduction, painting over errors, judicious editing, etc. In the future, new and better tools still have the preservation master to work with, so “you don’t put 2009 compression footprints on an item that may be processed in 2020,” Devlin said. “What you thought was visually lossless in 2009 will look terrible in 2020.”

The preservation master may be an uncompressed MOV at the original resolution. At SD the service master may be I-frame MPEG at 50Mb/s interlaced for video-originated material and de-interlaced before encoding. AmberFin iCR tools are currently based on DCT compression, but a wavelet version is in beta trials for release next year.

The AmberFin approach to encoding and transcoding is to do the best preprocessing before compression to realize the optimum compression. The encoded file is then put through a QC process that ensures that what is going to archive is going to achieve the best results when recalled from storage in the future. Devlin stressed that encoding without rigorous QC can be a waste of time and result in potential loss of the original content in a viewable form.