System.NullReferenceException: Object reference not set to an instance of an object. at DotNetNuke.Framework.DefaultPage.OnLoad(EventArgs e) in e:\websites\\public_html\Default.aspx.cs:line 834 Archives need standards | TvTechnology

Archives need standards

July 20, 2011

At a recent seminar on archive storage, I listened to a number of presentations that confirmed my view that we have a long way to go to get standards into common use in the media and entertainment (M&E) sector. The issue is coming to the fore with the demise of videotape, and the parlous state of many videotape and film archives.

Since the beginning of television, the fallback for the program archive has been the original camera negative (OCN) or the source videotapes. If the final program master was damaged, then the material could be reingested and recut as a last resort.

As film and television cameras move to capture as files, there are no OCNs and videotapes. This has raised the urgent need for more secure archives than a mix of external hard drives, cloud storage and proprietary data tapes.

There are many issues around archiving, not least the storage cost versus the possible future value of an asset. The focus in the past has been the life of the storage medium. The master tape was also the master of the content. In a file-based system, the content is abstracted from the medium, and a program of timely migration to the latest storage technology solves the issues of media obsolescence and decay.

Resolving the long-term storage of files is one thing, but the other issues are codecs and metadata. That is where the fun starts, with very little agreement across the industry. M&E, in common with the heritage and drugs industries, must store data for much longer than the decade or less of most IT applications. In the case of drugs development, it could be the life of a patient. For M&E and heritage, it could be forever.

Solutions so far have been to leave tape in a warehouse for 30 years and then scrabble around finding a budget to restore and migrate the content when it is found to be decaying. The current answer to all seems to be "the cloud," although the track record so far has not been inspiring.

What struck me most from the seminar was the approach to metadata. The card indices of the past have morphed into proprietary file formats expressed as XML. Without internationally agreed schema or document type definitions (DTD), we will not be creating open, interoperable archives. The difficulty lies in ambiguous definitions of the tags used in the XML. XML may be a standard, and human readable, but a house DTD is not a standard.

SMPTE has been strong in this area, but many see the organization taking too long, and creating standards that are too complex — MXF being one example. I don't see that as a reason to give up and take the proprietary route. Archivists should join and contribute to standards bodies if we are to find a sensible way forward that future archivists will thank us for.

Receive regular news and technology updates.
Sign up for our free newsletter here.