In today’s digital age, data is our most valuable asset. If this data is lost, you not only lose your work, but also your reputation and possibly even your entire business. Still, several post-production houses today fail to set up appropriate back-up and recovery systems to secure the business-essential data they hold, putting the company at grave risk. Just ask Pixar, who almost lost all of “Toy Story 2” when both their main and backup system suddenly failed. Furthermore, for post-production houses the ability to recover data quickly is of utmost importance. Any single post-production house can receive up to 100 restore requests a day, and with the average restore time taking between 15-20 minutes, this means that a team of three people spends roughly 25 hours a day on data restoration alone, which is simply not time well-spent in an increasingly competitive market.
As everyone in the media and broadcast industry knows, file sizes are rapidly growing. With trends such as 4K, 8K and HDR becoming increasingly popular, post-production houses are under immense pressure to put effective systems in place to manage files efficiently. Whilst increased image resolution and other picture improvement technologies can significantly enhance the viewer experience, they all also increase the amount of data required to deliver even a short clip. For anyone in the production and distribution business of content, these radical increases in file sizes come with a range of new challenges. Companies firstly need much more storage space than before, which needs to be organised and somewhat automated so as to not create more work for the technical teams. With bigger volumes of data, moving files around and finding the correct ones effectively becomes a challenge on its own.
The only way to secure and retrieve data efficiently is to have effective backup and recovery systems in place. To achieve this, businesses first need to get familiar with both the data they possess and the existing systems they have in place. The lack of awareness is actually the main threat to data security: Pixar’s systems didn’t all crash at the same time, but the issue of the failing backup system only rose to the surface when the main system broke down. Whilst the company was saved by an employee who had additional backup files at home, the team could certainly have been spared a massive scare had they been aware of the systems they hosted and undergone regular backup and recovery testing.
With a clear overview of existing systems, storage can then be optimized by organizing data according to relevance. This means that most recent, top priority files can be accessed quicker than others, whilst making sure older and less relevant files aren’t taking up too much of the storage space by having them archived. This cost-effective, automated system will keep the most relevant and recent data at hand as green data for 30 days or so, to then be moved to a lower amber level of the storage system after 45 days, and then any files untouched for over 60 days will be archived onto tapes or moved onto an object file system. This automated process saves engineers massive amounts of time and effort, allowing them to direct their efforts elsewhere whilst being fully confident their data is in safe hands. Combined with a clear cataloging scheme, the system will tell the end-user exactly where their data is being held and moved to at all times.
To save on storage space, old files need to be moved out of the way to make space for more relevant ones. Once a project is done, files tend to be left unprocessed, leaving them stored somewhere in the vast data library gathering dust for an infinite amount of time. We’ve witnessed that very few steps are normally taken to either archive or delete old files, which effectively fills out and overwhelms libraries making them much harder to navigate. Most of the data will, and should, be kept for future reference, as clients often return to old projects wanting to rerun or rework existing material. This doesn’t however mean it can simply be left untouched–setting up proper archiving and recovery systems will make everyday life easier for those working in post-production houses, and potentially save the business as clients often work with several post-production houses at once and will always go for whichever can provide the quickest turnaround.
Post-production houses hold huge amounts of data, and are constantly challenged by time wasted on locating, recovering, managing and moving files. Unreliable legacy systems protecting this data further adds to the headache of engineers, whilst putting the whole business relying on this data in danger. Furthermore, with producers often needing to turn projects around as quickly as possible, it is not unusual for them to work with several post-production houses at once and decide to go with the one that can meet those demands. With the right tools, recovering files will no longer be an exhausting and time-consuming task, and may even help your business win clients.
Claire Goodall is the senior client executive at Tectrade.