Storage: From film to optical

Throughout the 1940s, television really began to come to the forefront. Producers were looking at the recently developed tape technology in an effort to apply it to television, for the same reasons Bing Crosby had applied it to radio — rebroadcast and editing capabilities, according to Paul Cameron in his article “VTR Technology” in the Broadcast Engineer's Reference Book.

Crosby had invested $50,000 in a small startup called Ampex after seeing the technology AEG Magnetophon Signal Corps engineer John Mullin had brought back from Germany following World War II, according to Cameron.

In the late '40s, Mullin approached Crosby about applying this tape technology to video, and, with Ampex, he developed a prototype in the early '50s that led to the introduction of what is widely recognized as the first video recorder, Ampex's VR-1000, in 1956, Cameron says.

“The breakthrough that Ampex had was a two-fold breakthrough. One was that they came up with the concept of using spinning heads,” says Mark Schubin, technical editor for Videography magazine. “But [unlike what the Germans were doing with spinning heads during World War II] what Ampex did was have the heads spin transversely across the tape, and there were four heads, which is why the machines got known as quadruplex machines.” With the advent of this quadruplex machine came the 2in quadruplex videotape and ability to edit TV content.

“The early form of videotape editing was with a razor blade — splicing, the same as with film editing,” says Anthony Gargano, industry consultant. “It was the actual cutting and splicing of the videotape.”

Videotapes, and their corresponding machines, continued to develop through the '60s, with slight technological differences, until about 1970 when Sony introduced U-matic.

“U-matic was really the all-time champ because Sony itself sold, over the life of the U-matic tape … well over 1.5 million U-matic machines,” Gargano says.

Originally designed to be a consumer product, but failing to excel in that area because of its size and cost, the 3/4in tape held one hour of content, according to Schubin. From there, various analog videotape formats were introduced, including Sony's 1/2in Betamax in 1975; Bosch's 1in Type B in '76 (which, according to Gargano, became the standard in Europe); Sony and Ampex's 1in Type C in '76; JVC's 1/2in VHS in '76; and Sony's 1/2in Betacam in '82.

Digital videotape

Digital videotapes came into play in the late '80s with the introduction of Sony's D1 and the first digital videotape recorder, followed closely by D2, which recorded a PAL or NTSC signal with no compression and used 19mm tape. Both formats were relegated mainly to post-production applications.

“[D1] used 3/4in tape, just like the U-matic — large reels, but [it was] amazing that you could have a cassette that recorded what we refer to today as 4:2:2 video,” Schubin says.

Cameron says digital videotape continued to develop to include Sony's 1/2in Digital Betacam in 1993; 1/4in DV and MiniDV in '95; Panasonic's 1/4in DVCPRO in '95; Sony's 1/4in DVCAM in '96; Sony's 1/2in Betacam SX in '96; JVC's 1/2in Digital S in '96; Sony's 1/2in HDCAM in '97 (adding to the burgeoning interest in HD video); Panasonic's DVCPRO 50 in '98; Sony's 1/2in IMX in 2000; and Sony's MicroMV in 2001.

IT and storage

IT and broadcasting technologies have converged many times throughout the years, yet the IT industry has never specifically tailored products to target the broadcast market. But there may be a reason for this. According to market research firm IBISWorld, in 2008, the approximate industry revenue of the computer and peripheral manufacturing industry was about $62 billion, whereas the TV broadcasting industry came in at about $37 billion. Years ago, the disparity was even greater.

“No storage technology has been invented, other than the videotape … for television purposes. All of the storage technologies that we talk about, linear data tapes and so forth, hard disc, were not invented because of video; they were invented for something else, and television co-opted their use,” says John Luff, broadcast technology consultant.

In September 1956 (the year of the introduction of the first videotape recorder), the first model of the IBM 350 disc storage unit was released — a large cabinet that was part of the even larger IBM 305 Random Access Memory Accounting system. Jumping ahead to 1973, IBM released the 3340 direct access storage facility, code-named “Winchester,” which featured a smaller, lighter read/write head that could ride closer to the disc surface.

“It was IBM that developed the Winchester drive, and that was the underlying technology for all the hard drive-based products,” Gargano says. “RCA developed CED; that was capacitive head pickup technology, which actually required contact with the disk surface.”

Then beginning in the early '90s, innovations in storage systems in the IT world also revolutionized those in the broadcasting space with the introduction of redundant array of independent disc (RAID) systems. These allowed the coordination of multiple hard disc drive devices to provide higher levels of redundancy and performance than could be achieved with a single drive. The emergence of smaller, more powerful drives also helped the proliferation of RAID storage arrays, which became fairly standard in the mid-1990s, according to a 2003 article in IBM Systems Journal by R.J.T. Morris and B.J. Truskowski. And with the improvements in computer technology in the '80s and '90s, storage area network (SAN) and network-attached storage (NAS) enabled IT technologists, as well as broadcasters, to work with shared storage over a computer network, according to the article.

First NLE and video server

According to Schubin, the emergence of what could technically be considered the first storage and playout server system goes back to the evolution of editing and a partnership between CBS and Memorex.

“This company called CMX, which was a joint venture between CBS and Memorex … came up with a machine [the CMX 600] that even today is mind-boggling in its technology,” Schubin says. “So here we are at the end of the 1960s, and they have taken hard drive disc packs, which the discs themselves were about the size of a microwave oven, and the disc reader, or drive was about the size of a washing machine. So it's like sticking a microwave oven on top of a washing machine.”

These packs, Schubin says, could record about a half-hour's worth of black and white video onto disc drives and play them back. The offline editor's interface consisted of a light pen with a photo sensor that would allow the editor to click things off the screen immediately by touching it with the light pen, and that would create an edit decision list (EDL) in the computer.

“Well, the question was, how do you get that information over to the online editors? So CMX built an online editing system,” Schubin says. “This is in the era before floppy discs, so there was no way to transfer the information over to the computer in the online edit room except by punched paper tape, which was what teletypewriters used.”

Editors would then take these massive stacks of punched paper tape, he says, from the offline room to the online editor, and they would redo the edits on the actual content that was to go to air. And so the CMX 600 is widely recognized today as the first nonlinear editor.

Then in 1988, in line with the introduction of the first digital videotape recorder, now-defunct Editing Machines Corp. (EMC) had the idea of editing within a computer.

“So [what] they came out with didn't really catch on,” Schubin says. “But the following year, another company came out with the concept of nonlinear editing a little more thought out than the EMC version, and that company was called Avid. So that was Avid's first product.”

According to Schubin, the initial Avid product was an offline editor that would eventually take the place of many U-matic machines. And as technology improved, with the introduction of 8in floppy drives down to 3.5in floppy drives, the EDL was transferred from offline to online editing via disk. The quality of the Avid system then improved to the point that the online edit stage wasn't necessary, Schubin says, and that's when it went from being a nonlinear editor to a server.

But the Avid system, and similar systems from companies including Abekas and Ampex that eventually replaced the tape cart system in broadcast facilities, was being used mainly for production, according to Don Craig, CTO and co-founder of Omneon.

“Prior to 1990, the ‘servers’ were used strictly for production applications, and it was companies like Abekas and Ampex who made very special-purpose [systems]. They weren't called servers; they were just called disc recorders, recording relatively short clips that were used in production,” Craig says.

In the broadcast space, beyond production, it was time delay that helped popularize the use of video servers, according to Luff. “Time zone delay was actually one of the first things servers were used for. People were using it for delay of content for rebroadcast either on the same day or literally just a time zone delay — set an input channel recording and one hour later start an output channel playing the same video, and it becomes a continuous loop,” Luff says. “The earliest uses of servers had absolutely nothing to do with backup and had everything to do with cache, which time delay is really just a long cache.”

Craig worked with a company called Tektronix.

“In the early '90s, we started to see very compact and practical applications of JPEG2000-based compression techniques, and that coupled with the performance we started to see with the 3.5in tape drives was what enabled the [Tektronix] Profile to happen,” Craig says.

According to Craig, the creation of Profile came out of a merger between Tektronix and Grass Valley, because they were looking to build a platform rather than just an application. The first incarnation of Profile supported four channels of video for a total of 32GB of storage.

“It had to act like a tape machine, and it had to act like four tape machines in one — that was really the selling point,” Craig says. “So for the price of one tape machine, you could have four tape machines in one; the only catch was you couldn't remove the tape.” The fact that there was no removable tape didn't bother many broadcasters upon Profile's introduction.

Soon after the release of Profile, Craig left Tektronix to work abroad, but returned in 1998 to co-found Omneon. “Our initial goal was to do transmission servers along the lines of Profile, but with more comprehensive network I/O,” Craig says. “The notion with Omneon was we'd interconnect everything with a computer network, and the initial products were done with IEEE 1394, or FireWire. So [adhering to] the decisions in 1998, we elected to use Fibre Channel for disk I/O, we elected to use FireWire 1394 for video interconnect within the system, and we elected to use Ethernet for control and file transfer to these systems.”

Using FireWire, there was no connectivity limit on the number of devices that could send and receive video within the context of one server, according to Craig. And that's how Omneon set itself apart from Profile, which under Grass Valley was relaunched as the K2 line in 2005, adding some clustering features to the original design, Craig says.

The future of storage

Luff believes that optical storage is likely to be adopted in the near future, once the technology is proven and it becomes cost-effective, primarily because light “is in order of magnitude, shorter wavelength, so you can put that much more data in a same volume optically than you can with any other kind of electrical signal.” The adoption of optical may become as commonplace as fiber, for much the same reason.

Craig, on the other hand, has a healthy amount of skepticism about the medium.

“By the time the years roll by and this optical stuff ends up practical, magnetic for sure will be long past it, and I think solid state is going to catch magnetic. Then it'll be interesting to see how it goes. My guess is in 10 years, it'll all go solid state.”

Schubin essentially disregards “new” storage media and instead points to a concept of storage migration presented in a paper by Dave Cavena for Sun Microsystems. “[Cavena] said screw the technology; forget about the media. Instead, assume that you're going to constantly migrate your technology. So instead of getting the world's best hard drives, get the world's worst hard drives. Because with enough redundancy, they'll still store the data, and they're bound to last five years,” Schubin says. “And then every three years, technology is going to change, and you just buy whatever happens to be the cheap technology. The idea is that you're always storing only the data; you don't care about the storage mechanism because you're only going to be changing it.”

Collin LaJoie is an associate editor for Broadcast Engineering.