IP’s Impact on Imaging Tech on Full Display at NAB Show

Grass Valley LDX C180 camera
At the 2026 NAB Show, Grass Valley will feature its LDX C180, a compact version of its LDX 180 camera and designed for both Steadicam and PTZ applications. (Image credit: Grass Valley)

The history of camera development is often discussed in terms of the most visible changes— SD to HD, HD to UHD and standard to high dynamic range. As improved technologies make those things easier to achieve, today’s user demands seem defined more by considerations of workflow, convenience, and—tellingly—the kind of photographic creativity that has not always been a priority in broadcast as much as it has in the world of single-camera drama.

Post Consultant Gary Adcock suggests technologies that have been waiting in the wings for some time are finally entering the mainstream.

“Broadcasters are finally awakening to ST2110 and how that’s going to change things,” he said. “I’m working a lot with productions who are doing virtualized stuff—LED walls—and [with] 2110 we start not being bound by frame rates. You come out of a video card, you’re bound by DisplayPort, but when you do 2110 you’re not.”

Let’s Get Small
The triumph of IP is long-awaited, but Adcock predicted changes for which the industry may be less prepared.

Post consultant Gary Adcock

Gary Adcock (Image credit: Gary Adcock)

“I don’t think some of the broadcasters are being honest with themselves about how small cameras are going to become,” he said. “You can have someone go out with a DSLR and a gimbal and a wireless feed and follow the player around. The minimal depth of field gives that dominant look to a player.

“At the final American college football game, say—NFL was there with a couple of
Alexa 35s recording, the broadcasters are there with a couple of larger cameras. But 80% of the cameras on the field were FX6 or smaller, FX3, A7s.”

Adcock’s future predictions coincide with those of many others: “This big push for immersive tech—I don’t see people wearing a headset,” he said. “There are Cosm facilities in Dallas and L.A. now—that’s a mini-Sphere for live sports. They’re shooting multiple, 8K, spherically-configured cameras. It takes two days to set up, there’s someone live switching the event, and it’s a huge effort—but this is the kind of new tech that I see.”

The Real Challenges
Mike Bergeron, product manager for video infrastructure at Panasonic, reinforces the idea that “the challenge is not getting a good picture at this point.”

“[Even for HDR] we have a log system we’ve been doing for a very long time, but for our live cameras we’re mostly [encountering] HLG,” he said. “The only real, cross-vendor, end-to-end workflows for HDR that aren’t proprietary are HLG, and it tends to be what’s specced in arenas for that reason.”

Bergeron attributes the triumph of IP to convergence.

“There were several islands of IP technology,” he said. “On one hand, ST2110 is pretty well-established on the broadcast side. For a larger system, it’s actually cheaper, and we’ve been pushing to get people to try to move downmarket with it. It goes back to when we introduced the Kairos platform to take advantage of that.

“The other island is your gigabit stuff with things like NDI,” Bergeron added. “Once someone’s got a 2110 network, they’d often like to run some NDI and some SRT into that. Just because you’re doing 2110 doesn’t mean you’re not bringing a Zoom feed in.”

Bergeron credited easier implementation to providers of commodity network hardware, something IP video was always intended to enable. “Netgear’s gotten into the game with the new switches they keep launching,” Bergeron pointed out. “They’ve got people working there who care about the AV industry. And then you’ve just got streaming transport—WAN stuff, which kind of seems to be converging on SRT to get things from one place to another.”

Well-Established Fundamentals
Klaus Weber, Grass Valley’s director of product marketing, begins with a familiar bit of camera philosophy.

“Cameras have always improved over time…one could say we have reached a level of performance that is close to what the human eye can appreciate. Even so, just at the end of last year, I was attending a camera shootout in North America with similar types of cameras from three vendors on the same scene, and I can tell you they’re not all the same.”

The company’s current lineup derives from well-established fundamentals, Weber said. “We were the ones who introduced CMOS into the broadcast space in 2012 with the LDX 80. From the beginning, we introduced CMOS with global shutter. In modern production, there is almost always an LED wall involved, and they are quite challenging for rolling-shutter cameras. Maybe it can be compensated for, but you may have an application where rolling shutter doesn’t work.”

Those foundations, Weber said, have allowed the company to create large-sensor cameras that satisfy broadcast workflows, which may be frustrated by accessorized cinema cameras. “Our first Super 35 imager is an in-house development—it’s based on the same components on our 2⁄4-inch imager on a 6.25 times larger area,” he said. “Instead of 8.2 to 8.3 million, we have approximately 52 million pixels and a PL mount. This is our LDX 180, which looks like an LDX 150 but with a larger imager and the same workflow.”

Sensor size aside, Weber expands on Bergeron’s thoughts about the application of commodity networking—and particularly wireless options.

“In the old days, the standard was analog RF; this was replaced with digital transmission, but this equipment is now often replaced with systems that use standard technology,” he said. “The question is how to integrate 5G signals into the infrastructure. We work closely with Vislink from the U.K., who have a 5G solution integrated into our camera.”

Sony PXW-Z300 camcorder

At NAB Show, Sony will showcase its PXW-Z300 camcorder, which incorporates a dedicated AI-processing unit and an image processing engine. (Image credit: Sony)

That sort of integration is characteristic of much recent camera development, although Sony’s camera team has still been busy pushing specifications in the right direction.

High Frame Rate
“Over the past few years, I think the biggest change you saw was different cameras that could do high frame rate,” says Robert Willox, director of business management for live production systems at Sony. “Last year, we introduced a box camera that could do 8x slow motion, which has sold very well. Beyond 6x and 8x, things are on the screen too long.”

Robert Willox, director of business management for live production systems at Sony

Robert Willox (Image credit: Sony)

Collaboration between cameras and lenses has been normal in still photography for a long time. Willox reported that new applications—as well as a sheer thirst for optical performance—have provoked the adoption of more camera-lens integration in broadcast, too.

“Adjusting the circuit that puts luminance back into the corners of the image so you can use the full range of the lens—at 122:1, you can get any shot from anywhere in the stadium,” he said. “It’s an amalgamation of forces to get profiles from their lens into our camera…and it becomes even more important downstream with AR.”

Lenses are an enabling technology in many areas, Willox continued. “Very few applications were going to have all-Super-35 on a sporting event because the lenses didn’t exist until very recently,” he said. “They were for bumper shots—and those cameras were successful. We also put them into houses of worship where the front-of-house cameras would get that nice, shallow depth-of-field background—and also corporate environments, high-end telepresence.”

Willox is clear, though, that moderation is key. “For sports, especially, I think it’s always going to be a mixture of a lot of different formats,” he said. “The goal-line boundary, the line markers, are going to be very wide—probably a ¹/₃-inch to be able to get the coverage. Shallow depth-of-field cameras really do have a place in a reaction shot—someone runs out with a Steadicam and you can get the hero shot and use it for the creation of bumpers.”

Beyond Broadcast
The pressure to identify future technologies has sometimes led to wild speculation—though the industry’s interest in immersive formats could not be clearer. Bob Caniglia, director, sales operations, North America for Blackmagic Design, said the growing trend of using cinematic cameras for shallow depth of field has led to more interest in the company’s larger sensor cameras as live cameras, which was not really their original intent.

“But one thing that kind of started the last 12 months or so was the immersive camera in live events,” he said. “They did an NBA basketball game using them for the Apple Vision Pro headsets.”

Modern technology, whether applied to the camera or supporting infrastructure, seems mostly intended to create flexibility and relieve the consequences of choices as basic as framing and camera position, according to Caniglia.

Bob Caniglia, director, sales operations, North America, for Blackmagic Design

Bob Caniglia (Image credit: Blackmagic Design)

“Our cameras wind up in places you don’t expect them in…even the 17Ks are getting more use because people have been exposed to it,” he said. “Right now, when you look at broadcast, they love to over-shoot for establishing shots. You go to an event early, take the camera into the helicopter and you can get amazing footage that way, the larger the sensor the better, and we’ll pull whatever we want from it.”

New applications can demand pixel counts way beyond conventional broadcast. “In my living room I don’t see [the benefit of] an 8K TV, because my living room isn’t that big [but] spheres [use] resolution we don’t normally see,” Caniglia said. “It makes sense because of the environment you’re in. I could see a moment in time where you can go to the local movie theatre where they would have the headsets, they would have the live feed, and in an environment where you’re interacting with everyone else and cheering for the same team.”

That sort of broadcast might change more than the experience—it might change the technology, the workflow, and the entire business model. Ultimately, the practicality of that depends on audience engagement, and the short-term reality seems more prosaic. Still, if there is a sustained public interest in beer and pretzels in a video sphere, the industry could not seem more ready to oblige.