Select Page

TV Tech Talks – Remote Production – Rebuilding for the Future [ Recorded Webinar]

Published on Feb 26, 2021 | News

Wes Simpson (00:00:11):

This is Wes Simpson. I’m Contributing Editor for TV Technology Magazine and founder of I’m here today representing TV Technology and providing one of our updates on remote production. This is the third in a series of three sessions that we’ve had and the time now is to really focus a little bit more on the future.

Wes Simpson (00:00:35):

We’ve spent a lot of time talking about how people reacted in the past and some of the changes that they think have been happening, but for this panel discussion, I really want to see if we can focus on some of the changes that are going to be long lasting, the things that are really going to impact the industry over the next three to five years or however far out our panelists want to take a look. Speaking of panelists, I’m really happy to introduce, first of all, Jim Jachetta, who is CTO and co-founder of VidOvation. Hi, Jim.

Jim Jachetta (00:01:11):

Hey Wes. How’s it going?

Wes Simpson (00:01:13):

Great. Also Ian Main, who’s the Technical Marketing Principle for Teradici.

Ian Main (00:01:19):

Hi Wes. Pleased to meet you.

Wes Simpson (00:01:21):

Nice to meet you, Ian. We also have Richard Dominach, Director of Product Management for Raritan.

Richard Dominach (00:01:29):

Hey there, Wes. Hello, everybody.

Wes Simpson (00:01:32):

And last but not least, Robert Erickson, Strategic Account Manager for Sports and Venues at Grass Valley. Welcome Robert.

Robert Erickson (00:01:41):


Wes Simpson (00:01:43):

All right. So one of the things that is really interesting to me, and I think has been a trend in the industry, is that we’ve completely changed the meaning of what we were talking about for many years now called at-home production.

Wes Simpson (00:01:59):

Before we were talking about at-home as maybe remoting a venue back to a production facility, but come March of 2020, we started literally talking about people producing live events at their homes. So Jim, how did your business and how did your customers react to that change of circumstances?

Jim Jachetta (00:02:22):

Well, that’s a great question, Wes. For our business, we really didn’t change anything. We didn’t really have to innovate our product or our offering. We’ve been promoting the at-home or REMI, whatever you might call it production model for more than five years. The first type of event we did and we’ve established a niche for ourselves is live reality TV.

Jim Jachetta (00:02:53):

The first major production we did is this Live PD cop show. For political reasons that show now has been put on hiatus and they’re doing a Live Rescue show with Fire Department and EMTs. In this show, it’s very challenging.

Jim Jachetta (00:03:14):

We’re using a bonded cellular technology from one of our partners, Haivision, and one of the big challenges doing an at-home production or multi-camera at home production is maintaining frame accurate gen lock and lip sync across multiple cameras. And the Live PD show on the Live Rescue show we deploy 45, 50 cameras on a given Friday or Saturday night.

Jim Jachetta (00:03:44):

Now, not all the cameras are at the same city at the same location, but as many as eight cameras and eight sets of microphones are all open in close proximity. Then about a year ago, we did our first event with the PGA. We did an event in the Caribbean and the PGA was blown away. The PGA, like most at-home production events use the track and for the handheld wireless cameras, they use microwave wireless.

Jim Jachetta (00:04:17):

And all, if not, most of the microwave stuff that’s out there is still using H.264. So the PGA was like, “Wow, we’re doing this for a 10th of the cost. The Haivision VidOvation Solution has this amazing HEVC Codec. The show actually looks better. So we’ve proven that you can do a full production sports or broadcast event using an unmanaged network like cellular and/or the public internet and keep that gen lock. That’s the key for us.

Wes Simpson (00:05:01):

Have you had a change in how that equipment works or how customers are using your equipment in 2020 since the lockdowns began in March?

Jim Jachetta (00:05:15):

That’s a great question, Wes. In Hollywood, early on the networks and production companies were desperate to generate any kind of live content. So I would say these would be smaller productions. We had a couple of celebrity little shows that they did where, cutting a celebrity’s hair at home, doing it live and just because you have a $5 million house in Calabasas doesn’t mean you get good internet connection.

Jim Jachetta (00:05:50):

We learned this the hard way. Calabasas is notorious for very poor internet connectivity to the whole and very poor cellular. For these types of events, usually the talent they don’t want a technician coming in. They don’t want a crew coming into their house. They don’t want any strangers. So some of these production companies they’ll make a little flyaway suitcase.

Jim Jachetta (00:06:15):

Hidden in there is our bonded cellular and our bonded cellular has a relatively new feature where we make a secure VPN connection from the studio to the field and the assets in the field can actually be on the same sub-net as assets in the studio. So a camera operator can have the PTZ Controller back in the studio, back in the control room and the bonded cellular unit in a little suitcase with a PTZ camera that’s sent to the talent.

Jim Jachetta (00:06:49):

All they have to do is plug it in. They don’t even have to point it. Well, assuming the cameras has got 360 degree, put the suitcase in front of you, someone will frame the shot and they just do the whole show remotely. And because of the robustness of our cellular solution, we’re able to do a show where you normally couldn’t do a show. So, that’s something new for us.

Wes Simpson (00:07:14):

Robert You’re been heavily involved in the sports’ arena and obviously the whole nature of sports has changed over the past eight months. What’s been your experience?

Robert Erickson (00:07:28):

It’s all the details. Jim makes a really good point about you can’t guarantee a good bandwidth in Calabasas, and I think we could take the next step forward is you really can’t guarantee good internet almost anywhere from at-home connections.

Robert Erickson (00:07:43):

On the quick lessons LAN thing, we’re finding as we’re doing these remote productions that anybody at home, literally at home, working from their house or their apartment or their condo is usually beholden to a service level agreement from their ISP that is awful.

Robert Erickson (00:08:01):

So having a solution that is able to accommodate things like FEC or ARQ in terms of the air control, having flexible codecs, whether it’s HEVC or H.264 is really important because some of these connections are really, really bursty. You might get 40 megabit up sometimes, sometimes you might get three megabit up.

Robert Erickson (00:08:23):

Being able to have solutions that can accommodate that is a big thing. But what we’ve found is one of the biggest challenges of doing at-home productions is getting the content to the home so people can make the creative decisions. What does that mean? Well, if I got 16 cameras coming around from all over the world but my TD is working from home, how do I get all 16 of those images to the house?

Robert Erickson (00:08:46):

The traditional methods of taking output of a multi-viewer and slamming that into a H.264 coder don’t work, because once you start hitting frame-based encoders, or even worse than frame-based IP or B-Frame encoders, where you have multiple frames of delay to do switching of live content through that multi-viewer is way too high delay. So, how can we get the data, get the video to the home with a lower delay. And that becomes obviously a much bigger topic.

Robert Erickson (00:09:17):

At Grass Valley what we’re able to do is we’re able to create multi-viewers in AWS and Azure that have 60 to 90 milliseconds of delay. Well, that’s within the couple of frame delay that human eyeball can still not perceive it. So, that allows us to have a very low delay multi-viewer.

Robert Erickson (00:09:36):

So if somebody is using a Grass Valley switcher or a Grass Valley control panel from home, but the switcher is 1,000 miles away or 500 miles away that the multi-viewer supports it. And that’s part of our AMPP product, part of our Grass Valley Media Universe that allows us to take traditional hardware environments.

Robert Erickson (00:09:53):

Because a lot of our customers, especially with not having people in the stadiums or not having quite the eyeballs that used to have over television, have the budgets right now to just go buy all new gear for everything.

Robert Erickson (00:10:05):

So how can we make a hybrid environment where I use the hardware assets that I already have, the Grass Valley switchers I have, the Grass Valley cameras I have and so forth and wrap that up with a software-based abstraction layer that allows us to be able to control those assets from anywhere, to see those assets from anywhere.

Robert Erickson (00:10:22):

And that’s been the biggest challenge. That’s also been the biggest plus for us is to be able to mix and match these different environments between a hardware solution and software solution, even in a hybrid solution. There’s a lot of challenges, and I know we’re going to be talking about some of these later on that come in with doing these productions.

Robert Erickson (00:10:40):

I’m glad Jim brought up the challenges that we have of just getting stuff to the their home. But a lot of this is all in the details. It’s how do you stitch together an entire workflow from the second of photon hits the camera all the way till the output of the switcher and the output of the audio console, being able to chart that and design that and work around, that’s really the hardest part.

Wes Simpson (00:11:02):

It sounds like you’ve had to overcome some pretty interesting challenges. Richard, at Raritan you folks are not traditionally in the broadcast arena, but I think you guys have something really valuable to add into this whole mix of technologies.

Richard Dominach (00:11:19):

Sure. Thanks Wes. Yes, we’re a broad line supplier and we’re probably a little better known in the IT space, but actually we’ve been supporting broadcast customers probably for over 20 years. We’ve had our Paragon system in the ’80s that was used by many broadcast customers over the years and we’ve worked from there. So, we do support the broadcast industry.

Richard Dominach (00:11:52):

For us what we do is give you IP-based remote management of software applications and we can do this over a LAN, a WAN or even the internet. And we’ve tuned our tools over the years to work over a variety of different networks. I mean, it works best over a high speed LAN, but we have different types of configuration that you can do to work effectively over a WAN or even working from home.

Richard Dominach (00:12:32):

So, we have a variety of different IP-based tools and KVM switches that allow you to do that at different performance levels and also at different price points. We also have a user station that’s mostly used in a control room or a studio. So your tech, your operator can sit there, work with the user station, work across multiple monitors, and you could be accessing literally 20 different types of programs and then have access to hundreds of different programs.

Richard Dominach (00:13:13):

So that’s typically what people use when they’re in the control room type situation. But we also offer access by a laptop or via PC. For example, IT people can be accessing those systems and those programs and maintaining them as well as broadcast engineering. So typically at home access over the internet or over a VPN is something that we’ve offered, but more in terms of emergency access. But given the pandemic, people would use that same capability when they’re working from home.

Richard Dominach (00:13:59):

So we’ve seen quite an uptick of people using our products in the pandemic, and hopefully that’s allowed them to see what’s possible in an emergency situation, but also understand what they can do in the future, right? In more of a planned control way to change their workflow, to change how their staff works at different locations.

Wes Simpson (00:14:30):

Great. Ian, tell us a little bit about what you folks are doing at Teradici and I’m particularly interested in the cloud perspective.

Ian Main (00:14:42):

Absolutely. I mean, Wes, we had sort of two classes of customers here. We had our traditional visual effects post-production editorial companies who are using virtual workstations typically on-premises. So they were really remote to some extent using high quality networks, low latency and they were really using part of what we had to offer in conjunction with some connection management.

Ian Main (00:15:10):

But of course the pandemic for those gave them an opportunity to accelerate stuff they’d already been thinking about, which was at least getting that orchestration of their workstations going. So, that happened at a really high speed.

Ian Main (00:15:24):

We have a product called Cloud Access Software, which is a remote workstation, remote desktop product in conjunction with a product called Cloud Access Manager, which is the orchestration of your workstations, wherever they may be. So it’s a hybrid multi-cloud orchestration layer for your editorial workstation, so your EVID or your Adobe premiere station.

Ian Main (00:15:45):

We had a whole bunch of our post-production companies and broadcast companies just accelerating the migration so that they could have a pitch of VtoIP VIP connection without a VPN right into their home and they could take home either their zero clients they’re used to, or start working with their Macs or laptops or whatever they were used to.

Ian Main (00:16:06):

But you mentioned cloud, and this is where we have another emergency for all sorts of companies who weren’t set up for that remote and virtual workstation. We had this big wave of companies who just were used to having workstations under their desks and so we set up… Both had the ability to access all different types of Nvidia or AMD workstations.

Ian Main (00:16:32):

But we also have a longstanding relationship with Azure, GCP and the AWS team where our product has virtual workstation sits on all of those clouds. So in all three cases the broadcast studios or VFX houses are aligned with one or more cloud vendors.

Ian Main (00:16:50):

So they were able to take on these virtual workstations with the most convenient cloud service provider, typically with whatever was closest to them, add this management layer, which is already in place for them on-prem, and now extend those out to the public cloud so they could have users accessing virtual workstations directly or continuing back into their broadcast infrastructure.

Ian Main (00:17:15):

So you get this really nice opportunity to mix and match your assets and I think this is, I mean, we’ll talk about this later, an ongoing migration, if you will, but in the short term, people were just grabbing assets. The price of virtual workstations was secondary just to getting a secure high-performance access.

Ian Main (00:17:37):

I think Robert was talking a bit about latency. Of course, that was critical in terms of expectations. People were working at one or two milli-seconds latency to a local workstation. Now they’re up to whatever their ISP link is, 10 or 20, even 50 or 100, depending on what they’re doing. So there’s an expectations aspect there that we can probably talk about.

Wes Simpson (00:17:59):

That makes a lot of sense. One of the questions that always comes to broadcasters minds during any type of discussion that has anything to do with the cloud is their security. So for any one of you, four gentlemen what are your thoughts on that?

Wes Simpson (00:18:17):

Is that something that we as an industry have properly addressed or are there some big hurdles that we need to overcome to really have secure workflows for this very expensive content that we’re managing?

Ian Main (00:18:32):

I can speak to just the post-production part or the editing all the way through to post-production. Teradici’s technology has been used by the big movie producers and movie houses for many years in encrypted protocol, end-to-end. Typically, it was on-premises or dockside facilities in our products deployed in government security, those sorts of customers too.

Ian Main (00:18:58):

Extending to the cloud, I think there’s an educational component as well as some exciting architectural changes in zero trust and the need to authenticate people as well as devices. Some of these industry trends driven in some industries like the Trusted Partner Network, CDSA and MPAA and movie industry.

Ian Main (00:19:19):

I think the broadcast industry will be taking on some of these constructs and then there’s an education of locking down the cloud. I think it’s typically around the policing that is not set up, which would allow gaps in security rather than the actual technologies themselves.

Richard Dominach (00:19:34):

In our case we’re used across many verticals and we’re heavily used in government and also military systems. So we’ve evolved our products over time for different types of security and followed some of the US standards for that in terms of military grade encryption. There are some new standards coming out, and we’re also seeing a lot of requests for two-factor authentication, right? Because obviously login and password is not acceptable enough.

Richard Dominach (00:20:29):

The other thing that we’re seeing is some of the older protocols that we’ve used, right? In terms of SSL for web-based applications are no longer considered secure. So it’s kind of a continuous, say, struggle or challenge keeping up with all of this, right? Keeping up with the latest in security, making changes, moving to the next secure protocols and supporting the needs of our customers.

Richard Dominach (00:21:08):

From our broadcast customers, we don’t get the request for security that we see from, say, government or military or financial, but we are being driven by those industries to upgrade our products and our methods and then everybody can can benefit from that.

Wes Simpson (00:21:40):

I’d like to pivot a little bit and talk about the concept of timing and synchronization and delay, because I know a lot of people have to deal with those circumstances in a lot of current productions and especially in a live production, those things can be a killer. Jim, did you want to dig into that a little bit deeper?

Jim Jachetta (00:22:03):

That’s a great question, Wes. I think in the early days of Precision Timing Protocol, I may actually have learned the basics from you Wes. But we get asked a lot, VidOvation and their partner, Haivision, how is it that you’re able to maintain this magical gen lock? How are you keeping multiple cameras synchronous?

Jim Jachetta (00:22:35):

Haivision uses a transport protocol of their own. It’s proprietary. They call it the Safe Streams Transporter, SST for short and there’s a control mechanism, everything is… I think Robert, you had mentioned, or Ian there’s Forward Error Correction, there’s ARQ. It’s adaptive bit rate, like Robert mentioned.

Jim Jachetta (00:23:00):

You don’t know via 40 megs, one second or three mags throughput the next second. So you need a solution that can adapt to those changes. The benefit of bonded cellular, in the field, you have two very good clock references, either from the cellular network or GPS. It’s very common to have GPS. You pretty much have, within a few hundreds of a second, a reference that’s in sync with the studio and the encoder, but that doesn’t get you down to the frame level.

Jim Jachetta (00:23:40):

And as we all know, Precision Timing Protocol won’t work over an unmanaged network like the public internet or cellular, but I believe Haivision is using similar techniques. Their which they call the stream hub is acting like the master clock. And then reference signals are sent back and forth, differentiate single signals, error signals.

Jim Jachetta (00:24:04):

So they get the timing really, really close and then, because the product also has frame synchronizers built into its outputs, if the final product is a frame off that can be fixed with frame syncs, but also keeping mind of where the audio is. Again I don’t know all the secrets and I don’t know if Haivision would want me to share all the secret ingredients, but at the end of the day the video and the audio is very spot on.

Jim Jachetta (00:24:37):

And you watch golf, the show would be unwatchable. You’d hear whoosh, whoosh. You’d hear multiple swings of the club in a live production. You can’t fix anything in post. If you don’t a post.

Wes Simpson (00:24:57):


Jim Jachetta (00:24:58):

So it goes out live. If it arrives out of sync, you can’t fix it. You can’t repair it.

Wes Simpson (00:25:05):

Robert, have you encountered similar circumstances?

Robert Erickson (00:25:08):

Yeah. Quite a bit. And I think when you model a end-to-end remote production solution, it gets really, really complicated. Jim and I have already talked about how just getting sources from your source position, wherever it happens to be into the data center, to your switch timing plane is part of the equation. But there’s so much more to it, because timing is aggregate as it goes, delay is aggregate as you go.

Robert Erickson (00:25:37):

So maybe have a frame of delay or two frames of delay at source capture. Or they have a frame of delay, or hopefully less, at media processing, whether it’s going through a video switcher, whether it’s software or hardware, and then you have more delay added on by multi-viewing and then more delay added on by any normalization you do at the backend after it’s been switched.

Robert Erickson (00:25:55):

For us, what we’ve found is a millisecond here, a millisecond there or 20 milliseconds here or 20 milliseconds there adds up to potentially seconds at the end. So for us being able to model what the determinism is or the delay is for the system is really important.

Robert Erickson (00:26:14):

If we’re going over a 21’10” pipe over a LAN, we know that it’s going to be sub-line delay. But we know that if we’re going to a home to some of these cable modem, we’re adding and FEC and ARQ, and if it’s low bandwidth ATVC, which means that we’re going to have a ton of delay added to that.

Robert Erickson (00:26:32):

Understanding that is important, but I think Jim also made some really good points about what we can do now with timestamps or what we can do now with being able to track time as goes through. Because for the most part now accurate time is ubiquitous, whether it’s NTP, which isn’t quite accurate enough for our use, but pretty dang close, or simply 2059/PTP, you have AAA1588, whichever standard you want to go off of.

Robert Erickson (00:26:57):

Almost everybody has that now in most of their source devices. So Grass Valley is part of our AMPP problem, our AMPP solution is solving that problem and it’s how can we intelligently track timing and how can we intelligently correct that timing? For example, what’s the possibility of us time stamping every packet, regardless of where it starts as a source, compare that to where we receive it on the switch plane and being able to automatically adjust that time delay as we need to.

Robert Erickson (00:27:27):

Because Jim brings up a really good point, is if you bring in sources from five or six different locations with five or six different codecs, with five or six different types of internet bandwidth, you’re going to have potentially seconds. If you have one of those encoders as IPNB frames down to sub frame if you’re doing something like JPEG XS.

Robert Erickson (00:27:47):

So one of the biggest challenges is timing. How can you intelligently manage all of those delays as it goes through the system and then continue tracking it as it goes through the switching plane and as it goes through the processing plane and that kind of stuff. That’s what Grass Valley has been spending a lot of time within our Grass Valley Media Universe and in our AMPP platform.

Robert Erickson (00:28:06):

And for us, it’s intelligent media, intelligent timing. That’s what you get with the whole Grass Valley solution. But even if you do a solution where you’re just taking bits and parts from other vendors, it’s still something that’s really, really important. And honestly, one of the biggest challenges you have to address.

Wes Simpson (00:28:24):

Well, thanks. I’d like you gentlemen all to put on your forecasting hats for a minute, get out your crystal balls, whatever you want to call it and tell me, what do you see that’s changed already in 2020?

Wes Simpson (00:28:40):

I mean, we’re in the last quarter of the year, last month of the year, what’s going to happen going forward in early 2021 and beyond? What kinds of new technologies, new solutions do you think your companies are going to have to bring to market in order to meet this changing production environment? And maybe I’ll start with you, Richard.

Richard Dominach (00:29:04):

For us, we’ve been providing remote management products for over 30 years, so it’s not something that’s new to us. So I was really heartwarming to see that people could make use of what we develop. So it’s not so much that we came out with new solutions, but we had the solutions there for people to use and people needed then more than ever.

Richard Dominach (00:29:43):

But we have had some new things that we’ve come out with last year and this year that I think have been very, very helpful to people and our control room customers have been pushing us very hard in terms of our user station product that an operator or a tech would use in a studio or a lab, right? So we’ve increased what that product can do and in the past, it was using our KVM switches and KVM connections to different software systems that they were using.

Richard Dominach (00:30:25):

And then people also wanted things like RDP access to certain systems, VNC, SSH, use of virtual machines, web browser access. So we added that all in. So now it’s not just KVM sessions, you can get this other type of access to other different programs that are out there giving you your own mini control room at your desk.

Richard Dominach (00:30:56):

The other thing we came out with is a higher-powered user station that was more powerful. It could support more sessions and could support up to three connected monitors. And then that wasn’t enough for people. They wanted four, or they wanted six, or they wanted the ability to use nine monitors for all the programs that they were working on.

Richard Dominach (00:31:19):

So our engineers came up with a way to chain together user stations such that with a single keyboard and mouse you can seamlessly move across the user stations and support up to 45, not that anyone would do that, but up to 45 different monitors. And then once you had all those programs laid out then you wanted to be able to save the configuration of Windows and restore them, and so we’ve added that. We didn’t do it for the pandemic, but we added it to our products and our bag of tricks.

Wes Simpson (00:32:05):

Yeah. I’m not sure there’s too many spare bedrooms that would accommodate 45 monitors, but you never know what’s out there, right?

Richard Dominach (00:32:11):

It is kind of crazy. We thought two would be good enough, and then we thought three and no, it’s amazing what people require to do their jobs these days.

Wes Simpson (00:32:26):

So, Jim, are you going to be a customer of his?

Jim Jachetta (00:32:28):

Yeah. No. I’m salivating, I need 45 monitors in every room. My wife would love that. VidOvation we have a number of different solutions, but I feel you for live production, at-home production Haivision is our go-to partner. And I mentioned this SST, so it’s a closed system, or at least from the encoder to the decoder.

Jim Jachetta (00:32:52):

We get the question a lot where I have the encoder in the field, can I go to GV AMPP directly? Can I go to vMix in the cloud? Or can I go to my website? Or can I go to AWS directly? You need a software, what we call the Stream Hub, to put the video back together?

Jim Jachetta (00:33:16):

I’m sure you guys are familiar with bonded cellular or bonded IP and probably most of our audiences, but we’ll take a variable bit rate stream of video, we vary it to the available bandwidth, but then we split it up to 12 different paths. Eight cellular modems, two LAN connections, wifi.

Jim Jachetta (00:33:38):

The LAN connections could be satellite, they could be Fiber or they could be MPLS or a variety of all the above. So you have to put that all back together and that’s the SST that’s proprietary, but our system needs to survive in a multi-vendor world, in a multi-protocol world. So it is very important that the Haivision Stream Hub receiver…

Jim Jachetta (00:34:03):

And we call it a hub because it’s not only a receiver, it’s a transcoder, it’s an encoder. We have SDI outputs. We also have up to 16 IP outputs and that lends itself very nicely. CMT2110 is on the roadmap. NDI, we’re in beta. That’ll be released before the end of the year. We have hopes for SRT. So we can have an SRT IP output that will go to the GV AMPP ecosystem or go to a vMix system.

Jim Jachetta (00:34:36):

SRT is the de facto standard for a lot of things. It works well through public internet. The real challenge is somebody like SRT or even RIST wouldn’t be robust enough over cellular. Cellular is horrible. It’s changing latencies, unpredictable bandwidth.

Jim Jachetta (00:34:58):

We also have RTMP out or HLS out even just simple transport over IP, Dante. A customer asked us recently, “Are you going to have aux for Dante audio?” And it’s on the roadmap for next year to have Dante capability. All that is very important so all of us vendors can play nice together. Robert wants to sell you the whole system, cameras included. Many of our customers, Robert are using the GVM AMPP ecosystem to do their switching.

Wes Simpson (00:35:33):

That’s a good segue over to Robert.

Robert Erickson (00:35:36):

Yeah. Anyway, that’s actually one thing that it’s… I know we didn’t plan it this way, but what I actually really like about this panel that we have is we do work with Jim and a lot of his products. We also work with Ian and a lot of their products with our Media Asset Management System because Grass Valley also has editors and things like that.

Robert Erickson (00:35:54):

And a lot of our customers who have their computers and their editor workstations at their facility, but they want their editors to work from home, we’ve been working together with Ian and his company to make this solutions work. So I think it’s really interesting to see how by having multiple vendors work together is how we’re actually getting our customers to succeed. And we’ve been playing a lot more friendly with a lot of other people and I think it’s required moving forward that that’s how we do.

Robert Erickson (00:36:23):

I think one of the biggest things we’re going to see in the future is not the complete move, but a substantial move away from a Capex where I’m going to spend millions of dollars in integrated hardware to more of an Opex model. We have customers and we have this user switcher for example, where if you need a switcher that can do any production out there, I mean, the world largest switcher with the most amount of inputs, the most amount of MES, the most amount of outputs is at Grass Valley switcher. That has its place.

Robert Erickson (00:36:54):

But also what we’re seeing with a lot of remote production is that customers want to spin up a switching platform to just do four or five or six cameras, something they can spin up use and spin down. So be able to take what we’ve traditionally done in hardware, move it into software, so we’re also taking it from a Capex model to an Opex model is really important.

Robert Erickson (00:37:14):

But video is really hard to do in software. It’s incredibly high bandwidth. It takes a ton of memory. It takes a ton of compute. If you want to do a simple DBE fix, you look at the amount of math that goes into doing a DBE on a 10 ADP signal is actually substantial.

Robert Erickson (00:37:30):

So, doing video switching, doing video processing in a cloud or in a software environment maxes out everything. We look at what PCI-e bandwidth has available today in a standard server, we can max it out and turn video very easily. You look at how much compute sits in an AWS or Azure compute, we max it out pretty quickly, both on the PCI bus and also the networking infrastructure that goes between multiple nodes.

Robert Erickson (00:37:57):

So we’re limited today in what we can do in terms of video switching and video processing in the cloud. Can we do it? Yes. Can we do it to accommodate a lot of people’s workflows? Yes. Can we do it to accommodate everybody’s workflows? Absolutely not.

Robert Erickson (00:38:13):

But as compute gets more faster, as compute gets cheaper, as memory gets more bandwidth than the PCI-e bus, as storage gets cheaper, the market will evolve as it has been. The ability to do more and more intensive processing in the cloud or in software will become available. So what we’re seeing now is we’re going to address small to medium size workflows in a pure software environment, so we’re talking AMPP, whether that’s on a private data center or whether that’s in AWS or Azure or something.

Robert Erickson (00:38:49):

There is things we can do now. Every month or two, as hardware gets upgraded and as the data centers get more compute, we can do more. So we look out three, four or five years, a lot of what we can do today that requires hardware can be done in software very shortly in the future. So this gives you a very graceful progression from a Capex model where we’re buying a ton of hardware now to an Opex model in the future, but it also allows you to add the features of functionalities as the software allows.

Robert Erickson (00:39:18):

Now, the great part is if you need to do the peak stuff, or you need to the big productions, we could still do that with hardware. It’s not like this hardware is going away. It’s not like FPG just cease to exist. So what we can do is we can wrap those hardware-based solutions, whether it’s a switcher or whether it’s a camera or a processor Up/Down/Cross converter or whatever those happened to be, that we need that capacity for today.

Robert Erickson (00:39:40):

We can wrap it around a software interface to be able to get it to the home or get it to the data center. Again, that puts us back in this hybrid model. So, that’s going to be the future that we actually…

Robert Erickson (00:39:51):

I think the industry got pushed way faster this year than I think everybody felt comfortable with, but we’re seeing that hybrid models work. We’re seeing that software models work in certain workflows and we’re already seeing the software models will be able to grow in the future to handle much larger workflow requirements.

Wes Simpson (00:40:09):

That makes a lot of sense. So Ian, obviously you’ve dealt a lot with pure virtual systems and I assume that you’ve also dabbled perhaps a little bit in some of the the hybrid arena, but what are your thoughts on that?

Ian Main (00:40:24):

Yeah. I mean, I think that’s an endless set of tests and if I think of companies like Grass Valley with their requests on us to make sure that their users have that video editorial experience they expect from… We’ve been doing 10 ADP displays at high frame rates for many, many years, but now these days it’s 4K displays. And the remote workflow historically excluded the finishing work where you have HDR and 10 and 12 bits color for desktops, but that’s now a new need.

Ian Main (00:40:58):

Netflix are all doing HDR type workflows and so production houses need to be set up for that. And so the protocol, that on us means more pixels, more bandwidth. We were fortunate that we had started a project last year called Auto Offload for our PcoIP protocol. Historically we supported our color accurate remote desktop, which is what we were known for and a bit of a bandwidth premium. But that way you could get that exact colors that an artist or an editor would be seeing on the desktop that you could get it remotely.

Ian Main (00:41:33):

We also supported with our relationship with Nvidia H.264 for video encoding of desktops which gave you great bandwidth than WAN the efficiency, but not quite the accuracy of the desktop. And earlier this year, we released fortunately… IT would have had to switch manually between the one mode or the other, depending on the users.

Ian Main (00:41:53):

Now we have the protocols switching automatically between that, very low bandwidth. WAN use case, if people are using things like cellular or far away or very low bandwidth end points all the way up to that high color accuracy. And now the push is towards 4K and 4K high frame rates. In our industry, that’s up to 60 frames per second with animation and these types of workloads and then coming HDR and 10 bit.

Ian Main (00:42:20):

And then the other one, I think it’s already been mentioned a few times, things like STI, which can be remoted as a video stream today, but now people want that clean feed, which is typically your third over the shoulder review monitor, which you would have at a local edit bay.

Ian Main (00:42:35):

You want that to be part of your remote desktop, so a third channel that just takes your original feed along with the remote desktop and synchronized. So you need the timing, you need the audio. We do stereo audio, but we’ve got a lot of pressure to do 5171 and have most types audio.

Ian Main (00:42:53):

So, on us, there’s a lot of codec work just to get audio better synchronized with video. We have synchronization today, but people need that frame-accurate synchronization, which we’re used to on the front end production side, post-production one to two. The extra peripherals that are coming in. Wacom tablets is a… We work closely with Wacom to terminate the end point so that on long latency networks, artists don’t get a…

Ian Main (00:43:21):

And content editors are using these Wacom devices too to get that highly interactive experience, even if they’re on a LAN network. So more work we have ahead of us to make that work over longer distances and worse networks. Although I think we’re all lucky that a lot of investment is going to continue and accelerate into the network itself. And I think that will make all of our lives on this panel that easier.

Wes Simpson (00:43:46):

Richard, talking about desktops, do people have the equipment in their homes that they need in order to run some of these applications? Or are you seeing a big upgrade path being needed? Or how do we solve that problem?

Richard Dominach (00:44:04):

In terms of networks or in terms of [crosstalk 00:44:08]

Wes Simpson (00:44:08):

Well, in terms of both, I mean the end user equipment, the networks feeding those users and then the software tools and other technologies. How do we bring that full performance experience into somebody’s home?

Richard Dominach (00:44:25):

Yeah. I really think we’re at the mercy of the ISP and the connection that you can get over the internet unless people are going to be willing to set up their own types of networks at home. But I think it varies, but I mean, fiber to the home does exist. I mean, I do have it at my house and you can get higher grades of service, but frankly, it’s going to vary, so I think we’re somewhat hostage to that.

Richard Dominach (00:45:06):

But I think people in the broadcast industry and really all industries are going to have to look at that and to see whether they’re willing to make any investment to improve the network. But like Ian was saying, what we’ve done is made our products somewhat configurable and tolerable about the network, right?

Richard Dominach (00:45:34):

I know that’s something we’ve invested in over the years, right? That our products work really well on a high-speed LAN and more well on a WAN. And obviously if you’re at home, you’re hostage to the connection that you have, but we’ve invested in different ways to configure our products to work from home.

Richard Dominach (00:46:01):

We’ve even done maybe somewhat crazy things like put in non-color modes or black and white modes and we have had people over the years who have worked over telephone modems that if you’re in a really stressed environment and situation that you can at least get access, even though you don’t have that much of a connection in terms of the users on the user’s end, right?

Richard Dominach (00:46:34):

So historically IT has been a big customer of ours, accessing and supporting equipment in the data center and their tool of choice was the laptop or the PC. Our protocols have supported those devices and we’ve made use of the GPU’s right to provide good performance. And luckily laptops and PCs are pretty amazing devices given even the money that we pay for.

Richard Dominach (00:47:10):

So for IT and for broadcast engineering and people working from home, they can do their work with a laptop and then maybe four or five years ago, we came out with our user station, right? Which is really tuned specifically for our… We built in a CPU and a GPU and enough resources to do a really optimized job there for our customers. That’s in terms of the users and that’s what we support.

Wes Simpson (00:47:48):

That makes a lot of sense. Anybody else want to talk about that whole desktop versus bandwidth conundrum? How are we going to make sure that our users that need access to these high-end equipment you all are producing… How do we upgrade them? How do we get them to the point where these tools are really useful at home?

Jim Jachetta (00:48:15):

Wes, our ecosystem is… I’m just talking about the Haivision ecosystem right now. When someone needs to remote in or… VidOvation we’re the master distributor in the US for Haivision and with that comes the responsibility of a first-line support, 24/7. And the Haivision, whether you’re logging into an encoder remotely, or the Stream Hub receiver, physically in master control or in the cloud, all their web interfaces are very light.

Jim Jachetta (00:48:51):

They don’t require a lot of bandwidth and I don’t know how familiar you are with website or web GUI interfaces. It’s responsive. So whether one of my technicians is out to dinner with the family and he receives a tech support call or request that support ticket is opened, anyone on our team can log in from their small screen on their phone.

Jim Jachetta (00:49:20):

You might have to do a little bit more scrolling to get to the information you want, but everything that would be on a big screen is folded into the smaller screen and you can scroll and get to it. So there’s nothing you can’t do from your phone, a tablet or an Android. Basically, any browser, any device that has a browser, you can control it. So from that regard, the Haivision part of the ecosystem is very easy to work with.

Wes Simpson (00:49:53):

Let’s bring this home. Let’s talk about things that we’ve seen, things that we’ve envisioned for the future. A few of you mentioned the whole idea of some existing trends were getting accelerated by the lockdown. I mean, we’ve gotten some good news about vaccines recently. Hopefully they’ll get approved and start getting distributed.

Wes Simpson (00:50:17):

Do you see us going back to our old way of doing things, or if not, what’s going to be different about working for a broadcaster in 2021 versus working for a broadcaster in 2019. Robert, you want to kick that off?

Robert Erickson (00:50:35):

Do I see things going back to the way they were? No. Simple as that, but I think there’s a lot that we are losing by not having a lot of people in the same room being able to collaborate and be able to work together and learn off of each other and things like that.

Robert Erickson (00:50:53):

What’s really happened this year is the technology we’re using today to do remote productions has been around for a while, but the amount of risks that a broadcaster was willing to take just wasn’t there to justify jumping it.

Robert Erickson (00:51:07):

You look at things like eSports and things like that, they’ve been doing these style of remote productions for five or 10 years, but they’re also not talking about the money that you do on Monday Night Football. We’re talking significant less. But now actually eSports is getting those kinds of ratings.

Wes Simpson (00:51:21):

They’re catching up.

Robert Erickson (00:51:22):

Yeah. The ability to do these workflows has been around for a while. So the broadcasters in addition to risk, which is entirely warranted, has kept it from accelerating this year, forced the adaptation of more risk into our workflows. That part I don’t think is going to change. I think that the fact that engineers that are flexible and aggressive and willing to come up with an idea and test it and bring it to air won’t change because we’ve verified that that is there.

Robert Erickson (00:51:55):

But definitely the concepts of having collaborative environments where we can all work together, definitely has some edge to it. And I’ll just talk personally, I’m not even talking about broadcast, for example. Grass Valley is a very large company. We’ve got 400 engineers all around the world, and how I keep up with technology because technology is changing so fast, is by talking to our engineers.

Robert Erickson (00:52:19):

It used to be by going to our Grass Valley, California office because that’s where I’m based out of and sitting down with our hardware engineers saying, “Hey, what’s up? What’s going on?” And I learned something from them. Or if I happen to be in one of our offices or meeting with some of our engineers at a trade show, I’m able to sit next to them and talk to them and leach off of them pick their brains and that stuff.

Robert Erickson (00:52:39):

We haven’t had that environment. So I think as soon as we can we’ll see these collaborative environments come back, but I think the industry’s willingness to take risk and mitigate that risk with testing, mitigate that risk with products that have a history of working but still make the risk for trying new workflows, we won’t go back to the way of saying, “No, that’s not the…”

Robert Erickson (00:53:03):

I’ve always hated the words of, “We’re not going to do that because this is how we’ve always done it.” We haven’t been able to say those words for nine months and I think we’re probably able to finally say we’ve deleted that largely out of our industry. And I think that’s going to really be a very good progressive element for us moving forward.

Wes Simpson (00:53:19):

Ian, how about you?

Ian Main (00:53:20):

Yeah, I would say two the risks that Robert’s talking about are user experience and security from our point of view, things that were holding technologies are now holding them back from being deployed at scale. And user experience is a tough one, the video editorial industries has very tough requirements. Remote protocols, by definition, favor interactivity over frame delivery because that’s the more important criteria.

Ian Main (00:53:49):

So we’ve added controls that can better balance that for video editors so that they can at one time playback, play smoothly, but at other times they get that higher interactivity. So I think as the broadcast studios are now seeing that, that flexibility that they’re offering their employees is highly valued and people will take a compromise on interactivity and performance to get that flexibility.

Ian Main (00:54:17):

I’m 100% that we’re going to go back to some hybrid model, and all the technologies needed to support this, where you have some users remote and some users on-prem and the tools and collaboration needs to support that in a really positive fashion.

Ian Main (00:54:32):

We have examples where we have a production house in the UK with artists on the Island of Mauritius or here in Canada, so trans-continental stuff. I think people value that, but then they also want to have performance onsite and they want to have that whole thing secure, so it’s going to be an exciting future.

Wes Simpson (00:54:53):

I think it will. Richard, did you want to comment on that?

Richard Dominach (00:54:56):

Yeah. I mean, I generally agree with Ian and Robert and I think in the pandemic all of us were forced to act very, very quickly, right? And we did the best that we could with the tools that we had or we could get our hands on. I think in the future we can learn from that and adapt and we can do a better job planning, right?

Richard Dominach (00:55:27):

That definitely I think we’ve all learned from this, companies have learned from it, right? I think we have more flexibility in our future in terms of where people are, where they work, how we can work together. And in general, we’re all more cognizant of this being a virtual world and if we need to be together, we can be together, right?

Richard Dominach (00:55:56):

I mean, certainly we’ve seen it in our company with a lot of people like me, product management, marketing. And we’re probably okay. Working from home, but in terms of engineering, they really need that collaboration and need to be together and that’s hard to recreate in a virtual world.

Richard Dominach (00:56:20):

I think it will be the same thing in broadcast, certain things you want people to be together, either for collaboration or innovation or even performance. Other things can be more flexible, right? And with the tools that you have, and people responding to this environment there’ll be a lot more flexibility in terms of where people are, where they can work together, right?

Richard Dominach (00:56:49):

We’ve seen a lot of requests even before the pandemic for support for remote workers, support for remote facilities, right? Your IT person may not be onsite. They may not be from your company, right? You also have to work with consultants, you may work with people globally. So I think all of that is in the mix and we’re providing tools to make that possible.

Richard Dominach (00:57:16):

So I think it is right that this has been very tough for all of us, but we’ve kept the show going and we’ll learn from it and we’ll do better in the future and there’ll be more flexibility with better ways of working.

Wes Simpson (00:57:38):

I certainly hope so. Jim, any closing comments from you.

Jim Jachetta (00:57:42):

Yeah. So I totally agree with Robert, Ian and Richard. I see the future as a hybrid approach. VidOvation, we’ve always been promoting at-home production, REMI Production in a centralized location and for the large part PGA is doing that. I’m sure they have certain operators working from home.

Jim Jachetta (00:58:06):

[Marc Steben 00:58:06], I exchanged some funny emails with him recently, and I said, “How are things going?” And he says, “Do you know of a way to protect 130 orchestra musicians in an orchestra pit?” Because he works with the MET. So the modern control room now or the post-COVID control room as all Plexiglass everywhere, and there’s procedures. If you have a temperature, you stay home.

Jim Jachetta (00:58:41):

Back in January I wrote a thought leadership piece for the broadcast engineering extra of TV Tech and it came out in just as things were breaking. And in one of the concepts we were promoting with that home production was, a big element of live reality TV and sports is instant replay. Many of our customers will use EVS.

Jim Jachetta (00:59:09):

I’m sure, Robert, you have a Grass Valley equivalent of replay, whatever that is. EVOTS has theirs. Everyone has them, New Tech. So just take an EVS operators. So you do a basketball game in LA, you put that EVS operator on a plane, you get them a hotel room, you got to pay them per diem for their food. They do that one basketball game, or maybe there’s a series of basketball games over the weekend, but they can only do one event per day or maybe one event per trip.

Jim Jachetta (00:59:47):

If it’s the Super Bowl, they’re only doing one event. Now, what I was proposing in January, or what we’ve been proposing for quite a while is what if that knowledge worker, that skilled worker worked from a centralized location and he did an early afternoon bowl game on the East Coast then maybe he did a late afternoon bowl game on the East coast, then an afternoon game on the West Coast where this same skilled worker works at the master control he reports to everyday.

Jim Jachetta (01:00:25):

Now with COVID, he goes where he knows there’s COVID protocols, he’s within his pot of workers. They isolate different crews on different shifts so there’s no cross-contamination of COVID. But now this individual worker can do more than one event a day, certainly multiple events a week working from a common master control.

Jim Jachetta (01:00:51):

And then as Ian and Robert and Richard alluded to, they still can collaborate but in a safe environment. That’s part of the savings and then… I’m not looking forward to… I have flown when I’ve had to during this lockdown, but I’m washing my hands bathing in hand sanitizer. Employers don’t want that liability of demanding their workers.

Jim Jachetta (01:01:25):

You need to get in the plane, rent the car, get in a dirty hotel room, be in a germ-infested airplane. So I think these technologies… The silver lining is this is saving a tremendous amount of money and we can do more programs, we can do maybe another lower tier of programming, take the super bowl and professional sports. Now, we can use this technology for NCAA sports, college sports. We even have customers using our tech for high school sports and football.

Wes Simpson (01:02:06):

It sounds like your message is we’re saving lives, we’re saving money and we’re getting more done with the same amount of resources. I think that’s a pretty positive note to end on, Jim. All right. So let’s just recap here. We had Robert Erickson who is the Strategic Account Manager for Sports and Venues, Grass Valley. Thank you, Robert.

Robert Erickson (01:02:27):

I actually appreciated it. It’s great to be able to have conversations with you and Ian and Richard and Jim. We all come from very different parts of the industry, so it’s great to be able to share those experiences and stuff that has worked in and hasn’t. I have a glimpse of where we are going. I like it.

Wes Simpson (01:02:42):

Great. Then we had Richard Dominach who’s the Director of Product Management for Raritan. Thank you, Richard.

Richard Dominach (01:02:49):

Yes. Thank you, Wes. Thank you, Robert, Ian and Jim. This was great. It’s great to hear what leading companies are doing. We are all pitching in to save the world here and help things going forward.

Wes Simpson (01:03:05):

And we have Ian Main, Technical Marketing Principal for Teradici.

Ian Main (01:03:11):

Thanks, Wes and Robert, Richard, Jim. It was really a pleasure to see that there’s lots of other companies facing pretty much the same challenges as we are. And I think we’re all excited about taking these on and making it a more technical area for our consumers.

Wes Simpson (01:03:29):

Thank you. And Jim Jachetta, CTO and Co-Founder of VidOvation. Thank you, Jim.

Jim Jachetta (01:03:36):

Hey. Thanks, Wes. Thanks for putting together this diverse panel. It was a lot of fun. It was a pleasure meeting Robert, Ian, and Richard, and I think I’ll speak for all of us. We really love doing our parts, saving lives and saving cost, to coin your phrase, Wes.

Wes Simpson (01:03:55):

All right.

Jim Jachetta (01:03:55):

Saving lives and saving cost.

Wes Simpson (01:03:57):

There we go. All right. Well, thank you all gentlemen, and have a great afternoon and we’ll see you again on a future, hopefully, virtual event as well as a few events in real life maybe in 2021.

Jim Jachetta (01:04:13):


Richard Dominach (01:04:13):

Yeah. That’d be great.

Wes Simpson (01:04:14):

I’m looking forward to that. All right.

Ian Main (01:04:16):

[crosstalk 01:04:16] Thank you very much everybody.

Wes Simpson (01:04:18):

Okay. Take care. [crosstalk 01:04:18]

Jim Jachetta (01:04:18):

Bye guys.

Richard Dominach (01:04:18):



Continue Reading