What's New in At-Home Production [Webinar Recording]
Select Page

What’s New in At-Home Production [Webinar Recording]

Published on Jun 18, 2021 | Podcast, Webinars

Join us to explore the impact of the pandemic on at-home production and what new technologies or remote work practices might come from this period of radical change and creativity. You’ll learn more about:

  • Working with talent contributing from home
  • Remote directing
  • At-home production and REMI
  • Simplifying integrated camera control
  • Handling analog audio inputs
  • New technologies, including 4K and four channel HD

 

Register to watch Webinar Recording & Download Presentation
 
 

What’s New in At-Home Production [Webinar Sign Up]

Wednesday, June 16 at 10 am PT / 1 pm ET

Hosted by Jim Jachetta, executive VP and CTO at VidOvation, and Neal Metersky, U.S. sales and business manager at Haivision

 

Transcript

Jim Jachetta (00:00:00):

So good morning, everyone. I’m Jim Jachetta, co-founder and CTO of VidOvation. I have my good friend Neal Metersky, Sales and Business Manager for Haivision USA. Very happy to have Neal here as a guest. How are you, Neal?

Neal Metersky (00:00:17):

I’m great. I’m great. Staying cool in summertime in Las Vegas, 113 today, but-

Jim Jachetta (00:00:24):

113.

Neal Metersky (00:00:25):

Doing really well.

 

Jim Jachetta (00:00:26):

Yeah, we actually had to put the AIR conditioning on last night in Southern California. It was a balmy 85 degrees, not 113. But I guess, it’s all relative. So good morning, everyone. We’re trying something a little bit new. So we’ve set up a private text messaging number that only goes to me, it’s not something my marketing people are doing. If you text me any of your questions for today’s webinar to , you’re sending a private message to me, Jim Jachetta, and we’re trying to encourage a community. Any questions you might have. I’m going to periodically, once a month, offer something of value to people that communicate with me. People have questions. Also, invitations to webinars like today.

Jim Jachetta (00:01:35):

So we encourage you to use this new platform, just text message like you normally would, just to this special number. Of course, also still use the chat function if you prefer to use that in the GoToMeeting. So today, Neal and I were going to talk about what’s changed in at-home production, or maybe even broader what’s changed in everyone’s workflow. As a vendor, as a provider, there’s less people in the office here at VidOvation, where I would say 90% of the staff is working from home. We started using Slack to communicate with each other. Neal, I mean, you’re in Las Vegas. So you’re a little bit far from France. So has your work life changed much in these new times?

Neal Metersky (00:02:33):

Well, for me, it’s an interesting perspective, because I’ve worked from home for years, right, remotely. Just the nature of the position that I have. But it’s been interesting to see that as so many more people enter that world, there’s certainly been changes, more cameras, more people have their cameras on. But yes, it’s interesting. It’s driven so many different businesses to do things differently. And obviously one of the things to talk about today is so many different… So much of that, within broadcast and live production as well, and that combined with, “Hey, the technology is really weak. Can we do this?” “Yes, we can.” It started out when everything first shut down, where there was a lot of, “Hey, can we do this? What can we do? What can we do?”

Neal Metersky (00:03:24):

And I really am sure that a lot of the guys out there and customers, that it was probably a environment where it, number one, showed what they already had and the capabilities they had but they didn’t always use them that way. And I found that in the past that they don’t know what… The head of CNN, New York once said, to me, it’s like, “You guys have it in place before we know… You have what we need before we know we need it, and then when we need to go use it, it’s there.” And I think that this was really an extreme example of that new world.

Jim Jachetta (00:04:11):

Right, right, right. Yeah it just popped into my head. I’ve seen Mike Tyson a couple times on TV recently. And I remember one of his sayings was, “You can have the best laid out plans, but when you get that first punch in the face, all plans go out the window.” So the punch in the face to all of us was COVID and the lockdown and our normal way of doing things all went out the window.

Neal Metersky (00:04:41):

And I think one of the things that’s really important now is kind of where we’re coming out of this now is with a shift of, “Okay, where do we go from here? We’ve found a lot of works. We’ve had some things that don’t work and we know that new things are coming on horizon that are going to enable even more. So where do we go from here?”

Jim Jachetta (00:05:04):

Right. So, this slide kind of says it all. Samuel Fleischhacker is the encoder, Product Manager at Haivision. And most of these slides, if not all of them were created by him. This kind of says the whole, why? What is Haivision and VidOvation promoting? When COVID struck when we were all in lockdown, we were very well prepared because we had been promoting the idea of at-home production and REMI for five or six years. And we had been doing some fishing tournament we had been doing Live PD, Live Rescue. A lot of multi-camera productions that needed frame-accurate Gen-lock you can’t have multi-cameras all out of sync with each other.

Neal Metersky (00:05:57):

[crosstalk 00:05:57] Not just promoting we were actually doing this. We were doing it, not just promoting and talking about it-

Jim Jachetta (00:06:01):

We were already doing it. Yeah, yeah. So this may seem redundant, but the benefits are less people. So we don’t need as many people on site. Some customers like to shade on site. So they might have a video engineer on site, shade the cameras and then home run everything. Some of our customers like the PGA are shading back at Master Control. So everyone has a slightly different workflow. And Neal, you can say we’ve learned a lot from our customers too, like we learn the importance of camera shading in an at home, or REMI production that, “Hey, VidOvation. Hey, Haivision. You need to get this to work. We need to shade a camera through the public Internet, and cellular.” And we’ll talk about that today. People are seeing that it saves time. And then… I don’t know. Usually, all of our customers have some sort of a limit on their budget, right Neal?

Jim Jachetta (00:06:58):

So it’s all about saving money. If we can save money, it’s a no brainer. So I apologize to any of our friends at Game Creek or NEP, people that own trucks or satellite trucks. But this is just to make a marketing point that I don’t think that truck is going to go away. Maybe NEP has a bigger fleet of smaller vehicles, smaller trucks. Maybe the trucks are in a central location. So that the at home production… Because there’s limited studio space, so the truck stays in Kansas, etc. But you get the idea that with bonded cellular at home production using cellular or the public Internet, we can minimize some of the personnel load, the equipment cost of trucks. And we’re really getting close to the reliability of satellite and fiber. So here’s some of the customers we’ve been working with NBC Sports, Golf Channel, the PGA we’ve been also working with a couple of different Bassmasters Major League fishing. We’re currently doing three fishing tournaments and believe it or not fishing has very unique challenges.

Jim Jachetta (00:08:27):

The cellular providers don’t put towers around lakes. They put towers around where people live. So a lot of these lakes are in rural areas. And Haivision technology has stepped up to the challenge and has worked really well. So one of the first events we did during the lockdown was the first live sporting event in the US post lock downs, back in May of 2020. Seems like such a long time ago, was with the PGA. So they had a restriction where they could only have 50 people on the course, the local health authorities came up with that. So that included camera operators, personnel, officiants, as well as the players. And the event was done at the Seminole Golf Club.

Jim Jachetta (00:09:18):

Here’s another silver lining, the switch level three, multiple telecom providers probably have a connection at Pebble Beach, right? Neal, you’d know more about this than me. But the Seminole Golf Club is a private club, there’s probably not a fiber connection in there. So the PGA was able to do an event behind the scenes at a very nice, prestigious, private location or alternate locations where they wouldn’t traditionally be able to shoot and bonded cellular helped facilitate that.

Jim Jachetta (00:09:56):

Then one of the on AIR talent was in his home in Michigan, so they use bonded cellular for him to add color to the event. And granted, if we were doing the show over fiber, the latency would be practically zero. If you’re going through public internet or bonded cellular there might be a second of latency. But most of the on AIR talent or even some of our newer customers, we have on AIR talent at QVC, that’s used to the one second latency after they’ve done it for a few times they get used to it, pausing when they speak, not to step on each other’s toes. So, here’s the setup out on the course. They had two cameras in the tee box, two cameras on the fairway, two on the green, they had a couple of POV shots, beauty shots, they use the flagship Haivision PRO380 with eight cellular modems for the primary video shots.

Jim Jachetta (00:11:02):

But for the beauty shots, the cellular reception was good enough where they use the Haivision PRO320… I’m sorry, the AIR320 with only two cellular modems and they could get beautiful beauty shots. For this event, they had a shot of the clubhouse and a shot of the beach, the dunes by the beach. So that was part of the show. And then Neal and I we learned the importance of having analog inputs. Haivision has always had analog audio inputs on all of their field encoders. And there’s usually, for every video, there might be four or eight audios to go with every video. And any of you, audio engineers out there, this is old news to you, that there’s a lot of microphones, a lot of audio and the PGA use the two modem device, the AIR320 for a lot of the background microphones, microphones on the talent, microphones on the players, microphones on the commentators that were walking the course with the players, parabolic microphones catching the action.

Jim Jachetta (00:12:14):

And you can imagine the PGA will have about a dozen cameras live simultaneously on the course, dozens upon dozens of microphones all open simultaneously. And with Haivision SST technology, Safe Streams Transport, they’re able to keep everything in perfect lip sync, keep all the cameras in Gen-lock. If any of these microphones were a little bit out of time, it would drive the audio engineer crazy. And it would make for a horrible show. You’d hear the swish of the golf swing… You’d hear the crack of the ball strike multiple times. It wouldn’t make for a very good production. And one thing that PGA noticed is, the traditional way they would do an event like this, they would have microwave camera links connected to a truck in the parking lot.

Jim Jachetta (00:13:16):

And most of the microwave stuff that’s out there uses H. 264. All this Haivision tech is using HEVC. So the picture in most cases look better than what they were used to. They’re like, “Why does the picture look so much better? We’ve saved a ton of money, we saved on personnel, we saved on travel, and the pictures look better.” So it’s been a real win-win for the PGA and others. Here’s a little background on the product. So they used the eight of the PRO380. So they’re eight cameras on that event. Nine Airs there was some cameras connected to AIRs and microphones as I mentioned.

Jim Jachetta (00:14:01):

Here’s a little overview of the product line. So Neal and I are planning more webinars we’ll post the schedule. At the close I have a slide that will show some of the topics we want to discuss. If you have any questions or topic suggestions, please text me at 949-xxx-xxxx but you can see here at the top, the flagship, the PRO380 it’s got eight cellular modems it does 3G and 4G LTE. Then the PRO360 5G which we’ll be launching soon does 3G, 4G and 5G. It has six modems we couldn’t fit eight modems into the packaging. I guess the 5G modems are a little bigger. And then we have the PRO340, we don’t really sell the 340 in the US, maybe now that we have three cellular carriers instead of four, maybe the 340 might make sense.

Jim Jachetta (00:15:10):

And then we have the smaller unit, the belt mount, the smaller compact the AIR series. And that’s available in 3G, 4G and 5G. And it’s available… The 200 series is H. 264. And the 300 series is HEVC. So if you have a little bit more of a budget-conscious show, you could go with the AIR200 or the 220 to save a little bit on the cost. But if you want that really great HEVC Haivision quality, go with the 300 or 320 AIR units. So Safe Streams Transport, I touched on that. So people ask… I’m sure you get the same question, Neal. Is like, “What is the one thing that makes Haivision different?” And I think the heart of it is this SST, right? Isn’t that the case, Neal?

Neal Metersky (00:16:08):

I would say, yes. The SST is what gives us the level of performance, which is what really, I think, sets us apart. Traditionally, in this market with bonded cellular it’s been, “Yeah, everybody does the same thing. They’re all pretty close.” And that’s true. That’s true to a certain level and combining the highest efficiency and best looking in coding. And tying that with the management of the links and having it all interactive, basically, at the end SST it gives the outputs the best looking video over the least bandwidth even when there’s not much available. As you know, a number of our customers that use it in challenging environments have said… Actually, I know two separate people told me this that, “I’ve been able to transmit live when I was not able to text.”

Jim Jachetta (00:17:20):

Right.

Neal Metersky (00:17:21):

And cameraman wasn’t able to text either. And we were on different carriers.

Jim Jachetta (00:17:25):

Right. Right.

Neal Metersky (00:17:26):

And that’s really where it all comes together. And it’s the magic juice. And we all have our magic juice. And it’s just extremely efficient magic juice. Right.

Jim Jachetta (00:17:35):

Well, I think some of the folks joining us today, have been listening to some of my panel discussions on Sports Video Group. And on the SVG panel I was on last month, Del Parks asked an interesting question he asked Ken and the panel, “Do you see long standing relationships with vendors being disrupted?” And I say, “Absolutely, yes.” But one of our challenges is getting people to try Haivision. It’s human nature, you like a certain brand of car or a certain brand of coffee, getting someone to try something new. We’re creatures of habit and maybe you like the rep, the sales rep, from vendor A, and getting them to switch. Neal and I are very likable, the VidOvation team is very likable. But we really encourage you to try Haivision, and the customers that do they really see the difference and they’re able to get shots that they can’t get with other solutions that are out there.

Jim Jachetta (00:18:48):

So we really encourage you to reach out and give it a shot. And we promise you won’t be disappointed. So here’s some closer up shots of the unit. So traditionally bonded cellular has usually been in a backpack, people say… Even they refer to it as the backpack. “Where’s the cellular backpack?” Haivision was the first to put a cellular field encoder on a camera. They didn’t invent the idea. I think they took a page from the microwave market. The microwave camera transmitters, they’ve been doing that for 20, 25 years, right? Neal, they put the microwave transmitter between the battery and the camera, [crosstalk 00:19:35] Haivision was the first to do that.

Neal Metersky (00:19:37):

Yeah. And I think it’s different… Deployments, a lot of it has been news based deployments, which it’s a different world. They’re not used to having the microwave on the camera, but from a production environment. Looking at it from a production environment makes total sense. It’s what people have been doing, they’ve been using microwave, they don’t want to carry a backpack, they will. And that’s the thing is that there’s multiple options. But we give the additional option of the really clean one of having it right on the back.

Jim Jachetta (00:20:06):

Yeah. Like the PGA’s using larger cameras. They love the idea. Everything is wired in, they got their wireless mics logged… Because the analog inputs on the Haivision, they got microphones wired into the rig. So the camera, the Haivision and the battery come out of a porter brace bag. Boom! They’re transmitting within 60 seconds, they don’t need to connect anything up. But then like Live PD or other productions, they’re not using a large camera, they’re using a much smaller camera. Live PD, they didn’t want a backpack on their back, because they’re sitting in a vehicle and jumping in and out of police cars. They actually were the unit on their belly, they made a pouch for the for the Haivision on the front of their bulletproof vest.

Jim Jachetta (00:20:55):

So they got a battery in the unit shoved on their belly, so it’s in front of them, so they don’t get caught up on the door jamb of the car. So like you said, like Neal said, that every application is different. And then over on the right we have the AIR which is even smaller, you can see some… There’s a photo of a customer who likes putting the AIR on an accessory shoe on top of the camera. You could even put it on a DSLR. Or there is a little sling that you can sling it over your shoulder or a belt clip, if you want to wear it on your belt.

Neal Metersky (00:21:32):

And I think, that a big part of the story here is that, we can facilitate pretty much every type of form factor. From small to light HDMI, SDI, backpack this, that, but it’s all with the same consistent performance specs, same specifications, same functionality, same performance. It’s just how it’s packaged.

Jim Jachetta (00:21:59):

Right, right, right. Yeah, on this slide, you can see the field crew reporter in Times Square. She’s got a wireless earbud in her ear, listening to program feed and directions from the studio. Her microphone is wireless, that’s tied into the camera, the camera operators on comms hearing instructions, hearing the program audio. So it’s a cohesive system. A lot of systems the intercom is lacking. So it’s a really nice setup. I touched on this earlier too, we learned the importance of analog audio inputs. So I mentioned microphones on the golf course.

Jim Jachetta (00:22:46):

I didn’t know much about top trace, that graphic that follows the golf ball as it goes through the air. I had no idea till we started working closely with the PGA that the top trace telemetry goes through an audio channel, I had no idea. So PGA’s like, “We need a bunch of audio adapter cables.” “Okay, for audio microphone?” “No, for the top trace as well.” So they’re able to put that telemetry through the bonded cellular, which was a big win-win. And you can see that the system has a mini XLRs on it to save space, and we have these optional breakouts to full size.

Neal Metersky (00:23:25):

Converse to that, though, one of the interesting thing that’s really coming up is on the slide it talks about hotspot is and one of the things that I really see evolving and being rolled out is the use of IP audio applications. Now that audio guy, yes, he may have several analog, but then he also has an app for comms or an app that’s streaming audio for another purpose.

Jim Jachetta (00:23:51):

Right.

Neal Metersky (00:23:53):

And that’s all facilitated. So, I find that interesting that, yes, the importance and the power of the analog audio channels, but overall, being able to facilitate pretty much what we see comment and trying to stay on top of some of the surprises of, “Oh, we didn’t realize that that it was going to be Dante and not AES67.” That’s what I’m saying.

Jim Jachetta (00:24:21):

Right, right. Right. Right.

Neal Metersky (00:24:23):

New World, right?

Jim Jachetta (00:24:24):

Well, you bring up a good point. So I showed you the slide of the reporter and the camera operator, but in a PGA production or like Live PD or reality or these fishing tournaments. There could be a line producer nearby, but they might not be next to the camera. So how do they tie into comms? Well, you could hook them up with their own Haivision encoder, we’d love to sell more encoders but giving them an encoder just for comms may not be practical. So some of our customers, so some of this support staff, a field technician, a line producer, a field producer will integrate with Unity Intercom. But the problem with Unity, you use your cell phone for it but there’s a single cellular connection in here. So the cell phone, if it’s near one of the Haivision units can jump on to a bonded cellular WiFi connection or hotspot connection. So now you have redundant connection on your intercom.

Jim Jachetta (00:25:29):

So if you’re in a T-Mobile dead spot, and the line producer has a T-Mobile phone… Not to pick on T-Mobile, but just in this example, the WiFi connection to the Haivision unit nearby will allow him or her to stay connected with comms and this Data Bridge, data pipe and hotspot capability has been very important for that kind of stuff or for… Some customers were using Zoom as the back channel. But now we have that returned video so we don’t need to have Zoom but there could be other needs for Zoom or an internet connection on site and this Data Bridge capabilities facilitates that. So another production we did, we’ve done several events with Turner Sports we’ve done the Ryder Cup we’ve done some soccer tournaments, workflow was a little different. They chose to shade cameras on site so they had 16 cameras on site. So 16 ISO feeds from a smaller vehicle. They had a video engineer shading those 16 cameras on site, then they were sending the ISO feeds back to Atlanta and then switch the show live in Atlanta.

Jim Jachetta (00:26:49):

So every workflow is slightly different. And Turner went with the rack-mount version of the Haivision technology so they went with the HE4000, it’s a four channel device. Haivision Wes has a PRO4 and a RACK4 series coming out, which will do one UHD 4K or four HDs either portable backpack or camera mounted or rack mountable, rack AirGo, it’s a rack mounted version, right Neal? So this was an interesting event and, again, the PGA was really amazed at the video quality. Our customers are learning not all HEVC codecs are the same that the Haivision chipset, the Haivision codec and the SST combined gives you reliability but amazing looking pictures and a very efficient… It doesn’t require a lot of… “We had 100 meg stream.” Oh sure. It’s easy to do a great picture at 100 meg but pictures at 2, 3, 5 megabits per second look amazing with Haivision HEVC codec. Then we touched on some of these live reality shows so we’re hoping Live PD comes back.

Jim Jachetta (00:28:18):

The police departments are not the most appreciated people right now, at the moment. Unfortunately, a few bad apples in the community are tainting the good officers that are out there. So we’re hoping to Live PD show comes back but we’re doing fishing tournaments. They brought back Live Rescue, rescue with fire fighters. But it’s a very challenging environment and the SST helped facilitate. I like to say Haivision made shows like Live PD possible, have made these fishing tournament’s live, possible. You can’t put a satellite truck on the back of a bass boat. You can’t have a satellite truck following a police car while maintaining lock on a satellite. I mean, wouldn’t be practical. So customers come to Neal and I, and the VidOvation and the Haivision teams every day with new crazy ideas. “Hey, I got this idea for a crazy show in the middle of nowhere.” “Well, as long as there’s cellular nearby we can make the show happen.”

Jim Jachetta (00:29:28):

So again, you can see the importance of having frame accurate Gen-lock keeping that lip-sync. With Live PD, when there’s a major event several police cars can converge, there could be anywhere from four to eight cameras in close proximity, dozens of microphones in close proximity. So you can see a lower left, this camera operator, he preferred wearing the backpack kind of backwards. He’s got it over his shoulder, wearing it on his belly so he can get in and out of police vehicle. You see the guy top right, he’s got the bulletproof vest. I think he’s got the Haivision in the pocket on the front of his bulletproof vest. And you can see they using much smaller cameras, so there was no way to mount the Haivision on the camera. This is the full size PRO unit in there with the eight cellular modems.

Jim Jachetta (00:30:25):

And the show wouldn’t have been possible without Haivision technology. And the producers of Live PD, Big fish Entertainment. Their management was brilliant. They said, “What is a live reality show very much like?” They said it’s like sports. So they hired, I think, it’s Johnny Gonzales he’s ex NBC director, their TD is a sports guy, all their producers are ex sports people, Larry Barbatsoulis, I think, he was with CBS Sports for years. Sorry, Larry, if it wasn’t CBS, but they brought all these sports professionals in and basically, the whole show is pushed from an EVS replay system. So they dump all the live video into an EVS. You can see your top left, line producers are watching the content as it comes in live and putting metadata, “G, if there’s a gun. D, if there’s drugs in the scene.” Yeah?

Jim Jachetta (00:31:30):

So everything is coded, then another producer, an executive producer, then is taking another pass at it, looking at the metadata and finding segments and clips. And then they play it back live in front of the director. And the director goes… So they’ll be like, “Okay, Johnny, we got a package for you out of Tulsa, three, two, one rolling.” And now the four or five cameras are being played back in front of the director and the TV and they go, “Take one. Take two.” And they’re directing it like it’s live, but it’s already a few minutes old. So very ingenious, the workflow that they came up with to do a show like this. So we learn a lot from our customers. We provide the tech to facilitate the transmission. But they’re the real geniuses. So, this slide just kind of shows you… I think, Neal and I touched on this a little bit, the PRO3 can do 264 and HEVC.

Jim Jachetta (00:32:37):

StreamHub has the ability to trans-code so why would we need 264? Well, some of the competing systems, you have to transmit at 264 because their receiver can’t trans-code and why is that a problem. If you’re going to Facebook or YouTube, most of the social media platforms still don’t support HEVC yet, so you have to give them 264 or they’ll take 264 but they more want variable bit rate 264. So Facebook… I don’t remember the exact parameters. But Facebook might want a constant bit rate 264, ideally, four megabits per second. So we would transmit HEVC to Master Control, trans-code to the format that Facebook wants, then I have a second trans-code engine with the parameters for YouTube and so on and so forth. I think we can have up to eight or 16 trans-code instances. Right, Neal?

Neal Metersky (00:33:35):

It depends on the power of the StreamHub transceiver. But yes, to your point, it’s not a receiver, it’s a transceiver and we can accept pretty much, not just our own IP transmissions of different codecs, we can accept others as well. SRT and we can distribute them via trans-code because there are times where you have to… And especially, some of the online distribution points are very persnickety about what they want to see.

Jim Jachetta (00:34:12):

Right.

Neal Metersky (00:34:13):

And variable bit rate is usually what you use for acquisition and more and more CBR is what we need to deliver.

Jim Jachetta (00:34:25):

Yeah. I mean, I would never pretend to be a production professional, I dabble. So, I’ve done some video engineering work at Saddleback Church here before COVID. The in house production was shut down at Saddleback. So they went all online, went virtual, we help them facilitate some of the virtual stuff at Saddleback. Then I started volunteering at a drive-in church, so people stayed in their cars and we broadcast in a parking lot in Newport Beach. And we transcript transmit to Facebook and Facebook can be finicky, if you send at five megabits, instead of the recommended four, it’ll hiccup, it’ll fight back, it won’t like it. It’ll drop out or the resolution. Facebook might want 720P, but YouTube wants 1080I, and you can set all these parameters in the encoder. So it’s a piece of cake. What do they say, “I’m not only promoting Haivision, I’m an Haivision user. I use the tech.”

Neal Metersky (00:35:42):

And one of the things I was talking a lot about YouTube, but more and more I think it’s going to be about broadcast level distribution and contribution to multiple MSOs, different people, again, want… DirecTV wants it one way, and Comcast wants it another way. And it facilitates all of that. So you can easily create and trans-code and distribute the multiple formats from multiple formats.

Jim Jachetta (00:36:13):

Yeah. And to that point, we have a slide coming up on it but as you mentioned it, Neal, the StreamHub. Haivision came up with the word hub, because it’s not just a decoder, it’s not just a receiver. So it’s a receiver. It’s a decoder, it’s a trans-coder. It’s an encoder, it’s allowing return video, it’s managing the comms, the communications. It takes generic IP inputs, there’s SRT ins and outs. So it’s really a very robust platform, very stable platform. And one of the webinars we’re going to do in the coming weeks will be on the latest firmware version 3.5 of the StreamHub. There’s a lot of new features. So, Neal, I mean, you want to speak to this slide, it kind of shows the functionalities of the PRO and AIR?

Neal Metersky (00:37:11):

Yeah. I mean, the basic overall we can live record, forward files, we can now simultaneously live and remote control via the Data Bridge, Hot Folders. Mission centric, there’s a lot of different things that we can do that fit a lot of different workflows.

Jim Jachetta (00:37:34):

Well and then also, don’t forget the Video Return and then the Auto Live has been very helpful. So Neal and I and both of our teams have found with at home production, REMI production or you got an analyst working from home because of COVID or just travel restrictions or just costs, they don’t have the resources to send the tech to the on air talents home or the weather person’s home, doing the nightly news. To set up a camera set up a bonded cellular, a lot of non technical people have had to set this stuff up themselves. They get a shipment from Master Control, with a camera tripod and the bonded cellular, and they have to set this all up themselves. So this auto live feature, I love.

Neal Metersky (00:38:30):

The-

Jim Jachetta (00:38:31):

[crosstalk 00:38:31] So there’s a little hidden switch on the PRO3, where you can turn the auto live on and as soon as the unit is turned on, or plugged in, or you put a battery on it, it wakes up connects to the towers and it goes live. So the talent, as long as they can plug the darn thing in or put the battery on it. They don’t have to touch any of the buttons. It just goes live. So that has been great. And then the Data Bridge gets established. The live video gets established you just have to put the camera on the tripod, frame the shot, or if it’s a PTZ camera. We’ve had some customers they’ll build a whole kit in a pelican case. You take the lid off the pelican case, there’s a PTZ camera there. The Haivision bonded cellular is bolted in there. They plug that in the camera comes alive. The bonded cellular comes alive and a video engineering Master Control frames to shot with the PTZ cameras. And shades the shot. So customers are doing all kinds of crazy stuff.

Jim Jachetta (00:39:37):

So different types of architecture and we can’t cover them all. But some customers are doing all the production back at home. So it’s a full REMI, full at home. Some are switching on site. So they might have a small production truck, they might have a Tricaster or Ross Carbonite in their vehicle or their trailer switch to show in the truck and they send the program output to the internet or the Master Control, or a hybrid of both. Like Turner Sports, they shaded the cameras locally, but switched the show back at Master Control. So the configurations are really up to you. Then here on the right, the remote controlling of the cameras. We see more and more usage of PTZ cameras. So you don’t even have to have camera operators on site. We’re controlling the cameras remotely.

Jim Jachetta (00:40:42):

Maybe you have camera operators, but you don’t want the camera operators worrying about shading. So that’s being done remotely. The configurations are limitless. And again, it comes down to what makes this all possible is that SST, the Safe Streams Transport. Did a webinar back in 2020, I believe, Samuel Fleischhacker and I did one on SST. But so what is it? It’s bonding, it’s aggregating all the connections. So like the flagship PRO380, you have eight cellular modems, but you also have two LAN connections and WiFi. You actually have up to 11 connections. So it aggregates all those connections together all those small pipes into one bigger pipe. It’s doing ARQ packet retransmission, it’s doing a forward error correction, data balancing with the bonding of the connections together and then you can even prioritize. So if you have two LAN connections that are lower cost they are or maybe free, two internet connections, you can set those the high priority and then set the cellular to low priority.

Jim Jachetta (00:41:59):

We don’t recommend you turn it off because if the LAN connection takes a hit, you won’t have time to turn it back on. So you set the cellular allow so it’s connected, it’s on standby. It’s using a little bit of data. Then if the LAN… We’ll do this test with customers, they’ll say, “Unplug the LAN or unplug both LANs.” And the unit won’t even drop any packets, you’ll see the cellular slowly pick up the bandwidth that the LAN connection dropped, it really works amazingly well. This is just another slide showing the topology. This is an interesting schematic. So it kind of shows you the signal flow or the interconnections. So the Data Bridge function opens up an inner connection between the field and Master Control. You can even have assets in the field on the same subnet as assets in the studio. So a PTZ camera, as far as your video controller is concerned or your CCU or your PTZ controller is concerned in the studio. The camera looks like it’s local.

Jim Jachetta (00:43:14):

But one of the things we learned, Neal and I and the team has learned is that, some camera controllers, some shading devices, some PTZ systems don’t like any latency. They’re meant to be on the same network with two, three, five milliseconds, 10 milliseconds of latency. You can see in good conditions or typical conditions, the round trip latency on cellular is usually 40 to 60 milliseconds. So some controllers timeout, that’s too long for them. So we’ve partnered with a provider Cyanview, and they have different devices. They make a nice RCP, they make these RIO and CIO. They’re like a size of a pack of cigarettes, a little tiny device that goes on the camera that smooths out the latency. What it does is the little device actually emulates the CCU and it’s close to the camera. So if there is latency, the camera doesn’t timeout. So it’s a very nice companion.

Jim Jachetta (00:44:28):

The PGA has been using this tech. And it really helps do shading and camera control, PTZ control over an unmanaged connection like the public internet and cellular. Can even do tally through the device. So we encourage you to reach out about that, VidOvation and the Cyanview and Haivision teams, we can integrate all this technology together for you. This just shows you… There’s this little cloud icon that comes on with little arrows showing that you have the Data Bridge with a hotspot connection active. Some customers need a bigger pipe for the data connection. So primarily, when you’re running, say a PRO380. Your primary function is live video. So if you’re a live video, it won’t take the whole pipe away. But most of the pipe will go to live video. And then Haivision guarantees at least 500K for camera control or comms. But if you’re a customer that needs more bandwidth, quite often, they’ll have one 380 just for the Data Bridge, and hotspot connection.

Jim Jachetta (00:45:53):

And then other units for the live cameras if you need a more continuous stream, but you can use the two together have that live video with the Data Bridge. So here’s another picture. Cyanview not only helps shade cameras, but it can help with PTZ and then not all PTZ cameras are sensitive to latency. So like this BirdDog PTZ camera, we don’t need to have the Cyanview RIO there to help facilitate. The RIO will convert from serial to IP but also helps to smooth out the latency issues. So let us know what kind of cameras you’re using. Tell us about your workflow. Text us at 949-755-8881 with any of your questions. We’d love to hear, learn more about your workflow, learn about some of your challenges, and hopefully we can help you solve some of those challenges. Oh, what did I do? I killed the presentation. Uh-oh. Wait a minute, what did I do? What do I do? What I do now?

Neal Metersky (00:47:14):

Oh, I don’t have technical difficulties a lot.

Jim Jachetta (00:47:19):

I blacked out… Okay, wait. All right. So-

Neal Metersky (00:47:22):

Here we are, here we are. You’re back. You’re back.

Jim Jachetta (00:47:25):

Let me see. They’re skipping slides. Now we’re back. I guess there’s the black button. I didn’t know. I may hit the B button, Fade to black. So Haivision, one of the things, we have a fairly deep customer base here in the US but Haivision has been around for… What? 12 years now. 11, 12 years they have a very strong customer base. They are the leader in Europe. There’s 100, 200 customers globally. Many, many European customers, Asia, Latin America. Neal came on board in… What was it? October. So Neal’s a Managing Director here in the US. Neal helped to add to Haivision support team, to add to the infrastructure here in the US. Neal and the Haivision team have been working very, very closely with VidOvation. We kind of operate as one company, at least here in the US. We’re very closely together. So I mentioned, some of our bigger clients don’t like as mentioning their name. So we can only mention some of our customers but we’re doing a lot of these fishing tournaments and they’ve tried other technology.

Jim Jachetta (00:48:41):

It just doesn’t work in the middle of the lake and the Haivision tech is able to get those shots are doing Bassmasters Major League Fishing, PGA, Ryder Cup, a lot of live reality shows. We’re going to do another webinar on this. This is more for news, right Neal? The Mission Centric workflow. My understanding of it is, it’d be great for a freelance photographer so he arrives in Detroit and he’s like, “What am I shooting here today? Where am I going?” It’ll appear here, “Okay. The cycling race will appear or the election coverage.” He’ll see his mission. The package he’s supposed to capture that day. And then when the video feeds come into Master Control or into the News Bureau. The metadata is there, say, “What is this video clip for?” “Oh, this is the bicycle race or this is the election coverage?” Right? I mean, you would know more about this than I would coming from CNN and news, Neal, maybe you could speak to that a little more.

Neal Metersky (00:49:48):

Yeah, I guess, we kind of jumped a little bit there from wrapping up the REMI production to-

Jim Jachetta (00:49:54):

Yeah.

Neal Metersky (00:49:55):

Other things that we do and what’s coming out. But yeah, actually, as you said, in the US, clearly we did not take advantage of and jump on the news market seven, eight, nine years ago as it rolled up. And our focus has been in high end production, and even low end production with MoJo apps, and we want to [inaudible 00:50:26] get, right? But in Europe, we’re number one in news for a long time. And yeah, what this allows… This actually allows news organizations on a daily basis, to really add a level of efficiency to the workflow. So when they’re having the morning meetings, and they are assigning resources and assigning stories. And once that’s created when the producer creates it in their newsroom system, that story is assigned to the asset.

Neal Metersky (00:51:00):

So not just the field photographer who’s going, but his encoder unit, his field unit is assigned to that story. So when he fires up, he can select the mission, the story, and everything he shoots, metadata is already ingested. Everything he or she shoots, is created with that metadata and then forward it back into the system. And we can even do things like rename that file as it’s coming back in. So that it’s then [inaudible 00:51:33], that it ends up. So it starts in the newsroom production system and it ends in the newsroom production system ready for them air it. An extremely powerful thing. And one of the things that I’m hoping that we can get news customers, at least, some key news users and customers in the US to recognize.

Jim Jachetta (00:51:55):

Yeah. And we’d like to get feedback too, like if a major News Bureau is working with a certain brand of automation we’d like to have a conversation about it, maybe we can add your preferred vendor of choice to the roadmap and cover that. I think, today Dialect I think is a French company, is very popular in Europe. So it makes sense that Haivision integrates with that. So I’m sure, a lot of the European News Bureaus are using that. We’re working with EVS, I believe as well as others are in the pipeline, right?

Neal Metersky (00:52:33):

Well, and the interface from and to most of the common like a Bitcentral even some Ross products are pretty straightforward and relatively simple. But yes, with Dialect, we actually have a integrated with their digital asset management, directly. But to facilitate the workflows, regardless of the system is an easy thing to do.

Jim Jachetta (00:53:01):

And I should mention too, Ronan and Samuel probably could speak more to this. Ronan, the CTO, Samuel, the Product Manager. Everything has a very sophisticated API. So some customers have programmers on staff and they can integrate with any system, any workflow systems that they have. Or you can hire Haivision Professional Services to help with some of that integration. So a lot of that is… Customization is available today. So, what I like about this slide the most is it shows the little priority, little graph there. So you see here on the Ethernet WiFi, we have those set to high priority and some of the cellular connections are set to low. There’s only two settings high or low. But I think that’s a great visual so you could set your cellular connections to a lower latency. And then, everything in the Haivision ecosystem has the same buttons.

Jim Jachetta (00:54:12):

So whether you’re in the StreamHub remote controlling an asset in the field the play button, the record button they’re all the same. The soft buttons on the unit when you’re in the field are exactly the same, right? There’s consistency in the user interface across all assets, across all platforms. This slides I guess, a little redundant. Talks about the SST again, we kind of covered that. But maybe the takeaway from this slide is, we call the category bonded cellular. I think Tom Butts at TV Tech coined the phrase a while back when I was on a panel with him years ago, it’s really bonded IP. And what do we mean by that? Cellular is one of them, internet could be another. Some customers, they bond cellular IP and satellite all together, bond all the above.

Jim Jachetta (00:55:08):

So it’s not only a bonded cellular, it’s a bonded IP capability. And many of our customers use it that way. So this slide shows the automatic live feature I mentioned to. I kind of verbally told you this. You either press the power button, plug the unit in, or add a battery, and then boom, boom, boom, within seconds, you’re transmitting live, which is great, I really love that feature. And again with low tech operators, it’s very important. So here’s another slide showing, so maybe your primary feed is the satellite and then the cellular is backups. So you got high priority, low priority, however you want to use it is up to you. Return video is really cool, that you could have multiple return videos.

Jim Jachetta (00:56:03):

So maybe one crew needs to see the program feed, another crew wants to see an ISO feed of their own shot, they just want to see. Confidence, “Is my shot reaching Master Control, I’d like to see my shot.” Or there might be two feeds. One is program. And then I should mention, we keep forgetting, Neal, the multi-viewer function, you can have a multi-viewer feed on a second return feed. I guess, you can have virtually an unlimited number of return feeds or probably limited to 16. Probably, right? Because you have 16 inputs, right Neal? I think QVC was doing a multi-view return of all the camera feeds, and then a second feed with program. So they had confidence, they could see all the cameras were reaching Master Control. And then they had the program feed to help cue the talent, right?

Neal Metersky (00:57:01):

Yes. Yeah. I mean, there’s a lot of different ways to apply it with the… Any input can be used as a return source to feed it. So you could have multiple return inputs, or just utilize the incoming sources themselves, multi-viewer outputs can be used, and there’s a lot of flexibility. People deployed everywhere, from obviously high end productions to things like using return to feed large screens for events, and a lot of things that I think both me and you found really interesting was for remote directing, of high end film and scripted productions, as well. And that was part of the new COVID world, right?

Neal Metersky (00:57:52):

Effectively, we remote the video assist monitor from the site, to the director sitting at home. So yeah, a lot of flexibility. And it feeds into, as part of the StreamHub ecosystem. To me, where we are and where we’re going is more of the center of an IP ecosystem. And this is one of the examples of how that can be applied and utilize, using any of the inputs to any of the outputs for return. One of the big differences that has actually enabled us to do things like feed large screens, high resolution large screens at high quality events, a concert. Is that the return feed is… It’s not scaled down. It’s a full res HD-

Jim Jachetta (00:58:40):

Right.

Neal Metersky (00:58:40):

Feed. That’s one of the differentiators is what-

Jim Jachetta (00:58:44):

Full frame rate, full bandwidth.

Neal Metersky (00:58:46):

Right.

Jim Jachetta (00:58:46):

You touched on it, Neal too, so one of the potential problems with REMI or at home production. So you go to a sporting venue and Game Creek or NEP will come in to do the production for NBC. So they set up their cameras on sticks in the venue. Then there’s the in house production setup cameras right next to the broadcast cameras for the in house replay and color, during commercial timeouts they got to produce a show, keep the crowd engaged. Well, if there is no in house crew or in house production, so take like high school football. We’ve done some projects for high school football with some major broadcasters in the South. Friday Night Lights, Football on Friday nights is more important than the NFL, depending upon where you live, right? So if the truck was on site when the truck was on site, the replay feed for the fans in the stadium came from the truck, now the truck is gone. We’re producing the show in another location hundreds of miles away, thousands of miles away, how do we put replay up on the screen in the venue? The return videos used for that.

Jim Jachetta (01:00:05):

So that’s proven very important. Over here on the right, we show two Haivision RACK200 feeding, they return feed, many of our customers who already have an IP feed or already have the program feed in IP format, can go to an Haivision input on the StreamHub as either SRT, RTMP, HLS, whatever format they might have and use that as their return. And you don’t have to have an encoder, an Haivision encoder to do that. So, Neal mentioned the remote directing. So these are productions that are not live, so they’ll be they’ll be shooting with cinema cameras, they’ll have an Haivision PRO on the camera. So for video assist, while they’re filming, the live video comes to a StreamHub, the director can look at each camera or a multi-view of each camera from home. And then if the director needs to speak to one of the directors, because they’re not there, they need to give some notes.

Jim Jachetta (01:01:23):

They’ll actually have the Haivision MoJoPRO app on a phone and hand the phone to the talent and say, “Hey, John. I love your performance but that’s not really not what I’m looking for.” So you can have this two way conversation, all within the Haivision ecosystem. And the director is sitting in his living room in Malibu while he’s directing a feature movie or production in New Zealand. So again, it comes back to our customers teaching us. They’ll come to Neal and I and the VidOvation team and Haivision team, “Hey, how can I use this tech to do the remote directing?” And it’s pretty amazing. So you can see here that… Neal, kind of touched on this earlier, maybe I’m not going live with video directly on the PRO or the AIR, but I’m doing something with my phone, I’m streaming with Zoom, I need to connect a computer, maybe I got zoom or some other video platform.

Jim Jachetta (01:02:37):

I don’t want to go live with the Haivision stream, I just need a solid internet connection or I need to connect to my Master Control. So either a wired Ethernet connection or WiFi or both. You get a connection out in the field and many customers are doing this. Here, this is an interesting slide. So it kind of shows an overview of the ecosystem. So it shows on the left are all the types of inputs. So it could be field encoders. It could be something coming from the cloud vMix, Grass Valley, AMP. We have SRT listed here. We should also add NDI, right? We’re doing NDI ins and outs. We’ll talk about that on an upcoming webinar, right? NDI inputs and outputs we have many of our customers they’re integrating a Tricaster or a new tech, three play, instant replay using the NDI hooks instead of SDI. Or using NDI cameras etc, to bring them into the in and out of the ecosystem.

Jim Jachetta (01:03:52):

So the StreamHub can either be physically in your Master Control with SDI outputs, we haven’t figured out a way to do SDI inputs yet in the cloud, right? Neal, I’m joking, of course. But for a cloud based StreamHub, you can have, I think, upwards of 16 or more IP ins and outs of a StreamHub instance. And then if you do hit a limitation, you can cascade virtually an infinite number of StreamHubs together to go from one to hundreds of locations if need be, right?

Neal Metersky (01:04:30):

Absolutely, absolutely.

Jim Jachetta (01:04:33):

So we encourage you to let us know your workflow. If you’re on AWS or Azure or whatever cloud management, whatever cloud provider you’re working with, as long as you can spin up a Linux base instance, we can help you spin up a virtual StreamHub instance in the cloud. This slide is a little dated, but we’re doing SRT ins and out, RTSP transport over RTMP, HLS. NDI has been shipping for a while now, WebRTC, I think, is that out yet or is that coming? I mean-

Neal Metersky (01:05:19):

We had applications soon to be announced and rolled out that will expand our capability to emulate encoders and create acquisition points, using WebRTC. I know that sounds like a marketing, complicated engineering technical way to describe it, but really neat stuff. And really the story just, again, I could sum up five of these last slides by kind of saying that again, we’ve got an IP based infrastructure with extremely reliable and powerful acquisition via bonded cellular and SRT, and open to be able to acquire, and soon to create SRT from our encoders is coming soon as well.

Neal Metersky (01:06:13):

So, interoperable ecosystem that we can acquire IP formats and distribute IP formats, effectively routing, distributing, trans-coding all in a nice, easy to deploy really rugged and robust ecosystem. So multiple inputs, multiple outputs, 2110 is out there a little ways. 2110 for ingest, I don’t know if that’s ever really going to be a need. But, yes. And continuing forward with what other IP? IP audio, IP audio formats having available ingest out. And internally, whether we use them for comms for programmer fees and then routing, shifting the audio channels, all of that in a cohesive IP infrastructure.

Jim Jachetta (01:07:15):

Absolutely, absolutely. I see I got a typo. I spelled SMPTE wrong. So S-M-P-T-E 2110. That support, it’s on the roadmap. I think, we can give you more specific dates on things that are on the roadmap, in an upcoming webinar, Neal and I are going to do. This is a little redundant it’s just showing PTZ cameras. This is interesting. Neal mentioned this a couple of times that we’re not just a streaming video company or bonded cellular you can go live, you can move files, you can do PTZ cameras, you can use other streaming services, intercom, the SST combined with live video combined with the Data Bridge can facilitate all of this. One of the things we, Rick, Neal and I and our sales teams at VidOvation and Haivision. Who are some of Haivision customers? Yes, the majority of these names you see here are in Europe or in Asia or Latin America.

Jim Jachetta (01:08:24):

We’re growing in the US, but Neal and I are not going to bore you and read these to you. But there’s 100, 200 names here. People using the PRO, people using the AIR people using the MoJoPRO app, who’s who. And I’m sure we’ve missed quite a few of our customers. If we’ve missed naming you we apologize. So Haivision has been very busy. If you haven’t looked at Haivision recently, please reach out. There’s been a lot, a lot of new developments that I would say 80 or 90% of Haivision budget is going into R&D. The funding they have raised is going primarily, all into product development, hiring engineering staff, hiring support staff. Right, Neal? Haivision has been very, very busy. So you really need to look at what Haivision is doing today. They’ve made a lot of leaps and bounds in the last couple of years, a lot of really cool interesting stuffs on the roadmap.

Jim Jachetta (01:09:41):

So that’s a perfect segue. So I think the next webinar, Neal and I will do, we haven’t set the date yet, but we’ll probably do it next month. The product roadmap, I think, customers would really like to know where Haivision is today. What are the new features that are available today? But where are we going? We would encourage your feedback. If you want to stay in touch with me or VidOvation and be reminded of webinars, ask me questions receive an ebook, on at home production, text me at 949-755-8881. We’ll talk about new developments in the flagship PRO3 series, the AIR, the RACK. We’ll talk about the MoJoPRO app for both iOS and Android, we’ll talk about some of the new features and capabilities receiving, decoding, trans-coding and encoding with the StreamHub.

Jim Jachetta (01:10:46):

We have a product presentations that Samuel has put together on the PRO3 and the AIR, we can go into that. MoJoPRO app, the manager, how to set up redundancy. Many of our customers their main bread and butter is the production and the cloud is not perfect. Amazon can have an outage and maybe you want to cloud instances in two different cloud providers. And using… We have an application presentation where we show we can make one plus one automatic failover, automatic redundancy that if something should go out, it’ll switch over automatically without any user intervention. So that’s been a very important to some of our customers. We can do a deeper dive into video return and then the Mission centric workflow. I’m really trying to encourage people to sign up for this private text messaging community I’m building. One on one with me, and people that know me. Even conversations with customers, it takes about five minutes for me to bring up food. Neal, you’re you’re a foodie. Neal actually is a very good cook. I’ve heard Neal is a very good cook. You got to cook for us sometime, Neal. And he’s always-

Neal Metersky (01:12:16):

I send pictures and it looks good.

Jim Jachetta (01:12:19):

Yes, he sends pictures or when Neal comes to Orange County, we usually let him pick the restaurant because he’s got the more discerning palate. So to kind of encourage people I want to give something of value. So I’m not going to spam you. Once a month, maybe I’ll invite you to a webinar, I’ll give you a recent ebook once a month that’s been of interest to customers where they feel that might be something of value depending upon who you are. And then as a little reward or an incentive. I think once a month we’re going to do… There’s a local Italian deli that I like, I’m going to put together a little gift basket.

Jim Jachetta (01:13:01):

I mean, if you’re vegan, I can put things that are… I’ll leave out the salami if you’re vegan, but put a little Italian gift basket together. Finding good Italian food in Orange County sometimes could be a challenge I do miss Mott Street and Little Italy. So please sign up at 949-755-881. And then of course normal means of communications. When you engage with VidOvation, if we’re talking about bonded cellular and Haivision, we bring Neal early on into the application. So we’re doing virtual discovery calls every day with customers. We’ve been using GoToMeeting and GoToWebinar before Zoom even existed. I don’t know everyone uses Zoom now, Neal. I think at VidOvation when our contract is up with GoToMeeting. Maybe we got to switch to Zoom or Teams. So is there anything maybe I forgot to mention? I’m sorry, Neal. I did most of the talking today. Is there anything you want to mention or ask our audience?

Neal Metersky (01:14:18):

No. Well, I’d like to sum it up just by saying you’ve covered a lot of different things in your way. And a lot of different stories of things that we’ve experienced and customers have experienced. What I would kind of like to sum up, as far as REMI production goes is, most of us who’ve been in the production side we’ll sit there at times over the years, and it’s been, “Won’t it be really cool when…” And so much of this world, the bonded cellular world, was always about that. When I first got into it years ago, one of the things I was reminded of was even years before that, we had just put the first ever Avid system in at CNN, New York. It was the first one ever that CNN put it in, and it was in New York.

Neal Metersky (01:15:18):

And me and one of my colleagues from Atlanta, were having a beer kind of afterwards and over dinner, and we start talking about what ifs, and Kevin goes, “And won’t be so cool when we can just slap something on the back of camera, and go live.” And this was kind of around the time where Avid was showing that really big digital camera bag that was way ahead of its time. But now we see the implementation and the deployment of this, and it’s from everything from using your phone for MoJoPRO app, up to and getting closer and closer to being able to do the full fledged, high quality, remote productions. And 5G is one of the things that will kind of take us there to that edge on. But whether it’s with MoJo app, or whether you just need four cameras coming back, and you don’t need to paint them, you don’t need to do that but you can do a quick switch remote or next level is a level of remote control.

Neal Metersky (01:16:29):

And then 5G and lower latencies will open up soon the higher levels of remote control, where it’s one thing for an operator to remotely set, pretty much static shot or a bunch of static shots around the country even, as compared to trying to run video for football game on what we used to call Holy Roller days where the clouds are rolling in and out, and you need instantaneous. And that’s the more challenging end. We’re able to do more and more and more and overcome the more challenging of environments with the technology that’s available and soon coming. We’re certainly on the upside of the hockey stick. And I think it’s exciting times.

Jim Jachetta (01:17:23):

Yeah, no. Absolutely. Absolutely. Absolutely. Yeah. And then we’d love to hear about some of the successes, some of the challenges customers have faced and inventing new applications for the technology, we really want to have a dialogue and figure out a solution to some of the challenges that still exist. Integrating… Example of like integrating the Cyanview with the Haivision, maybe there’s some other integration VidOvation and Haivision can help with. So reach out to Neal or myself, reach out to the VidOvation team reach out to the Haivision team. We’d love to hear from you. We’d love to engage with you. So it seems like things are getting back to normal the last couple of days. June 15th, they said we can remove our masks here in Orange County.

Jim Jachetta (01:18:26):

So I’m seeing things are slowly getting back to normal. I hope everyone’s staying safe and healthy out there. Neal and I have been… Our teams have been traveling. So if you feel it’s appropriate, we’d love to come and do an in person demo. We can do things virtually. We do combination of both. So certainly reach out to us. And thank you everyone for joining today. Thank you so much, Neal. And in the coming weeks, we’re going to try to do one of these a month. Right, Neal? So, we’ll put the schedule out. We got like five or six topics planned over the next couple of months. And if you’re not sick of seeing Neal and I we’d love to see you hope you enjoy. Thank you so much, Neal.

Neal Metersky (01:19:20):

Thanks, Jim. Thanks, Jim.

Jim Jachetta (01:19:22):

Take care. Take everybody. Take care Neal. Bye for now.

 

Continue Reading