Select Page

How Private 5G Technology Simplifies Live Remote Productions & Broadcast [STE Presentation & Audio Recording]

Learn how broadcasters and production companies adapted to the changing workflows during the pandemic. Learn how to save costs using 5G cellular and Public Internet connectivity. Produce a live show with multiple untethered handheld cameras using the latest At-home Production and REMI technology. Learn the techniques to maintain frame-accurate genlock and audio lip-sync across dozens of cameras.

Here’s what’s in it for you:

  • Learn how broadcasters and production companies adapted to the changing workflows during the pandemic.
  • Learn how to save costs using 5G cellular and Public Internet connectivity.
  • Produce a live show with multiple untethered handheld cameras using the latest At-home Production and REMI technology.
  • Learn the techniques to maintain frame-accurate genlock and audio lip-sync across dozens of cameras.

 

Loader Loading...
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Download [6.99 MB]

 


Bill Hogan:

Let’s start our program. Jim Jachetta from VidOvation will talk to you about remote production and utilization of 5G for newsgathering and production and everything. And he’ll turn on his mic, and we’ll turn it up here in just a second.

Jim Jachetta:

Do you guys know who I am? Jim Jachetta. It’s been a while. There were a few years when my colleagues and I came on a pretty regular basis. I appreciate you guys, you have good food and good people. Hopefully, your brains won’t hurt too bad after I get through this. I geek out on this stuff. So, how many people in the room do some remote production, bringing feeds from the field into the studio? Got a few hands. Thanks, Glenn. There are numerous ways to do it. In the early days, you’d use a fiber or satellite link to bring those feeds in. We’re going to talk about 5G tonight.

We will talk about private 5G, public 5G, and bonded cellular, though it is the overarching theme, or streaming video through the public internet. Let’s dive into it. Why do we want to do… Well, we call it at-home production or REMI production. VidOvation has been doing at-home production or REMI production. REMI stands for remote integration. It’s got the word remote in it, but integration? At home, I think, is a better term. And what do we mean by at home? The old-school approach of doing a show was having a truck in the field. They do Monday Night Football.

They do the Super Bowl. There might be three trucks just devoted to graphics. One truck or several trucks are devoted to replay. There could be a dozen trucks for a significant event. Now, you have to fly all those people out to the event. You have to pay them per diem for food, for the hotel. It’s costly. And then these knowledge workers who know how to run an EVS or a Ross Carbonite production switcher, or whatever the task might be, can do one game a week if they’re lucky. They’ve got to fly out on a Wednesday for the Sunday game or on a Friday for the Monday night game.

If they’re lucky, they can do two games a week, squeeze in a Thursday night football and a Sunday game. With at-home production, we’re eliminating the truck. Some of our customers are truck providers, so they hate it when I say… They hate this slide. A vendor called me up and said, “You can’t use my truck on your website anymore. You’re making my truck look bad.” But it’s just to make a point. Putting the red circle through it is to make a point. The truck may just be moved to a different location, or the truck is smaller.

Some customers will do a hybrid. Maybe they shade the cameras locally and then home-run them back to the master control. At home, we bring it home to our master control and produce the show back home. Now, during COVID, the home became home. A lot of operators were doing it from their house. The word home even became more relevant.

This slide just showing that by using this at-home technology or REMI technology, we can reduce the number of personnel we got to put in the field, we can deploy quicker, we can save a lot of money, and the net result is now we see second-tier, third-tier, fourth-tier sports or live events that you normally wouldn’t see on TV, now you can start seeing them on… ESPN2 now is carrying bass fishing tournaments, and we’re involved in that. I don’t think any time soon they’re going to be doing Monday Night Football strictly on bonded cellular. That wouldn’t be the play.

It’s all a reduced footprint. I’m going to talk about some of the bigger projects we’ve done. Some of our current customers is the PGA. The top-level masters tournaments, I know they’re using a more traditional workflow. They’re using a fiber connection. Pebble Beach has a fiber connection from there back to St. Augusta to do the Pebble Beach tournaments. But we do the second and third-tier PGA, the minor leagues of golf.

We do Korn Ferry. We’re doing Q-School in December. School, it’s the training, it’s the training camp of golfers. I don’t know if you guys are big golfers, if you know all this stuff. If you see a Korn Ferry tournament on the Golf Channel that’s being carried over a bonded cellular solution provided by VidOvation. And are any of you guys fishermen? Do you like to fish? Glenn, you own a boat. Do you like bass fishing? You like tuna, good choice. There you go, there you go. I like your thinking, I like your thinking. Believe it or not, people like to watch other people fish on TV. It’s a thing. We work with Major League Fishing, we work with Bassmasters, and a couple of other … just got into this groove where we’re doing a lot of bass fishing tournaments, and our competitors were struggling to have a good signal out on the lake, a brand that begins with the letter L. I won’t say their name. Some of these tournaments were using this other brand and the signals were choppy. They weren’t getting a good signal. The customer comes to us and says, “Hey, I hope your product works better because what I got ain’t working.”

The bonded cellular solution that we provide from Haivision fit that challenge. The competitor’s product was going zero bits, one meg, zero bits, one meg, struggling, and they put one of our units in a boat right next to it and we were at five meg solid.

And why is that? It’s a lot of little things, better antennas, better modems, better algorithm. But we encourage you to try it, try to break it. I love it when customers try to break our stuff.

So VidOvation and our partner Haivision, we facilitated the first live sporting event after the lockdown. So March of 2020 we all went into lockdown. Sports was stopped, all major sports were stopped. So we did a skins game with the PGA. It was a charity tournament so it wasn’t, there was a purse. No, actually I take that back. I think the players donated money. The PGA, the league donated money. So they were playing for charity. They weren’t playing for their own purse. So they picked four star players and we had a bunch of cameras out on the course live and the local county health department would not allow more than 50 people on the course.

They did it at the seminal golf course. It’s one of the players fathers a member there and it’s not a PGA approved course. It’s a private club. So there’s no fiber connection there. There’s nowhere for the broadcaster to hook up. So they could have brought satellite trucks in, but that was too expensive. So the PGA was already using our tech for these minor league golf tournaments. So this went out prime time. So this was kind of a big deal. This was a prime event and there were a lot of eyeballs because golf had stopped, all sports had stopped.

So here’s the challenges. So the PGA came to VidOvation and said we’re using this other brand, the L word, and we have a lip-sync problem and a genlock and video synchronization problem. So you’re doing a live show, camera one arrives early, camera two arrives late, the audio on one is late, the audio on two is early. You can imagine you have no post-production to fix this mess and microphones are bleeding into each other because the cameras are in close proximity. So there’s got to be something better. And they came to, and at the time the bonded cellular from Haivision was originally Aviwest about 18 months ago. Haivision bought Aviwest. So now it’s part of the Haivision family. So they did some tests and you can see here on this slide, this was just actually the tournaments we do now are much more complex.

This had six cameras, maybe twenty, twenty-five microphones. So you got a parabolic microphone operator aiming at the T box. Then you got a photog filming that the mics are picking up. Whoosh, whoosh. Four four, all the sounds of golf. It was out of sync. It’d be like there’d be echo there would be beating. It’s unusable. So it’s patented technology. What facilitates this is Aviwest, what they call SST Safe Streams Transport. It’s funny, I was joking with someone before that the Haivision invented SRT, right? You guys know that. So they bought a company that their protocol’s, SST kind of goes together. SRT, SST. So SST is used for the first mile, get it over cellular. SRT is used for the long haul. So it kind of makes sense why Haivision bought Aviwest. So it’s this SST that makes this all possible. They’re doing, it’s an unmanaged network.

So it’s not the same as Precision Timing Protocol, what’s used for synchronization in [inaudible 00:12:51] 21. But it’s a similar mechanism. They’re sending timing signals, timestamps to and from the unit in the field synchronizing with the unit in the studio. And as long as you feed genlock into the receiver, all the video and audio comes out of the box in perfect sync or near perfect sync. And that’s what the PGA wanted and we met that challenge. So this is a slide. This is the SST. So the bigger units are called pro, that’s their name, pro, PRO. Smaller units are called air. So what’s the difference? Essentially the same got a really high quality HEVC codec. I’ve learned this from working with different vendors. Not all HEVC codecs are the same. Some are really good, some are not so good. The Haivision Aviwest implementation is second to none.

So the bigger unit has more modems. The smaller unit has two modems. The bigger unit has the 4G-LTE model has four. The 5G has six. Why six? The 5G modems are a little bigger, couldn’t fit eight of them in the chassis, so we would limit the six. The little guy has two modems. So the PGA initially used the air for audio only. So if a camera operator had a boom mic or had a parabolic mic, he had the little unit on his hip, he didn’t need video. So the two modems worked fine. But for a workflow or having spares, PGA is like, you know what, I think it works better if we just have all the units the same. So they’ve kind of since then standardized on the pro. So here’s a little slide. What’s missing here is another graph showing the PRO 360, which is the 5G model with six modems.

And of course the 5G model also supports 4G. It’s not 5G only. So this is kind of showing how we go through the cellular network, the cellular network dumps to the public internet and then we hit the public IP of the receiver in St. Augusta at PGA master control. Depending upon the quality of the network, we can adjust the latency.

Now you obviously have to have the latency set on all your encoders the same to maintain that genlock and lip-sync. So the connection wasn’t, it was okay, it was good to average. So we ran at this event at 1.4 seconds. It worked fine because they had no commentators on site. Sometimes there can be a challenge where there’s talent in the studio and then there’s talent on stage or in a tent on the venue. It’s like, okay, we throw it back to the field or we throw it back to the booth. Talent that is familiar with working on satellite, they one 1000, Hey Eric, how’s it going today? One 1000. Great, I’m doing fine Jim, how are you? One 1000. So you don’t step on each other, you figure those things out.

Another big differentiator, the PGA didn’t like the idea that most bonded cellular rigs you got to wear as a backpack. Some operators don’t like that for safety reasons too. There could be safety concerns. Having cellular, you’re really not supposed to be on your iPhone all day like this near your brain. I try to use Bluetooth or speaker. Same thing applies. And think about it too, if the unit is on. So the preference is to mount it on the camera. So Aviwest now Haivision took the approach. They kind of took a page from the microwave links that mount on cameras where you can see in the top left here, they put the mobile encoder between the battery and the camera. PGA really liked that. Operators will put the camera, the unit and the battery in a porta brace bag. The comms are wired up, the intercom, the wireless mic for the talent.

Everything’s all pre-wired and tested. Put it in the porta brace bag, ship it out to the field, take it out of the bag, turn it on, you’re ready to go. You got a backpack. Oh well I need to hook the intercom wire and oh, I put it in backwards or I put it in the wrong port. It’s just less moving parts and it’s just easier to have. And then you also get a better signal. The unit is up here on the operator’s shoulder away from their body as opposed to if it’s on my back and the tower is that way, now I’m the signal between the tower and where the transmitter’s trying to transmit. So it makes a lot of sense. But then lower left corner, there are those cases where they’re using a smaller camera. We’ve done live, we didn’t invent the category, but we helped facilitate it.

I like to say we helped facilitate the category of live reality TV. We did the first generation of the live PD and an ENG camera was too big. So photogs are in the backseat of a cop car. So you imagine that this big ENG camera with a long lens, they would get in and out of the car once they come out. Oh, they left lens in the car or getting in the car camera’s just too big. So some shows, lower left, I believe that’s the camera they use in live PD. It’s a lower end pro camera, smaller, so they can kind of have it in their belly and they can kind of get in and out of the car. Another reason not to have a backpack. You ever try to get into a vehicle with wearing something on your back, you’ve got to take it off. So having for the live PD show, I have the bad habit of getting ahead of myself. I have a slide for this. I’ll circle back to live PD. Then on the right we have the air. So the air kind of goes in a little sling that you can wear over your shoulder or there’s a little belt clip where you can hang it on your belt. Lower right, you can see there’s a quarter 20. You can even put it on an accessory mount on the top of the camera if you’re using a smaller camera.

So I kind of covered all this mounts on the camera. You can see a film crew there, bottom right, this was like election night 2020. A film crew was from Europe TV, to Sweden, I believe they were. And you can see there, you see the unit. It’s in a portabrace bag. He’s got headphones on. He’s got the mic is wireless and she’s got a wireless IFB in her ear. So she’s untethered camera operator. They’re both hearing calm. So they they’re listening to direction from master control 8,000 miles away and transmitting live. This was in Times Square on election night. So they’re interviewing Americans. Hey, what do you think about the election? What’s happening?

So here’s the thing, we all learn something. I love learning something from my customers. So you ever see in golf that red line that follows the ball? There’s like a red streak. There was a red streak for a while in hockey that followed the hockey puck. It was when things were standard def where you couldn’t see the puck. So they made the red line so you knew where the puck was. This is kind of similar. So I had no idea how the technology works. It’s called Top Trace and the telemetry for that, it tracks the coordinates of where the ball is in the picture frame and it sends that telemetry through an audio channel. So the Haivision product happened to have analog audio inputs to the PGA’s surprise. Like, oh, you got analog inputs. Oh, we could put that to use. Well, we would have to use an audio and embed and add a secondary, a third audio channel.

They needed another box, and mounted on the camera that’s not convenient. So they were able to pipe the top trace right into the analog audio inputs to get that red arc that you see when they’re teeing off the golf ball. This is some of the stats of the job, I won’t read this to you, it’s a little bit of an eye chart. Another big event, golf related again, we did the Ryder Cup and the scope of work. So our Abbey West is based in France. So the team is still in France, the mobile bonded cellular team of Haivision now still operating in France. The French make fun of the Canadians that yeah, you sort of speak French, but it’s a different accent, but they’re sorting that all out. So the French were a little surprised, there’s a major American golfing tournament in Paris, the Ryder Cups.

So the Ryder Cup that year was in Paris and the scope of work was 20 channels of ISO video from Paris back. This was with Turner Sports back to Atlanta and then 10 channels of return because the commentators were on site. So commentators needed to see the program or multiple program feeds on the return path and this all went through the public internet. So here’s a testament to the agility of the Haivision solution. Do any of you guys know Tom Sahara? He was the president of SVG Sports video group for a while. He was like VP of engineering at Turner for, I don’t know, 30, 40 years, something like that. He’s retired now. So it wasn’t the night before the game. It was the night before the dress rehearsal. So it was like a day or two before the game I see on my caller ID 11 o’clock at night.

Tom Sahara calling me 11 o’clock California time. So it’s 2:00 AM Oh shit, this can’t be good. So Tom calls me up, Jim, Jim, Jim, I think we screwed something up. We got a big, big fricking problem. Tom, Tom, Tom take a breath. And Tom’s not an excitable guy. He is very soft-spoken guy. So he was a little agitated and I’m like, what’s up Tom? What’s the problem? He goes, the production trucks in Europe all run in 50 hertz frame rate. We are, as you know, Turner and the US were 59, 9 4. I don’t have 20 frame shakers to convert from 50 hertz to 59, 9 4. Oh no worry Tom. Hold on a second. I went downstairs to my office, logged in remotely into the unit. I fixed the problem. What do you mean you fixed the problem? What’d you do? Oh, I went in and changed the output.

So normally we make the output frame rate match the frame rate of what’s coming into the camera or coming in from the camera. Cameras 59 9 4, the output because customer might be streaming to the web and want 60. So they set their cameras for 60. So the workflow might change. So we usually leave it on auto. All I had to do was go in and change the output to 59 9 4, the frame sync conversion is built in. So these Haivision engineers, these French engineers, it’s like, well of course we have that feature built in. I’m like, you guys are too modest. You need to brag about this stuff. So Tom’s like, okay, I can go back to bed. I say the problem’s already fixed and then the subject of like, well not all frame converters are equal. And particularly with sports, when you convert from 50 to 59 9 4 a golf ball moving through the air basketball movement, there can be temporal distortions.

Nobody noticed anything because the picture looked beautiful. Tom was like, okay, we solved the problem, but is the picture going to look lousy? So it’s those little things that differentiate it. So here’s the workflow. So you see a bunch of videos coming from the venue back. This workflow, they had a smaller vehicle, you can see the pictures here. So they wanted to shade the cameras on site. Now there’s a benefit to that. You shade on site, you can shade in real time low latency. I’ll show you on some slides coming up. We’re able to shade through the cellular network, but it’s typically a one second latency. So you want to make subtle adjustments, morning to afternoon changes in light. So for this event, they wanted some presence, they had a smaller footprint than normal. They shaded the feeds and then isolated them back to Atlanta and then they switched to show in Atlanta. So it’s kind of a hybrid at home, we’re going to do a little shading on site, but then we’re going to bring it all home. So then this is the live PD show. I was telling someone before during the cocktail hour, I don’t know, I’m just a fiend for solving crazy problems. Customers call at 11 o’clock at night on a Friday. Oh the sky is falling. So the live PD show was a similar scenario.

A technical consultant was fired a month before the premiere. A and E was already running Premieres advertising for the premiere of the show. 9 O’clock on Saturday, blah blah blah. Tune in for live PD. I mean that would be embarrassing if you can’t do the show, right? So new tech consultants hired someone in their infinite wisdom from the production company side. What is a live reality TV show a lot like? It’s a lot live sporting event. So they brought in a director from NBC sports, Johnny Gonzalez, they brought in Larry Barbred Sulis. He was an engineer from CBS sports. So Larry was the one who called us up in a frantic, I took over this guy’s job, everything is, I think they conned me. They said that everything was working and nothing’s working. I’m going to be fired, you got to help me. I’m at wit’s end, I’ve tried everything.

So I go to Larry, I go, well we have this genlock thing and lip-sync thing. And he goes, look, I’ve heard of you, I’ve heard your name, we’ve never worked together. I want to trust you but I’ll believe you when I see it. So I talked to the vendor and they’re like, we did a similar TV show where we were able to keep things in genlock lip-sync. So Larry was very clever. So I said, he goes, let’s try this out remotely. Do you have four cameras? I was like, no, I only have one camera. All right, take the one camera, feed it into a DA and feed the output into four of your links. Then blast music and let the camera pick up the music. Then I said, well how do we measure the video synchronization? Go buy an analog clock. I went to Target and got a clock that had a sweeping hand, $8 clock, put the camera on that. I had already sent the receiver to Larry. So Larry put all four copies of the audio. So it’s actually eight audio feed because it’s stereo. I had a stereo mics on the camera open. So there’s eight speakers. Now if that audio was out of phase or out of lip-sync, you would’ve woo beating, canceling, non-canceling.

The music was perfect coming over those eight circuits. So he is like, unless you’re doing some kind of voodoo that I’m not aware of, I think we got something here. And then just from the human eye he could see the secondhand on all the clocks, on all four feeds where it’s roughly the same. So he says, okay, overnight me the four units, I’m going to get four cameras and drive up and down the West Side highway to simulate this cop show and see if the cameras stay in sync. He calls me two days later, we got a show. It’s great. Okay, I gave you the air date of the show. No, I don’t remember. The good news is you got the contract. Bad news is we need 40 units within less than two weeks. And Aviwest Haivision, they delivered, they got the units delivered.

So these photogs, Larry made them a special bulletproof vest that had a pouch on the front and so the unit would sit on their belly, the battery would be on the outside so they could get to the battery and change it again so they can get in and out of the vehicle. So they got the camera in their lap, the unit’s on their belly so they can get in and out of the cop car. Funny things in the show too, like camera one somehow gets ahead of the perp, the camera operators in better shape than perp. The cop is like 50 yards behind and I’m like, well why doesn’t the camera operator trip the perp to slow them down? No, no, they’re not allowed to interfere. So it’s like funny things like that. The cops are huffing. Are any of guys police officers? I’m out of shape. I’ll use me as an example. I run, if somebody came after me with a gun, I would just give up. I couldn’t run. So these guys, they got bulletproof vests on.

It is a lot of fun, crazy, crazy show. And then so all these feeds come in and how do you treat a live show? Now there was some legal stipulations of the show. The police agencies would only agree to be on the show if they have a final cut approval. So what I learned through doing this show, the FCC allows you to call a show live with up to a 29 minute delay. So I also learned another term. There’s live and then there’s live, live. I’ll explain. So these guys, Larry and his colleagues came from sports. So in sports for replay, the go-to is usually an EVS system. So they dumped these 40 camera feeds into an EVS, and then you see top left, these interns, these young people, they do it in basketball too. When a basketball game is live. Somebody’s punching in metadata. When Shaq scores or Shaq smashes the back or when he shacked or LeBron is doing, so there’s metadata keyed into the video live.

Then another level of technicians or analysts will go back and add more metadata after the game is over. So all the stats of the game are baked into the video when it goes into archive or when it’s reused. So same approach. G, that means there’s a gun present. D, drugs. So there were certain codes on the keyboard and then a line producer would be watching this, whoa, whoa, I see out of Tulsa there’s a gun and drugs and blood. What’s the story? What’s the line in news? If it bleeds, it leads. It’s a little ghoulish, but this is a cop show. So a line producer would decide, hey, this clip that happened a few minutes ago out of Tulsa is great. So then they would take that, they’d package it, they’d say from time mark 10:00 PM oh one, blah, blah, blah to whatever, put that into a package and they play it back to the multi-viewer in front of the director and they go, hey Johnny, be ready.

3, 2, 1. We got a clip out of Tulsa. Okay? So he goes, take camera one, take camera two. It’s being played back, but Johnny, the director and the TD cut it like it’s live and as long as it’s within that 29 minute window, you treat it like it’s a live show. So I’m on set a couple of times and Dan Abrams is the anchor. You go, hello ladies and gentlemen, welcome to live. Ah shit, f-bomb. Oh, let’s do it again. Do it again. What’s he doing? It’s a live show. No, it’s being recorded into the 29 minute buffer. So stuff like that. Or there’ll be a 20 minute bust. They arrest the guy, they put him in cuffs, they pull him over, there’s a scuffle and then that PD says his Miranda rights were not read right, you got to cut the whole thing. So the 29 minute buffer now is they cut a big chunk of the show out. So what would happen is they would run out of content too much would be cut, too many F-bombs, they’d have to cut.

So you’d hear in the studio, guys, no screw ups. We’re live, live. There’s no safety net, we’re at zero. When we break for commercial, we’ll gain two minutes back. So every time there’s a commercial, we get two minutes back or three minutes back. So you say, okay, we’ll get some time back. So there’s live and there’s live, live. So you can see my enthusiasm. This live reality show was invented around our technology basically Live PD took the show cops and made it live. I mean they pretended like they invented the idea of following cops around with cameras. They just put the live angle to it, to give the cops show it’s due. These are some of the intricate features of the product. I won’t go into this, I want to stay with the good stuff. So at home production or REMI this was kind of like I mentioned earlier.

On the left is more of the traditional at home we’re pushing the whole show through. Some shows they might switch the show or in the case of the Ryder Cup where they did some camera shading locally. So you can see there’s a switcher or some workflow happening onsite. So there’s different ways of, it’s very common now to produce the whole show in the cloud using Grass Valley Amp or you use vMix or Paul’s using OBS here today. There’s a myriad of inexpensive, Simply Live is another one, inexpensive or even free production tools that could be implemented.

Here’s an interesting use case. So I kind of touched on this a little bit before shading cameras. So the bonded cellular has signals going in the other direction. So we have what we call a data bridge. So basically it’s like a VPN from the studio to the camera. So what can we use that for? We can shade cameras, we can control PTZ cameras. We can do lots of cool stuff. You can see here one of the ethernet ports is used to bring that signal back to master control. The other ethernet port could be used if you have a wired internet connection because cellular has a cost. Cellular is not free. Now somebody gives you an internet connection that’s relatively free or lower cost. You want to plug that into the unit. But one of the problems with controlling cameras over an unmanaged network is latency.

Now typically the data bridge has 100 to 150 milliseconds latency camera control units. Glen just stepped out. I want to ask him. Marshall has this problem. They don’t like latency. Camera control units are meant to be used in the studio. So Paul is controlling this PTZ over here. He’s got an ethernet connection. We’re talking what, two milliseconds if that, or microseconds even. So typical camera control systems can handle 10, 15 milliseconds. They’re meant to be in the studio. Now we’re extending that over the cellular network to about 150 milliseconds and some cameras don’t like it. Sometimes the CCU, the RCP or camera control unit thinks it’s lost connection with the camera. Camera will do unpredictable things if it loses connection. It closes the shutter, it opens the shutter. So we partnered another VidOvation partner. It’s called Scion View and I like to call them, they’re like the Swiss army knife of camera control.

Literally we haven’t found a camera system that is capable of control that they can’t control. I mean from Sony, Panasonic, JVC, [inaudible 00:38:55] , bird dog, you name it. And what it does is the little Rio here, see the R is like for remote io, the Rio mimics the RCP. It spoofs the camera to say, Hey, I’m your PC, I’m your RCP right here, I’m your CCU. Nope, no changes to the iris. It’s still leave it at 34 or leave it where it is the black ped. Nope, it’s the same. So the Rio and the cyan view, that’s a typo. The cyan view RCP, it says Rio there. This should say RCP right here. They’d work together. So the Rio talks to the RCB. Has the red changed the green? Any changes in iris? Any changes in black ped? Nope. It’s all the same if a change is made.

So it smooths those bumps out and it works really, really well. And they’re a Belgium company. They have great tech support. It’s a very IT intensive product. So we got to route the signals through the stream hub. Cyan view is very familiar with the Aviwest Haivision product. So they know how to do it. So you can see here the same RCP can control two different types of cameras. The Cyan view RCP, one RCP can control up to 50 cameras. Now if you are actively shading 50 cameras, that’s going to be challenging. You probably want an RCP for each camera that you’re actively shading. But if you’re just shading them once for the event, doing a little touch up, if the sun changes if it’s a long day event. So customers love this. We sell a ton of these and it really is cool.

And then they make the interface cable for all of these cameras. It is the cable that always gets you the cable between the little Rio box and the camera. So they make the legacy eight pin little Sony Rojas connector for legacy Sony. If it’s an IP camera, they make a cable for that. So it’s all in the details. It’s the cables.

So the cellular, I’m emphasizing cellular. I have some slides I’ve been accused of doing webinars that were supposed to be 60 minutes for two hours. Do you guys, have you come out with a hook or is there a timer? Okay, well there’s more wine coming out. Glen went out to get some more wine out of the trunk of his car. So we all say bonded cellular. Now we can use satellite. Some of our customers will use a starlink. A Starlink is pretty good. Now starlink now at primetime when everybody’s streaming video. But during the day golfing event, you’ll get a pretty good connection on starlink or we bond all these different types of connections together and they all have different cost bases and we can prioritize that. We can give a higher priority to the less expensive connection to save you money.

Then this is the setting. So the cellular you got to pay more for this is probably backwards. Satellite probably costs more than cellular. So that should be flipped. But you get the idea. If somebody hands you a free internet connection to hook to your bonded cellular unit, you’d be a fool not to try to use it. And the unit’s intelligent enough. If the connection’s garbage, it’ll ignore it. It just won’t use it. This is another cool feature. So not all, most bonded cellular does not have a full frame rate, full resolution return video. They do have some sort of return. The Haivision solution’s full frame rate. So if you pick it, you want to send 10 80 p as return and you see there’s a green path and a blue path, you can have multiples. So one could be program video I’m sending to the video village at my remote location.

Another could be prompter, teleprompter. So a lot of our customers will do that. One unit can only do one return. So if you want to do program and prompter, you’d have to have two field bonded cellular encoders. But again, you’ve got to account for the latency. Typically, on average, the inbound feed is about a second or eight tenths of a second. The newer Haivision units go down to a half a second. The cellular networks are not quite good enough to handle half second yet, they’re getting there. I have some slides on private 5G. We’re able to get down to 80 milliseconds. So we’re getting there, but again, if the video’s coming into the facility and it’s about a second, the return is 700 milliseconds. So if I’m looking at program or a teleprompter or the teleprompter operator back in master control, better be a little early with it to compensate for, or you give the talent or someone locally to control through the data pipe to advance it a little faster because of the roundtrip latency.

So clever people, I mean everyone in broadcast on air talent production people, they’re used to a few second delay working with satellite. So it’s that whole like what I did with Eric. Hey Eric, how are you doing? One 1000, 2, you get used to it. Sometimes it takes a little training, a little bit of training. We can bring, I mentioned bringing this VPN connection out into the field and that can be used as a hotspot or for internet for other purposes. VidOvation we’re also, I was talking with Carlos before. There he is. We were talking about maybe I don’t need a bonded cellular video encoder. You’re paying for the HEVC Kodak. I just need something like a pep link. So we do bonded cellular internet connections. So some customer’s workflow dictates that we’re working with a broadcaster right now. They own a half a million dollars worth of Makitos streaming.

SRT. We would say, well it would be better if you replaced those with a bonded cellular Haivision unit. Well, no, I want him to use my Makitos, I just need internet. So we sold him a ginormous pep link, you’d love this box. It’s got 24 modems in, it’s a 2RU box. The thing’s a beast and we can bond that all together. So there is a way to make it work. And I love a challenge. As I mentioned earlier, this slide shows some of the ins and outs. A big differentiator is IP in and out. So the stream hub or the receiver is IP agnostic or vendor agnostic. So what does that mean? Most bonded cellular receivers only receive their brand of input.

Haivision does it differently. Well, because they invented SRT, they take an SRT in, they’ll take HLS in. So what’s interesting is Haivision, when you buy a bunch of Makitos now they actually recommend the bonded cellular receiver as the receiver of choice just due to its agility. Another workflow I was talking with Carlos about is if you’re streaming to social media or A CDN, whether that’s a free CDN like YouTube, Facebook or a paid CDN, most of them do not support HEVC. They’re afraid of licensing litigation from MPEG-4 consortium. So they still use HR-264. So our competitors will say, well the solution for that is at the camera you’ll have to drop from HEVC to 264. Send that through since the final destination can’t handle HEVC. Haivision don’t do that. More sophisticated. We do HEVC to the unit, so then if you have SDI outputs, we’re not sacrificing quality on your primary SDI outputs, your secondary outputs going to the CDN.

Maybe it’s for monitoring even going out IP. We transcode in the unit from HEVC to 264 for just this purpose. And I don’t know, have any of you guys tried to stream stuff like Facebook, it’s got to be constant bit rate five megabits main profile. And then YouTube wants something slightly different than, Ooyala, your CDN wants, again, something slightly different. So you’ll have three transcoder engines transcoding to the exact profile these different cloud destinations want, and no one else can touch that. No one else can do that. So these French guys are really humble. This box is just a Swiss Army knife of video and ip.

Again, most of our customers use a physical receiver even though the cloud now is all the rage, people will want to bring it into a master control. But we do have some customers where it’s exclusive to the cloud or they’ll bring it into their master control and then use the IP outs to bring it to the cloud. Kind of do a parallel workflow. But you can buy the Linux software and spin it up in AWS is usually our vendor of choice, but you just need to have a Linux instance in Azure, Google or AWS or whatever cloud provider. Sometimes customers build their own private cloud. We’ll sell them the software and they put it on their own VMware or VM machines. Then these are all the protocols we take in and take out.

What’s missing here is the bonded cellular protocol. I see a typo right now. SST, the safe streams transport, but these are all the third party. This slide’s probably supposed to emphasize all the other protocols, but SRT and being a part of the Haivision SRT has got to be in there. But RST transport over IP, RTMP, HLS, NDI. A lot of our customers will use like a TriCaster. I mentioned production switches earlier. I forgot TriCasters, NDI. Like if you buy a TriCaster, a lot of your scalability limitation is the number of SDI inputs, but you can buy many more licenses to bring virtually an unlimited number of NDIs in. So Haivision, Aviwest did a great NDI integration and then web RTC, I don’t know if you guys are familiar with that. That’s used a lot in gaming. It’s extremely low latency, high video quality because when you’re shooting the bad guy in your video game, if there’s lag from when you pull the trigger and the dude dies, that’s bad, right?

So Haivision uses the web RTC for a feature they call live guest. So I could send one of you guys an invitation to your email because you’re going to be on air talent tonight for the basketball game. Send you an email, you open up the email, your browser launches and you’re in standby mode. You’ll have a timer like when you’re going to go live. You can see the program feed. It’ll use whatever virtual camera you have loaded. That could be your webcam or you could have a higher end broadcast camera virtualized into your PC or Mac. Same thing with audio. You could have a higher end audio set up or your generic webcasting set up. And when it’s your time to talk, you get a countdown. Hey, we’re coming to you Bob, give us your thoughts on the game tonight and from your man cave, you’re now on ESPN and it works great.

So it’s an alternative. It’s popular to use Skype to integrate feeds from talent working remotely. So it’s like a higher version of that and web RTC protocols used for that. Kind of mentioned that this is a little redundant. So Bill had asked, Hey, he kept forwarding me links to stuff on Sports video group about private 5G. Private 5G is all the buzz. So what is private 5G? Another name for private 5G is CBRS. It’s the Citizens Band Radio Service and hopefully you can see that. So it’s a band that’s typically used by the Navy and unless you’re doing the Formula one right near a Navy base, you should be able to use the private 5G or CBRS band. The Navy doesn’t use the band that often, but if the Navy suddenly needed it, they’re going to get it. The beauty of this is the use and deployment of private 5G is all managed by the FCC automatically or semi automatically.

You go into a database, so you set up your private 5G gear and the system will go out to the FCC website and ask, Hey, can this customer XYZ corp or ABC corp light up a 5G at the Honda Stadium in Anaheim? And it looks in the database, it knows who’s already there, who’s using it or ahead of time you can go into the database and see. Now if a venue has 90 or 95% utilization, you might have trouble getting in there. So you can kind of plan if you’re going to be able to use private 5G. And then once you’re given a channel in the private 5G spectrum, if you leave your radios on, it’s yours indefinitely. Well until the FCC changes their laws and they’ve never moved spectrum around before. I’m being rhetorical. So that’s a fear like, oh, I buy all this private 5G hardware.

What if the FCC decides they want that spectrum back? The fact that the Navy is partially using it, maybe that helps it, but it uses what we call in the cellular spectrum it’s band 48, that’s the spectrum and it’s 150 megahertz of spectrum. But with complex modulation we can get much more throughput. You’re given a small slice of that and with the 5G technology, you are able to get a slice of that and you can push as much as 250 megabits per second, two or 300 megabits up and down and you’re on spectrum you own. So there’s no interference like from wifi or a lot of video gear works on the five gigahertz wifi band. So your coexisting with wifi access points, it really makes sense, it’s much lower latency than wifi private 5G.

You get like four times the speed of 4G, much more capacity. What do we mean by capacity? That’s user devices. Now some venues or some people will use the private 5G for fans in a stadium. And believe it or not, most of our devices are phones or Android. Our cell phones, apple support the band and you could keep your primary Verizon phone. I could buy an ESIM or activate an ESIM in here and jump on, instead of jumping on Verizon, I could jump on the private private 5G. So it gives you a slice. It is literally like running a wireless ethernet connection through the air. You own that slice. The FCC orchestrates it. And one of the vendors we work with is called Celona. And coincidentally enough their product’s called the orchestrator. The orchestrator works with the FCC database to assign you a channel. What’s cool about it is IT departments don’t like anything new, right? We’ve got it guys here.

Are you okay if we put private 5G in your facility? Are you okay with that? No. You’re shaking his head no, it doesn’t like it. Well, here’s the cool part. It works just like a wifi access point. So you see the smaller device here, this is the indoor version. You hook it to your network, you could set up a VLAN. Actually it is a good idea to set the provision of VLAN just for the private 5G, but you could be on the same network. The provision a VLAN, the access points powered off of POE, so you hook them to your existing infrastructure in the middle. This switch, this server, slash switch, needs to reside on that VLAN. So some IDF closet in your data center, you need to have this switch server that is the physical connection of all the private 5G connections. Then that server hands off to the public internet.

Now you could keep the private 5G network as a closed network in our world, because we’re streaming video, we’re transporting video, we want to hit the public internet. But there could be a use case where if you want to keep it private customers will use this at a ballpark for point of sale.

Have you ever been at a ball game and the credit card machine stops working and it holds up the beer line? It’s pretty frustrating. A lot of those credit card terminals are working off of cellular, the smart venues, because the beer line is revenue. You want to keep that moving man. They’ll put private 5G in and have a private slice. And if a terminal is not 5G or CBRS compatible, we have a little pep link, little adapter to convert it to private. So then the orchestrator is what goes out to the FCC and tells it, Hey, ABC corp wants to use a slice. Okay, we’ll reserve it for them, how long they want to use it. Well, semi indefinitely or just for a couple of days. So you buy how wide an area you want to cover, how much power you’re going to use and for how long.

And then there’s outdoor versions of the antennas. So you can see the benefit. Now in the bonded cellular play, all we need is we provision special sims to look for the right private 5G channel. The modems in our device. Now granted it’s our 5G versions of the device. The 4G versions don’t have band 48, the 5G models all support band 48. So we put a SIM in. It tells the modem what slice of the band to look for. It also authenticates itself. So the SIM talks to the access point. The access point talks to the edge, talks to the cloud orchestrated and says, yes, this is a valid sim. It’s not somebody trying to steal bandwidth from our network. So it’s all orchestrated together. So you can see here, you got the devices on the left. So people are like, well, I never, who’s actually using this private 5G.

Every NFL venue has private 5G. Have you noticed the coaches all have Microsoft Surface devices with the pen? They’re getting instant replay on there. They’re sketching out plays. All of their comms are going through private 5G because wifi didn’t cut it. Microwave was too complicated. So I got to be careful. I’m under NDA. The technology used by the NFL is provided by Verizon, but Verizon is OEMing it from a vendor that I may or may not know. So connect those dots. So the technology has been deployed. The NFL has been using, I think they’re going on season number two. So we’re working with other sports leagues for something similar. The NHL, the NBA. They’ll cluster a bunch of wifi access points near the coaches table near the bench. But then there’s interference from the access points of the venue. This is on its own network.

And the 5G access points have much broader range. So you don’t need as many access points. One access points could cover half a mile in some cases. Again, it depends on concrete and steel that might be in the way. So with one access point, we’re pretty certain we can cover the whole parquet floor easily, if not beyond of an NBA facility. So they can do replay, do comms, et cetera. So it’s pretty cool technology. So sometimes we run into a problem where we’re at a venue and we’re using public 5G or public cellular and we’ve got 80,000 fans all uploading to Instagram. And in our world we need the upload leg. I don’t know if you guys know cellular, when Verizon tells you speeds, it’s always a download speed. The upload speed sucks. It’s asymmetrical. That’s the beauty of private 5G.

It’s symmetrical because we have control over it. So we’re all trying to use the upload leg of the tower. Everyone is trying to push clips from the game to their Instagram account. We’re all fighting for that bandwidth. So what we’ve done with some customers, we just put two, even one private 5G sim in our unit and we get 5 megs, 10 megs, 20 megs out, no problem. And look at the public cellular maybe just as backup, but the 5G network virtually never fails. So we’re really excited about that. So I tried to put together a clever drawing here. I’m trying to depict that you would use 5G in the venue or near the venue. You’d put these access points in the venue. Maybe you want to do some tailgating, you put some access points outside the venue. But then the public cellular is usually away from the facility. But that’s not always the case. Many times there’s cellular repeaters in the venue for fans to upload the Instagram. But then we’re fighting for that connectivity. The public cellular networks. There’s been talks of offering quality of service. Verizon, well CBS sports will give you a slice of the public 5G. There’s been some experimentation. Yes, theoretically it can be done. This solves the problem. We put in a private 5G network, we connect that to the public internet. We bypass all that congestion caused by the fans.

So here’s a little bit about what we do. If you don’t know, I mean VidOvation, I’ve been, you’re not supposed to show your age, but I’ve been doing this for close to 40 years. Geez. Yeah, it’s coming up ahead. You got a few more years to go, Eric. I become a big fan of, we did the first generation of in that gold camp for the NHL and I became, the word fan is a derivative of fanatic, which is not necessarily a positive connotation there. I became a super fan of hockey because we did a lot. I became obsessed working at a Honda Center and Staples, believe it or not, I’ve become a fan of watching. I made fun of it earlier, watching bass fishing on TV. There’s rules. So one time when I’m watching, the angler gets a fish in the boat and his pole slipped and the fish hit the deck of the boat, five minute penalty.

So I don’t know, fishing tournaments, it’s timed, it’ll be like four hours. And they weigh the number of fish you catch in those four hours. So a five minute penalty could be pretty significant. You could pull in a fish every 30 seconds if you’re in a good spot. So now penalty because the fish hit the deck. But I’m like, but there’s a hook in its face. So we hurt the fish, we dropped it. Is that more traumatic? And again, that’s semantics. So the angler is pissed, and these bass boats are pretty small. So you see kind of in the corner of the shot, time out, five minute penalty, and the angler’s getting all in the ref’s face and the camera operator is backing up and you could see the camera’s shook. He’s like, look, where’s boat? He can’t go back any further. So we’re like is there going to be fisticuffs? And the guy is bleepity bleeping this. It made the whole show. It made the whole event. This one angler went off because he was penalized. So there’s color in bass fishing.

So somebody asked me earlier, what does Novation primarily do? We’re a systems integrator. We’re very consultative in what we do. We don’t really charge for our consultation, systems, integrated project management support and warranty. And we’d love to become obsessed with what you guys do. If there’s something we could help with, we’d love to do that. Does anybody have any, do your brains hurt? We need more wine. Yes.

Speaker 3:

Hello. So question in the simplest terms, if you have a 5G bonded cellular in your cell phone or something and it’s going back to a TV station, what is the connection? Is there an effect? What is the receiver at the television station? If somebody’s using their phone out in the field?

Jim Jachetta:

Yeah, so I

Speaker 3:

Mean what is the hardware?

Jim Jachetta:

Yeah, I could describe the hardware. So Haivision has a, they call it Mojo Pro. It’s like a mobile journalism pro taken from news. So the mobile app, I have a Verizon phone. So I could have an AT&T hotspot in my pocket hooked to the wifi of the phone. So I could bond my Verizon and AT&T wifi together and go back to the studio. So it goes to the cellular tower. The cell towers are usually connected, some metro ethernet or sonnet connection. And then depending upon the carrier, they dump to the public internet at some strategic point on their network. So where that’s dumped to the internet, that’s what adds to the latency.

Speaker 3:

So you don’t have a cellular receiver in

Jim Jachetta:

No. No, no, no.

Bill Hogan:

It’s going to the internet.

Jim Jachetta:

Yeah. Actually that’s a common question. Sometimes people want to use the system backwards. So, oh, like can I do cellular at the receiver? You need an anchor somewhere. So the receiver is usually either in the cloud or in your master control sitting behind a firewall with certain ports opened. We point the transmitter to a public IP that’s associated with that receiver. So typically the easiest workflow is the customer has one public IP per receiver. If they don’t have an excess of IPS, we can do some port forwarding and trickery to make one IP go to dozens of receivers. But yes, you need to hit a public IP on the internet and most broadcasters have a fiber or a solid internet connection. So we would call that the last mile.

The last mile is usually no problem from the internet to the facility. It’s that first mile, the challenges from the tower, but I say the first mile, sometimes the congestion not on the tower itself. It’s the back haul or the sonnet ring or somewhere else on Verizon’s network that there could be congestion. And that’s why we have multiple carriers, multiple modems. I should mention, we’ve been doing this for a couple of years now. We use what’s called multi-carrier sims.

Speaker 3:

So my pixel phone has a capability of two sim cards. So you could have T-Mobile and Verizon. It’ll switch automatically based on signal strength?

Jim Jachetta:

With a multi-carrier sim, you don’t need to have a second or a third sim the single sim can go. So what happens when the unit boots up? We like to, six is kind of the common denominator. So we like two modems to be Verizon as the primary two AT&T and two T-Mobile. If Verizon should fail, then it fails over to AT&T fails over to T-Mobile. Why do we pick something as a default? It just boots up a little quicker.

So it boots up, tries Verizon and usually all three networks are there in some capacity. But then as things fail, as you’re moving around, like at the PGA moving, hey, on the 18th green, we noticed now we’re on all AT&T. Well who cares? The unit’s working and then we move over the ninth hole. Oh, it’s a mixture of both. Oh, now it’s all T-Mobile. In the old days we would overnight sims to the PGA. Hey, pebble Beach has no Verizon can you overnight 300 AT&T sims. So they were pulling out Verizon Sims putting them in. So because the sim is agile now, they don’t have to do that anymore.

Bill Hogan:

Questions?

Jim Jachetta:

I think there was a question back there.

Bill Hogan:

Joe, let me get you the mic here. Hold on a second. You got a big question here.

Speaker 4:

Oh, big question. Yeah, thanks. I noticed you said that you had a strong relationship with AWS to haul back and I did not see Zixi in your numerous protocols there. And that seems to be the up and coming and I think it’s their protocol. Any comment on that?

Jim Jachetta:

Indirectly, Zixi, AWS loves RIST and they love SRT. Those are the two protocols. Most broadcasters have an account with AWS Elemental. You remember the Elemental group? AWS bought them out of Portland. You remember the boxes used to be green, now they’re orange when they glow. I used to joke that the green in the elemental boxes was green for money. Their boxes were so expensive.

Speaker 4:

I’ve operated a couple of them.

Jim Jachetta:

Yeah, yeah. It’s good stuff. So I think Amazon bought Elemental for the network that they built on the AWS Cloud. And Elemental when they were acquired by AWS. They’re like, your egress fees are insane. No one’s going to use this. What do I mean by that? AWS, it’s meant for cloud computing. You can essentially push data to the cloud. AWS practically free.

You want to bring anything out of the cloud? Haha, we got you. So elemental. If you use the Media Connect network, the rates are like 70 or 80% less of what AWS normally charges if you’re a Media Connect member. But you’re right. So Zixi is a proponent of RIST. So AWS supports Zixi indirectly. I don’t think AWS will actually take a Zixi native stream. It’d have to be a RIST stream generated by Zixi. And I asked the Haivision guys, I mean RIST is growing in use. We all know SRT, it’s a defacto standard, right? It’s not a standards committee, ratified standard. NDI is the same way. It’s a vendor created standard that’s been open sourced RIST is actually a, I believe it’s simply ratified standard. So for those that didn’t, they each have their place. There’s a lot of similarities between RIST and SRT. So does that answer your question? I don’t believe I’m 99% certain a native Zixi signal is not going to make it through AWS. Zixi would be in RIST mode to do that.

Speaker 3:

You’re not Zixi averse.

Jim Jachetta:

Yeah.

Bill Hogan:

Other questions?

There’s enough acronyms to go around I think.

Jim Jachetta:

Yes. Yeah. Risk. What is it? Reliable internet, something streaming transport. Something. SRT is safe. I should know this. Geez. Secure Reliable Transport. There you go. SST is Safe Streams Transport. I was joking with Carlos earlier. So when Haivision bought Aviwest, they’re like, oh, your protocol’s called SST. We do SRT, it kind of rhymes. They go together. We can work together. SST for the first mile, SRT for the long haul.

Bill Hogan:

Jim will be around and I think thank you very much.

Jim Jachetta:

Thank you for having me.

Bill Hogan:

And most interesting.

Jim Jachetta:

Thank you for dinner.

 

Continue Reading

Get the Guide: Solving Big Production Challenges with the Right Solution for Every Job

Get the Guide: Solving Big Production Challenges with the Right Solution for Every Job

At VidOvation, we love a challenge. Have you ever faced a problem like one of these? Live remote production in rural areas where cellular coverage drops regularly, so you miss important shots. Cameras are mounted in spots where they withstand damage almost daily,...