Select Page

Flexible Live Remote At-Home Production for the New Abnormal [Webinar Replay]

Learn how to maintain frame-accurate video genlock and lip-sync across multiple handheld cameras in your live remote field production

Learn about:

  • Learn how broadcasters and producers are finding more efficient ways to contribute and distribute live television
  • Learn to overcome the additional challenges in the new abnormal with travel restrictions and social distancing.
  • Learn how to maintain frame-accurate video genlock and lip-sync across multiple handheld cameras
    • Over Unmanaged Networks
      • Cellular
      • The Public Internet
    • Over Managed Networks
  • Learn how to mount the transmitting audio and video encoder on the camera or talent
  • Learn the importance of analog audio inputs on your field encoder
  • Labor costs are reduced since the television production specialists work at a centralized master control near where they live, eliminating the need for travel, and overtime expenses
  • Learn how the latest at-home production technology eliminates the need for expensive satellite, fiber and telecom connections
  • Learn how it also eliminates the need for on-site production trucks with dozens of on-site operators and personnel.
  • Learn how to have most of your crew work from home or a centralized master control near home.
  • Learn how this technology has been successfully implemented by the PGA, Turner sports, Viacom, CNN, Fox, the Discovery Network, Major League Fishing, and more 
At-home Production

 

HEADSHOTS_backdropWhite_Jim-350-150x150

Jim Jachetta, CTO at VidOvation, Engineer, Design Consultant, Integrator, Trainer, Teacher, Author & Speaker

 

 

 

With more than 25 years of experience in designing, integrating, and delivering video transmission and distribution systems, Jim Jachetta is the driving force behind VidOvation’s world-class technology that makes the impossible and never been done before a viable solution within your daily business operations. Using modern, easy to support technology, Jim and the talented VidOvation team creatively design, implement, and support wireless, cellular, IPTV, video over IP, and fiber optics installations that meet your organization’s business goals, and at a price point that fits any size, scope, and budget.

Transcript:

Jim Jachetta (00:00:01):

Good morning, everyone. I’m Jim Jachetta, CTO and Co-founder of VidOvation. Today, we’re going to talk about flexible live remote at-home production for the new abnormal. So before we get started, I’d like to ask you folks a question, I’d like to do a little poll, just to kind of see where everyone is in their field production journey, where they’re at right now. It also gives time for any stragglers to log in. So let me launch this little question I have for you guys. Unfortunately, I can’t see if it’s on the screen or not, but you should have a poll on your screen right now.

Jim Jachetta (00:00:46):

Basically, we’d just like to see, are you using at-home or REMI technology right now? Or is it something you’re planning in the future, or you’re just curious today but have no real plans of implementing at-home production? Definition of terms is always slightly different. When VidOvation and our partners would talk about at-home production, we really mean multi-camera. We’re not talking about a news commentator going out with a single camera, either with a camera op or with a tripod, going out by themselves. Doing that type of a remote production is very commonplace.

Jim Jachetta (00:01:38):

Where it gets tricky, and what we’re going to talk about today, is some of the challenges that VidOvation and our partners have overcome when you’re doing multi-camera. We’ll talk about maintaining gen-lock and lip-sync. So it’s really the multi-camera aspect. Okay. So let me see. I always hit the wrong button here. So, manage the poll. Oh no, no. I see. I hit the wrong button. I need to close it. There we go. And share it. You would think the bigger button would be the one they want me to press. The big button is a button I shouldn’t press.

Jim Jachetta (00:02:16):

So, I believe you should see the poll results on your screen. Unfortunately, I can’t see it, but I can read it to you, in case you can’t see it. So 18% of you are already using some form of at-home or REMI production technology. It seems like 53% of you, the majority, are not using it and a plan to soon. So hopefully, we can educate you and make you more familiar with the technology. And then it seems just 29% of you maybe are just curious today and you have no future plans, but maybe we’ll enlighten you, maybe some ideas we share today may resonate with you.

Jim Jachetta (00:03:02):

Right. So let’s get started. So, broadcasters, sports networks, news networks, any kind of a television operation, we’re not charitable organizations. They’re for-profit, they’re always looking for ways to cut costs, to maximize advertising revenue, so any kind of a technology that can save in the contribution and distribution and production of an event, news, sports, live reality, whatever it might be… We’re going to talk about how this technology can save money, can drop a significant amount of money to your bottom line.

Jim Jachetta (00:03:53):

And then what I alluded to before we started, we’re going to talk about how this tech can help, during these new abnormal times, with travel restrictions, personal distancing, social distancing. I think we’re all now working remotely in all walks of life. It’s not uncommon now, when you watch the evening news, one anchor is far left, the other anchor is far right, but the weather person is working from home. They have a green screen, they have a high speed internet connection, they’re probably using some form of at-home production technology to achieve that, keep everything in lip-sync.

Jim Jachetta (00:04:42):

So some of us may like to continue working from home when all of this is over. I also touched on this too, we’re going to discuss and learn more about how to maintain frame-accurate video gen-lock and lip-sync across multiple handheld cameras. There are solutions where you bring all your cameras to a rack or a box, in your truck, or in a central location, and then have a frame-accurate transport back to master control. But that’s wired, the cameras have to be tethered. This is completely untethered, the encoder’s mounted on the camera. I’ll discuss more about this. So it’s multi-camera, at-home production. Multi-camera field production at home, where there’s very few personnel.

Jim Jachetta (00:05:41):

In many of our applications, the only personnel on site are the camera operators. Everything is home-run: graphics, the production switcher, the producer, the technical director. All those people are back at master control in a centralized location. What are some of the infrastructure that we use to transport these signals? Primarily, it’s over unmanaged networks. Unmanaged networks, in some cases are low cost, or even free, or almost free, and we’re referring to cellular and the public internet.

Jim Jachetta (00:06:19):

So most of the applications we’re going to discuss today were done either through the public internet or cellular, or a combination of both. A common question we get, “Well, is cellular and the public internet as reliable as satellite?” Many of our customers can attest, from Turner Sports to the PGA, that in some cases, it’s more reliable than satellite. And then, of course, the tech can be used on a managed network if you have a Metro fiber optic ethernet link, or you’re working with level three, or our friends at the switch, working with your local telco provider, you have an MPLS network. Of course, this tech would work on that.

Jim Jachetta (00:07:04):

And then you could use the cellular or the public internet as a backup. So there’s always a place, there’s always… We encourage you to call VidOvation. Call our customer service team, they’ll engage you with our engineering team, and we’ll design a system that fits your application, and more importantly, your budget. So, we’ll talk more about the importance of mounting your field encoder, your encoding transmitter, on the camera or on the talent. We’re going to discuss why that’s so important and why that’s such a big benefit.

Jim Jachetta (00:07:42):

We’ll talk about how the tech reduces labor costs. Your specialized worker, your instant… We do lot with sports and reality TV. Reality TV, surprisingly enough, is treated a lot like live sports. They’ll use an EVs instant replay system, like they do in sports, and those operators are expensive and there’s not that many of them. So some of these operators, they can only travel and do a few games or a few events a week. If they work from home or from a centralized location, they can do several events in a day. They could do a live reality show in the early afternoon and do a sporting event in the evening or an early sporting event.

Jim Jachetta (00:08:37):

So they could be in California, do an afternoon ball game, and then do a local game in their own state in the evening, and do two or three events in one day, as opposed to two or three events in one week. So this eliminates the need for travel, need for overtime. You have to pay your employees per diem when they are traveling. People are telling me they can’t see my camera. Ah, look at that. I forgot to turn off the polling results. So the slides were just text, so you didn’t miss anything. And then you couldn’t see me, I’m sorry. But sorry about that.

Jim Jachetta (00:09:29):

Cindy, this is why I need you to be my producer, mate. Maybe it’s time where I have you help me produce these. So, I’m just talking over these bullets, but thank you, folks, for chiming in. So you have to put an employee on a plane, you got to pay for the plane ticket, you got to feed him or her, you got to put them in a hotel, then long hours onsite, where frankly, a lot of those hours, they’re not working. It’s preparing, it’s rehearsals. So this way, your EVS operator’s only on the clock maybe an hour before the game for rehearsal. The game is over, they’re done. They’re not still on the clock. You’re not paying them for their travel. So it really makes sense on so many levels to use this at-home technology.

Jim Jachetta (00:10:21):

You can also learn how the technology eliminates the need for satellite or an expensive fiber or a telecom connection, that cellular and the public internet is reliable, or the bonded cellular or the at-home production technology makes the internet and cellular, or a combination of the both, highly reliable. The tech is what smooths out the bumps that you might find in the internet or the cellular. Oh, let me see. Are the survey results still up? They’re still up, I guess. Sorry about that. Our friends at NEP and Game Creek… I have images on our website of a truck with a line through it, I’ll get to that.

Jim Jachetta (00:11:25):

But I don’t think trucks are going to go away, but maybe the trucks parked in a central location. They are mobile master controls, they’re portable control rooms. So maybe “they take the wheels off the truck”, and the trucks sit in a local location, a central location. They can do more than one game or one show in a day to help costs. Or maybe there’s more trucks, but smaller trucks. But work we’ve done with the PGA has eliminated or drastically reduced the size of the truck, and then the associated personnel that go with the truck. That adds the cost and to logistics.

Jim Jachetta (00:12:14):

So you have your crew work from home or a centralized location, I already discussed that. So we’re going to talk about the technology that we’ve implemented with the PGA, Turner Sports, Viacom, CNN, Fox, Discovery Network, Major League Fishing, and more. So, let’s talk about the PGA. So we have some ongoing… PGA has engaged us for their events for the rest of the year. They’re using our at-home production technology for all of that. The event I’m going to talk about today was the first… Well, actually, it was the first event that we did during COVID with PGA. We had worked with PGA in the past.

Jim Jachetta (00:13:06):

So once COVID started, there was big restrictions. It was the skins game. Actually, I think it was the first sporting event to come back after the lockdown or during the lockdown, after February. So it was a big deal. There was a restriction that was set, that at the Seminole Golf Club, that the local health authority said, you can’t have more than 50 people. I don’t know how they came up with that number, but that was determined, and that included the players, support staff, officiators, the TV crew. That was it, 50 people. Then the Seminole Golf Club was very unique, that they’ve never done a PGA event there. They’ve never done a live event, they’ve never televised there.

Jim Jachetta (00:14:05):

Somebody is telling me the survey screen is still up. The survey came back up. Okay. That’s weird. So why does that keep happening? All right. Hopefully, it doesn’t happen again. Sorry about that, folks. I must have clicked on a key. So you haven’t missed too much. So 50 people was the limit. So to do a production, you normally need one or two or three production trucks. One truck does graphics, one truck produces the show. So it’s not uncommon where you need one or several tractor trailers, and there could be 20 or 30 or 40 people associated with each trailer. So how do we stay under this 50 person limit?

Jim Jachetta (00:14:55):

So the game was in Seminole Golf Club. There’s no fiber connection, there’s no level three fiber circuit, there’s no telco circuit, there’s no telecom. If they did this old school, they probably would have brought a satellite truck. And then if you’re going to bring a satellite truck, then you might as well bring the production truck, too. So there was no fiber optic infrastructure in or out of this venue, so some temporary satellite or cellular were really the only choices. Let’s see. I think it keeps popping up on me.

Jim Jachetta (00:15:38):

So the game was in Southern Florida, and PGA headquarters and master control is in St. Augustine. So there was some talent, some of the play-by-play was in St. Augustine. And then Mike Tirico lives up in Michigan. He didn’t want to travel. I think the story was he has a child with a compromised immune system, I believe. I apologize if I got that wrong. But there was some reason why he didn’t want to travel, he didn’t want to risk it. So using the at-home technology, we were able to tie this all together.

Jim Jachetta (00:16:11):

Now, you might ask, “Well, St. Augustine is pretty close to Seminole, so that’s probably pretty easy to do.” If you’re going through the public internet or cellular, whether you’re a thousand miles away or 10,000 miles away, it really doesn’t make a difference. When we did the Ryder Cup last year, with Turner Sports, it was a good 8,000 miles away. The event was in Paris, in the Paris France region, and we back-hauled 20 camera feeds through the public internet, perfect gen-lock, perfect lip-sync back to Atlanta. So the fact that this all took place in Florida really doesn’t matter.

Jim Jachetta (00:16:50):

So PGA had two camera operators in the tee-off box, and you can see a one top tracer. So what is that? That’s that red line that follows the ball after it’s been hit through the air, so you see the trace of the ball. I learned something new, I didn’t realize this. The top trace telemetry actually goes through an audio channel. So I’ll get into it a little more, in a little more detail, but all of the Haivision field have analog audio inputs. So you’d have to use an audio embedder, or you have to have a camera to feed the audio into the unit.

Jim Jachetta (00:17:36):

So for the PGA, this worked out really well. Having analog audio inputs, they just fed the top trace telemetry into that. I’m not an expert at top trace, but then that gets… the telemetry then goes to a control unit back in the studio to put the graphic over the video. Then there were two more cameras on the fairway, and these are… The encoder is mounted on the camera. I’ll show you some pictures. Again, another top trace telemetry need, and then two more on the green. These cameras kind of like rotate. As a crew goes out, the tee box, they follow and then… So as different groups of golfers go out. But this was kind of like… I guess this is like the starting position of the cameras.

Jim Jachetta (00:18:27):

Then there were two POV beauty shots. This club was near the beach, so there were beautiful shots of the beach and the sand dunes and some other POV, another shot of the clubhouse. So the units on the fairway were the Haivision PRO380s with eight cellular modems. So we have two Verizon, two AT&T, two T-Mobile and two Sprint. That’s for video and audio transport. Eight modems gives us the highest level of reliability. But these beauty shots worked great with two cellular modems. The AIR20, it’s an HEVC codec, just like the 380. Five megabit bit rate with HEVC is what PGA chose, so we ran… The AIR320 was able to achieve five megs with only two modems.

Jim Jachetta (00:19:27):

So It’s a small… I have pictures of the actual units. Then they had two commentators walking in the course with the players. So this was really cool. The commentators don’t carry a camera, they carry a microphone to interview the talent. I have some still shots of that. So, how do we get that microphone audio back to master control? Well, the commentator chose to use the shoulder mount little pouch, a little sling. He had a microphone preamp on his belt, fed the analog audio into the Haivision AIR320 because of the analog audio inputs, and you could hear the interview, he interviewed the players while they were on the course.

Jim Jachetta (00:20:16):

It’s not uncommon that cameras were… The players were all wired with lav microphones and lav wireless transmitters, and the lav receiver is on one of the cameras on the fairway, or several of the cameras on the fairway, because they may rotate. So if one camera operator goes away and he comes into range of another camera operator, you might have multiple lav receivers on the cameras, in case one camera gets out of range. So you have some diversity that way. Again, that analog input on the Haivision 380 or AIR320 proved invaluable for these lav mics.

Jim Jachetta (00:21:02):

Here it is. So Dustin Johnson, Rory McIlroy, and the other players all had lav on them. And then since there’s two analog audio inputs, an operator stayed nearby, holding an AIR320, so wireless from the player to the AIR320, and then the four channels of analog audio were sent back the master control using two of the Haivision AIR320s. Then parabolic mics. So you see how many microphones are used in a sporting event? This is typical. Whether it’s football, baseball, basketball, you have different microphones, different angles. Crowd mics, parabolic mics to try to catch the conversation in the huddle. In this case, parabolic mics, they catch some of the ambient sounds on the course, et cetera. And then they use the traditional microwave link to get a camera feed from the plane overhead down to the ground, and then another PRO380 was used to home-run that video back to a St. Augustine and master control.

Jim Jachetta (00:22:15):

So here’s the two units. So on the top here, we have the PRO380, the bigger unit. It has four antennas on the top, four antennas on the bottom, the AIR320 has two antennas. It’s got kind of like these little… I’m calling them ears, these little ears off the back of the unit where the antennas are hidden, and two internal modems. Me, my colleagues, and my friends and colleagues at Haivision, we’re asked all the time, why does this technology seem to work better? Other people do bonded cellular. Well, it starts with the antennas. Better quality antennas, they’re patented, high gain antennas.

Jim Jachetta (00:22:58):

You can amplify RF. Cellular RF cannot be amplified. That’s not legal. But you can design an antenna with more gain, that’s allowed. So it starts with the antennas. And then the next magic is the modems. We’ve had customers like, “Well, I have brand X and then I bring the Haivision, brand X has no cellular connection. What’s going on? And then your unit comes on in, and we see signal.” That has to do with the motives. Higher sensitivity in the modems, better antennas. But the modems Haivision use catch all the cellular bands, without exception, and this is globally.

Jim Jachetta (00:23:42):

Haivision puts a big in the modems, they’re three or $400 a piece. I jokingly tease the Haivision engineers that they over-engineer the product, but in a good way. This is a case of that not skimping, over-engineering, putting the best possible modems in there. If you go into the administrative screen, or look inside the unit or on the receiving dashboard, you’ll see a little number like B12, B6, B14; that’s the band. We’ll often see one Verizon modem will connect to band six, the other will connect the band 12, and the unit does that on purpose, for better diversity.

Jim Jachetta (00:24:28):

So not only as a connecting to the same network, but it’s connecting to different bands on that network. So if one band should suddenly get congested… So it’s these little things that really sets this tech apart. So as I discussed, the flagship unit, the PRO380 has eight modems, AIR320 has two. So, why did this tech work so well for the PGA, for Ryder Cup, for Live PD? It comes down to what Haivision calls safe streams transport, or SST, for short. So what is this?

Jim Jachetta (00:25:07):

Everyone has some sort of a protocol for transport. There’s SRT, there’s RIST, there’s different protocols. Haivision’s protocol has proven to be virtually bulletproof. Some of the techniques that are used by others, but it’s all in the implementation. So there’s a level for error correction, there’s automatic REMI-request, or ARQ, adaptive bit rate control. We don’t try to jam five megabits down a pipe that only can support two. So the unit will adapt. But the most important feature for at-home production for live reality TV, for sports, for multi-camera field production is the ability to maintain frame-accurate gen-lock and lip-sync.

Jim Jachetta (00:25:58):

I like to say they implemented their own version of precision timing protocol. Obviously, it would have to be different than PTP, because PTP, you control the clocks and all your switches across the network, all your devices. But Haivision, the receiver acts as the grand master clock, reference signals are sent to the transmitters that get them all in sync. Remember also, the transmitters are hooked to GPS. If not GPS, they’re hooked to the time clock of cellular, which is probably also hooked to GPS. So the units are close, they’re within maybe a hundredth of a second out of the shoot from a time clock standpoint, but not a frame.

Jim Jachetta (00:26:51):

So many of our competitors, they’re a couple of dozen frames off, and in live production, that really doesn’t work. If the audio is a couple of frames off, you’re going to get hello, hello, or whoosh, whoosh, when they strike the golf ball. It’s would to drive the audio engineer crazy. They’d have to be muting channels that are out of sync and trying to find that the audio that’s in sync.

Jim Jachetta (00:27:17):

Because you can imagine, on the slide before, there were two, four, six, eight… 10 or 12 cameras filming this group of four golf players. Each of those cameras has open microphones, you’ve got open mics on the parabolic microphones, you got the lavs on the players. The show would be unwatchable if that audio was all out of sync. When you cut between cameras, you’d see whoosh, whoosh. Why are they hitting the ball again? They just hit the ball a second ago. But more importantly, you’d see this audio out of whack, out of sync. So that’s what’s really important.

Jim Jachetta (00:27:56):

So here are some of the stats. PGA chose to shoot 1080i59.94. These units can do 1080p59.94, 720p. The unit can actually go up to 20 megabits per second. With HEVC, high efficiency codec, 20 megs probably would be overkill. They chose five, which I think is respectable. Also, Haivision encoder has proven to be about 30% more efficient than many of the other HEVC encoders out there, so you could say that this transmission was analogous to a seven or eight megabit transmission.

Jim Jachetta (00:28:39):

The video quality of the hardware encoder that Haivision uses is unbelievable. The picture quality is beautiful, and… Shooting golf is not easy. You got a lot of fine blades of grass, you got a lot of color gradients, different colors in the grass as the wind blows through it. So sports, in general, is not very forgiving if you don’t have a good HEVC codec, and Haivision certainly has that.

Jim Jachetta (00:29:13):

So let me keep going. So here’s what the units look like. So you see on the top picture here, most of our customers like to mount the unit on the camera. So Haivision wasn’t the first to come up with the idea of putting an accessory between a camera and a battery. It’s probably first done with microwave gear, wireless RF gear, short range stuff. But they were the first to do it with cellular, and they started doing it about seven or eight years ago. Others have followed suit, but I think right now, today, of all the newer units out there, they want you to put in the backpack… There’s a battery plate on one side, but then there’s nothing on the outside; you have nowhere to put the battery. So your only choice is to put it in a backpack.

Jim Jachetta (00:30:09):

Not that I’m against backpacks, but when I’ve been in the field, I see people have a tendency to put the backpack on the ground. They don’t want to carry it, they don’t want to have it on their back. If you have a wireless apparatus, do you want it below the crowd, below the people? Having it up high, on an operator’s shoulder, you’re going to get a better signal from the tower. So really, we feel this is the only way to go, is to put it on the camera. But if you see in the bottom photo, you’ve got a smaller camera, a prosumer, or a smaller broadcast cam, doesn’t have an Anton Bauer V-Lock plate.

Jim Jachetta (00:30:46):

So then we do make a very nice compact backpack. You can even hide a second battery in the lower compartment, feed that into the power input. She can run off of two batteries, where one battery is like a backup. Then on the right here, we have the AIR. So, again, with a smaller camera, you see the AIR on the top right here, it’s in its little pouch. You can wear that over your shoulder. We don’t have a photo of it, but Haivision makes a very nice belt clip. I think of it like a military or a law enforcement accessory. It actually clips into place, it locks, so it won’t fall off.

Jim Jachetta (00:31:27):

So if you’re doing live reality or news, and you’ve got to run after someone, the unit’s not going to fall off your belt onto the ground. Or you can mount the unit. It’s got a quarter 20 thread on the bottom, so you could put it on top of an accessory shoe on top of your camera rig that way, if you so choose. So here’s another shot close up. So you mount it right on the camera, another shot showing it in action. Many operators, what they like to do is they’ll have their camera rig in a porta brace bag, and you get one a little bit longer, so it’s intended for accessories being on the back of the camera, whether it’s the battery or RF or cellular.

Jim Jachetta (00:32:16):

So you have the unit mounted on the camera, you got your intercom, the lapel receiver’s already wired for your talent. Everything’s ready to go, it’s all been tested. The video output of the camera’s connected to the video input of the PRO. So you take it out of the bag, you slap a battery on, or the battery is already on, you turn it on, 30 seconds, you’re transmitting. You don’t have to, “Oh, let me get the bag. Oh, let me unzip the bag, so I can turn it on. Oh, let me hook up the cable. Do I want to wear it? Do I want to have it on the floor?” Boom. I’ve seen operators. You take the camera right out of the bag, you’re shooting within 30 seconds. There’s no cabling, there’s no setup. You’re just ready to rock and roll.

Jim Jachetta (00:33:03):

So again, the first system to mount on the camera. Haivision V-Lock tends to be more the battery of choice in Europe. Most of our innovations inventory here in the US tends to be Anton Bauer. I’d say 80, 85% of camera operators are using Anton Bauer here in the US, but we support both Anton Bauer and V-Lock. You can purchase or rent systems either way. Some of these things, I touched on. So there are a smaller portable systems out there. None of the newer systems seem to be able to mount on the camera. They tend to be 50% bigger than the Haivision system. This is a very tight… very, very tight integration. You can see, it’s a very small package, very narrow package.

Jim Jachetta (00:34:03):

I’m not disparaging the backpack. If you prefer where the backpack, we have the backpack options. And then again, we come to the analog audio inputs. I don’t want to… Yeah, here we go. So here’s some closeups. So even on the smaller unit, you can see there’s too many XLRs. So you can purchase… If you want to make your own cabling, you can… It’s just a standard mini XLR, you can wire up your own cables, or you can buy these adapters from us, mini to XLR to full-size XLR. So it’s line level inputs, so you will need like a microphone preamp of some sort to feed into the unit. But that’s usually desirable. This way, the operator can adjust levels. Oh, what happened? Oh, no, that was it. Can operate the levels. Looks like I repeated a slide there.

Jim Jachetta (00:35:07):

So that kind of ties up the PGA application. Touch on some other applications we’ve done. I alluded to Turner Sports. We’ve done two live a golf events with Turner, one with Ryder Cup, and then we did a football or soccer tournament. We’ve done work with the PGA, we’re doing live reality shows on A&E, Fox, Discovery, Live Rescue. Live PD has been put on hiatus, it should be coming back soon. First Responders Live. Some people say like, “Oh, these live reality shows, that’s not real broadcast.” Well, I don’t think… Well, unless it’s NASCAR, we… Some of our encoders have been used in professional racing, motor racing, and we’ve had very good results, even at high speed.

Jim Jachetta (00:36:06):

But if you watch some of these reality TV shows, the police cars, the ambulances, the firetrucks, they’re going over a hundred miles an hour down the street, and we maintain video lock, even with a unit jumping from tower, to tower, to tower. So pretty much whatever your environment is, whether it’s a crowded stadium or a high speed chase, something about the Haivision technology will give you a very reliable and robust results. In the crowded stadium, by having those better antennas, higher gain antennas, better modems, grabbing more bands… So the fact that the 380 and the AIR can grab more bands, not everyone in the stands is going to be flooding all those bands. So you can grab a band that’s not being over-utilized.

Jim Jachetta (00:37:00):

So here’s a little overview of what we did with Turner Sports in the Ryder Cup. So they had 16 ISO camera feeds from the course. They did have a truck or a smaller vehicle. They did camera shading onsite. So when I said earlier, maybe NEP, Game Creek, your mobile production guys, I think eventually, we are working… we are coming out later this year, some camera control, low-latency camera control capability, to run through the cellular and the public internet. So it’ll be there, where we don’t need to have a shader onsite, or need to have the cameras run with auto iris. So the technology is coming, we’re in beta on some of this. But for this particular event, they chose to have a small truck there, they did some shading locally, onsite, but then after that…

Jim Jachetta (00:38:00):

That feed was… 16 cameras were ISO’d and sent back to Atlanta, and then they wanted four program feeds coming back out, confidence feeds. There were commentators in Paris and commentators back in Atlanta, so they needed video back and forth, so we were able to achieve that, and we did it all through the public internet, and a single internet connection. I would’ve slept better the weekend of the event if they had two internet service providers. So they use the… I think we have some pictures of it. Oh, I don’t have a picture of it. But they use the HE4000 of Haivision. It’s a four channel HD or a single channel 4K appliance. It has two LAN connections, so they could have hooked the…

Jim Jachetta (00:38:52):

Oh, somebody said the poll graphic is back. What is going on? Or is that an old message? That would be an old message. So it has two LAN connections, so you can have internet service provider number one, MPLS network on LAN number two. Ryder just did one channel, they didn’t have the budget for it. If they had two, it would have bonded between the two. There also is an option for external cellular antennas and modems that connect via the USB port. So you can add cellular, and if the units are in your truck, these cellular antennas and modems, it’s called the quad [inaudible 00:39:36] and can be put on the outside of your vehicle. But Turner chose not to use any of that. We didn’t drop a single packet during the live event, so it worked really, really well. Everyone was very pleased.

Jim Jachetta (00:39:50):

So here’s a picture of the control room. Yeah, it does look like they did have a truck onsite, so that they’re probably doing the shading in that truck. Then I talked about our live reality shows. Actually, me and my team, we get a lot of orders. We encourage customers to do a demo, and we have demo… At any given moment, we have several million dollars worth of Haivision gear in stock for rentals, for demos, et cetera. But often enough, we get… a customer’s like, “Hey, I watched Live PD this weekend, I saw your technology in action. I’m sold. Where do I buy? Where do I sign up?” So just watching Live PD is a great recurring demo to watch the tech.

Jim Jachetta (00:40:38):

You can see on Live PD, they’re using smaller cameras, a bigger ENG-style camera, with… The unit mounted on the camera’s not practical. Getting in and out of the car quickly, they’d knock the lens off the front of the camera. So you see how they got the camera out in front of them. Lower left, this operator chose to carry the unit in the backpack, and he’s wearing it on his chest, which also helps to get in and out of the vehicle. If it’s on your back, you’re going to get hung up. You can’t sit in the car. The photog in the top right, he’s wearing the Big Fish, an A&E-provided a bulletproof vest, and he’s got… there’s a pocket that they made in the front of the unit, so he’s got the unit in hidden in a pocket in the front of his bulletproof vest.

Jim Jachetta (00:41:37):

Every photog sets it up how they like. You can see the picture top left is kind of hard in the shadows. I can see the guy’s got the unit exposed, strapped to his chest. I see the antenna and the battery right there, so he’s got the like just strapped right here on him. So whatever works. Your camera operators will figure out a way to mount it and use it comfortably. So very much like sports, the raw video comes in, there is no live, live element to the Live PD show. Unlike sports, there’s the live action, and then there’s EVS systems and EVS operators that do the replay. All of Live PD is some form of replay by a few minutes.

Jim Jachetta (00:42:35):

So all the content is dumped into EVS, and you can see there’s operators working in each city. A lot of the content coming in is probably not show-worthy, it’s the cop driving to the call or waiting for a call, he’s on a call, he or she’s on a coffee break. So they’re looking for the action, so they’re putting metadata in, which… It’s very similar sports. While the game is happening, people are putting metadata in news. Live news is happening, people are putting metadata in while the live event is happening.

Jim Jachetta (00:43:15):

So then a line producer is looking at this metadata and it’ll be like… Me and my colleague from France, Florian, my colleague, Rick at VidOvation, we’ve all been in the studio several times with Live PD. A line producer will find an interesting clip, and then they’ll cut together a package, as they call it. Again, I’m not a production expert, but that they’ll put together the clips for that event that’s just happened, then they’ll get on the comm and tell Johnny Gonzalez, the director, “Hey, we’re putting a package together. We got shots fired in Tulsa. There’s some activity. So when we break away to commercial, we’re going to put this package up.”

Jim Jachetta (00:44:06):

So then they put the cameras up, they play back the multiple cameras in front of the director, and the director directs now, like it’s live, but now it’s a few minutes old. The FCC actually allows you to call a show live, even if it’s a 29 minutes delayed, up to 29 minutes. So, we can’t be live, live for officer safety. That wouldn’t be good. So legally, I think the show has to be delayed at least 15 minutes, not to put the officers in jeopardy. But then the director goes, “Okay, take camera one, take camera two,” while it’s playing back from the EVS. So he’s cutting the show like it’s live, but you have that little bit of delay.

Jim Jachetta (00:44:51):

If the show was truly live, it’d be a little bit boring. There’d be moments where nothing was happening. And then, of course, you know how it is, Murphy’s law; when something does happen, it happens in all cities at the same time. It’s almost comical. So while they’re playing back what happened a few minutes ago in Tulsa, well, something happens in Chicago or, wherever, so they record that, and they that’s how they put the show together. So it’s very much like sports, these live reality shows.

Jim Jachetta (00:45:25):

So here’s some of the capabilities of the unit. I hope this doesn’t seem like a commercial, but I want to get into some of the differentiators. Other systems do this, other systems out there. We didn’t invent the idea of going live. So we can live, We can record, we can forward, we can go live and simultaneously record, we can record and simultaneously forward. These last two, there’s a big differentiator. Other field encoders out there, some don’t have recording. The ones that do have recording, to my knowledge, all of them record the live transmission.

Jim Jachetta (00:46:08):

So when do you need a recording most? You need it when you’re in a dead spot, you need it when the transmission is struggling. I mentioned that the Haivision PRO and AIR have better antennas, better modems, will grab a signal where other systems can’t, but we can’t materialize a connection that just isn’t there, so there will be those… You’re in a lead line basement trying to do an interview, you’re going to struggle to go live. There are dead spots in cellular in the world. Having eight modems on four different carriers, that helps a lot, the better antennas, the better modems.

Jim Jachetta (00:46:46):

But Haivision did this in their first products nine, 10, 12 years ago, to have a second encoder. Now, putting a second encoder chip, these chips are hundreds of dollars, so you don’t just arbitrarily throw a second HEVC encoder chip in there. Costs a significant amount of money. These chips run hot, you got to cool them, ventilate them. You don’t just throw another chip in there. So there’s a second encoder. So when you’re in that fringe area…

Jim Jachetta (00:47:20):

So there was an episode of Live PD, when my colleague Florian and I were in the studio, and it was a big drug bust. They found some cocaine in the wheel of a truck, and the chase had lasted an hour. They’re out in the middle of a corn field, cellular wasn’t great. This was with our older product using H.264, so it got down to about 500K. The audio was clear, the video was soft. 500K video using H.264, it’s going to be a little soft. So Florian and I are there, we were guests in the control room, we were in the back, and somebody kind of mumbled under their breath, “Oh, too bad, we can’t get the recording out of the camera.”

Jim Jachetta (00:48:06):

Most productions, you record in-camera, right? That’s your safety net, so you can piece together a produced show after the live event. Everyone does that, whether you’re news, sports, film, production, cinema, you always record in-camera, even if you’re recording out-of-camera. So somebody said, “Oh, too bad, we can’t get the recorded video out of the camera.” Florian and I lean over to the tech consultant, we’re like, “You put the SD cards in these units?” “Oh yeah, sure. I’m recording in your unit. That’s our second set of… our second safety.” Florian, and I go, “When you break away from the action, we can show you how to log into the unit and pull that high resolution recording through.” I didn’t know we could do that.

Jim Jachetta (00:48:58):

Now, obviously, you have to have, in your workflow, the ability now to ingest the file. So we had to figure out… We pulled the file through, once they had arrested everything… You can’t broadcast live and pull a high resolution file through at the same time. You would steal too much bandwidth. So when they had the guy in cuffs, the scene was… So we remoted into the unit, we stopped the live transmission, we pulled just the clip we needed. You could select the clip. Because Haivision records in a fragmented file, we could just take the clip we wanted. Didn’t need to pull everything.

Jim Jachetta (00:49:38):

We took the clip that we wanted. The anchor said, “Okay, we’re going to go. Things are wrapping up with that drug bust in Tulsa. We’re going to go to Chicago now.” When they were done with Chicago, they broke for commercial and they go… The director went on the comms, or the producer went on the comms, “Hey Dan Abrams, when we come back from commercial, we got a better copy of the recording of that bust in Tulsa. Can you queue that up?” So it comes back, “Oh, ladies and gentlemen, our tech team was able to pull the recording from the field, so we have a cleaner version of that bust.”

Jim Jachetta (00:50:16):

Now, that wouldn’t have been possible in another system, because the recording would have been the same garbage as the live. So because of that second encoder, we were able to broadcast that live. Because of things like this, because the picture looks so good, or because we can pull that file through… In the beginning, people on Twitter were like, “I am a 30 year engineering veteran of RF and cellular, and I know a fake show when I see it.” And the Live PD guys were pretty upset. I’m like, “Hey, it’s a compliment of how good the show is and how good the tech is.”

Jim Jachetta (00:50:54):

Then another new feature I want to touch on is a data hotspot. Now, others offer this. You can buy like an internet hotspot. Like a cradle point or something like that, they’ll have multiple modems, but it’s more fail over. It’s not true bonding. They don’t work well for video. So not all hotspots work well for video. So what Haivision has done, they’ve taken it a step further. It’s not only an internet connection, but it’s a secure VPN connection.

Jim Jachetta (00:51:32):

So why is that so important? It sets up a tunnel between the StreamHub receiver in your master control and the unit out in the field. And then you can connect assets, either hardwired via LAN or WiFi and have them on the same sub-net. So this is really important, trying to do PTZ cameras. I think you’ve noticed that a lot of production now, that weather person doing the weather from home, they don’t want a camera operator in their home, they don’t want a technician come into the house. They ship a PTZ camera on a tripod, they send our bonded cellular unit, the data hotspot bridge controls the PTZ, the video transport control sends the live transmission.

Jim Jachetta (00:52:24):

Some people have even hardwired everything together, like in a big Pelican case that they open. The camera’s even in the Pelican case. Like, put this case on a table in front of you, and then master control will frame the camera remotely. All they have to do is power it on, and the unit will power up. Even the Haivision units have a function where when you apply power, they not only turn on, I think other people do that, other equipment, computers, when I turn on, but it goes live. When you apply AC power or put a battery on it, it turns on and goes live, which is very powerful. Down here bottom right, the auto live function.

Jim Jachetta (00:53:07):

5G we’re starting to deliver 5G, if you want it. 5G is coming very soon. We don’t have any inventory just yet, but the inventory is coming very soon to the US. We have return video. A lot of people have been waiting for this. Other systems have return video, but it’s a very low frame rate, low bit rate, just for confidence. This is full resolution. If you want to, this could be 20 megabits per second, if you wanted to. Again, with HEVC, that might be overkill.

Jim Jachetta (00:53:43):

But why is that so important? So, we’re doing a high school football game, and we’re doing at-home production, we’re bringing everything back to the master control. Well, again, this is COVID now, but let’s pretend COVID is over for a second. There are some fans in the stadiums, how do we get replay for the monitor in the venue? We do replay from master control, because we don’t have a truck locally. Usually, the truck is what’s feeding the scoreboard, the master control. There’s a replay coming from there. So that has been a missing link. How do we get the replay up on the big screen in the venue? Or how do we feed a high resolution feed for the teleprompter?

Jim Jachetta (00:54:24):

These low resolution feeds, low frame rate feeds, the tech starts looking pretty fuzzy, the commentator can’t read it. So you want sharp, crisp teleprompter video, or you want really high frame rate, full resolution, full frame video for the replay for the scoreboard. Then Haivision has this mission centric EVS, [inaudible 00:54:49], other news automation. I’m not a news automation expert, but you can connect into the ecosystem of these mission centric or this newsroom automation, and basically when an operator… I think we have another webinar that we did, if you look in our webinar library. If not, we can do another webinar on mission centric.

Jim Jachetta (00:55:16):

But the basic concept is an operator might get confused. I’m in New York, I got to do three shoots today. What were they again? Oh, he turns the unit on, and they’re actually numbered. It comes up on his screen. Number one for today is… you shot at city hall. Clicks on it. Then all the metadata, the video coming in, it’s tagged, it’s identified it’s timestamped that those were the shots from city hall. Then they stop transmission. “Oh, shoot. Where do I go next? Oh, I got to go to the courthouse.” Click on courthouse, go to the courthouse, everything gets tagged. So the field encoders become part of your news workflow, your mission centric workflow.

Jim Jachetta (00:56:06):

Florian, I hope I did that explanation justice. So here’s a little bit more about it. So JSON files, mission titles, device names, journalists names, metadata. So the mission file, the metadata or the mission file, travels with the video. It all comes together, and you see here, the picture at the bottom, you select your mission, so you know what you’re doing. And then everything seamlessly comes back to master control. So you see here, your assets on the right, you got the video file, and then that’s the president. You got the video, and it’s the storm. You got the video, and it’s the football. So the metadata file and the video file travel together to make everything seem…

Jim Jachetta (00:56:52):

I’m sure all been there, where you shoot something and you’re like, “This footage looks great, but I have no idea where it was shot, when it was shot, who the people are in the shot.” So this is a very powerful feature. Haivision has a screen on the side of all their devices. For a small screen, it’s very high resolution, so there’s a lot of information that can be seen on the little screen. Another new feature I should mention is… You can kind of see it here in the picture. Actually, you can see… You see like modem number one? This unit looks like it’s connected in Europe, so you see [inaudible 00:57:31]. But do you see B20? It might be hard to see on the screen, but where it says 4G, that modem is connected to band 20.

Jim Jachetta (00:57:40):

And then you can see the signal strength, you can see the bit rate going through there. And then you see these little priority, little kind of graph? So it’s like, high priority, low priority. So, if you have an expensive internet connection, or maybe you don’t trust it, so you put that on high priority. We don’t recommend turning cellular modems off. You want them on, you want them connected, you want them on warm standby. So this feature is very nice. When you set the priority to low, it’ll send a little bit of data through them to keep them open, but the majority will go through your free WiFi or your free LAN connection, or your free internet connection.

Jim Jachetta (00:58:29):

But then if that’s suddenly hiccups or somebody disconnects it or shuts you down, the cell modems will instantaneously take over. It will ramp instantly and send the majority of the traffic over cellular. And then when the LAN connection comes back, it’ll slowly go back, and it’ll constantly test the connection that hiccups and try to reestablish. And it’s all automatic. You set these priorities once, you don’t need to go in there and play with it.

Jim Jachetta (00:59:00):

So many customers still… I don’t think satellite’s going anywhere soon. Elon Musk, SpaceX is launching more global internet wireless satellites, so I think some sort of satellites, some sort of IP. The Haivision tech will work with anything. It’ll work over WiFi, it’ll work over one or two LAN connections, and up to eight cellular modems, or a combination of all the above. A customer was just asking me yesterday, “Oh, if we use WiFi, will that mess up the cellular? Do we need to turn the cellular on or off or back on again or cajole it, enable, disable?” It’s all automatic. The unit doesn’t care. You hook something to the LAN connection, it finds the path to the public internet in the StreamHub receiver. It’s looking for the IP address, that public IP address of the receiver. If it can find that, it will send packets, and it’s all automatic. It all just works like magic.

Jim Jachetta (01:00:06):

So I kind of touched on this, this is a little bit redundant, that the PRO380 has eight modems. There is a PRO340 with four modems. Maybe the four-modem solution becomes prevalent, or we’re going to go to 5G, which is six modems, so I don’t know. We don’t really sell the 340 in the US, it’s really… The flagship for us in the US is the PRO380 and the AIR320. 380 with eight modems and the AIR320 with two modems. So this is what I alluded to. So the PRO3-5G, or the 5G version, the 5G modems are a little bigger, so we can’t fit all eight cellular modems in there, but we can fit six.

Jim Jachetta (01:00:54):

So, fast forward, Sprint has merged with T-Mobile, so I imagine us configuring 5G units with two Verizon, two AT&T and two T-Mobile. It’s perfect. And then the AIR series will have two 5G modems in them. Oh, and I should mention, the AIR only has two internal modems, but it has two sims per modem, it’s dual sims. So how we like to configure modem them, number one and modem number two, the primary sim, Verizon and AT&T, then the secondary sim is T-Mobile and Sprint. So you turn on the unit, and for some reason Sprint’s not available, you’re in a dead spot, it’ll switch to the T-Mobile sim.

Jim Jachetta (01:01:45):

You turn on the unit and AT&T is not available, it’ll fire up the Sprint sim. And it’s all automatic, you don’t have to touch anything. So that’s a very cool feature, and we keep forgetting to say that. It’s two modems. So it’s kind of like you got the connectability or the capability to connect to the four cellular networks, but only two at a time, if that makes sense.

Jim Jachetta (01:02:12):

Again, so here’s kind of a nice graph. Samuel at Haivision, he’s the… Samuel Fleischhacker is a product manager for the field encoders. He does some really nice graphics. A lot of these slides, he put together. So this kind of shows the steps. So you either press the power button on the unit, connect external power, or slap a battery on. If you apply power suddenly, whether externally or with battery, and you have it set in the auto-connect mode, it’ll just connect. Here’s a common error we see. Photog in the field, he’ll get on the comms, Live PD, “Oo, camera 57, I’m doing a battery change.” So the video engineer knows 57 is going to go down for a minute or two, gets his new battery, slaps it on.

Jim Jachetta (01:03:20):

Two, three, five minutes go by, he’s not coming back. “Hey, 57. What’s going on?” “Oh, shoot.” The camera came back on, because it’s got a permanent on/off switch. The cellular did not come back up. So you have to put the battery. So now, when the camera operator reapplies the battery, the unit just starts transmitting. So here’s the steps. It boots up, it connects to the network, connects to the StreamHub, starts waiting on the camera. So if the camera has no video, it’s just waiting on that. But the camera, when you change out the battery, the camera probably boots up a little faster than the Haivision live video, and you’re live. So you avoid that hiccup, where the camera operator forgot to hit the live button. Camera operators know how to operate their camera, maybe they’re not as familiar with this kind of tech. So it’s nice.

Jim Jachetta (01:04:20):

Now, let’s just say you don’t have this mode enabled. Still, the video engineer from master control can remote into the unit. If it’s on, it can be seen from master control, and they can initiate the live. But this is really nice. I really like this feature. So it makes the field encoder, the Haivision PRO380 one less thing for the photog to worry about.

Jim Jachetta (01:04:49):

So this is about the video return. So the Haivision units have a video input and a video output. The video has SDI in, SDI out. The SDI in and out, the SDI out is a loop through. That can be helpful if you’re using an external viewfinder on your unit. So you could loop into the Haivision SDI out, into your Atmos viewfinder, your little monitor. That could be helpful. HDMI in is for more of a prosumer camera. The HDMI out is for the return video. So, that could be used for a teleprompter, it could be used, as I said, for the replay in the large screen in the venue. Basically, the sky’s the limit.

Jim Jachetta (01:05:43):

You can even see, here in this picture, you can have more than one return, and you can route where it goes to. So the one in green, we want that return only to go to site number three, the blue one, we want it to go to site number one and two. In the management tools, you can decide where that return video… does everyone see it, or is it segregated in some way? The return video is done in less than a second, and it’s full bandwidth, as I mentioned earlier.

Jim Jachetta (01:06:14):

Good. So here’s this mobile hotspot. So, here’s a graphical picture of it. It’s not uncommon for news personnel, sports personnel, they’re in the field and producing a story, they’re sending the script for tonight’s news events, so someone can feed it into the teleprompter, or it’s not uncommon where producer in the field might be producing something, might be cutting together some B roll, producing a short segment, a sizzle reel, promo reel, and they need to send that back. How do they do that?

Jim Jachetta (01:06:58):

Well, they can connect via LAN connection, or WiFi to the PRO or the AIR and get an internet connection through that secure VPN or internet connection, and or the secure VPN. They can also put the file on a thumb drive, feed that in, and just store and forward that, and there’s a media folder that will sit in, in the receiver. So there’s a myriad of ways of moving files across the connection. Or just get plain simple internet connectivity. So this is a common right now.

Jim Jachetta (01:07:35):

Two is with at-home production, using a cloud-based production switcher. So we can connect… At any point. We can feed cloud instances, production switching and graphics, we can take the output of something like that and distribute it, we can feed live cameras to that, we can feed it to the cloud and then feed the produced show to your master control, we can push it to a social media, to a CDN. You can see, on the input side, we can take virtually any protocol. We can take a SST, SRT transport over IP, RTMP, HLS, RTSP. Then on the studio side, we can do all those in the cloud. The only thing we can’t do in the cloud is SDI. That’s like my joke. We can’t do an SDI output in the cloud. But otherwise, the software and the environment is identical, whether you have a cloud, a StreamHub instance, or an instance in your master control. Only difference being is you have SDI out in the master control.

Jim Jachetta (01:08:52):

So the StreamHub. Haivision came up with the name hub or StreamHub. It’s taking streams in and out. It’s encoding, decoding, transcoding, streaming, recording. So it’s not just a decoder, it’s not just a receiver. So they didn’t want to call it the receiver. Sometimes, just for some simplistic terms, we refer to it as the decoder or the receiver, but it’s way more than that. So we can do IP to IP, or IP to SDI. As I mentioned, we can decode, we can transcode.

Jim Jachetta (01:09:29):

So, why is transcoding important? Well, it’s very common that Facebook, YouTube, they don’t accept the HEVC signal yet, or H.265. Or if they do take 265, they want constant bit rate, and they want five megabits, or they want… they’re optimized for 1080p and don’t like 1080 I, something like that. There’s always a list of guidelines. So, whatever format we come in from the field, if it’s an older Haivision unit coming in, H.264, or a newer unit with HEVC H.265, many times, we need to transcode, if we’re coming back out IP.

Jim Jachetta (01:10:15):

You can have multiple transcoding engines, there’s a screen for it, and you set your profile. You pick what input you want to transcode, you set your transcoding parameters, I want to do H.264, I want to do five megabits per second, you pick your encoding profile, you pick whether you want a 128 or 256 audio, you want AAC, audio, et cetera, and then you decide what output you want to send it on, what IP output you want. So you can have up to 16 IP output, 16 inputs coming in from the field, 16 encoders, 16 transcoders.

Jim Jachetta (01:10:56):

So the streamer can handle virtually any format, up to 4K UHD and in any standard. You can have StreamHubs talk to each other, set up like a matrix or a grid. So the IP output can use the SST. Remember, SST? I mentioned in the beginning, that’s Haivision’s safe streams transport. So SST can be an IP output, so StreamHubs can talk to each other and use that bulletproof, robust transport protocol I talked about. But then we all work with other systems, so we do support SRT, which is an open source industry standard. Haivision has, on the roadmap, to support NDI. So the IP output could be NDI in your control room.

Jim Jachetta (01:11:54):

It’s a very powerful appliance. So this is a little overview of the hardware appliance. It’s a [inaudible 01:12:03] server that sits in your data center or your master control, it can decode HD, 4K, does the transcoding, as I mentioned, it does have a hard drive in it to do recording, and you can have up to four 12G SDI outputs, and up to 16 IP outputs, as I mentioned. Then the cloud version is virtually the same. You have the transcoding capability, recording, multiple IP outs, but just, you don’t have the SDI out. And it’s exactly the same environment.

Jim Jachetta (01:12:41):

We have customers that will use a cloud and a physical environment in concert. The cloud is more for distribution and multiple destinations, and then if the destinations need SDI out, they’ll have the hardware version of the StreamHub to get SDI out. So here’s a little overview. Again, SRT, RTSP, TS, or transport over IP, RTMP, HLS incoming soon, NDI and WebRTC, SMTE2110. We all know SMTE2110, that’s the new IP workflow for broadcast. WebRTC comes up a lot, I’ve seen recently with gaming. We’re doing a lot of work with the gaming industry, or what we call now e-sports, and they use WebRTC quite a bit, so that… NDI, WebRTC and SMTE2110 are going to be very hot commodities very soon.

Jim Jachetta (01:13:44):

So I kind of alluded to this, and I apologize to anyone if you’re the owner of one of these vehicles. I’m just trying to make a point here. Again, I said, I don’t think satellite is going away, cellular internet will compliment it. But you can consider… The way I look at it, I think more events will happen. I don’t think you’re going to see, anytime soon, all the trucks eliminated for Monday night football. There’s just too much going on. They’re doing the graphics and the replay. But as 5G is rolled out, as internet conductivity proliferates, who knows what may happen? Or these trucks stay in the NEP parking lot, and they’re put up on blocks, and several fiber optic level three, the switch will connect to any piece central location, and these trucks are used centrally. The Truckers Union may suffer, because they won’t have to drive the trucks around. But a given truck can do more than one production in a day.

Jim Jachetta (01:15:02):

We’re all in the middle of adapting our lifestyles, both personally and professionally. So this is just something to think about, and I apologize if these red lines to the truck are offensive, but I’m just trying to make a point that the public internet and cellular is very close to the reliability that you get with satellite. Frankly, satellite had some Achilles heels; rain fade. The public internet is not susceptible to rain fade. Cellular tends to be less susceptible to rain issues. So no technology is perfect.

Jim Jachetta (01:15:42):

So here’s the slide, I just wanted to show this, showing how we can have PTZ cameras. I think we’re going to see more of that. One of my buddies, my buddy Barry, was doing some consulting work for a local synagogue, and he put some photos up on LinkedIn that I noticed last night. He’s got a PTZ camera in the synagogue to be aimed at the Cantor, I guess, during services, and someone in a remote location, maybe that’s Barry, will control that camera, and doesn’t need to be on site. So this data bridge, at-home production technology, combined with this secure VPN, where you extend your network, is very valuable.

Jim Jachetta (01:16:37):

So here’s another slide. All of this goes through this magical SST, or safe streams transport, this magical pipe. From your intercom to internet, live video, file-based workflow, return video, your PTZ control; all of these things can be done with the Haivision system. So here’s a little bit about VidOvation. I hope you folks got something of value out of our discussion today. I know 90 minutes, we’re approaching 90 minutes, is a little long, but I appreciate you staying tuned, staying attentive. For those of you who are not familiar with VidOvation, we’re a provider of video, audio and data transmission, contribution and distribution systems. We work with broadcast television, sports, news, production, corporate AV, first responders, government agencies.

Jim Jachetta (01:17:40):

You can see some of the logos, a handful of logos, of some of the people that we work with. We help clients integrate a custom solution into their existing infrastructure, and we do our best to work within your budget. I know not every budget is the same, where one customer, a million dollar budget is not that big, and another customer, a $50,000 budget is significant. So we try to apply the best technology for the project, while staying within your budget.

Jim Jachetta (01:18:20):

So why engage with VidOvation? You’ll get consulting, design engineering, system integration services, project management, warranty, and support. We’re probably most known for our technical support. I have three technicians on staff, we’re all on standby 24/7. If my guys are busy or sleeping or not on shift, the buck stops with me, that I include myself in the tech support workflow, and if one of my guys does not pick up, the buck stops at me, and I will answer the tech support calls at any hour of the day, any day of the week. So we pride ourselves on amazing customer service. That’s very important to us.

Jim Jachetta (01:19:08):

So here’s some contact information, how to reach me. You can drop us a line, my email is JimJ@VidOvation.com. Our main phone number is (949) 777-5435. Be sure to visit our website, VidOvation.com. We usually get the recording produced, transcribed and uploaded on the website by a Friday afternoon, but I’ll just tell you, by Monday, at the latest, you will have the recording of this, with the transcription. We put the slides in the transcript. If you like to read stuff more than listen or watch a video, we have that option.

Jim Jachetta (01:19:57):

We encourage you to call our team any time to discuss any of your active projects around video transmission, contribution, distribution. At-home production is a hot area for us right now. We also do a lot of work with live television distribution on the corporate network or the enterprise. So a corporate or an enterprise IPTV system and digital signage system. We’ve worked with Paramount Studios, Viacom, Nickelodeon, Big Ten Network, distributing cable, direct TV, their own internal video, all on the network to set top boxes, TVs, et cetera, on the network.

Jim Jachetta (01:20:43):

So if any of this sounds interesting to you, please reach out to me or my team. Thank you all so much for tuning in today. We couldn’t do this, VidOvation wouldn’t exist without you folks out there. We really thank you for your continued support and for listening today, and I hope all of you stay very healthy and safe, and I hope to see you at some point out there on the road. The best part of my job is visiting a vendor’s facility, being in the rack, in the data center, helping them get on the air. I live for that, I enjoy that. So I can’t wait to get back in the trenches and work elbow-to-elbow with you, with my mask on, of course. So, look forward to seeing you soon. Thank you so much and have a great rest of your day. Thank you. Bye-bye.

Continue Reading