Learn about:
- PGA TaylorMade Driving Relief At-home Production using Haivision
- More efficient ways to contribute and distribute live television and video
- VidOvation has been promoting At-home production for several years. Today it is more of a necessity during these uncertain times with social distancing.
- Address the challenge of maintaining frame-accurate video genlock and lip sync across multiple cameras for wireless at-home live production or REMI.
- How labor costs are reduced since the television production specialists work at a centralized master control near where they live, eliminating the need for travel and overtime expenses.
- SafeStreams Transport (SST) for video transport over unmanaged networks such as cellular and the public Internet
- New codec – SMPTE VC-6 ST 2117 P.Pro – a More Efficient Alternative to Uncompressed, JPEG XS and JPEG 2000 from V-Nova
- New low latency wireless technology for remote productions
Jim Jachetta, CTO at VidOvation, Engineer, Design Consultant, Integrator, Trainer, Teacher, Author & Speaker
With more than 25 years of experience in designing, integrating, and delivering video transmission and distribution systems, Jim Jachetta is the driving force behind VidOvation’s world-class technology that makes the impossible and never been done before a viable solution within your daily business operations. Using modern, easy to support technology, Jim and the talented VidOvation team creatively design, implement, and support wireless, cellular, IPTV, video over IP, and fiber optics installations that meet your organization’s business goals, and at a price point that fits any size, scope, and budget.
Transcript:
Jim Jachetta (00:01):
Okay. Good morning, everyone. I’m Jim Jachetta, CTO and co-founder of VidOvation. Today we’re going to talk about a few applications we’ve done recently for at-home production, in particular the PGA TaylorMade Driving Relief charity event a few weeks ago was a skins game. But at home production, REMI production has been a very, very hot topic as of late. VidOvation and our partners such as Haivision, we’ve been promoting, we’ve been successfully implementing and deploying at-home and REMI production for the better part of five years. We’ve done numerous sporting events, we’ve done numerous live reality television events. Some of our partners have created this new category of live unscripted television shows such as Live PD, Live Rescue and others. Today, we’re going to start talking about this event with the PGA.
Jim Jachetta (01:12):
It was a very successful event. As we all know, we all on a lockdown, we’re all maintaining social distancing. If we are closer than six feet, we’re supposed to wear masks to lessen the curve or mute the curve so we don’t spread the COVID-19. So this actual charity event was to help people on the front line, help nurses, support the CDC and others who are struggling with this pandemic. The event actually raised five and a half million dollars, way above projections, and it was a great game. I don’t know if any of you out there are golf fans, but it was fun watching. It was a charity event, but they’re still professional athletes, they’re still competing with each other. They are a little trashed, with the polite golf trash talking of each other. It was great.
Jim Jachetta (02:20):
The event was put out over the usual … The rights holder was NBC, NBC Sports, Sky Sports overseas, and then it was streamed out OTT to other entities. We were very proud of VidOvation, and our partner, Haivision, were very proud to be part of this landmark event. So what was the workflow? What was the logistics? So the event was at the Seminole Golf Club in Juno Beach in Florida, and this is a private club. I believe one of the players, I don’t remember which, his dad is a member, so he played there as a kid. So he had a little bit of an advantage over the other three golfers. But this was a private club, a private golf course that no one has seen, the public has never seen. You can imagine if it’s a course that’s never had a live event, there’s probably no fiber optic infrastructure.
Jim Jachetta (03:20):
The switch or level three probably does not have a fiber connection into the venue. The only options might be satellite or some of your more traditional means of production and transport. So at-home production, or in particular bonded cellular at-home production, was perfect for this event. They didn’t have a fiber connection, they wanted … they minimized the footprint on the prem, on the course. So I guess the state agencies came up with a number of 50 people maximum. So you had four golfers, 28 television crew and 18 officials. So we were able to keep it under that 50 person mark. A big part of this is not having one or two or three production trucks on site, and all the support personnel that involved that. You have to have your instant replay people, your TD, your director, your producers, your audio people.
Jim Jachetta (04:25):
At minimum, a tractor trailer full of equipment and people on site. But the benefit of At-Home is you use your facility at home, and in this case, the At-Home or the production or master control was PGA headquarters in St. Augustine. Now, you might ask, well, they were going from Southern Florida to Northern Florida. Once we go on cellular, this could have been 10,000 miles away, and we’ve done applications 10,000, 8,000 miles away with Turner Sports for the Ryder Cup. I have a few slides about that. So once you put your video on cellular, on the public internet, whether you go a couple of 100 miles or thousands of miles, the distance becomes irrelevant then at that point. Play-by-play announcers were at the studio, production team was there, some of the analysts were there.
Jim Jachetta (05:24):
One analyst, Mike Tirico, did the commentation from his home in Michigan. So you can see the spread out distance of this production. You see, this dotted line here is the first mile or the first few miles, went over cellular. Then the cellular networks dumped to the public internet, and then the public internet connects to the Haivision receivers in St. Augustan. Then the output the Haivision receivers spit out typically SDI or IP, if you have an IP facility, and you use your traditional production switcher. I’m certain PGA use the EVS or something similar to that to the capture some instant replays. It did help. There were only four players. It wasn’t like you had a full tournament where people are teeing off at different holes at different times. So instant replay or capturing stuff in replay was less important, but that was still a part of the workflow.
Jim Jachetta (06:33):
Then the particulars out on the course. One of the big benefits or big differentiators of Haivision is they’ve been doing this for more than five years, where they’re able to maintain frame accurate genlock and lip-sync on portable units and on multiple cameras. So they send a form of precision timing protocol. I mean, it’s an unmanaged network so they don’t have access to the switches that are part of the public internet, but it’s a similar technique. They time lock the receiver and the transmitters together so the video comes through synchronously. As you’ll be able to see here, there were two cameras in the tee box, and they used the Toptracer system that draws the line where the ball went, and it draws that arc.
Jim Jachetta (07:31):
One thing I learned about this project is the Toptrace telemetry actually goes through an audio channel. So because the Haivision PRO series and AIR series have two analog audio inputs on many XLRs, they were able to feed the Toptrace telemetry into one of the Haivision audio channels and were able to have that functionality. That was a nice surprise for the PGA that they could use to Toptrace while keeping such a low footprint with the at-home production using Haivision. Then there were two cameras on the fairway, each with a Haivision that … The Haivision PRO, you can mount it on the camera, between the battery and the camera if you have a larger camera, either VLOC or Anton Bauer, or if you’re using a smaller camera. Some of our live reality shows will use a smaller camera than a larger ENG or larger sports camera. Then they’ll wear it in a backpack.
Jim Jachetta (08:38):
You have that option where you can mount the unit on the camera, and that was the case here with the PGA. Then again, they have the Toptrace on the fairway, and then they had another two cameras at the pin capturing the action. Then two POV beauty shots. They had a nice shot of the nearby ocean and the dunes. This course seems really nice. It abuts the beach. So they had a nice beauty shot of the beach and the dunes adjacent to the beach. I’m sorry, adjacent to the course. Then another beauty shot of the clubhouse, and they use the smaller Haivision unit, the AIR320 which has two internal cellular modems and the option for two external modems. Using HEVC, two modems was more than enough. I should mention the 380, the flagship Haivision product, has eight modems. We like using eight. Some people might say, “Well, do you really need eight?”
Jim Jachetta (09:50):
You probably could get away with less modems in certain circumstances, but we’d like to cover all contingency, especially when there’s crowds involved. For this event, there were no crowds, but we did a PGA event in the Caribbean and there were typical crowds allowed on the course and they’re all filming and uploading to Instagram, et cetera, and the Haivision 380 worked really well having eight modems. When we’re operating in the US, we’ll do two Verizon, two AT&T, two T-Mobile, two Sprint, and we like that diversity. Particularly in an area that gets overcrowded, Verizon and AT&T are the most popular networks. Those get over-utilized first, then T-Mobile and Sprint in some cases will help pick up that slack. We’ve seen some instances where Sprint is the only network available to us, and we’re able to get a few megabits per second out.
Jim Jachetta (10:47):
Then it helps having a very efficient HEVC codec where you don’t need very many megabits per second for a good looking picture. This show PGA ran it at five megs, which is equivalent to 10 or 12 megs with H.264. So they ran a five megs with HEVC. Then you see here, we also have … There were two commentators on the course with them, with the players, and they had Shotgun mics, interviewing mics, where they were interviewing the players asking them questions. The players themselves were miced as well. You can see here technicians had the AIR 320 nearby and they used wireless mics on the players, and you see here two Haivision units. We used the two audio connections on the two units to make up for analog microphone connections. So we were able to go wireless from the talent.
Jim Jachetta (11:51):
From the players to the Haivision, they had a small microphone receiver in the backpack with the AIR 320, and then those microphone channels were sent back to master control. So you can see here, there’s actually more to this. Then they had two parabolic mics following the action, catching the sound of the golf strikes, the golf swings, et cetera. Then in addition, they had a plane overhead. You could hear it buzzing around during the event. They use a traditional microwave link from the plane to the ground, but then used an Haivision 380 to get that microwave receiver signal back to master control. So you can see here, we got two, four, six, eight, 10, 12, 13, 14, 16 … 25, 30 microphones all open in close proximity. If we had the slightest bit of genlock misalignment, first you would notice it in the video of the different camera angles of the same action.
Jim Jachetta (13:09):
You’d see the golf strike twice if there was something out of sync when the director cut between cameras. There were a few shots that were out of sync, but the PGA explained that during the sports video group session, that when they use the Toptrace, they lose a few frames of video. The TD is supposed to pick the cameras accordingly. He should be using a camera that’s carrying the Toptrace for the video when they’re using the Toptrace. There was a little production glitch there. But aside from that, Haivision performed perfectly. There was no lip-sync issues. You can see here where if the parabolic mics were open and the tee box mics were open, you’d be like, “Hello, hello.” You hear whoosh, whoosh when they struck. There was none of that. Everything looked perfect, frame accurate.
Jim Jachetta (14:05):
Each of these cameras were autonomous. They don’t need a genlock input to each of these cameras. These camera operators can wander untethered. All the magic happens in the receiver where everything is genlocked and corrected. What makes that possible is what Haivision calls Safe Streams Transport, or SST. It’s a very robust, very complicated transport mechanism. It’s already their third generation. You can see here the Emmy Award in the photo here. They’ve one technical Emmy’s in 2018 and 2019. I don’t know. I know you’re listening Flory, and I don’t know when you guys are going to be awarded your statue for 2019. NAB Vegas was canceled, as we all know, and I’m sure you’ve heard NAB New York is not happening. But it’s because of this Emmy Award winning technology that everything is frame accurate. It makes this whole transport possible.
Jim Jachetta (15:17):
You can see here, I think they had one PRO and one AIR for backup, but they used seven or eight Haivision PRO380 and eight or nine AIR 320s. The AIRs were typically used to transport the audio, so it was lower bit rates. We used two or four modems. Then the video and audio was transported using the PRO 380s. SST, we’ve done other presentations. Samuel Fleischacker, Product Manager at Haivision has done a webinar. You can find it in our blog or look under the resources page, and then subcategory webinar, you’ll find a webinar Samuel did end of last year. There’s a few webinars that Florian KOLMER, the biz dev and sales engineer for Haivision for the US. Florian and I have done quite a few demos.
Jim Jachetta (16:17):
We’ll go more into the nuts and bolts of SST, but a big part of it is forward error correction, and then ARQ or packet retransmissions is a part of it. Not all transports are the same. Others do ARQ, others do FEC. The big thing is the bonding, the aggregation, and really managing the cellular modems. Cellular network is very unpredictable. The modems themselves, the cellular connection, the cell tower can just decide, “Whoop, your lease is up on that connection.” So one of the Verizon modems might decide to disconnect. Well, thank God we have a second Verizon modem that’s there until the first Verizon modem renegotiates with the tower and gets its connection back. So most of the time, all eight connections are rock solid, but there are those moments where things happen unexpectedly, and the Haivision algorithm technology and SST is what smooths that all out.
Jim Jachetta (17:28):
Here’s the units. As I mentioned, the 380 has eight modems in the AIR, 320 has two internal. You can connect two external USB modems to the AIR if you want to add four. But you can see the footprint, the 380 can Mount on the camera, it’s got the battery mount plates, or we can put it in a backpack. The AIR, it does have a quarter 20 screw thread on the bottom. I’ve seen some operators put it on top on the hot shoe or an accessory arm on a smaller camera. There is a small shoulder mount little backpack, a little slang for it. You could see in the PGA event the on-course commentator was wearing one. There is a little belt clip to mount it to your belt. There’s a number of different configurations for all of this technology. Then as I mentioned, here’s a picture.
Jim Jachetta (18:27):
You have all your cameras coming through. You’ve got your audio signals. It’s this multi-camera genlock or sync, frame-accurate lip-sync, frame-accurate genlock. That’s what makes these shows possible. The Haivision technology can go down to eight tenths of a second, a half a second on a constant bit rate set up, or eight tenths of a second with variable bit rate. PGA chose to run at 1.4. They felt their commentators, their talent, their analysts were familiar working with 1.4. Why not run it three seconds? Why not run at eight tenths of a second? Well, it’s a decision. The more aggressive you go with the latency, the lower you make the latency if there is a hiccup. Most people think when operating over cellular it’s a matter of bandwidth. Yes, bandwidth is important, but I really put latency first.
Jim Jachetta (19:25):
Haivision, VidOvation, we’ll watch the stream hub receiver display, and for the most part, the roundtrip latency is about 50 or 60 milliseconds, which is very nice, very manageable. You might say, “Well, the rule of thumb is you transport it four or five times around trip latency.” There you go. That’s how you can get down to a half a second or less, or three tenths of a second, a 300 milliseconds. Problem is you’ll see a cellular modem all of a sudden jump to 4,000 milliseconds, or four seconds. There’s just this unpredictability. There’s a choke point somewhere in the cellular network at the cellular tower. The SONET ring that is connecting the cellular network together to the central office of bottleneck and the internet. So it’s that unpredictability where we have to build in a little bit of a buffer.
Jim Jachetta (20:24):
What I would like to see with the promise of 5G is lower latency, more predictable, more consistent latency. Yes, more bandwidth is great, but I think that’s coming, that’s going to be part of 5G. So we’ll be able to get down to a couple of 100 milliseconds if we can get that better quality of service out of cellular. Here’s some of the functions of these products. In the case of the PGA, it was a live event, so we were using the live function. Another novel feature of Haivision is there’s actually two encoders in there. There’s two hardware-based encoder chips in there. For live transmission over cellular, we use a variable bit rate and code. Then for recording, we use a constant bit rate encoder. Why do we do this?
Jim Jachetta (21:18):
Our competitors will record the live transmission, but when do you need to record in your encoder most? When the cellular is struggling, and if the cellular is struggling, the bit rate will drop and so does the bit rate of your recording drop. So what good is that if the recording is going up and down with the available bandwidth of the cellular network. With Haivision, you send out your live transmission variable. If you hit a patchy part … Most productions, they’ll record in camera. That’s one recording, but you don’t have to ship that memory back to master control or your production company to produce the show. This is for live TV. We were doing a live reality show and they had a big drug bust out in the middle of a cornfield, and the bit rate dropped about 500 kilobits per second.
Jim Jachetta (22:12):
Audio was crystal clear, video was starting to get a little soft at 500 K. Now, this was granted at the time, this was H.264. Today, 500 K would look much better, but still the producers were like, “Wow, this is like the bust of the show. This is the highlight reel of the show and the bit rate’s low.” I happened to be in the studio with Florian that day, and we’re like … I go to the technical consultant and I’m like, “Hey, you’re recording in our unit, right?” “Oh yeah, we do that as a safety net.” We record in camera. In the Haivision unit, of course, they’re recording in the studio dumping it into an EVS. I said, “Well, when you break away to another city or your breakaway to commercial, I’ll show you how to retrieve that file, that clip from the unit.” So you can’t pull the file through live, so we had to stop the live.
Jim Jachetta (23:12):
The bust was over, things were winding down. We remoted into the unit and we actually found the clip, and because the recording is done with fragmented MPEG, we were able to find just the clip we needed, pulled the clip through, put it on a memory stick, fed it to the production people. They were able to ingest it, and then 40 minutes, a half hour later, we were able to go, “Oh.” The anchor came back, says, “Ladies and gentlemen, that drug bust we did at the top of the hour, we were able to retrieve some cleaner footage from the field and we played it back,” and that was recorded at five or six megabits per second. So it was perfect. It is because of things like this that people accuse this show of being fake. Then if you want to do a near live scenario where you can recording cameras say at five or six megabits per second, and as the file records, it’s streams to master control.
Jim Jachetta (24:12):
If cellular is good, the file will arrive in real time. If the cellular gets a little sketchy, it’ll slow down, but it will eventually catch up. So we got live, record, forward, live and simultaneous record, this is the most common, and then record and simultaneous forward. This is what I just mentioned. Haivision also has this data hotspot, this internet and VPN bridge. So others have this, others can do a hotspot out in the field, but Haivision takes it a step further. They create a secure VPN. So if you have PTZ cameras, intercom systems, other telemetry, you want to give your field producers access to the server back at master control or back at the studio, or back at corporate. We can extend the network out into the field, put the assets in the field on the same sub-net as the master control, which is amazing.
Jim Jachetta (25:18):
We’ve been using this a lot during COVID. There’ve been a lot of TV shows now doing Zoom meetings of famous people in Hollywood giving each other hair cuts, giving dye jobs to their wives. Jerry O’Connell and his wife, he gave his wife a … He dyed her roots live on Zoom, and the locations they were at had poor internet connection, so they use this hotspot and VPN bridge function to make that show possible. A hot folder, that would be … Associated Press has used this where they take photos in an SLR, dump it via WiFi, push it via FTP to a hot folder in the Haivision unit, and that hot folder is automatically forwarded to master control or corporate. Video return will be released very soon from Haivision. Mission centric, this is using a dialect or an EVS system where you plan the whole production of your show, where you call the … you pick up this mission.
Jim Jachetta (26:33):
So you turn on your unit and you’re like, “Oh, I’m shooting the protest rally today in Sacramento.” You find that mission, you click on it. This way, the metadata is embedded in the production, the files are tagged as they’re ingested into the workflow. So Haivision is becoming the entire news production, sports production workflow. This is a very cool feature, this auto-live. There’s a little button on the side of the Haivision, where … a little switch, a little hidden little switch, and you put it in auto-live mode, the second you apply power, the second you apply a battery, the unit will start transmitting. Because a common mistake a photog will do, battery will go down, we’ll put the battery on, the camera comes up, he forgets to hit live on the Haivision transmitter.
Jim Jachetta (27:24):
So this alleviates that. Puts the battery on, bam! The Haivision PRO or AIR is going live. Here’s some of the particulars about this PGA event. They shot 1080i59.94. They could have done 1080p if they wanted. They chose a variable bit rate, five megabits per second. Now, Haivision HEVC codec is very unique. It’s more efficient than most of the other codecs out there. So I would argue PGA running at five mags is probably analogous to seven or eight megs with other HEVC. I don’t know if you saw the game, and I might be biased, I will say that, but they had a beautiful day. The sun was out, blue, blue skies, the grass was green. The Haivision technology made beautiful pictures, it made beautiful pictures. I don’t think there was anyone would say that the production quality of this PGA event was limited in any way.
Jim Jachetta (28:29):
They were able to do their usual graphics, but instead of that happening on site in a trailer, that happened at master control. The production quality was there, and in some cases, I say it was even better than some of the traditional productions. The cameras they used use a 3G SDI interface. They could have used smaller cameras with HDMI. The units have the ability for both. The audio is AAC. They used one stereo pair. The unit actually can support two stereo pairs or four mono channels of audio. They chose to run at 128 kilobits per second. You could make this 256. I think you could make this as high as you want depending upon the quality. I think they chose 256 for that Toptrace telemetry channel. They wanted a little bit more fidelity to get the telemetry through.
Jim Jachetta (29:23):
So you can play with this, set your maximum bit rate, set the audio bit rate. Then again, here’s the interfaces, and then also when it comes to audio, well, audio usually comes in as embedded on the 3G SDI, or the HDMI, or a combination of both. You could have embedded audio on the SDI on channels three and four, and then channel one and two, have some external microphones, a lapel mic on the talent coming into a microphone receiver on the camera rig, feeding this many XLR input. A lot of these reality shows we’ve done in this PGA event, this analog audio input, this many XLR has been invaluable. It’s a real lifesaver. Otherwise, they’d have to have some clunky audio and better in the field, and then you got to worry about how to power that. This is you just feed your analog input.
Jim Jachetta (30:20):
That’s it for that for the PGA tour. Haivision and VidOvation, we work very closely with our customers. We’ve done other at-home production projects with Turner Sports. We did the Ryder Cup and then the UEFA Championship Leagues. We’ve done some work with CNN. We’ve done these live reality shows on A&E and Fox and Discovery Channel, Live Rescue, Live PD. First Responders Live. The guys that at Live PD, Big Fish Entertainment, in my mind they invented a new category, the category of live reality TV. Bonded was intended for doing a static interview on the courthouse steps. Camera on a tripod, reporters standing there on the steps, single camera type setup. Live PD is a testament to how robust Haivision technology is under speed. We’re able to maintain a frame-accurate genlock. In the Live PD there’s actually two cameras.
Jim Jachetta (31:40):
Well, actually, no, I take that back. Often enough, there’s four cameras lit up inside the cop car simultaneously while going 80 or 120 miles an hour down the freeway, and as they cut between cameras and microphones, the genlock and lip-sync is maintained. So this is a real testament to the robustness of that SST, that Safe Streams technology. I alluded to this upfront that in the PGA TaylorMade skins game, they only went a couple 100 miles from Southern Florida to the Northern Florida. But in the case of the Ryder Cup, we had 20 cameras. 16 ISO cameras from Paris coming back to Atlanta, to Turner Sports, four-program feeds coming back for confidence. There was analysts on both sides. So the analyst could hear and see each other on either side, so that the analyst in Paris could see the studio in Atlanta and vice versa.
Jim Jachetta (32:53):
This was done with Haivision rack-mounted technology, the HE4000, and that’s a four-channel appliance in a half rack. Haivision now has a single channel appliances in a half rack called the RACK Series. The RACK200 is H.264 and the RACK300 is HEVC. The HE4000 does four channels of 3G HD-SDI, 1080p60, or a single channel of 4K. So they the Turner use the four channel appliance. Here’s the four HE4000s picking up the 16 1080p … I should say, 1080p50 signals. Two days before the event, they were doing some rehearsals, and Tom Sahara calls me up. Well, Tom Sahara is not the type to freak out, but I could tell he was a little concerned. He says, “Jim, we’re shooting the event with a production team in Paris and none of us considered that they would be shooting … that their truck is 50 Hertz frame rate. So how are we going to output a 5994? I don’t have 16 standards converters to convert from 1080p50 to 1080p5994, or Turner’s probably 720p5994.”
Jim Jachetta (34:29):
I’m not sure. I’m sure somebody can correct me if I’m wrong on that. But you see the dilemma there. So I said, “Tom, don’t worry about it. Haivision has this covered. You just said it in the output profile of the Haivision stream hub, and you can set it to anything you want.” So we set it to 720p5994, and the Haivision receiver does the trans code for you. You might ask, “Oh, is that going to add a frame of latency? Is that going to delay things?” Not a problem. Everything is perfect. There’s no lip-sync issue. So that really, really helped. Then they use the single HD to bring four-program feeds back or multi-viewer back. Haivision now does have a multi-viewer function built into the stream hub. So this was a few years ago. It was already about a year ago. Now, this workflow might’ve been able to be simplified a little bit.
Jim Jachetta (35:29):
We could have brought a multi-view signal through instead of bringing all four through. So there are some new developments from Haivision. Then here’s a picture of the control room, all the feeds coming into Atlanta. They did have a truck onsite, but less of a footprint. This was before COVID. But there are ways of shading cameras remotely, so you probably want a … In the case of the PGA, they did use some cellular technology to shade the cameras remotely. The VPN bridge of Haivision could be used to facilitate that. That connection is lower latency. But on our website, some of the truck providers, your [inaudible 00:36:24], your Game Creeks, they’re like, “Why do you have a picture of a truck on your website with a line through it?” I’m like, “I’m just trying to make a point.”
Jim Jachetta (36:30):
I don’t think trucks are going away. I think they’ll probably have more of them, but smaller trucks. You might have a smaller truck with a video engineer just shading the cameras. You might have a truck where you go microwave from the course or from the field to the truck, have a small production switch or a tricast or something like that. Or you might do the production switching in the cloud, and then use bonded cellular from this smaller truck to get the feed out. So there’s home running the whole feed, there’s a home running part of the production or all of the production, or a blend of the two. There’s many different ways to do that. Here’s some slides about the Live PD show that we’ve done. The police, I think, need all the help we can give them. Things are getting a little crazy out there.
Jim Jachetta (37:32):
I think the Live PD show has a lot of officers that are good at their job, and unfortunately, a few bad apples are making them all look bad, which is very unfortunate. But the Live PD show, there’s two camera man in the backseat in each police car. There’s two cameras. There’s a camera out the front dash, a POV camera on the rear view mirror on the photog. One of the cameramen has a tendency to sit in the front seat and shoot out the windshield and at the driver. There’s three or four cameras lit up inside the police car, and it’s this SST, this frame-accurate genlock and lip-sync is what makes a show like this possible. All the video comes back to the A&E master control in New York. They dump it into an EVS system. Line producers look at the content, they add metadata.
Jim Jachetta (38:31):
They see a gun, they say G for gun, D if there’s drugs involved, and in this way they can scrape through the raw footage to find the segments, the clips, that they want. Big Fish Entertainment in their infinite wisdom when were tapped to do this live reality show, when they pitched this live reality show to A&E, they were smart enough to realize that this show was very much like sports, that live reality TV was going to be like sports so they were going to need an EVS system, that they were going to need EVS operators. They were very clever in that regard. Here you can see some photos here, this particular photog. Some of the photogs, it’s hard to see in this picture, maybe this guy in the front seat here, you see he’s got a little bit of bulge there in his front there. That’s not as belly.
Jim Jachetta (39:22):
He’s probably got the Haivision transmitter in a pouch on his belly with some batteries, and you see the smaller camera they’re using. An ENG camera, they would be knocking the lens off the camera every time they got in and out of the cop car. This operator chose to wear the Haivision backpack, and he’s got it on his front so he can sit down in the cop car. You don’t want something on your back getting in and out of the car. So they basically jump out of the car shooting. They got the camera rolling, and with this tiny little camera, they can catch the action. They won’t catch their lens on the door jamb as they’re getting in and out of the cop car. You can see here in the picture, you see there’s a … it looks like a GoPro. I think there’s a Marshall camera up in here as well, catching the POV shot of the officer, and then camera shooting out the front seat.
Jim Jachetta (40:19):
Then here’s the control room. So these four operators, so each one of them is watching the live feeds coming in from a given city or given police vehicle. They’re doing the first line of metadata, adding data. Then this is a line producer here looking at clips as they come in, and then they’ll cut a rough package of a certain clip that they want and then they put that package up in front of the … I think he’s ex NBC Sports Gonzales, is the director of the show. So they put the package up, and then he calls shots like it’s live. Take camera one, take camera two, but they’re playing that back from EVS. So they dump it all into EVS. There is a, for the safety of the officers, at least a 15 minute delay because a lot of the bad guys watch a Live PD show, and you got to have some delay in the production to help with officers’ safety.
Jim Jachetta (41:23):
That dovetails into one of our other vendors who’s going to speak next week, V-Nova. V-Nova has some technology that is ideal for IP video transport at-home production contribution, distribution over a managed network. The PGA TaylorMade at-home production, the Ryder Cup production. Ryder Cup was done over public internet. It was done over a single public internet connection. I would have slept better during that production if they had two connections. But hey only have budget for one. We didn’t drop a single packet. But I would have preferred to see them bond the multiple connections together. So Ryder Cup was public internet, PGA was cellular. Cellular eventually dumps to the public internet, so it is a combination of cellular and internet, but that’s all unmanaged.
Jim Jachetta (42:26):
Now we can get lower latencies if we use a managed network, and V-Nova’s technology, they’re basically … They have two approaches. One approach is operating as an alternative to JPEG 2000. Another approach is offering a secondary stream to enhance older codecs, to enhance older hardware and coders. What does that mean? So there’s a new standards, MPEG-5 LCEVC. Try to say that fast, LCEVC. Low complexity enhanced video coding. So what is that? So it’s a new standard. It helps alleviate some of the challenges. Not everyone is ready to throw out all their H.264 infrastructure. But this LCEVC can help bring some of the benefits of higher efficiency to lower quality hardware infrastructure, and for that matter, software infrastructure that you have out there.
Jim Jachetta (43:36):
What is it doing? It’s a data structure defined by two streams. So there’s the base stream or the primary stream, which is decoded by your typical hardware decoder. So this could be a H.264 for an example. So you have an older or generic H.264 encoder and a generic H.264 decoder. What V-Nova does is it looks at the video coming in and the video coming out of the H.264 decoder and looks for errors, looks for the losses between the original and the encoded stream, and then it formulates a secondary supplemental stream filling in some of those missing pieces. As we all know, compression is not perfect. H.264 is good. I don’t think it … We’re not going to shut everything off H.264. We all see the benefits. The PGA TaylorMade event was a testament to how good and how beautiful Haivision HEVC looks.
Jim Jachetta (44:47):
But this is a technique to recoup or as an intermediary transition to upgrading everything. We’re able to stay compatible, because let’s just say this could even be implemented OTT or streaming to the home. Well, my Apple TV, my Roku may be can only support H.264, but certain newer appliance … I’ll just use an example, just use your Roku or Apple TV. The newer units have the ability to decode this secondary channel. But if you have an older set top box that only does H.264, it’ll just ignore that secondary channel. It won’t look at it, it’ll just throw it out. So you’re able to get the higher fidelity if your decoder has the software in it to implement this secondary stream, this secondary channel.
Jim Jachetta (45:50):
So we’ll learn more about that, and I apologize to my friends at V-Nova if I didn’t explain that clearly, but I think that’s the basic idea, and we’ll learn more about that next week. Someone might say, “Oh, well, is this only for on-demand?” The algorithms are fast enough, real time, where this can be done on-demand as well as live. We’ll also look at the new standard or the new codec developed by V-Nova, which now has been ratified as a SMPTE standard ST 2117 or VC-6. This is the standard that I alluded to, it’s great for contribution, distribution, remote production workflows. It’s great as a mezzanine codec. It’s a more efficient alternative to JPEG 2000, JPEG EX, and of course, a more efficient in terms of Uncompressed.
Jim Jachetta (46:53):
Uncompressed is full bandwidth, or SMPTE 2110, something that’s lightly compressed. Why is this important? Well, I think as with HD, I think sports are going to be driving 4K, and sports leagues are already seeing, we’re in discussions with some of the sports networks, that with 4K, the JPEG 2000 is not efficient enough, that using the SMPTE 2117, we can, with 4K, see 70% or 75% savings, with HD 20% to 30% savings. It’s all about saving bandwidth, and it doesn’t lose any fidelity. We’re talking about like fidelity … excuse me, live fidelity. Analyzing the input and the output that we can see maintaining the same fidelity of 70% lower bit rate and that’s quite powerful. Again, not to be a spoiler, but this technology has AI built into it. So it has a library.
Jim Jachetta (48:14):
If it sees the details of grass, let’s say, take golf for example. The details of grass is hard to produce, or in basketball, the sea of faces with that orange ball going back and forth. Its progressive downscaling and upscaling is the gist of the technology. Again, that’ll be explained, but it down scales and looks at the error, so it constantly … It even has AI and it learns that, okay, that grassy fringe over there, we lost some fidelity. Okay, it learns how to handle grass, it learns how to encode grass. Okay, we’ve seen this sharp edge before, and it has it in its library and it knows how to process that, it learns how to efficiently process that the next time it sees it. So the more content you feed the system, the more efficient it becomes.
Jim Jachetta (49:13):
We’ll learn more about that next week. We’ll also learn this rack mountain unit you see here, it’s called the P.Link. It’s an eight channel device, eight input, eight output, or any combination thereof. Seven in one out, four and four, whatever you want to do, of the V-6 or the SMPTE 2117 transport. This is an appliance implementation. It’s been fully integrated with AWS MediaConnect and AWS Direct Connect. We’ll talk about that more next week. What are the benefits of this? I mentioned high quality, low latency media transport. It’s more cost effective than having a dark fiber or a telco connection. But it’s really meant for a managed network. Here’s a little bit about VidOvation.
Jim Jachetta (50:07):
I’m sure most of you know who we are, but in case you don’t we’re a provider of video and audio and data transmission systems for contribution, distribution, for broadcast, sports, production, corporate AV, first responders and government agencies. We’ve been working a lot lately with first responders. We’re doing a lot of at-home production, a lot of these bonded cellular hotspot, the VPN. We’ve been maintaining our normal hours. Our New York, Arizona and Southern California locations have all been open. Eastern time, we are open from 9:00 AM to 9:00 PM. Pacific time, we’re open from 6:00 AM to 6:00 PM. Our technical support has maintained operations 24/7, so we’ve been servicing our clients.
Jim Jachetta (50:59):
Actually, demand has increased the good 40% or 50% during COVID. We’ve been helping our customers. VidOvation, we excel at integrating custom solutions into your existing infrastructure. Sometimes when a vendor comes in, they’re like, “Got to rip everything out, start everything over.” Sometimes that makes sense, but most of the times we’re able to integrate the new technology into your existing workflow, into your existing infrastructure. We have solutions that satisfy almost any application or budget. So we love to hear from you, we love to have discovery calls. Part of what we offer is engineering consultation, design and engineering services, systems integration, project management, ongoing warranty and support, and we’d love to hear from you.
Jim Jachetta (51:52):
Does anybody have any questions? Well, I’m not to pat myself on the back. I think I did such an amazing job today, I don’t see any questions. Let me see. Just some thank yous. Yeah, so a common question we get is when will the recordings be up? We’ve been doing these webinars at least one a week now for the better part of eight weeks. We had two last week. We had AB Aero wireless partner, and we had a fourth episode in the series of four with Haivision. We talked about the stream hub last week. It takes about a week. Sometimes we get it done beforehand, but we transcribe the session. We obviously record it. We strip the video recording, we put up online, and we make a separate audio recording.
Jim Jachetta (52:51):
We strip out the audio, and we actually have a video podcast feed and an audio only podcast feed. So you can find us that way. I like to listen to books and technology in the car or while exercising or walking the dog in the morning. So I listen to a lot of stuff. I use Audible and different things like that. iTunes, Spotify for podcasts. Whatever your preferred method is, or call us up. Here’s my contact information, here’s my email address, jimj@vidovation.com. We would love to book an engineering discovery call with you. Me, my colleagues, Rick Anderson, David Robinson, we can bring in Florian KOLMER, sales engineer from Haivision for the Americas or any other vendor.
Jim Jachetta (53:49):
We can bring them into the conversation. So we’d love to hear from you and work with you. So have a great day. I hope you’re all staying safe and healthy out there. It looks like we won’t be seeing each other face to face for NAB New York, but hopefully this is good, if we’re connecting virtually this way, and if we do an engineering discovery call, make sure you set up a camera so we can see each other and have that personal connection. So we look forward to that. Thank you so much and have a great day, everyone. Be safe. Bye bye.
Download Presentation & Watch Webinar Recording
Learn about:
- PGA TaylorMade Driving Relief At-home Production using Haivision
- More efficient ways to contribute and distribute live television and video
- VidOvation has been promoting At-home production for several years. Today it is more of a necessity during these uncertain times with social distancing.
- Address the challenge of maintaining frame-accurate video genlock and lip sync across multiple cameras for wireless at-home live production or REMI.
- How labor costs are reduced since the television production specialists work at a centralized master control near where they live, eliminating the need for travel and overtime expenses.
- SafeStreams Transport (SST) for video transport over unmanaged networks such as cellular and the public Internet
- New codec – SMPTE VC-6 ST 2117 P.Pro – a More Efficient Alternative to Uncompressed, JPEG XS and JPEG 2000 from V-Nova
- New low latency wireless technology for remote productions
Jim Jachetta, CTO at VidOvation
Transcript:
Jim Jachetta (00:01):
Okay. Good morning, everyone. I’m Jim Jachetta, CTO and co-founder of VidOvation. Today we’re going to talk about a few applications we’ve done recently for at-home production, in particular the PGA TaylorMade Driving Relief charity event a few weeks ago was a skins game. But at home production, REMI production has been a very, very hot topic as of late. VidOvation and our partners such as Haivision, we’ve been promoting, we’ve been successfully implementing and deploying at-home and REMI production for the better part of five years. We’ve done numerous sporting events, we’ve done numerous live reality television events. Some of our partners have created this new category of live unscripted television shows such as Live PD, Live Rescue and others. Today, we’re going to start talking about this event with the PGA.
Jim Jachetta (01:12):
It was a very successful event. As we all know, we all on a lockdown, we’re all maintaining social distancing. If we are closer than six feet, we’re supposed to wear masks to lessen the curve or mute the curve so we don’t spread the COVID-19. So this actual charity event was to help people on the front line, help nurses, support the CDC and others who are struggling with this pandemic. The event actually raised five and a half million dollars, way above projections, and it was a great game. I don’t know if any of you out there are golf fans, but it was fun watching. It was a charity event, but they’re still professional athletes, they’re still competing with each other. They are a little trashed, with the polite golf trash talking of each other. It was great.
Jim Jachetta (02:20):
The event was put out over the usual … The rights holder was NBC, NBC Sports, Sky Sports overseas, and then it was streamed out OTT to other entities. We were very proud of VidOvation, and our partner, Haivision, were very proud to be part of this landmark event. So what was the workflow? What was the logistics? So the event was at the Seminole Golf Club in Juno Beach in Florida, and this is a private club. I believe one of the players, I don’t remember which, his dad is a member, so he played there as a kid. So he had a little bit of an advantage over the other three golfers. But this was a private club, a private golf course that no one has seen, the public has never seen. You can imagine if it’s a course that’s never had a live event, there’s probably no fiber optic infrastructure.
Jim Jachetta (03:20):
The switch or level three probably does not have a fiber connection into the venue. The only options might be satellite or some of your more traditional means of production and transport. So at-home production, or in particular bonded cellular at-home production, was perfect for this event. They didn’t have a fiber connection, they wanted … they minimized the footprint on the prem, on the course. So I guess the state agencies came up with a number of 50 people maximum. So you had four golfers, 28 television crew and 18 officials. So we were able to keep it under that 50 person mark. A big part of this is not having one or two or three production trucks on site, and all the support personnel that involved that. You have to have your instant replay people, your TD, your director, your producers, your audio people.
Jim Jachetta (04:25):
At minimum, a tractor trailer full of equipment and people on site. But the benefit of At-Home is you use your facility at home, and in this case, the At-Home or the production or master control was PGA headquarters in St. Augustine. Now, you might ask, well, they were going from Southern Florida to Northern Florida. Once we go on cellular, this could have been 10,000 miles away, and we’ve done applications 10,000, 8,000 miles away with Turner Sports for the Ryder Cup. I have a few slides about that. So once you put your video on cellular, on the public internet, whether you go a couple of 100 miles or thousands of miles, the distance becomes irrelevant then at that point. Play-by-play announcers were at the studio, production team was there, some of the analysts were there.
Jim Jachetta (05:24):
One analyst, Mike Tirico, did the commentation from his home in Michigan. So you can see the spread out distance of this production. You see, this dotted line here is the first mile or the first few miles, went over cellular. Then the cellular networks dumped to the public internet, and then the public internet connects to the Haivision receivers in St. Augustan. Then the output the Haivision receivers spit out typically SDI or IP, if you have an IP facility, and you use your traditional production switcher. I’m certain PGA use the EVS or something similar to that to the capture some instant replays. It did help. There were only four players. It wasn’t like you had a full tournament where people are teeing off at different holes at different times. So instant replay or capturing stuff in replay was less important, but that was still a part of the workflow.
Jim Jachetta (06:33):
Then the particulars out on the course. One of the big benefits or big differentiators of Haivision is they’ve been doing this for more than five years, where they’re able to maintain frame accurate genlock and lip-sync on portable units and on multiple cameras. So they send a form of precision timing protocol. I mean, it’s an unmanaged network so they don’t have access to the switches that are part of the public internet, but it’s a similar technique. They time lock the receiver and the transmitters together so the video comes through synchronously. As you’ll be able to see here, there were two cameras in the tee box, and they used the Toptracer system that draws the line where the ball went, and it draws that arc.
Jim Jachetta (07:31):
One thing I learned about this project is the Toptrace telemetry actually goes through an audio channel. So because the Haivision PRO series and AIR series have two analog audio inputs on many XLRs, they were able to feed the Toptrace telemetry into one of the Haivision audio channels and were able to have that functionality. That was a nice surprise for the PGA that they could use to Toptrace while keeping such a low footprint with the at-home production using Haivision. Then there were two cameras on the fairway, each with a Haivision that … The Haivision PRO, you can mount it on the camera, between the battery and the camera if you have a larger camera, either VLOC or Anton Bauer, or if you’re using a smaller camera. Some of our live reality shows will use a smaller camera than a larger ENG or larger sports camera. Then they’ll wear it in a backpack.
Jim Jachetta (08:38):
You have that option where you can mount the unit on the camera, and that was the case here with the PGA. Then again, they have the Toptrace on the fairway, and then they had another two cameras at the pin capturing the action. Then two POV beauty shots. They had a nice shot of the nearby ocean and the dunes. This course seems really nice. It abuts the beach. So they had a nice beauty shot of the beach and the dunes adjacent to the beach. I’m sorry, adjacent to the course. Then another beauty shot of the clubhouse, and they use the smaller Haivision unit, the AIR320 which has two internal cellular modems and the option for two external modems. Using HEVC, two modems was more than enough. I should mention the 380, the flagship Haivision product, has eight modems. We like using eight. Some people might say, “Well, do you really need eight?”
Jim Jachetta (09:50):
You probably could get away with less modems in certain circumstances, but we’d like to cover all contingency, especially when there’s crowds involved. For this event, there were no crowds, but we did a PGA event in the Caribbean and there were typical crowds allowed on the course and they’re all filming and uploading to Instagram, et cetera, and the Haivision 380 worked really well having eight modems. When we’re operating in the US, we’ll do two Verizon, two AT&T, two T-Mobile, two Sprint, and we like that diversity. Particularly in an area that gets overcrowded, Verizon and AT&T are the most popular networks. Those get over-utilized first, then T-Mobile and Sprint in some cases will help pick up that slack. We’ve seen some instances where Sprint is the only network available to us, and we’re able to get a few megabits per second out.
Jim Jachetta (10:47):
Then it helps having a very efficient HEVC codec where you don’t need very many megabits per second for a good looking picture. This show PGA ran it at five megs, which is equivalent to 10 or 12 megs with H.264. So they ran a five megs with HEVC. Then you see here, we also have … There were two commentators on the course with them, with the players, and they had Shotgun mics, interviewing mics, where they were interviewing the players asking them questions. The players themselves were miced as well. You can see here technicians had the AIR 320 nearby and they used wireless mics on the players, and you see here two Haivision units. We used the two audio connections on the two units to make up for analog microphone connections. So we were able to go wireless from the talent.
Jim Jachetta (11:51):
From the players to the Haivision, they had a small microphone receiver in the backpack with the AIR 320, and then those microphone channels were sent back to master control. So you can see here, there’s actually more to this. Then they had two parabolic mics following the action, catching the sound of the golf strikes, the golf swings, et cetera. Then in addition, they had a plane overhead. You could hear it buzzing around during the event. They use a traditional microwave link from the plane to the ground, but then used an Haivision 380 to get that microwave receiver signal back to master control. So you can see here, we got two, four, six, eight, 10, 12, 13, 14, 16 … 25, 30 microphones all open in close proximity. If we had the slightest bit of genlock misalignment, first you would notice it in the video of the different camera angles of the same action.
Jim Jachetta (13:09):
You’d see the golf strike twice if there was something out of sync when the director cut between cameras. There were a few shots that were out of sync, but the PGA explained that during the sports video group session, that when they use the Toptrace, they lose a few frames of video. The TD is supposed to pick the cameras accordingly. He should be using a camera that’s carrying the Toptrace for the video when they’re using the Toptrace. There was a little production glitch there. But aside from that, Haivision performed perfectly. There was no lip-sync issues. You can see here where if the parabolic mics were open and the tee box mics were open, you’d be like, “Hello, hello.” You hear whoosh, whoosh when they struck. There was none of that. Everything looked perfect, frame accurate.
Jim Jachetta (14:05):
Each of these cameras were autonomous. They don’t need a genlock input to each of these cameras. These camera operators can wander untethered. All the magic happens in the receiver where everything is genlocked and corrected. What makes that possible is what Haivision calls Safe Streams Transport, or SST. It’s a very robust, very complicated transport mechanism. It’s already their third generation. You can see here the Emmy Award in the photo here. They’ve one technical Emmy’s in 2018 and 2019. I don’t know. I know you’re listening Flory, and I don’t know when you guys are going to be awarded your statue for 2019. NAB Vegas was canceled, as we all know, and I’m sure you’ve heard NAB New York is not happening. But it’s because of this Emmy Award winning technology that everything is frame accurate. It makes this whole transport possible.
Jim Jachetta (15:17):
You can see here, I think they had one PRO and one AIR for backup, but they used seven or eight Haivision PRO380 and eight or nine AIR 320s. The AIRs were typically used to transport the audio, so it was lower bit rates. We used two or four modems. Then the video and audio was transported using the PRO 380s. SST, we’ve done other presentations. Samuel Fleischacker, Product Manager at Haivision has done a webinar. You can find it in our blog or look under the resources page, and then subcategory webinar, you’ll find a webinar Samuel did end of last year. There’s a few webinars that Florian KOLMER, the biz dev and sales engineer for Haivision for the US. Florian and I have done quite a few demos.
Jim Jachetta (16:17):
We’ll go more into the nuts and bolts of SST, but a big part of it is forward error correction, and then ARQ or packet retransmissions is a part of it. Not all transports are the same. Others do ARQ, others do FEC. The big thing is the bonding, the aggregation, and really managing the cellular modems. Cellular network is very unpredictable. The modems themselves, the cellular connection, the cell tower can just decide, “Whoop, your lease is up on that connection.” So one of the Verizon modems might decide to disconnect. Well, thank God we have a second Verizon modem that’s there until the first Verizon modem renegotiates with the tower and gets its connection back. So most of the time, all eight connections are rock solid, but there are those moments where things happen unexpectedly, and the Haivision algorithm technology and SST is what smooths that all out.
Jim Jachetta (17:28):
Here’s the units. As I mentioned, the 380 has eight modems in the AIR, 320 has two internal. You can connect two external USB modems to the AIR if you want to add four. But you can see the footprint, the 380 can Mount on the camera, it’s got the battery mount plates, or we can put it in a backpack. The AIR, it does have a quarter 20 screw thread on the bottom. I’ve seen some operators put it on top on the hot shoe or an accessory arm on a smaller camera. There is a small shoulder mount little backpack, a little slang for it. You could see in the PGA event the on-course commentator was wearing one. There is a little belt clip to mount it to your belt. There’s a number of different configurations for all of this technology. Then as I mentioned, here’s a picture.
Jim Jachetta (18:27):
You have all your cameras coming through. You’ve got your audio signals. It’s this multi-camera genlock or sync, frame-accurate lip-sync, frame-accurate genlock. That’s what makes these shows possible. The Haivision technology can go down to eight tenths of a second, a half a second on a constant bit rate set up, or eight tenths of a second with variable bit rate. PGA chose to run at 1.4. They felt their commentators, their talent, their analysts were familiar working with 1.4. Why not run it three seconds? Why not run at eight tenths of a second? Well, it’s a decision. The more aggressive you go with the latency, the lower you make the latency if there is a hiccup. Most people think when operating over cellular it’s a matter of bandwidth. Yes, bandwidth is important, but I really put latency first.
Jim Jachetta (19:25):
Haivision, VidOvation, we’ll watch the stream hub receiver display, and for the most part, the roundtrip latency is about 50 or 60 milliseconds, which is very nice, very manageable. You might say, “Well, the rule of thumb is you transport it four or five times around trip latency.” There you go. That’s how you can get down to a half a second or less, or three tenths of a second, a 300 milliseconds. Problem is you’ll see a cellular modem all of a sudden jump to 4,000 milliseconds, or four seconds. There’s just this unpredictability. There’s a choke point somewhere in the cellular network at the cellular tower. The SONET ring that is connecting the cellular network together to the central office of bottleneck and the internet. So it’s that unpredictability where we have to build in a little bit of a buffer.
Jim Jachetta (20:24):
What I would like to see with the promise of 5G is lower latency, more predictable, more consistent latency. Yes, more bandwidth is great, but I think that’s coming, that’s going to be part of 5G. So we’ll be able to get down to a couple of 100 milliseconds if we can get that better quality of service out of cellular. Here’s some of the functions of these products. In the case of the PGA, it was a live event, so we were using the live function. Another novel feature of Haivision is there’s actually two encoders in there. There’s two hardware-based encoder chips in there. For live transmission over cellular, we use a variable bit rate and code. Then for recording, we use a constant bit rate encoder. Why do we do this?
Jim Jachetta (21:18):
Our competitors will record the live transmission, but when do you need to record in your encoder most? When the cellular is struggling, and if the cellular is struggling, the bit rate will drop and so does the bit rate of your recording drop. So what good is that if the recording is going up and down with the available bandwidth of the cellular network. With Haivision, you send out your live transmission variable. If you hit a patchy part … Most productions, they’ll record in camera. That’s one recording, but you don’t have to ship that memory back to master control or your production company to produce the show. This is for live TV. We were doing a live reality show and they had a big drug bust out in the middle of a cornfield, and the bit rate dropped about 500 kilobits per second.
Jim Jachetta (22:12):
Audio was crystal clear, video was starting to get a little soft at 500 K. Now, this was granted at the time, this was H.264. Today, 500 K would look much better, but still the producers were like, “Wow, this is like the bust of the show. This is the highlight reel of the show and the bit rate’s low.” I happened to be in the studio with Florian that day, and we’re like … I go to the technical consultant and I’m like, “Hey, you’re recording in our unit, right?” “Oh yeah, we do that as a safety net.” We record in camera. In the Haivision unit, of course, they’re recording in the studio dumping it into an EVS. I said, “Well, when you break away to another city or your breakaway to commercial, I’ll show you how to retrieve that file, that clip from the unit.” So you can’t pull the file through live, so we had to stop the live.
Jim Jachetta (23:12):
The bust was over, things were winding down. We remoted into the unit and we actually found the clip, and because the recording is done with fragmented MPEG, we were able to find just the clip we needed, pulled the clip through, put it on a memory stick, fed it to the production people. They were able to ingest it, and then 40 minutes, a half hour later, we were able to go, “Oh.” The anchor came back, says, “Ladies and gentlemen, that drug bust we did at the top of the hour, we were able to retrieve some cleaner footage from the field and we played it back,” and that was recorded at five or six megabits per second. So it was perfect. It is because of things like this that people accuse this show of being fake. Then if you want to do a near live scenario where you can recording cameras say at five or six megabits per second, and as the file records, it’s streams to master control.
Jim Jachetta (24:12):
If cellular is good, the file will arrive in real time. If the cellular gets a little sketchy, it’ll slow down, but it will eventually catch up. So we got live, record, forward, live and simultaneous record, this is the most common, and then record and simultaneous forward. This is what I just mentioned. Haivision also has this data hotspot, this internet and VPN bridge. So others have this, others can do a hotspot out in the field, but Haivision takes it a step further. They create a secure VPN. So if you have PTZ cameras, intercom systems, other telemetry, you want to give your field producers access to the server back at master control or back at the studio, or back at corporate. We can extend the network out into the field, put the assets in the field on the same sub-net as the master control, which is amazing.
Jim Jachetta (25:18):
We’ve been using this a lot during COVID. There’ve been a lot of TV shows now doing Zoom meetings of famous people in Hollywood giving each other hair cuts, giving dye jobs to their wives. Jerry O’Connell and his wife, he gave his wife a … He dyed her roots live on Zoom, and the locations they were at had poor internet connection, so they use this hotspot and VPN bridge function to make that show possible. A hot folder, that would be … Associated Press has used this where they take photos in an SLR, dump it via WiFi, push it via FTP to a hot folder in the Haivision unit, and that hot folder is automatically forwarded to master control or corporate. Video return will be released very soon from Haivision. Mission centric, this is using a dialect or an EVS system where you plan the whole production of your show, where you call the … you pick up this mission.
Jim Jachetta (26:33):
So you turn on your unit and you’re like, “Oh, I’m shooting the protest rally today in Sacramento.” You find that mission, you click on it. This way, the metadata is embedded in the production, the files are tagged as they’re ingested into the workflow. So Haivision is becoming the entire news production, sports production workflow. This is a very cool feature, this auto-live. There’s a little button on the side of the Haivision, where … a little switch, a little hidden little switch, and you put it in auto-live mode, the second you apply power, the second you apply a battery, the unit will start transmitting. Because a common mistake a photog will do, battery will go down, we’ll put the battery on, the camera comes up, he forgets to hit live on the Haivision transmitter.
Jim Jachetta (27:24):
So this alleviates that. Puts the battery on, bam! The Haivision PRO or AIR is going live. Here’s some of the particulars about this PGA event. They shot 1080i59.94. They could have done 1080p if they wanted. They chose a variable bit rate, five megabits per second. Now, Haivision HEVC codec is very unique. It’s more efficient than most of the other codecs out there. So I would argue PGA running at five mags is probably analogous to seven or eight megs with other HEVC. I don’t know if you saw the game, and I might be biased, I will say that, but they had a beautiful day. The sun was out, blue, blue skies, the grass was green. The Haivision technology made beautiful pictures, it made beautiful pictures. I don’t think there was anyone would say that the production quality of this PGA event was limited in any way.
Jim Jachetta (28:29):
They were able to do their usual graphics, but instead of that happening on site in a trailer, that happened at master control. The production quality was there, and in some cases, I say it was even better than some of the traditional productions. The cameras they used use a 3G SDI interface. They could have used smaller cameras with HDMI. The units have the ability for both. The audio is AAC. They used one stereo pair. The unit actually can support two stereo pairs or four mono channels of audio. They chose to run at 128 kilobits per second. You could make this 256. I think you could make this as high as you want depending upon the quality. I think they chose 256 for that Toptrace telemetry channel. They wanted a little bit more fidelity to get the telemetry through.
Jim Jachetta (29:23):
So you can play with this, set your maximum bit rate, set the audio bit rate. Then again, here’s the interfaces, and then also when it comes to audio, well, audio usually comes in as embedded on the 3G SDI, or the HDMI, or a combination of both. You could have embedded audio on the SDI on channels three and four, and then channel one and two, have some external microphones, a lapel mic on the talent coming into a microphone receiver on the camera rig, feeding this many XLR input. A lot of these reality shows we’ve done in this PGA event, this analog audio input, this many XLR has been invaluable. It’s a real lifesaver. Otherwise, they’d have to have some clunky audio and better in the field, and then you got to worry about how to power that. This is you just feed your analog input.
Jim Jachetta (30:20):
That’s it for that for the PGA tour. Haivision and VidOvation, we work very closely with our customers. We’ve done other at-home production projects with Turner Sports. We did the Ryder Cup and then the UEFA Championship Leagues. We’ve done some work with CNN. We’ve done these live reality shows on A&E and Fox and Discovery Channel, Live Rescue, Live PD. First Responders Live. The guys that at Live PD, Big Fish Entertainment, in my mind they invented a new category, the category of live reality TV. Bonded was intended for doing a static interview on the courthouse steps. Camera on a tripod, reporters standing there on the steps, single camera type setup. Live PD is a testament to how robust Haivision technology is under speed. We’re able to maintain a frame-accurate genlock. In the Live PD there’s actually two cameras.
Jim Jachetta (31:40):
Well, actually, no, I take that back. Often enough, there’s four cameras lit up inside the cop car simultaneously while going 80 or 120 miles an hour down the freeway, and as they cut between cameras and microphones, the genlock and lip-sync is maintained. So this is a real testament to the robustness of that SST, that Safe Streams technology. I alluded to this upfront that in the PGA TaylorMade skins game, they only went a couple 100 miles from Southern Florida to the Northern Florida. But in the case of the Ryder Cup, we had 20 cameras. 16 ISO cameras from Paris coming back to Atlanta, to Turner Sports, four-program feeds coming back for confidence. There was analysts on both sides. So the analyst could hear and see each other on either side, so that the analyst in Paris could see the studio in Atlanta and vice versa.
Jim Jachetta (32:53):
This was done with Haivision rack-mounted technology, the HE4000, and that’s a four-channel appliance in a half rack. Haivision now has a single channel appliances in a half rack called the RACK Series. The RACK200 is H.264 and the RACK300 is HEVC. The HE4000 does four channels of 3G HD-SDI, 1080p60, or a single channel of 4K. So they the Turner use the four channel appliance. Here’s the four HE4000s picking up the 16 1080p … I should say, 1080p50 signals. Two days before the event, they were doing some rehearsals, and Tom Sahara calls me up. Well, Tom Sahara is not the type to freak out, but I could tell he was a little concerned. He says, “Jim, we’re shooting the event with a production team in Paris and none of us considered that they would be shooting … that their truck is 50 Hertz frame rate. So how are we going to output a 5994? I don’t have 16 standards converters to convert from 1080p50 to 1080p5994, or Turner’s probably 720p5994.”
Jim Jachetta (34:29):
I’m not sure. I’m sure somebody can correct me if I’m wrong on that. But you see the dilemma there. So I said, “Tom, don’t worry about it. Haivision has this covered. You just said it in the output profile of the Haivision stream hub, and you can set it to anything you want.” So we set it to 720p5994, and the Haivision receiver does the trans code for you. You might ask, “Oh, is that going to add a frame of latency? Is that going to delay things?” Not a problem. Everything is perfect. There’s no lip-sync issue. So that really, really helped. Then they use the single HD to bring four-program feeds back or multi-viewer back. Haivision now does have a multi-viewer function built into the stream hub. So this was a few years ago. It was already about a year ago. Now, this workflow might’ve been able to be simplified a little bit.
Jim Jachetta (35:29):
We could have brought a multi-view signal through instead of bringing all four through. So there are some new developments from Haivision. Then here’s a picture of the control room, all the feeds coming into Atlanta. They did have a truck onsite, but less of a footprint. This was before COVID. But there are ways of shading cameras remotely, so you probably want a … In the case of the PGA, they did use some cellular technology to shade the cameras remotely. The VPN bridge of Haivision could be used to facilitate that. That connection is lower latency. But on our website, some of the truck providers, your [inaudible 00:36:24], your Game Creeks, they’re like, “Why do you have a picture of a truck on your website with a line through it?” I’m like, “I’m just trying to make a point.”
Jim Jachetta (36:30):
I don’t think trucks are going away. I think they’ll probably have more of them, but smaller trucks. You might have a smaller truck with a video engineer just shading the cameras. You might have a truck where you go microwave from the course or from the field to the truck, have a small production switch or a tricast or something like that. Or you might do the production switching in the cloud, and then use bonded cellular from this smaller truck to get the feed out. So there’s home running the whole feed, there’s a home running part of the production or all of the production, or a blend of the two. There’s many different ways to do that. Here’s some slides about the Live PD show that we’ve done. The police, I think, need all the help we can give them. Things are getting a little crazy out there.
Jim Jachetta (37:32):
I think the Live PD show has a lot of officers that are good at their job, and unfortunately, a few bad apples are making them all look bad, which is very unfortunate. But the Live PD show, there’s two camera man in the backseat in each police car. There’s two cameras. There’s a camera out the front dash, a POV camera on the rear view mirror on the photog. One of the cameramen has a tendency to sit in the front seat and shoot out the windshield and at the driver. There’s three or four cameras lit up inside the police car, and it’s this SST, this frame-accurate genlock and lip-sync is what makes a show like this possible. All the video comes back to the A&E master control in New York. They dump it into an EVS system. Line producers look at the content, they add metadata.
Jim Jachetta (38:31):
They see a gun, they say G for gun, D if there’s drugs involved, and in this way they can scrape through the raw footage to find the segments, the clips, that they want. Big Fish Entertainment in their infinite wisdom when were tapped to do this live reality show, when they pitched this live reality show to A&E, they were smart enough to realize that this show was very much like sports, that live reality TV was going to be like sports so they were going to need an EVS system, that they were going to need EVS operators. They were very clever in that regard. Here you can see some photos here, this particular photog. Some of the photogs, it’s hard to see in this picture, maybe this guy in the front seat here, you see he’s got a little bit of bulge there in his front there. That’s not as belly.
Jim Jachetta (39:22):
He’s probably got the Haivision transmitter in a pouch on his belly with some batteries, and you see the smaller camera they’re using. An ENG camera, they would be knocking the lens off the camera every time they got in and out of the cop car. This operator chose to wear the Haivision backpack, and he’s got it on his front so he can sit down in the cop car. You don’t want something on your back getting in and out of the car. So they basically jump out of the car shooting. They got the camera rolling, and with this tiny little camera, they can catch the action. They won’t catch their lens on the door jamb as they’re getting in and out of the cop car. You can see here in the picture, you see there’s a … it looks like a GoPro. I think there’s a Marshall camera up in here as well, catching the POV shot of the officer, and then camera shooting out the front seat.
Jim Jachetta (40:19):
Then here’s the control room. So these four operators, so each one of them is watching the live feeds coming in from a given city or given police vehicle. They’re doing the first line of metadata, adding data. Then this is a line producer here looking at clips as they come in, and then they’ll cut a rough package of a certain clip that they want and then they put that package up in front of the … I think he’s ex NBC Sports Gonzales, is the director of the show. So they put the package up, and then he calls shots like it’s live. Take camera one, take camera two, but they’re playing that back from EVS. So they dump it all into EVS. There is a, for the safety of the officers, at least a 15 minute delay because a lot of the bad guys watch a Live PD show, and you got to have some delay in the production to help with officers’ safety.
Jim Jachetta (41:23):
That dovetails into one of our other vendors who’s going to speak next week, V-Nova. V-Nova has some technology that is ideal for IP video transport at-home production contribution, distribution over a managed network. The PGA TaylorMade at-home production, the Ryder Cup production. Ryder Cup was done over public internet. It was done over a single public internet connection. I would have slept better during that production if they had two connections. But hey only have budget for one. We didn’t drop a single packet. But I would have preferred to see them bond the multiple connections together. So Ryder Cup was public internet, PGA was cellular. Cellular eventually dumps to the public internet, so it is a combination of cellular and internet, but that’s all unmanaged.
Jim Jachetta (42:26):
Now we can get lower latencies if we use a managed network, and V-Nova’s technology, they’re basically … They have two approaches. One approach is operating as an alternative to JPEG 2000. Another approach is offering a secondary stream to enhance older codecs, to enhance older hardware and coders. What does that mean? So there’s a new standards, MPEG-5 LCEVC. Try to say that fast, LCEVC. Low complexity enhanced video coding. So what is that? So it’s a new standard. It helps alleviate some of the challenges. Not everyone is ready to throw out all their H.264 infrastructure. But this LCEVC can help bring some of the benefits of higher efficiency to lower quality hardware infrastructure, and for that matter, software infrastructure that you have out there.
Jim Jachetta (43:36):
What is it doing? It’s a data structure defined by two streams. So there’s the base stream or the primary stream, which is decoded by your typical hardware decoder. So this could be a H.264 for an example. So you have an older or generic H.264 encoder and a generic H.264 decoder. What V-Nova does is it looks at the video coming in and the video coming out of the H.264 decoder and looks for errors, looks for the losses between the original and the encoded stream, and then it formulates a secondary supplemental stream filling in some of those missing pieces. As we all know, compression is not perfect. H.264 is good. I don’t think it … We’re not going to shut everything off H.264. We all see the benefits. The PGA TaylorMade event was a testament to how good and how beautiful Haivision HEVC looks.
Jim Jachetta (44:47):
But this is a technique to recoup or as an intermediary transition to upgrading everything. We’re able to stay compatible, because let’s just say this could even be implemented OTT or streaming to the home. Well, my Apple TV, my Roku may be can only support H.264, but certain newer appliance … I’ll just use an example, just use your Roku or Apple TV. The newer units have the ability to decode this secondary channel. But if you have an older set top box that only does H.264, it’ll just ignore that secondary channel. It won’t look at it, it’ll just throw it out. So you’re able to get the higher fidelity if your decoder has the software in it to implement this secondary stream, this secondary channel.
Jim Jachetta (45:50):
So we’ll learn more about that, and I apologize to my friends at V-Nova if I didn’t explain that clearly, but I think that’s the basic idea, and we’ll learn more about that next week. Someone might say, “Oh, well, is this only for on-demand?” The algorithms are fast enough, real time, where this can be done on-demand as well as live. We’ll also look at the new standard or the new codec developed by V-Nova, which now has been ratified as a SMPTE standard ST 2117 or VC-6. This is the standard that I alluded to, it’s great for contribution, distribution, remote production workflows. It’s great as a mezzanine codec. It’s a more efficient alternative to JPEG 2000, JPEG EX, and of course, a more efficient in terms of Uncompressed.
Jim Jachetta (46:53):
Uncompressed is full bandwidth, or SMPTE 2110, something that’s lightly compressed. Why is this important? Well, I think as with HD, I think sports are going to be driving 4K, and sports leagues are already seeing, we’re in discussions with some of the sports networks, that with 4K, the JPEG 2000 is not efficient enough, that using the SMPTE 2117, we can, with 4K, see 70% or 75% savings, with HD 20% to 30% savings. It’s all about saving bandwidth, and it doesn’t lose any fidelity. We’re talking about like fidelity … excuse me, live fidelity. Analyzing the input and the output that we can see maintaining the same fidelity of 70% lower bit rate and that’s quite powerful. Again, not to be a spoiler, but this technology has AI built into it. So it has a library.
Jim Jachetta (48:14):
If it sees the details of grass, let’s say, take golf for example. The details of grass is hard to produce, or in basketball, the sea of faces with that orange ball going back and forth. Its progressive downscaling and upscaling is the gist of the technology. Again, that’ll be explained, but it down scales and looks at the error, so it constantly … It even has AI and it learns that, okay, that grassy fringe over there, we lost some fidelity. Okay, it learns how to handle grass, it learns how to encode grass. Okay, we’ve seen this sharp edge before, and it has it in its library and it knows how to process that, it learns how to efficiently process that the next time it sees it. So the more content you feed the system, the more efficient it becomes.
Jim Jachetta (49:13):
We’ll learn more about that next week. We’ll also learn this rack mountain unit you see here, it’s called the P.Link. It’s an eight channel device, eight input, eight output, or any combination thereof. Seven in one out, four and four, whatever you want to do, of the V-6 or the SMPTE 2117 transport. This is an appliance implementation. It’s been fully integrated with AWS MediaConnect and AWS Direct Connect. We’ll talk about that more next week. What are the benefits of this? I mentioned high quality, low latency media transport. It’s more cost effective than having a dark fiber or a telco connection. But it’s really meant for a managed network. Here’s a little bit about VidOvation.
Jim Jachetta (50:07):
I’m sure most of you know who we are, but in case you don’t we’re a provider of video and audio and data transmission systems for contribution, distribution, for broadcast, sports, production, corporate AV, first responders and government agencies. We’ve been working a lot lately with first responders. We’re doing a lot of at-home production, a lot of these bonded cellular hotspot, the VPN. We’ve been maintaining our normal hours. Our New York, Arizona and Southern California locations have all been open. Eastern time, we are open from 9:00 AM to 9:00 PM. Pacific time, we’re open from 6:00 AM to 6:00 PM. Our technical support has maintained operations 24/7, so we’ve been servicing our clients.
Jim Jachetta (50:59):
Actually, demand has increased the good 40% or 50% during COVID. We’ve been helping our customers. VidOvation, we excel at integrating custom solutions into your existing infrastructure. Sometimes when a vendor comes in, they’re like, “Got to rip everything out, start everything over.” Sometimes that makes sense, but most of the times we’re able to integrate the new technology into your existing workflow, into your existing infrastructure. We have solutions that satisfy almost any application or budget. So we love to hear from you, we love to have discovery calls. Part of what we offer is engineering consultation, design and engineering services, systems integration, project management, ongoing warranty and support, and we’d love to hear from you.
Jim Jachetta (51:52):
Does anybody have any questions? Well, I’m not to pat myself on the back. I think I did such an amazing job today, I don’t see any questions. Let me see. Just some thank yous. Yeah, so a common question we get is when will the recordings be up? We’ve been doing these webinars at least one a week now for the better part of eight weeks. We had two last week. We had AB Aero wireless partner, and we had a fourth episode in the series of four with Haivision. We talked about the stream hub last week. It takes about a week. Sometimes we get it done beforehand, but we transcribe the session. We obviously record it. We strip the video recording, we put up online, and we make a separate audio recording.
Jim Jachetta (52:51):
We strip out the audio, and we actually have a video podcast feed and an audio only podcast feed. So you can find us that way. I like to listen to books and technology in the car or while exercising or walking the dog in the morning. So I listen to a lot of stuff. I use Audible and different things like that. iTunes, Spotify for podcasts. Whatever your preferred method is, or call us up. Here’s my contact information, here’s my email address, jimj@vidovation.com. We would love to book an engineering discovery call with you. Me, my colleagues, Rick Anderson, David Robinson, we can bring in Florian KOLMER, sales engineer from Haivision for the Americas or any other vendor.
Jim Jachetta (53:49):
We can bring them into the conversation. So we’d love to hear from you and work with you. So have a great day. I hope you’re all staying safe and healthy out there. It looks like we won’t be seeing each other face to face for NAB New York, but hopefully this is good, if we’re connecting virtually this way, and if we do an engineering discovery call, make sure you set up a camera so we can see each other and have that personal connection. So we look forward to that. Thank you so much and have a great day, everyone. Be safe. Bye bye.
Download Presentation & Watch Webinar Recording
Learn about:
- PGA TaylorMade Driving Relief At-home Production using Haivision
- More efficient ways to contribute and distribute live television and video
- VidOvation has been promoting At-home production for several years. Today it is more of a necessity during these uncertain times with social distancing.
- Address the challenge of maintaining frame-accurate video genlock and lip sync across multiple cameras for wireless at-home live production or REMI.
- How labor costs are reduced since the television production specialists work at a centralized master control near where they live, eliminating the need for travel and overtime expenses.
- SafeStreams Transport (SST) for video transport over unmanaged networks such as cellular and the public Internet
- New codec – SMPTE VC-6 ST 2117 P.Pro – a More Efficient Alternative to Uncompressed, JPEG XS and JPEG 2000 from V-Nova
- New low latency wireless technology for remote productions
Jim Jachetta, CTO at VidOvation
Transcript:
Jim Jachetta (00:01):
Okay. Good morning, everyone. I’m Jim Jachetta, CTO and co-founder of VidOvation. Today we’re going to talk about a few applications we’ve done recently for at-home production, in particular the PGA TaylorMade Driving Relief charity event a few weeks ago was a skins game. But at home production, REMI production has been a very, very hot topic as of late. VidOvation and our partners such as Haivision, we’ve been promoting, we’ve been successfully implementing and deploying at-home and REMI production for the better part of five years. We’ve done numerous sporting events, we’ve done numerous live reality television events. Some of our partners have created this new category of live unscripted television shows such as Live PD, Live Rescue and others. Today, we’re going to start talking about this event with the PGA.
Jim Jachetta (01:12):
It was a very successful event. As we all know, we all on a lockdown, we’re all maintaining social distancing. If we are closer than six feet, we’re supposed to wear masks to lessen the curve or mute the curve so we don’t spread the COVID-19. So this actual charity event was to help people on the front line, help nurses, support the CDC and others who are struggling with this pandemic. The event actually raised five and a half million dollars, way above projections, and it was a great game. I don’t know if any of you out there are golf fans, but it was fun watching. It was a charity event, but they’re still professional athletes, they’re still competing with each other. They are a little trashed, with the polite golf trash talking of each other. It was great.
Jim Jachetta (02:20):
The event was put out over the usual … The rights holder was NBC, NBC Sports, Sky Sports overseas, and then it was streamed out OTT to other entities. We were very proud of VidOvation, and our partner, Haivision, were very proud to be part of this landmark event. So what was the workflow? What was the logistics? So the event was at the Seminole Golf Club in Juno Beach in Florida, and this is a private club. I believe one of the players, I don’t remember which, his dad is a member, so he played there as a kid. So he had a little bit of an advantage over the other three golfers. But this was a private club, a private golf course that no one has seen, the public has never seen. You can imagine if it’s a course that’s never had a live event, there’s probably no fiber optic infrastructure.
Jim Jachetta (03:20):
The switch or level three probably does not have a fiber connection into the venue. The only options might be satellite or some of your more traditional means of production and transport. So at-home production, or in particular bonded cellular at-home production, was perfect for this event. They didn’t have a fiber connection, they wanted … they minimized the footprint on the prem, on the course. So I guess the state agencies came up with a number of 50 people maximum. So you had four golfers, 28 television crew and 18 officials. So we were able to keep it under that 50 person mark. A big part of this is not having one or two or three production trucks on site, and all the support personnel that involved that. You have to have your instant replay people, your TD, your director, your producers, your audio people.
Jim Jachetta (04:25):
At minimum, a tractor trailer full of equipment and people on site. But the benefit of At-Home is you use your facility at home, and in this case, the At-Home or the production or master control was PGA headquarters in St. Augustine. Now, you might ask, well, they were going from Southern Florida to Northern Florida. Once we go on cellular, this could have been 10,000 miles away, and we’ve done applications 10,000, 8,000 miles away with Turner Sports for the Ryder Cup. I have a few slides about that. So once you put your video on cellular, on the public internet, whether you go a couple of 100 miles or thousands of miles, the distance becomes irrelevant then at that point. Play-by-play announcers were at the studio, production team was there, some of the analysts were there.
Jim Jachetta (05:24):
One analyst, Mike Tirico, did the commentation from his home in Michigan. So you can see the spread out distance of this production. You see, this dotted line here is the first mile or the first few miles, went over cellular. Then the cellular networks dumped to the public internet, and then the public internet connects to the Haivision receivers in St. Augustan. Then the output the Haivision receivers spit out typically SDI or IP, if you have an IP facility, and you use your traditional production switcher. I’m certain PGA use the EVS or something similar to that to the capture some instant replays. It did help. There were only four players. It wasn’t like you had a full tournament where people are teeing off at different holes at different times. So instant replay or capturing stuff in replay was less important, but that was still a part of the workflow.
Jim Jachetta (06:33):
Then the particulars out on the course. One of the big benefits or big differentiators of Haivision is they’ve been doing this for more than five years, where they’re able to maintain frame accurate genlock and lip-sync on portable units and on multiple cameras. So they send a form of precision timing protocol. I mean, it’s an unmanaged network so they don’t have access to the switches that are part of the public internet, but it’s a similar technique. They time lock the receiver and the transmitters together so the video comes through synchronously. As you’ll be able to see here, there were two cameras in the tee box, and they used the Toptracer system that draws the line where the ball went, and it draws that arc.
Jim Jachetta (07:31):
One thing I learned about this project is the Toptrace telemetry actually goes through an audio channel. So because the Haivision PRO series and AIR series have two analog audio inputs on many XLRs, they were able to feed the Toptrace telemetry into one of the Haivision audio channels and were able to have that functionality. That was a nice surprise for the PGA that they could use to Toptrace while keeping such a low footprint with the at-home production using Haivision. Then there were two cameras on the fairway, each with a Haivision that … The Haivision PRO, you can mount it on the camera, between the battery and the camera if you have a larger camera, either VLOC or Anton Bauer, or if you’re using a smaller camera. Some of our live reality shows will use a smaller camera than a larger ENG or larger sports camera. Then they’ll wear it in a backpack.
Jim Jachetta (08:38):
You have that option where you can mount the unit on the camera, and that was the case here with the PGA. Then again, they have the Toptrace on the fairway, and then they had another two cameras at the pin capturing the action. Then two POV beauty shots. They had a nice shot of the nearby ocean and the dunes. This course seems really nice. It abuts the beach. So they had a nice beauty shot of the beach and the dunes adjacent to the beach. I’m sorry, adjacent to the course. Then another beauty shot of the clubhouse, and they use the smaller Haivision unit, the AIR320 which has two internal cellular modems and the option for two external modems. Using HEVC, two modems was more than enough. I should mention the 380, the flagship Haivision product, has eight modems. We like using eight. Some people might say, “Well, do you really need eight?”
Jim Jachetta (09:50):
You probably could get away with less modems in certain circumstances, but we’d like to cover all contingency, especially when there’s crowds involved. For this event, there were no crowds, but we did a PGA event in the Caribbean and there were typical crowds allowed on the course and they’re all filming and uploading to Instagram, et cetera, and the Haivision 380 worked really well having eight modems. When we’re operating in the US, we’ll do two Verizon, two AT&T, two T-Mobile, two Sprint, and we like that diversity. Particularly in an area that gets overcrowded, Verizon and AT&T are the most popular networks. Those get over-utilized first, then T-Mobile and Sprint in some cases will help pick up that slack. We’ve seen some instances where Sprint is the only network available to us, and we’re able to get a few megabits per second out.
Jim Jachetta (10:47):
Then it helps having a very efficient HEVC codec where you don’t need very many megabits per second for a good looking picture. This show PGA ran it at five megs, which is equivalent to 10 or 12 megs with H.264. So they ran a five megs with HEVC. Then you see here, we also have ̷