Good morning everyone. This is Jim Jachetta. I am the CTO and Co-Founder of VidOvation. We’re a video distribution contribution company. We specialize in wireless, bonded cellular, IPTV, fiber optics, amongst other things.
Today we’re gonna talk about the LIVE PD TV show. Hopefully most of you have heard of it. It’s the number one show on cable for the better part of the last two years. Live PD, it’s like the show Cops except it’s live. Today we’re gonna talk about some of the challenges and hurdles that needed to be overcome to make this show possible, and that’s a perfect segue into the title of today’s webinar, How LIVE PD became the top OTT, Video On Demand, DVR show in 2018. The show has won all kinds of statistical awards.
Believe it or not, it’s the number one live cable TV show, but it’s also the number one DVR’d show. I think people, if they don’t catch it live, they don’t want to miss it so they make sure they DVR it. It broke all the records in video on demand. If you didn’t DVR it and you didn’t watch it live, they would go into the A&E Library and watch the show that way. Over the top, it’s the number one streamed show, OTT refers to streaming.
Maybe I’ll start with the question. A lot of what we do at VidOvation centers around live television, live production, whether that be news or sports or like this live reality show, Live PD. Just curious, how many of you guys and men and women, how many of you are doing a live production right now? Some sort of live broadcast? Whether it be news or sports or even doing a lot of high school sports. Second, third tier college sports, university sports, high school sports. There’s more and more demand for live content and that fits right into our wheelhouse very nicely.
Let me see here. I think I can show you guys the results. No, maybe not. Well, I’ll just tell it to you. Basically, 71% of you said that you do some sort of a live production and 27% of you do not. Maybe when we’re done, we can help you … Those 25% of you guys that are not doing something live, we can help you with that if you have aspirations for that. Let me … Yeah, there, I need to close that. Yeah. You should be able to see my screen now.
What was the challenge? We were approached by the technical director at Big Fish Entertainment. Big Fish Entertainment is the production company that produces the Live PD show for A&E. It was quite a daunting
challenge. Basically in the beginning they wanted to do six or seven different cities simultaneously. Now they’ll do like seven or eight simultaneous. In each of those cities there’s four cameras in each police vehicle in close proximity so those four cameras need to be in perfect Gen-lock and have perfect audio lip sync, otherwise the show wouldn’t be watchable. It would be like, “You’re under arrest … You’re under arrest”, as they cut to different shots. A tight shot, a wide shot.
It gets even more complicated. In some cities, they have more than one police vehicle. In a lot of cities now, they have two police vehicles and there’s a tendency where the two vehicles with Live PD cameras, if it’s a real juicy arrest, if there’s something juicy happening or the cop needs backup, they’ll send the second car and now you have eight cameras all in close proximity. In a live show … If this was a recorded show, a produced show, you could fix a lot of things in post-production. You could fix lip sync problems, you could … Of course, you would synchronize all of your video clips together, all of your different shots. There’s a lot you could do in post, but in a live show, there’s no time for that.
The producers of the show, Big Fish, needed a system that could do rock solid Gen-lock, provide rock-solid lip sync over cellular, which is not so easy to do. Inside of our networks, the old school approach is distributing black burst. Now with SMPTE 2110, synchronization is built into the fabric, but SMPTE 2110 can’t go over an unmanaged network, can’t go over a cellular network. It’s just networks are not built for that. Big Fish and A&E really had nowhere to go and when they came to VidOvation, we had a technology from a company by the name of Haivision where they had done a similar multi-camera live show in Europe. We did some initial testing and we were off to the races. The show … The technology made the show possible.
Let me just dive into it a little bit. Some of you might be familiar with the show, so this might be a little redundant, but if you’ve seen the show or you haven’t, there’s basically two photographers that sit in the backseat, two photogs. They use a small camera from Canon. It’s a really nice camera. It’s broadcast quality but small. It does have infrared capability ’cause some of these situations are at night in the dark.
Inside the police vehicle they have a medium shot of the driver, a POV shot basically below the rearview mirror and then they have a shot going out the front dash. These cameras are good. They’re Marshall cameras, but sometimes they want a camera with a better lens, better optics, so sometimes the photog sits in the front seat instead of sitting it the back. There’s two of ’em in the back, so maybe one will ride shotgun. There’s some pictures I’ll show you where he’ll sit up front, just ’cause he’s got a better lens, better optics to get a variety of shots on the show. Basically two photogs sit in the backseat.
Bonded cellular … Haivision was one of the first to put bonded cellular on the camera. You can see here if you so choose the unit would go between the battery plate on the camera and the Anton Bauer with V lock battery. It would sandwich between the batteries and they were the first to do this six years ago with a bonded product. Sandwiching between the camera is popular, has been popular for the last 20 years or more with fiber optics or with more microwave type wireless. Haivision six years ago said, “Hey, let’s do it that way. That’s a better way to package it.”
The other alternative is to wear it in a backpack. Well, Live PD, because they’re getting in and out of a vehicle, they didn’t want a backpack on their back. They’d get hung up on the doorjamb getting in and out of the car or they couldn’t sit very well. These guys are dressed for battle. They’re wearing a bulletproof vest so there’s a little pocket on the belly of the bulletproof vest where they tuck the bonded cellular Haivision Pro 180 in so they can sit. It’s on their belly. Just works out better for them that way.
In the rear of the vehicle or in the trunk are two of the fixed Haivision units there. Those are the ones that are the … The two Marshall cameras are feeding the two units in the trunk. You can see here they put a big battery … They just put a big UPS battery that will run for a couple of hours in the trunk so they don’t have to steal power from the police vehicle. The police vehicles have all kinds of radios. They don’t want to put an unnecessary load on the alternator, so just put a big honkin’ battery in the trunk to power all of the transmission gear. Also, I should say inside the cab of the police vehicle are some ambient mikes, and those feed into the analog inputs on the bonded cellular units in the trunk. Of course, the photogs have microphones on their cameras that feed into the Haivision that way.
Here you can see the units, they’re hidden in a little case in the trunk. You can see here these up at the top here are probably like wireless Lav receivers, Lavalier mikes that the police officers wear. That’s coming into the unit as well as the ambient mikes inside the police vehicle cab. They have a little audio mixer and then the audio inputs are going to the analog audio inputs on the Haivision. You see in this photo there’s a little RF connector. Here are two connectors that plug into the Haivision for external antennas.
A common question I get all the time is like, “Hey, won’t the unit work better if I use the external antennas? Why don’t I just use them all the time?” You really don’t need them. The Haivision unit … Inside the top and bottom of the housing right here, top and bottom are four cellular antennas. Four on the top, four on the bottom. On top is also Wi-Fi and GPS. These work great as long as you’re outside of a vehicle or outdoors. The only time you really need the quad antennas is if you’re in a vehicle. Believe it or not, of course the body of the vehicle shields the cellular signal. There are two Haivision units in the trunk of this police vehicle. It’s like a Faraday cage. You’re not gonna get much signal out of there. It’s not gonna work very well.
It’s like a ribbon cable that comes off these antenna. Haivision makes two types of antennas. The ones used by Live PD, they’re passive. It just outboards the internal antennas to external antennas and their high-gain passive antennas and there’s a very strong magnet. They mount to the roof here of the police vehicle. Here they just put them in the trunk, put them on top of the trunk of the vehicle and it’s a flat ribbon cable that just goes through the molding on the trunk, so it doesn’t impede the trunk. Trunk closes fine and it’s still waterproof, etc.
Somebody made a joke once on our website. It’s like, “Hey, were the cops eating Chinese food?” They thought these look like Chinese food takeout boxes. I thought that was pretty funny, but these are actually the antennas. Why are there four of them? They’re quad antennas and the Haivision unit has eight cellular modems, so four times two makes eight. You need two of them per unit. There’s two transmitters in here, hence the eight quad antennas. Haivision does make an active antenna where there’s four modems in there and that can be used with their HEVC rack-mounted encoder as an option, the active version of the quad.
Here’s some cool shots. You can see here … Well, this guy looks like he’s not wearing his bulletproof vest, but you can see some of them have vests on. Here’s a photog. You see … Here’s the Marshall camera pointing out the front. Somewhere in here is a camera pointing at the driver, but the photog, he decided to sit up front maybe to get a better shot. He’s got a bigger piece of glass on that camera compared to the Marshalls. The Marshalls do a good job, but you can understand why maybe they would want an alternate shot with a bigger camera.
You can see here someone on his belly. He’s got the battery and the Haivision transmitter on his belly. He’s got his bulletproof vest. There’s … You can see here’s two photogs and looks like two either line producers or technicians with the photogs out in the field. The tech, he’ll be helping them swap out their batteries. The line producer’s in communication with Master Control. Once in a while Dan Abrams will throw to one of the line producers like if there’s something going on and it’s not clear what’s happening. They’ll go, “Hey, Becky, what’s happening there in Tulsa? What’s the situation?” Once in a while the line producer will speak, but more off camera. They don’t get on camera. They’re able to do that with the Haivision.
This is the Master Control side. In the A&E Master Control in New York, there’s nine Haivision receivers and Haivision calls their receiver a StreamHub. It’s not only a receiver, it’s a receiver, it’s a decoder. It’s also a transcoder and an encoder. It is very common for a lot of our customers to use the Haivision StreamHub Receiver to output IP to social media, to other affiliates. The StreamHubs can speak to each other through the public internet very reliably.
They have … I’ll get into it more, but I’ll give you a little foreshadowing to it. Haivision has what they call SafeStreams, and it’s an extremely robust proprietary patented transport mechanism for video and it’s how they communicate. It’s how the pro transmitters communicate to the StreamHub Receivers. It’s how the StreamHub Receivers all communicate all with each other. It’s all very rock solid.
There’s nine of these receivers and Big Fish was very clever in that they have three different internet service providers. They have three cities on one service provider so you’ve got the nine cities, so they have a three-way diversity. If the internet should go out with one of these service providers, they’d only lose three cities, not everything, so it’s very clever that they have a redundancy that way and the Haivision product facilitated that.
What do they do with these live videos that are coming in? They first dump it into an instant replay system, and EVS or something similar to that. They have line producers watching the live feeds as they come in and they make an initial pass at metadata. Certain key words. Is there a gun? Or are there drugs? Minor things are in green, some things of interest might be yellow, and then something really juicy’s in red. There’s operators at … I think I have a picture of it, actually. Let me get to the next slide. Here we go.
Yeah, so you see here these are four different cities coming in, or four different police vehicles. You see the four cameras, these are different police vehicles. These four operators here are keying in metadata from what they see on the screen above. Of course, they’re on the comms, and then there’s a line producer further downstream who is also watching and if he sees a lot of red coming in on a camera feed, wow, that’s great. There’s something juicy happening there. They’ll alert the director, say, “Okay, we got shots fired in Tulsa.” They’ll alert the director and the director will say, “When we’re done with the current segment, load that up.”
They put it in front of the director. They play … They’ll put it together in a package, all the cameras associated with that. That one’s a few minutes old, so it’s just part of the process. We’re not just dumping everything to air. There’s as many as 41 cameras, so it’s a few minutes before it actually gets QC’d. They put the metadata then they’ll tell the director, “Boom.” The director will cue them, “Okay, when we’re done with this clip, put the package up and then we’ll go to Tulsa.” Or they’ll go, “When the current clip is done, we’ll go to commercial, let Abrams, the talent, know when we come back from commercial we’re gonna go to Tulsa.”
Everyone’s all on the comms. While they’re in commercial they’ll tell Abrams, “Okay, we have shots fired in Tulsa.” They put the package up and then the director here, Johnny Gonzalez is sitting in the middle. I think this is a producer next to him, and then here is the technical director. They’ll put the package up when they come out of commercial. Say, “Folks, shots fired in Tulsa”, and then the director goes, “Okay, take camera one. Take camera two.” He takes cameras like it’s live now, but they’re playing it back to him out of the instant replay system.
Here’s some other stations. You can see the studio. This is a quality control station where they have all of the cameras up. It’s a live show and they want to … Having at least two photogs with each police vehicle. They
try to time it where they don’t have to do a battery change at the same time, so if something juicy happens, unfortunately they have to take system down for a minute or so to swap batteries. They don’t have … The photogs don’t have two batteries on their person where they could do a failover, but that seems to work. If somebody needs battery, then the other photog keeps going and they manage to stay on the air that way.
Here’s a quote from Dan Cesareo, the Founder and President of Big Fish Entertainment. He was very pleased with the service and support that we’ve given them. The show was in jeopardy. Basically they approached us a month before the show was gonna air and, “Hey, we need a solution.” I talked to their technical consultant. I said, “Theoretically, I think we have what you need. We do do Gen-lock. We sent them a demo quickly and they saw that the Haivision system worked. We did a few dress rehearsals and the show went pretty much … Even the premier … There was a learning curve, learning the equipment. Big Fish had to learn the workflow of this crazy type of a show, but now it’s been the better part of two years. Been a number one a show. It’s been a lot of fun.
Let me ask you guys another question. This particular show … I’m gonna get more into some of the technical stuff. I’m also gonna talk about a project we did with Haivision with Turner Sports for The Ryder’s Cup. That’s coming up next. We’re gonna kind of shift away from bonded cellular. This show uses bonded cellular and I’m just curious, how many of you guys are currently using bonded cellular right now? Is that something you guys are using? Or something you plan to use in the future? Just give me a yes or a no, even if you’re gonna use it in the future, just say “no” for now.
It looks pretty good. I don’t know if you guys see the results, but I’ll just read them to you. 29% of you are already using some form of bonded and … Oh, it’s changing now. It’s moving on me. 37% are using it, yes, and 63% are not using bonded. If you folks that are not using bonded and you have aspirations to do live, we’d love to help you. We’d love to help all of you with any of your live production needs. Oh, I gotta end … Wait, I gotta end the poll. Hold on a second. Okay.
What was the result of the … In summary, this Live PD show, on a given … The show’s typically on Friday and Saturday nights. It’s very draining on the photogs. They really don’t go home to their families. They just say on-site. I think they do like a six- to eight-week stint and then they take a couple of week hiatus. It’s a very demanding show on the field personnel. On a given night, there’ll be upwards of 36 and sometimes even close to 41 live camera feeds and everything’s in perfect lip sync. It really is irrelevant whether they’re shooting in New York or somewhere on the East Coast or whether they’re shooting in Arizona or California. The show is … The gear can be 3,000 miles apart.
We’ve done some projects with some of the European broadcasters that use Haivision. They’ll do a multi-camera shoot of the red carpet for The Oscars or The Academy Awards and send it live back to Sweden or Norway or wherever with less than a second latency. The real challenge was maintaining that multi-camera synchronization. It’s what some folks refer to as REMI, Remote Integration or at-home production. Whatever you might call that type of production where you don’t have the production truck in the field, where you bring everything home. We’re seeing more and more of that. I talked to guys at NEP or Game Creek and some of our marketing says, “The truck is gonna go away.” I don’t think that trucks are gonna go away completely. I think the trucks will always be needed to some extent.
Eventually, we’ll probably be doing camera shading across the network. I think there’s been some applications more of on a managed network, controlling iris of cameras, but you’re still gonna need photogs there on-site. You’re gonna need some equipment on-site. I suspect NEP, your Game Creeks, and your other players out there, in the future they may actually have more trucks, but they’ll be smaller trucks and they’ll be doing a wider diversity of shows and live productions than only your Monday Night Football or your hockey or baseball games, those big ticket events, that we’re gonna see some smaller stuff.
I don’t want to turn this into a commercial, but here’s some slides. I’ll kind of go through this quickly. This is the unit, Haivision … Big Fish is looking to upgrade. They’re using the Haivision older unit. The new unit will have HEVC, but all of Haivision units’ have eight internal modems, high-gain antennas, two land connections, Wi-Fi, Anton Bauer V lock. They were one of the first to go on camera. Here’s a TV 2 Denmark crew in Times Square using the Haivision. The talent is untethered. The IFP and intercom is going through the unit. Haivision is very strong on the intercom elements of the system. U.S. operators are loving the Haivision as well.
The StreamHub, you can have up to 16 transmitters, 16 PROs received by a single receiver. In news, it’s more of a round robin. You might have 50 units out there, but they don’t all go live at the same time, so it’s more of a many inputs and a few outputs. When it’s live, typically we have four SDI outputs, so in the case of Live PD, they only have four inputs and four outputs.
Now, if they had an IP Master Control, we could do actually 16 inputs, 16 PROs or 16 receives in one RU, which is very powerful. The box also outputs IP RTMP, RTSP, HLS, and generic transport stream as well as the proprietary Haivision SafeStream. You might be using some of the other bonded cellular gear out there where you make like a grid and you hook all your affiliates together. This is the technology you would use from Haivision to do the same thing.
That kind of segues into The Ryder’s Cup. Again, Haivision has a cloud manager. They have a multi-viewer function, so if you want a quality control … Like Live PD, they have these 41 feeds coming in, you could designate one of the SDI or IP outputs as a multi-viewer output and put that up on a big display for quality control or for the transmission engineer to keep an eye on everything. There’s some cool new features in the Haivision system that can help with your live transmission and your live workload.
Now we’ll segue into the project we did Turner Sports for The Ryder’s Cup. Bob Baker and Tom Sahara brought innovation into Turner and Chris Brown was heavily involved in the project. Chris Brown has been kind of the face of this particular project, particularly with the Sports Video Group. I think Sports Video Group had a function this week in Arizona. Chris Brown’s there speaking about what Turner Sports did with The Ryder Cup and of course he spoke at Sports Video Group in December along with some other panel members, some of his sports colleagues from other networks.
What was the challenge for Turner? The big was challenge was they weren’t the primary rights holders. Their revenue stream or their advertising revenue stream was not as big as the primary rights holder. They had a more finite budget for their production. They still wanted good quality. They still wanted to do a good show. They were already looking at some of the Haivision and they were demoing some of our stuff and like, “Hey guys, here’s a crazy question. We talked about a single feed here, a single feed there. Can you do 16 ISO camera feeds from Paris to Atlanta through the public internet? Can you do four return program feeds from Atlanta back to Paris? It’s a live show. We don’t want to be putting frame syncs and fixing lip sync problems. When it gets to Atlanta, we want to dump it live to air, live to social media.”
Again, this is more of the rack-mounted platform of Haivision called the HE4000. These are the units here. They’re half a rack wide. There’s a rack mount to put two of ’em in a rack. It’s relatively portable. In the case of Turner, they put them in their flyaway kit. This device has two land connections, so you could have two internet service providers, two telecom providers. It also has two USB ports which can optionally connect to what Haivision calls The QUAD CellLink, and the QUAD CellLink is that active external antenna I alluded to.
Tom Sahara and the guys at Turner are like, “We like that idea. If we’re at a basketball game, we’ll show up a couple of hours before the game. We’ll order a Metro Ethernet Circuit for our feed, but then it’s an afternoon game, we arrive Saturday morning, the feed’s not turned up. We call. The service provider says, ‘Oh, we’ll send somebody out on Tuesday.’ Excuse me, the game’s in a few hours. Tuesday’s not gonna work for us.” They liked the idea that they could connect bonded cellular to an encoder as a backup in case the teleco circuit is out.
Now, in this case the show was in Paris. They chose not to do a bonded cellular as a backup. They chose to only use a single internet service provider. I probably would have slept better the first night of the show, the first day of the show if they had two connections, a backup connections if the ISP went down, everything was going through that, but everything worked flawlessly. They barely dropped a single packet. The Ryder Cup I think was a four-day shoot, so they had these 16 feeds coming out of Paris. They had not only the tournament, but they had some … They followed the talent around. I’m sure they pushed some content through the circuit that wasn’t live. They went out around the campus and followed some of the talent around, did interviews.
There was a live connection for the better part of four days. 16 feeds coming in, four feeds coming out, and it just worked rock solid as if they had a fiber connection. It was that good. If you guys know Tom Sahara or Bob Baker or some of the guys over at Turner Sports or Chris Brown, they can attest to that.
A common question we get is like, “Well, what’s the magic? How does it work?” This is an HEVC encoder. It helps that … It doesn’t require as much bandwidth because it’s HEVC, it’s more efficient, H.265. Also, it uses Haivision, what they call SafeStreams. What does that mean? It has … Haivision has just rolled out their third version, SafeStreams 3. It has Dynamic Forward Error Correction. What does that mean? Forward Error Correction is nothing new for communication systems, whether it be satellite or microwave or ENG or encoders or IP. Forward Error Correction is a certain amount of overhead that’s added to the stream to make it resilient to packet drops or packet loss.
Without getting into the … I have a whole ‘nother PowerPoint presentation I’d done years ago, it’s on our website, on more of the nuts and bolts of how FEC works, but basically if you add 20% overheard … Let’s say you set the FEC at 20%, you’re adding a 20% overheard to the payload to offer redundancy. Statistically, if you add 20% overhead, you can sustain 20% loss and automatically recover any loss packets. Well, that 20% is overhead, so say you have a 20 meg pipe and 20% is going to FEC, now that 20 meg pipe is actually 16 megs of usable video bandwidth.
What Haivision does is very clever. I think they’re the only ones doing this. They only turn the Forward Error Correction on if it’s needed. If the link is passing all the packets, everything is going smoothly, maybe there’s a little bit of FEC turned on, just the minimum amount. If all of a sudden things start going south, the circuit is taking a hit, it drops a lot of packets, in the short term ARQ will kick, Automatic Re-Request. Because the FEC is turned down, maybe we missed a few of the initial packet drops, so ARQ … The decoder runs the dance, controls the show. The decoder will tell the encoder, “Whoa, I just lost packets 101 through 121. Can you resend those?” It sends those through.
In parallel it says, “Maybe I better bring the FEC up a little bit because things are getting funky”, to try to stop this barrage of packet loss. This is how Ryder was able through an unmanaged connection, through a public internet connection, essentially dropped no packets. I mean, I think they dropped a very small number of packets for the whole event. If a packet should not be recoverable, the whole system doesn’t fall apart. It conceals it. If a packet in the middle of green grass is suddenly dropped, you’re not gonna see a bright spot. It’s gonna smooth it out and assume the missing packet is green. The unit has what Haivision calls a Concealment Mode.
Another important thing is … I mean, all bonded cellular does this, but I think it’s a little bit more of a foreign concept when it comes to fixed encoders. You take Fujitsu or Erickson or a TAMM, something like that, for the most part most high-end encoders are constant bitrate. Bonded cellular, guys, we’ve all had to learn how to do adaptive bitrate because the cellular pipe was so up and down and unpredictable. Constant bitrate just didn’t work. “Okay, we have a 10 megabit pipe. Let’s try to send five megabits.” Oh, then it dips down to three momentarily. You’d lose the feed of the video, would freeze. With bonded it’s constantly going up and down as the cellular bandwidth changes. Haivision has a lot of expertise in variable bitrate.
Turner, when they got on-site to do The Ryder Cup, they put the Haivision units into constant bitrate mode. They purchased a Gigabit Ethernet Pipe and they’d say, “We have no problem. We’re gonna run these 16 feeds at 20 megabits each. Everything is gonna be fine.” In constant bitrate, it started dropping packets. “Whoa, is something wrong? Is something wrong with the Haivision units?” It ended up that their Gigabit Ethernet Pipe … Now, they might have had a service-level agreement from the venue in Paris to the knock of the service provider, but I don’t think they had a service-level agreement of one gigabit per second from Paris all the way to Atlanta. We figured this out during dress rehearsal.
The crew was out there 48 hours before the event. We said, “Look guys, we really recommend you run the Haivision gear at variable bitrate.” They go, “Well, our Fujitsus we normally run would run ’em at CBR, constant rates.” Well, this is a different product and we don’t know where the bandwidth ceiling is. We switched the units over to VBR and lo and behold, we could only … It seemed to settle out at 13 megabits per second. The available bandwidth was far less than the Gigabit Ethernet that they thought they had. That’s the beauty of variable bitrate.
Here’s a common question I get. Customer will tell me, “I have a decent encoder. A constant bitrate encoder.” What … Pick a brand, any brand. “I have this cradle point router with four cellular modems and it does some load balancing”, or something like a cradle point. “I have … I’m pointing that to a decoder and a master control. When the ballpark is empty, but then when the ballpark is full it doesn’t work.” The problem is is that the decoder, it’s not a closed loop.
With Haivision, it’s a closed loop. The decoder is actually controlling the dance. The decoder is checking or looking at the quality of each of the eight cellular modem paths. The Haivision units can have up to 12 paths, two land, two Wi-Fi, eight cellular. It looks at all 12 of those paths. If circuit number one starts dropping packets, it tells the encoder or the transmitter to stop sending packets down that path. If the loss of that path affects the throughput, it tells the encoder to lower its bitrate.
That’s why in the scenario I mentioned earlier where you have different vendors’ encoder, we don’t have control over the bitrate. No one’s analyzing the quality of the cellular path or the network. That’s why you need a closed system. That’s why you can’t build your own bonded cellular or your own bonded IP. Cellular’s a piece of it, but a bonded IP to be more general. The simple answer, too, is we get asked all the time, “Why does the Haivision units seem to work better than a lot of the other guys out there?” We think it comes down to better antennas, better modems, and then the algorithm is the big thing. I think that’s the big piece. How Haivision controls the transport stream. I’m gonna get into that in a couple of slides. I have a bad habit of jumping ahead, like three slides ahead, so give you a little foreshadowing.
They had 16 … They had four HE4000s in Paris to give you 16 feeds and then they had one HE4000 in Atlanta feeding back to Paris. With each HE is a companion StreamHub. One RU appliance gives you the four outputs, and a one RU half-rack appliance gives you four HD signals or a 1.4K. Turner could have done this show in 4K if they wanted, but they did it in HD. What Turner has told us what some of the other major networks … There’s a network in New York that has an encoder lab. I’m not at liberty to mention names, but it’s a three-letter network. There’s an individual there who’s really known for picking apart encoders. He says that this is the best HEVC encoder he’s seen, and it’s a hardware, it’s a chip-based ASIC-based encoder. It’s the best anyone’s seen.
Turner … Everyone’s different. If you’re at ESPN and they put an encoder, they like to look at the crawl. Does the crawl at the bottom stutter? If you’re dropping a frame, the crawl will stutter. That’s a good test of an encoder. Or some sports customers like to … A basketball moving over a sea of faces. Does everything get blurred out? With Turner, quick zooming or quick panning on grass … They’re like, “Hey, when we do quick pan, quick zooms, we can still see the blades of grass. This is Haivision box is really good.”
Again, similar to the Live PD show, we were able to maintain Gen-lock and lip sync. Another big caveat was because it was a European production show, they shot the show in 50 hertz frame rate. The Haivision box has the ability to do a standards conversion inside the box. I believe they set the output to 10 ADI or 720P 59.94 and the show was shot or the video came into the system at 50, so they didn’t need 16 expensive standards converters on the output of the Haivision.
I always test my Haivision counterparts that … I go, “You guys are too modest, man. You got a lot of really sophisticated, cost-saving features in your product and you don’t brag about it enough.” I guess that’s my job. My job is to brag about how good the Haivision technology is.
I did a presentation at the Annual SMPTE Conference in Florida. Joe Addalia had me come in and speak on precision timing protocol and a little bit about SMPTE 2110. I also did speak about Live PD and Ryder Cup and how does Haivision maintain all this lip sync and Gen-lock. They use a form of precision timing protocol. The benefit that Haivision as a closed system, they really don’t need to meet any kind of an industry standard, but on their IP outputs, they do have generic transport stream out. They also have outputs for SRT. They’re part of the SRT Alliance. The HE4000s do … Will have some point in the near future. There’s an SFP fiber optic port on the back of these HE4000s to take in SMPTE 2110 and then transport that and then on the receive side give back to SMPTE 2110. Haivision is forward looking in a lot of their technology.
Here’s some beauty shots of the event. Here’s the Control Room in Atlanta. Here’s The Ryder Cup. I think we sent Turner the rack mount brackets for the HE4000s, but you can see here it looks like they just put them on a shelf and it looks like there’s a bungie cord kind of holding them down. These could be rack mounted, two of them side by side. Here’s three of ’em just sitting here on the shelf. These are the boxes that got the video back to Atlanta.
I’ve known Tom Sahara for years and I’m sure you guys that know Tom, he’s probably one of the smartest guys I know. He’s just so humble and soft spoken and he’s just a pleasure of a guy to work with. Everyone at Turner … I mean, my Dad started Multidyne some 40 years ago, and when I was still involved with Multidyne, Turner was always one of our best customers. After NBC I think CNN and Turner were the second-biggest customer that helped launched Multidyne back in the days when Multidyne made test equipment. It was a pleasure working with Tom Sahara and the Turner team.
We briefly did a show called Border Live. A competing network was trying to kind of … I don’t know, do a similar show to Live PD maybe, I don’t want to say the word “copy”, but I think it could have been a good show. I think the producers, the network didn’t realize, “Hey, look, guys, it’s gonna take a dozen or so shows to see that we really have a show.” I just have this slide here that here’s another testament that the technology worked. If anything, it was even a more hostile environment than Live PD. We were on border towns to Mexico and the cellular was not that great near the border. The units were actually grabbing towers in Mexico to help transport the signal, so our units can roam internationally. For this show, we allowed the units to grab U.S. towers or Mexican towers or a combination of both to get a good signal.
Let me get into some of the technology. I have a habit of going on pretty long with these. We wanted to keep this under an hour. I see we got about 10 minutes left, so I’ll get through some of this. Here’s just a quick shot of the ecosystem. You can see here you have your capture devices, your encoders, your transmitters. It could be the PRO, either mounted on the camera or in the backpack. There’s rack-mount versions of the PRO that are a full rack. The HE is a newer platform. That’s what Turner used. This does four channels of HD or one channel of 4K. The AIR is a really nice device. The AIR is a very economical way to get your hands on what people are saying is the best HEVC encoder they’ve seen.
While the HE4000 has … It’s a four-channel device, some customers are like, “I don’t need four channels. That’s overkill.” If you just need one, this AIR is really nice. It’s portable. A photog can wear it on his belt. There’s an H.264 version and a HEVC 265 version. You can get ’em without any modems or with two internal modems and then you can add two external modems via USB. The more consumer type modems, you can add that to it. That’s a nice way to get the HEVC high-quality encoder economically. The cell phone apps, we work on Macs, iOS devices, Android devices, and Mac OS, and then other IP sources. If you … The Haivision StreamHub connect as a generic I or a D, so if you have other encoders, an Erickson or a TAMM out there, the Haivision will receive a transport stream from other devices.
Here’s an overview of the networks. 3G, 4G, 5G, Wi-Fi, satellite. I mean, satellite is not an ugly word. I don’t think satellite is going away anytime soon. We have customers that will bond the cellular with satellite where the bonded maybe is backup to the satellite and it just seamlessly will bond the two or failover to one or the other or vice versa. Cellular is the primary and the more expensive, satellite is the backup. BGAN, LAN, you hook any kind of an IP, an internet connection to any of these devices and if it finds the public internet, it will use … It will sniff out that bandwidth and use it.
For distribution, we can … You can hit one StreamHub or an infinite number of StreamHubs. They can all be hooked together in a grid, in an interlocking grid. IP outputs, SDI, and HDMI inputs and outputs. Now we can get into some of the meat and potatoes. Here’s the more technical stuff and I’ll try to keep track of time here.
SafeStreams 3. I mentioned that. It’s Haivision’s 3rd generation of this very robust transport protocol. What does it support? It supports H.265, HEVC or 264 ABC Codecs. It’s a hybrid system. I kind of alluded to this before. It dynamically adjusts to the Forward Error Correction. It turns it on, it turns it up and down when it’s needed. That works hand in hand with the retransmission of the ARQ or the ACK. There’s so many acronyms for this, but you guys get the idea. The Automatic Re-Request of packets. “I lost the packet. Please resend the packet.”
The system manages the priorities. In the grand scheme of things, what packets are more important than others? Control packets obviously are important. AQR or ACK packets are important. I have a diagram that shows kind of the priority of that. Managing the links, setting the priorities of the transmission. All that … That’s the magic. I think that’s the special sauce that really makes Haivision different. This very sophisticated, elaborate dance that’s done to control the transport, to make the transport bulletproof.
IP data bridges in router modes. A common thing is maybe I want to use an extra Haivision PRO as a hotspot, and we do that all the time with our customers. We’re agnostic. We have … We typically do eight cellular modems across … That’s what we recommend. Haivision was the first to do eight modems. Some of the other players have copied them. We’re firm believers of having two Verizon, two AT&T, two T-Mobile, and two Sprint. Common question we get is, “Why you put Sprint in there? Sprint’s not that good.” I apologize for any Sprint users or employees. Just statistically, people say Spring is not needed. We speak contrarian to that. We see on our monthly data usage … Our unit tries to flatten out the usage for diversity, so I’ll give you an example.
Let’s say Verizon has 20 megs available on a single modem. We don’t like the idea of putting the whole payload on that one connection because, as we all know, in any kind of IP communications, you’re only leased that connection. The tower could decide, “Hey, subscriber, you’re hogging the tower or you’re hogging that connection. We have some congestion.” They’re gonna dump you. Now, if you had your whole payload going through that 20-meg connection, it’d be very hard for the system to recover from that seamlessly. There would be a hit, if not a good 60-second hit.
Haivision takes the approach of smoothing it out. We would rather see a meg or several megs spread across all eight modems. Just the way the Haivision product works, there may be more bandwidth available statistically, but we see pretty even usage across Verizon, AT&T, and T-Mobile. What tells us that at least the networks are equally available. Really doesn’t speak to like maybe Verizon had more bandwidth, but we prefer to diversify, but I look at it as availability. That averages 27%, 29%, somewhere in that range, and then the leftover, Sprint handles about 14 to 16%, so we’ve seen in really challenging environments.
Here’s another question we get. “I don’t believe it that the Haivision will work at MetLife Stadium with 80,000 fans in the stands during a football game.” It’s like, “Well, demo it.” The customer will say, “I had your competitor’s rig out here last Sunday.” Ours just works. We get five megs with 80,000 fans. Well, what’s happening there? Maybe the better antennas. Maybe the better modems. Maybe the better algorithm. I think a piece of it, too, though, is the majority of the fans are probably on AT&T, T-Mobile, and Verizon, and there’s less competition to get onto Sprint. In some of those bad areas I think Sprint saves the day. We do need Sprint. Sprint we feel is an important piece of the puzzle.
I’ll go through this pretty quick. This is more about the SafeStreams. It’s a full duplex uplink and video return. Haivision … The new Haivision products all have video inputs and video outputs. If you buy a unit today, there’s a video output. Today the firmware just gives you a loop through. It’s like a competence out, but there will be firmware updates later this year that will give full-quality video return. Particularly with the REMI production, especially some of these smaller games where they don’t have an in-house crew shooting the game.
There’ll be a single crew and they’re also doing replay and they’re also doing the in-house feed. If we move the truck from the venue back to Master Control, we have to send the high-quality feed back to the venue to put something up on the big screen. To have play highlights or replay, that sort of thing. There’s Return Path, everyone’s been asking for this, but I think as we move forward and we get into more of this REMI or at-home production, this Return Path is gonna become even more important. Let’s see here.
I mentioned this, so what is SafeStreams? Why do we need it? Well, it’s what gives us that resilience, that what I call the bulletproof transport over an unmanaged network. What does that mean? An unmanaged network? Well, the public internet is an unmanaged network. Cellular for the most part if an unmanaged network. Things like that. Now, if somebody’s running this on their internal corporate network, that would be more of a managed network. Of course, the devices will work over a managed network, but that’s easy. That’s easier. The unmanaged is more the Wild West.
This Forward Error Correction and ARQ and this ACK is another way. These acknowledgements of reception of packages, of packets is the special sauce. Haivision is delivering 300 millisecond glass-to-glass transport. Encoding the transmission and decoding in 300 milliseconds. Now, obviously, if we’re going over a network that has a 600-millisecond roundtrip latency, well, we’ll probably have to increase the latency on our transport to one second.
Basically, the transmission rule of thumb usually is the lowest theoretical limit, the lowest you can go on your transport latency is about three or four times the roundtrip latency of the network. Cellular modems, when they’re behaving properly, are probably at 50, 60 milliseconds in one direction, so we would guesstimate about a hundred milliseconds. You’d multiply that by … A hundred milliseconds by three or four … 300 milliseconds under the right conditions is certainly theoretically possible by common transmission standards.
The HE4000 as well as the portable, the PRO units … The HE4000 go up to 200 megabits per second. The smaller units can go up to 20 megabits per second. The UHD is 42210 bit, which is part of the high quality. IT’s not 420, it’s not eight bit. The color depth, the video quality, the bits are there to give you a superior picture.
I touched on this before. How is this Gen-lock maintained? This IEEE 1588 or what is better known as Precision Timing Protocol … Basically, timing signals are sent through the network so that all the appliances in the Haivision ecosystem are all locked to the same frame reference. What helps on some of the transmitters is they have a GPS receiver. What’s the world’s most accurate clock? A lot of times we synchronize our Master Control Clock to GPS. That’s the frame reference.
If we’re trying to lock to the studio, the transmitters a lot of times are pretty close, either through the time stamp from the cellular network or even better from the GPS network. We’re not completely in left field when a transmitter lights up and tries to talk to the StreamHub Receiver and Master Control. The timing is probably pretty close, but then we get it down to be frame accurate, lip sync accurate across that network.
Here’s a little breakdown of the … Of course, guys, I mean, if I’m going a little fast, I’ll give you a copy of … I’ll email everyone a copy of the PowerPoint when we’re done. You have … Here’s a typical transmitter. This could be the HE, the rack-mount unit. It could be the portable unit, the small AIR unit, laptop, whatever. Basically, here’s the hierarchy. The most important thing is the Control Channel. If we lose control, the mechanism’s gonna fall apart. That’s the most important. We want to make sure those packets get through. The decoder can talk to the encoder. That’s at the top of the stack.
You see also here actually above the control is the ARQ. Those are the packets that get lost, so the video … Those two are kind of at the top. Some other data payload buffering, removing files or statistics being moved through. Audio … The Audio ARQ, audio packets get dropped. Oh, I’m sorry. The top, it’s the ARQ for the Control Channel. The Control Channel’s at the top. If some packet gets lost, the unit will recover those first just to show the priority of that, then we get into audio and there’s an ARQ for that along with audio buffering then there’s … Actually, video is last. Surprised me.
I think … Any of you guys out there that are audio guys, you will know … Audio guys know this that the human ear is far less forgiving to flaws or pops or clicks or dropped packets. The human eye, visually we extrapolate a lot. There’s a lot of integration going on. Our brain can’t process all the pixels we see in our environment, so we interpret things or anticipate things. I think there’s a lot of neural science behind this. I’m not surprised that video is actually last and audio is second to the Control Channel, because you … I think audio is one of those things … There’s … We don’t noticed it until something goes wrong. Until there’s a pop or a bad … Or a click. Audio is an important element.
In theatrical movies, sometimes the score says more than the visual. You know something spooky is about to happen from the soundtrack. That’s kind of the logic behind this. All of that is load balanced and the packets are scheduled, which packets go out first. How do we synchronize those packets? You see network statistics. The decoder is telling the transmit mechanism which one of these channels is behaving properly. A lot of times people think with communications, particularly cellular, it’s a question of bandwidth as the problem. Usually the first problem is latency. On the dashboard on our units, you’ll see a modem jump up to 3,000 milliseconds and then settle back down. Sometimes there’s these fluctuations.
Latency is usually the first parameter or the first problem area, then of course, bandwidth. I mean, if the pipe has gone away, then latency kind of becomes irrelevant if you lose the connection. The unit is doing a delicate dance across all these connections and it really doesn’t care if the connections are modems or satellite or some … Just any form of IP or Wi-Fi, it all seamlessly manages all of this.
Here’s how the transport stream is made up. It’s real-time protocol. UDP stream wrapped in RTP. The StreamHub Receiver acts as the master clock. This gets into a little bit of the Precision Timing Protocol. One device on the network decides it’s gonna be the master. Usually in the case of Haivision, it will be one of the StreamHub Receivers in your Master Control. If a StreamHub should be disconnected from the network for some reason, a second unit will say, “Whoa, where did the master clock go? I don’t see it anyone out there.” It’ll poll the network, “Where’s the master clock?” If nobody answers, it says, “Hey guys, if it’s okay, I’m gonna be the clock now”, and it takes over.
The StreamHub, it kind of dovetails into what I said earlier, how the StreamHub controls everything. It also acts as the grand master clock for the Precision Timing Protocol. As I said earlier, it’s RTP over UDP. That’s the stream payload, and in that payload is the audio, the video, the data, and the control. The RTP, which is the whole envelope, the whole transport stream, that gets encapsulated in this proprietary SafeStreams. The video, the audio, the data, gets packetized as a UDP stream, which then gets put into an RTP transport, which then gets put into another encapsulation of SafeStreams.
It’s like UDP inside of RTP inside of SafeStreams. This is what makes it so bulletproof to go through unmanaged networks or cellular networks or the public internet or satellite or whatever. It’s the Forward Error Correction and the ARQ or the Automatic Re-Request across these 12 IP connections, that’s where the magic all happens.
That’s it, guys. I know I went over an hour there. Hopefully you guys found this informative. If you guys have … Let me … How do I get to my … If you guys have any questions, let me … Let me stop the PowerPoint. I want to put something up for you guys. I have some links here. If you guys want to … We see a big part of our business as consulting, and if you guys have an active project, of course, bonded cellular, encoding, but we do an enterprise IPTV system with digital signage, fiber optics. I’m sure you guys are familiar with our website.
The link I just sent to you guys takes you to my bio page on the VidOvation website. On the right hand side is a mechanism to book a meeting with me. If you guys have a project that you want to discuss or if you have any deep questions about what we discussed today, I don’t see any questions coming up. If you guys … Of course, everyone who attended or … Or actually, everyone who registered will get a copy of the PowerPoint, and if you guys have any questions or comments, certainly reply back.
Here’s another link. If you guys want to demo or learn more about the Haivision ecosystem, at this link you can make a request for a demo. No pressure. I’m not making this a sale. I’m a firm believer of marketing through education, so hopefully you guys found this session today educational. If there’s any questions, I can stay on the line for a few more minutes here. I guess that’s it guys. Yeah, if you guys think of something in the future, certainly reach out to me. Our sales team, my colleague Rick Anderson, our VP of Sales, he’s very knowledgeable.
You can call the Sales Department if you need pricing information. Certainly feel free to reach out to me for more of the technical questions and I would love to do a 30-minute engineering consultation if you’d like.
Thank you all very much and thank you for tuning in today. I hope you all have a great weekend and let me know what you thought of the presentation today. I appreciate your feedback. Thank you all so much and have a great day and great weekend. Bye-bye.
Podcast: Play in new window | Download (Duration: 1:11:58 — 65.9MB) | Embed
Subscribe: Google Podcasts | Email | RSS