Signal Transport: IP, 5G, RF, and Bonded Cellular - SVG Sports Production
Select Page

Signal Transport: IP, 5G, RF, and Bonded Cellular – SVG Sports Production

Signal Transport: IP, 5G, RF, and Bonded CellularPRO380 Bonded Cellular Field Encoder

There are more options than ever for getting a signal from point A to point B (and even C, D, and E) and transport technologies continue to evolve rapidly. Signal transport service and technology providers discuss some of the latest developments that can make a difference to your production.

Speakers:

Janel Fleming, LiveU, Director of Sales and Sales Group Manager, Sports
Jim Jachetta, VidOvation, CTO and Co-Founder
Peter Larsson, BSI, President

Moderator:

Ken Kerschbaumer, SVG, Co-Executive Director, Editorial Services

Video Recording:

Learn more about At-Home Production (REMI) using Cellular & the Public Internet

Transcript:

Ken Kerschbaumer (00:00):
Peter, Jim, Janel, good to see you all. How are you all doing? How are you holding up?

Janel Fleming (00:04):
Good.

Peter Larsson (00:05): We’re good.

Ken Kerschbaumer (00:05):
Okay. I think it’s day 654 of the COVID pandemic. It says we laugh now until comes that way, the three months, five months. Anyway, good to see you all. Our goal here is to talk about some of the trends in transport technologies. Peter, I’m going to start with you because you are a hall of fame when it comes to transport technologies with the work you’ve done with wireless cameras over the years. And I want to kind of start with you as far as an overview of RF and then what you see some of the challenges, because obviously the spectrum situation, not getting any better out there. So kind of give us an update on where things stand right now.

Peter Larsson (00:43):
Well, the way I’ve always looked at the RF environment that we work in that there are effectively three different levels. There is a single RF camera that can be used for a new story or a sporting event. Then there could be two or three cameras. And then you get to some of the bigger car races or golf where you have 40 systems. And they both take different technologies. A lot of the LiveU or the bonded cellular systems are incredible for the single-camera situations specifically for news and somehow the sporting events. It’s when we start getting up into those 40 plus cameras for the golf and car racing, and then we start running into issues. And unfortunately, although we think our marketplace is the most important one, and the world and the biggest one in the world is actually the cell phone systems that dwarf us.

Peter Larsson (01:40):
They have all the money in and they’re coming in and buying up all the spectrum. So, as we try to find more spectrum to do what we need to do, we’re finding that we have to go higher and higher and higher in spectrum. And one of the issues with that is that now, where you might’ve gotten away with one or two receive sites if you’re down in the lower spectrum, as you get high, you have to put more and more receive sites out there. So, the available money is going down, but the cost of operating is going up. And that is about the biggest issue we face at the moment. One of the bigger ones also is for wireless microphones. With the repacking of the TV stations, a lot of the UHF spectrum is going away. I know the SVG is working with the FCC as well to try and come up with solutions. But again, we are a small little player in that marketplace as opposed to the cell phone companies.

Ken Kerschbaumer (02:41):
Yeah. Yeah, I know that there have been some recent filings from Shure and Sennheiser on the wireless microphone funds and we will be getting involved with those to helping support that as best we can.

Peter Larsson (02:51):
It’s all good.

Ken Kerschbaumer (02:53):
So then what’s the long-term implication? It is going to just get more complicated and more difficult? Is there a bigger threat here at all, potentially longterm?

Peter Larsson (03:03):
There are ways out of it but they all cost money, which is the problem. So, yeah, as the technology is becoming more important or as digital technology is maturing, everyone is going after minimal cell systems, multiple outputs, which allows the signal to be two way. The encoders are becoming more efficient with better quality and creditors. And that’s reducing the bandwidth of the video that you require. So there are ways out, but as you’ve heard it described, nothing happens all of a sudden. So, it’s just a very slow transition. And then we fall off the cliff and there’s a brand new technology comes out there which will help everything move along.

Ken Kerschbaumer (03:51):
Right, right. Excellent. So, Janel, I wanted to bring you in here to discuss the Super Bowl. And Peter mentioned some of the smaller systems and the role for these smaller demands. And obviously I think the LU800 is stepping in there, right? With multi input needs. So, can you talk through some of the workloads we saw at the Super Bowl and then maybe sending in some other events in the spring in Florida potentially?

Janel Fleming (04:15):
Mm-hmm (affirmative), yeah. So, we’re seeing obviously a large demand for REMI production because having the need to have reduced people on staff is increasing the need to do more stuff from a central location. So, we’ve deployed the LU800 late last year. We had this in development before COVID hit. So we were sort of thinking about that in terms of customers are going into venues that are camera wired into an IO panel. And this is sort of a easy solution to just roll up with a few encoders and be able to produce a game remotely. So, the Super Bowl, we had a ton of use cases, obviously not the game itself, but all of the stuff leading up to the game and throughout the weekend, where they couldn’t roll their traditional production truck. They had limited personnel that were allowed to travel because they had to quarantine two weeks prior to that.

Janel Fleming (05:05):
And that displaced a lot of key people who do other things in their network. So between Fox Sports, Fox Deportes, they did REMI productions of shows on site. CBS Sports interactive exclusively used LiveU for all of their digital programming throughout the weekend. And we saw the LU800 used in one location more than we’ve seen it anywhere else. So, it was a great sort of us a rewarding thing to see that. We knew these this need would be huge and a multi input single device unit, especially having bonded cellular because connectivity was very much questionable. Where are you going to get located? Where do you do your studio show? So sort of the confidence that the network is there when they rolled up. So, it was very exciting. And then, of course, we’ve got some big news come spring training. As many people know, we have done a lot for baseball throughout the season, but spring training is going to be really exciting for us this year, because we’re going to be used more than we’ve ever been used in the past. So-

Ken Kerschbaumer (06:07):
Excellent.

Janel Fleming (06:08):
… more to come.

Ken Kerschbaumer (06:09):
Great. That’s great. So, Jim, I want to bring you in here as far as at home productions obviously, they’ve grown in popularity, no doubt over the last year or two. And there are issues when it comes to things like free [inaudible 00:06:23] or gen lock, lip-sync, especially because these productions are getting more complex. So, can you discuss that for a little bit? And then also for Peter and Janel, if you want to chime in as well, but Jim, from your perspective.

At-home Production Multi-Venue

Jim Jachetta (06:33):
Yeah, I mean to Peter’s point that bonded cellular, and I’m sure Janel would agree, in the early days, the technology was really meant for news, a single camera, either news talent, and a cameraman, or sometimes the talent was the camera operator. We’ve come a long way from that. And to Janel’s point, bonded cellular now is really working really well for multi-camera at-home production, REMI production. We do have devices that have multiple inputs, up to four cameras in one appliance. But where our customers are using our technology is  camera mounted units that is all over the field. They could be all over the golf course, the PGA, the golf channel, NBC are using our tech. And the cameras are not tethered to a single box. The field encoder mounts on the camera.

Jim Jachetta (07:33):
And we’re able to maintain frame-accurate genlock on mobile, portable cameras similar to an RF workflow that BSI would employ. Or some of our customers are using a hybrid approach. They might use BSI for the RF to the truck, switch the show in the truck and then use bonded cellular out of the truck. But we’ve been promoting for more than five years even before COVID the idea of fully portable, fully autonomous portable cameras. We’re able to do frame accurate gen lock. So, the PGA will deploy on some events up to 12 cameras and we can keep them all in perfect gen lock. They could have 30 microphones open at the same time. And if they were out of sync with each other, the show would be horrible. In a live event, as we all know, you can’t fix timing issues in post production because there is no post-production. So, VidOvation and our partner, Haivision, it’s the safe streams transport protocol that we do a form of precision timing protocol over an unmanaged network to keep all the field encoders in sync, to keep all that camera’s in sync.

No Production Trucks

Eliminate or reduce the need for Multiple Production Trucks

Jim Jachetta (08:53):
And, Peter said 40 cameras, we have one customer that deploys 50 cameras. Now, they’re not all in the same location. So, frame accuracy becomes less important if you’re in different venues, different cities. But within a given venue, there could be dozens of cameras all shooting the same scene and dozens of 30 or 40, 60, 100 microphones all on the same event. And they all have to be imperfect genlock. And we help our customers achieve that.

Ken Kerschbaumer (09:26):
That’s so great. Peter, Janel, any thoughts on that whole need for timing and getting things looking perfect?

Janel Fleming (09:33):
Yeah. I mean, you can’t do a production without the camera signals being in sync. You can’t cut to one camera in the other one, they just swing the back. So, LiveU has had our wireless at home production for six plus years. We cover most of the American athletic conference games. We’ve done NESN spring training games, we’ve done the off-road rallies and motor sport events, and a lot of tier two, tier three sports, Polo for ChukkerTV. And the beauty of the wireless solutions, somewhat like what Jim said is that you don’t have to wire cameras to a centralized place because cabling can oftentimes be the most expensive. So, having that ability to be married to a camera operator, whether it’s mounted on the camera or in a backpack or whatever way the configuration works, having these synced across all and being able to easily deploy this from any location that this sporting event is at is critical. And of course, bandwidth is very important.

Peter Larsson (10:33):
Yeah, I agree that REMI has been gaining traction. And then when COVID came along, it just really, really took off. And whether it’s done by compressing the video and sending it off on fiber or using your phone system, it certainly is the wave of the future for some shows.

Ken Kerschbaumer (10:55):
Mm-hmm (affirmative), mm-hmm (affirmative). So, Peter, I want to have you kind of kick off the next section, which is about the impact of cloud based production, right? So there’s people looking at, figuring out how do you get signals directly from your systems into the cloud to have them have production done there? There’s still lots of issues with that happening and that’s even more timing issues, I’m sure. But you investigated the cloud based production as far as it relates to where your assistants kind of chime in? And then [crosstalk 00:11:23].

Peter Larsson (11:23):
To a certain extent, I wouldn’t actually call it true cloud production, but with a lot of car racing and a lot of golf now, because we use a much more regular microwave system, you have to compress the video down versus into the cell phone systems. Compress the video down, and then we decompress ina trailer. But we’re also sending it out over the internet back to wherever the REMI is being produced and then send out decoders out there as well. So, it’s not truly a cloud, but it is send via the internet to aid in the REMI productions.

Ken Kerschbaumer (12:03):
Right, right. [inaudible 00:12:04] rumblings of things like let’s do a NASCAR race in four years on the cloud? Is that on the horizon potentially?

Peter Larsson (12:13):
I think everyone is always looking for that. And again, this is just purely my view on what’s going on. The whole REMI system has to hit that tipping point, because at the moment, you don’t have enough of the staff that it takes to operate the REMI in one specific city. So, if you’re, say, using Pittsburgh as your REMI source, you still might have to travel in the [TDs 00:12:44], the audio guys, the production people, et cetera. So, you’re getting closer, but you aren’t quite hitting that financial saving by doing REMI. The big thing right now is COVID, and COVID has really pushed the whole REMI thing forward, because it’s cheaper or safer to move people to that one location. You can practice social distancing, et cetera, get your show done and do it in a safe manner.

Ken Kerschbaumer (13:13):
Sure.

Peter Larsson (13:13):
And the goal will be to do it in a cheaper manner.

Ken Kerschbaumer (13:16):
Yes. Indeed, indeed. Yeah, Jane, what’s your take on cloud-based switching in LiveU and-

Janel Fleming (13:21):
Yeah, I mean, this is a theme of being tools. This is something our customers have certainly been inquiring about the most, because as soon as the pandemic hit, it was how do I leverage what we have that we can’t access right now? And what’s the alternatives? What’s out there. So, all of the developments in the cloud I think got accelerated a lot. And so, our response was really building our solutions to be sort of agnostic and be able to hit a physical server, a software switcher, something in the cloud hosted by us. And then recently, we’ve launched selling essentially our MMH software in the virtual private cloud of somebody’s Amazon or Azure Stack. So that way, you can utilize tools like [VRT 00:14:09] Vector and the SRT or anything that, where it’s within your AWS stack or your system. You can lower the latency and leverage NDI in the cloud, which is really, I think, the key element here, is that you’re going to want to have extremely high quality and extremely low latency and not have to sacrifice too much by transitioning into a cloud production environment.

Janel Fleming (14:32):
And whatever way we can reduce the latency, that was our goal. So, we built out a software solution to build into your cloud infrastructure. We also enabled our cloud servers and physical servers to stream in our TMP with a timestamp to go along with having to make sure there’s frame accuracy based on any cloud switching platforms, because there’s many, right? There’s sort of a lower tiered sport that’s going to look somewhere at a sort of budget conscious environment. And then there’s the bigger budget companies are going to look for Grass Valley AMPP and something that’s a little bit more robust and turnkey for whatever sporting event they’re going to be leveraging. So, it’s been our goal recently to make sure that when you’re using LiveU, it’s a seamless integration into whatever cloud environment that you’re trying to get to.

Ken Kerschbaumer (15:18):
Sure, sure. So, Jim, what’s your take on the cloud and how your company is evolving around this [crosstalk 00:15:24]?

Jim Jachetta (15:23):
I mean, to Janel’s point, we’re doing virtually identical things. One question we get asked a lot, “Well, why can’t I hook my fielding coder directly to be mix or directly in GVM?” LiveU’s the same way. You need to hit either a physical server or a software, a server in the cloud to put the video back together. Bonded cellular, as we all know, with our solution, we take the video stream and we spread it over eight cellular modems to land connections WiFi. The land connections could be satellite, they could be fiber, they could be public internet. So, a LiveU has to do the same thing, put all those bits back together. So, we do the same thing. The Haivision software is available on a physical server with SDI outputs. But a lot of our customers are not doing traditional broadcast that major league fishing, it’s strictly a cloud solution.

Jim Jachetta (16:31):
So, the show, it’s actually switched with a physical production truck, but then it’s piped up to the bonded cellular from the [bass 00:16:42] fishing boat to a truck on land. Then from the truck using bonded cellular and public internet to get it up to the cloud. So the possible combinations of how the workflow rolls out, it could be fully at-home production, it could be all cloud, it could be a hybrid. It really depends on the application. And to Janel’s point, the transport from the field encoder to the decoder or decoding software is usually proprietary. In the case of Haivision, it’s the same streams transporter SST, LiveU has their protocol. But in order to hook to the rest of the world, we support NDI, we support SRT, we can do an RTMP. One important thing in our software is transcoding. Not every platform now supports H.265 or HEVC. So, we may come into our receiving software, our physical receiver, but then Facebook doesn’t want HEVC right now. They still want 264, or they want a certain bit rate, or they want constant bit rate, not variable bit rate. So we transcode. So, those little extra things of transcoding and changing the transport protocol to make it compatible with all the different flavors of cloud vendors to maximize interoperability.

Ken Kerschbaumer (18:17):
Awesome. Great. So, Peter, I wanted to bring you in to kick off the next segment [inaudible 00:18:22] control of signals, the camera parameters, and looking at the developments over the years as far as the level of control that you have of these remote camera systems. That’s been job one for you clearly. So, what’s going on as far as the evolution of just making sure that there is more control remotely of these systems so that people back at the head end can dial in color and make sure video shading is proper? All that kind of stuff.

Peter Larsson (18:52):
As I mentioned a bit earlier that the bi-directionality of the microwave systems now are really, really starting to take off, which is great. One of the big things all along has been to get the cameraman program video. So especially with the graphic heavy shows that are out there now, they need to be able to see where all the graphics are so they can frame the show properly. So with the bi-directionality now, you can get a program video back very easily. You can even get the intercom back if you want to get at that path. But more importantly, you can do the full CCU control of the camera which is very critical with the way they’re all going together now. In the [inaudible 00:19:36] UHF signal for both the communications and the data. And just by putting in a specific code per camera, you could run 20, 30, 40 cameras off a single UHF control channel. But with the bi-rationality, all that’s becoming a fairly simple system now.

Ken Kerschbaumer (19:57):
Excellent, great. Janel, from LiveU perspective? Camera control and all that stuff?

Janel Fleming (20:03):
Yeah. So, basically we’ve seen this ask a lot recently. And our units have always been transmitting video back to a control room environment, that was the traditional use case. But now that the deployment of PTZ cameras with golf looking to do auto iris and camera shading, there’s a need for a secondary point-to-point data connection to control some of this IP equipment that’s out in the field. So, we rolled out a new feature called IP Pipe, which is essentially a layer to the connection between your field devices and your server. And the attention is really to be able to control cameras remotely and not necessarily need to have someone there. So, with a PTZ camera, you can set it up, forget it. You don’t have to go through the [inaudible 00:20:49] complications of creating with whatever customer’s network is available at that time. To be able to control that camera, you hook it into the unit with the SDI path and be able to control. And the same goes with other IP infrastructure like low bandwidth comms and that sort of thing. So, we’re seeing it used a lot in golf especially because that’s a big need for camera shading for some of our customers who, they’re sunrises and they need to be able to fix this the lighting and the coloring. So, it’s been a huge help with customers and being able to solve for that need awesome.

Ken Kerschbaumer (21:25):
Excellent. And, Jim?

Jim Jachetta (21:28):
Yeah. To Janel and Peter’s point, that is a big challenge. Some of our customers, they may deploy a smaller truck and choose to do shading on site. And then in those applications, we might use our partner, an ABonAir. They have a microwave solution that does a high rise video, 4K video now. Camera control, intercom, and seven milliseconds of latency. So, it’s half a frame or sub-frame latency. LiveU, Haivision, half a second is the lowest latency we can do. I’ve done a little amateur video engineering myself, but I volunteer at the local church. And even the slightest bit of latency, it’s very hard to shade a camera. If the video is arriving in the truck or master control or the cloud, whatever the workflow is, if the video’s already a half a second old, the camera may have already moved to a shadow, and now you’re opening the iris, so you’re following it.

Jim Jachetta (22:42):

Haivision CyanView Camera Control

So, there can be challenges. So we have a microwave solution that addresses that, but that’s meant more for having a truck onsite. With the bonded cellular approach, we have a data bridge function in the Haivision technology, similar to what Janell was speaking about. We’ve had that for quite some time. But some camera control systems don’t like more than 10 or 20 milliseconds of latency. They time out, some ping hiccups, even a few drop packets, let’s say. So we’ve partnered with a company that specializes in IP camera control particularly for specialty cameras. But they work with the big cameras too. Your Grass Valleys, your Panasonics and your Sonys. And the company’s Scion view. And what Scion view does is, you have your CCU, your controller in the control room, but then there’s a little box on the camera that emulates the controller.

Jim Jachetta (23:44):
So, if the is interrupted or there’s too much latency between the real physical controller that the video engineer’s controlling and this pseudo controller on the camera, what it does is it keeps the camera happy. The camera thinks it’s connected all the time. So, the real physical controller talks to the little box and keeps everything happy. So if there is a little bit of a hiccup, it’s smooths that out. And the tech really works well over an unmanaged network, like the public internet or cellular. Or you could use it over an RF link. If you don’t have camera control integrated in your RF link, you could do a data RF connection and use it that way. So, that’s some of the things we’re rolling out to some of our customers, helping them with this camera control and shading issue.

Ken Kerschbaumer (24:34):
Excellent. So, Jim, I know you wanted to talk about the analog audio inputs and how they kind of fit into the bonded cellular encoder. So, [crosstalk 00:24:43] that issue.

Jim Jachetta (24:43):
Yes. So, our partner, Haivision, has always had the feature of analog audio input. So, most field encoders will take audio from the camera. So, whether it’s two audios, four audios, et cetera, what Haivision does is it allows you to take maybe audio number one and two from the camera, and then feed analog audio inputs to three and four. If the camera has analog audio inputs on it, of course, you can do it that way. So, if an operator has a bigger camera that has maybe internal microphones and maybe two audio inputs, maybe one for a shotgun mic, and then you can plug your lapel mic from your talent into the third or the fourth audio channel. If you have a smaller camera, there’s a lot of productions now. They’re shooting stuff with SLR, so you don’t have audio input.

Jim Jachetta (25:38):
So, you have the audio input on the field encoder. I mean, I think of myself as a video guy, but we all know that for every camera video feed, there could be 4 or 8, 12, 16 channels of audio for every video that there’s far more audio than there is a video. So what some of our customers are doing, like for the PGA tour, is they’ll use the flagship unit and mount it on the camera. And then they comm it on the analog audio inputs. The commentator is on the course, so they’ll have a lapel receiver for a commentator going in channel three on here. Then one of the golfers has a microphone on. That’ll be going in channel four.

Jim Jachetta (26:32):
Some customers are very savvy. Because the camera operators will rotate, they’ll move around. So what happens if the camera operator moves away from the commentator? So, you can have multiple receivers for each lapel mic on the course or on the production. So, the microphone will jump to whatever one is closest to five. Like live PD, police officers driving the car, lapel receivers in the back seat or in the trunk. But then he gets out of the car, now his lapel jumps to the photographer following up on this. Then we all learn from our customers, right? So, we learned the importance of analog audio from the PGA that apparently the top trace, the red line that follows the ball going through the air, that telemetry goes through an audio channel. I didn’t know this till the PGA.

Jim Jachetta (27:23):
So they’re, “Oh, wow, that’s really cool. We just plugged the top trace telemetry in here and it goes through the analog channel.” Then the PGA was like, “Well we could rent more of these, but we don’t need video. We just need audio.” So it’s kind of overkill. So the PGA rents the smaller unit. Instead of eight modems, it has two modems And they use the analog audio is on this for parabolic microphones to get audio shot audio from afar. So a combination of the technology having those analog inputs. I’m sure any customers listening is like, “Hey, I just thought of a new way to use those audio inputs for something.” So, hopefully I learned something new, a new application.

Ken Kerschbaumer (28:13):
Oh, excellent. So, Peter, it’s funny, because when you look at the evolution of some of the shots you’ve done NASCAR, the coolest thing was when the audio got in, right? And you call us and then you talk to the drivers and things like that. So, it’s interesting as much as we love the video and the video images, that communication to be one of the cooler things when it happens. So, what’s your sense on what Jim was just talking about as far as audio in the future of audio within RFS and what you’re working on?

Peter Larsson (28:44):
Well, specifically as we were saying earlier with the compacting of the UHF TV transmitters, the actual spectrum that was useful while [inaudible 00:28:56] mics is diminishing. And unfortunately, it’s becoming more and more critical for the show because we always joke that it’s very easy to get the video out, but the audio is always the hardest part which requires the most amount of work. So whatever you can do to simplify that process and get the audio out quickly is critical. And it’s becoming more and more difficult as time goes by just because this… When you look at the number of wireless mics that are out there between all the local churches and the conference rooms and all that sort of stuff, are just a tremendous amount of equipment. And the spectrum, depending on the city that you’re in, it’s becoming less and less and less.

Ken Kerschbaumer (29:39):
Sure, sure. So, Janel, I know the LU800 has a bunch of audio channels on it I guess to meet these needs as well.

Janel Fleming (29:45):
Yeah. Yeah, when I came on three years ago and I started the sports segment of LiveU, it was, “We need to support more embedded audio channels.” We’re being used and being asked to use as a backup to satellite, sometimes the primary on the linear broadcast channels, and audio channel support is critical to that. So, our LU800 can now support up to 16 embedded audio channels, obviously double up more LU800, you can support more audio channels. So. it depends on the sport and what you’re doing, what the need really is. Golf is very unique, but most sports, they are looking to say, “Hey, are we mixing onsite? Are we mixing remote? And what can your encoders do to support?” So I think we’re sort of unique in the market in that. The REMI productions, we really need every different kind of sport from in-car cameras with NASCAR stuff and the G1 Series with LU300, because you can get that in there and loop in a camera and be able to get that in-car audio into your production environment. That gives fans… they get excited with the noise. There’s no NASCAR without noise.

Ken Kerschbaumer (30:49):
Right. That’s right.

Janel Fleming (30:51):
So that was a big one. And then, obviously regional sports networks, they’re producing more at-home production, high school sporting events in their given region. So, the ability to sort of take as much of that stuff as a REMI in including, doing audio mixing and wiring that into your at-home production set is critical. So, it’s cool to see us up our audio game in LiveU. And all the bonded cellular companies starting to follow suit.

Ken Kerschbaumer (31:18):
Right. Excellent. Great. So, we can’t have a conversation about signal transport without talking about 5G. Just because everybody thinks is going to solve every problem in the whole entire world. We heard it mentioned at the Super Bowl yesterday. So, Peter, what’s your take? All three of you can chime in on what your sense is of 5G, its role in industry, myth versus reality.

Peter Larsson (31:41):
I’ll start off with that 5G is really a set of rules, and it’s how you implement those set of rules to get the product. But people generically just say, “Oh, 5G,” and you’re right, that’s going to solve absolutely everything. The why I’ve always looked at this is back in the days of NTSC. Now, NTSC could be a VHS tape that your kids are watched 100 times. And then it could be a three quarter inch machine, it could be a two inch machine, it could be all the way up to a DigiBeta. They all put out NTSC. But you look at the range of qualities you’re talking about, from the VHS up to the DigiBeta. And it’s the same with 5G, that it depends on the frequency of the implementation and the width of the channel that you can actually have available, which allows the numbers of zeroes and ones to get through.

Peter Larsson (32:33):
So, when everyone hears about 5G, it’s like, “Oh, my God, you can do megabits of data, gigabits of data.” That is true if you’re up around 70 gig. And the downside of the 70 gig is now, instead of having a cell tower every few miles, you’re going to have to have it every block. So it is economically good to put that in New York city, it’s not economically good to put it in Martinsville, Virginia. So, there has to be the sorting out of what the spectrum is that they’re going to be implementing 5G. And at the moment, everyone’s just, “Oh, it’s going to change the sporting experience.” And yes, it is. It’s going to change the sporting experience in the download speed. So that’s if you’re sitting there in the stands, you can get at your phone and you can look at the game being played in somewhere else, or whatever you want to do.

Peter Larsson (33:25):
What they’re not really throwing the money into at the moment is the upload speed. And in our specific world, you’re going to need more and more upload speed to do what you need to do. Now, I know sports venues are getting a lot more interesting what their WiFi that they have very narrow antennas that point down, they’ll only hit up to five rows of seats at a time, et cetera, et cetera. But one of the issues wearing into whenever we use a bonded cellular style system, and again, it’ll look great in 99% of the venues you’re going to go to. But you go down to preferably one Charlotte Motor Speedway, you get 190,000 people in the stands, and they all have one of these things. And it really causes a lot of issues with the spectrum availability. So, as you’re doing these more and more and more critical shows, you need to have control of the spectrum that you’re using to make sure you can get the video through.

Peter Larsson (34:26):
So, yeah. We have this constant battle at work that people are saying, “In 10 years time, we’ll just be writing T-Mobile a check and we’ll be using all their spectrum.” We may get to that point. But as I said it earlier, in our industry, we don’t have the financial backing to go and fight the big cell phone companies. So if I take it to level, that’s what we’re going to have to do. But in the foreseeable future, we do like having control over the spectrum that we have.

Ken Kerschbaumer (34:58):
Great. Janel, from your perspective, I’m sure you must get this question all the time. Right?

Janel Fleming (35:03):
Oh, yeah. I mean, we’re following the carriers closely with their 5G network roll outs. I mean, specifically to what our units will be compatible with right out the gate, is all the sub six 5G networks that are continuing to be rolled out, which is less of a need for the shorts cell tower locations, but there’s still a need, right? So, I think that the 5G upload speed is a great point that Peter made. It’s not something that is talked about, but it is something that they are so focused on download speeds because it’s a consumer driven deployment. It’s not a business-to-business business driven deployment. So we’re definitely going to see that. And I think having bonded cellular as how you’re using the 5G network is even more critical, because you’re going to get better latency, you’re going to get a little bit better bandwidth upload speeds.

Janel Fleming (35:54):
And it’s going to allow you to do stuff. Even with LTE and 4G that you started using your cellular bonding units more frequently, especially when HEVC rolled out, it’s 60% of the bandwidth needed for the same video quality. So, now you’re going to see customers using us with HDR, with 4K higher quality stuff. And they’re going to be riding more on this connection, right? Because really right now you’re looking at a video transmission, but you’re also looking at a data connection as well alongside it. So you’re going to see more tools being utilized remotely over IP and the public internet. So I think that we’re going to have less concern over what’s the bandwidth condition of a given location and think more about what softwares can we now deploy? And what kind of tools can we start building that leverages and takes advantage of this new connectivity that we live in the air?

Ken Kerschbaumer (36:46):
Excellent. Excellent. Jim, what’s your take on 5G?

Jim Jachetta (36:49):

New PRO3 5G and AIR 5G

Yeah. To Janel’s point that what’s out there today is sub six gigahertz. So, everything that Peter and Janel has said, that they’re concentrating on the download, we all need the upload for production. We’re working right now with several of the carriers, we’re under NDA doing some tests with their millimeter wave 5G connectivity. And I’m sure Janel gets asked this every day. “Well, why do I need bonded cellular? 5G’s going to solve all our problems.” Same thing Peter said. “I don’t need you. I don’t need you.” But the 5G is either going to be more limited in range. I live in Southern California. They are always adding lanes to the highway. And the second they open the lane up, it’s just as congested. So just because they’re opening up more lanes, more bandwidth on the cellular, it’s going to be gobbled up by consumers, and then we’re going to be left with the crumbs. But the carriers are talking about, giving us broadcasters a service level agreement or dedicated bandwidth or a slice of the bandwidth.

Jim Jachetta (38:04):
And then to Peter’s point, the millimeter wave stuff, the ultra short wave length, we designed and fabricated the early in that goal cameras for the NHL using 60 gigahertz technology. So, we know it very well here at VidOvation. And it’s extremely, extremely directional. Now, there’s techniques in the millimeter wave where phased array, where there’s beam steering. So the bandwidth will be even targeted. You mentioned, Peter, with WiFi targeting a couple of roads in the venue, there’s even tech where you can get it to the person with beam steering. So there’s a lot of cool stuff coming down the Pipe.

Jim Jachetta (38:50):
But today, one of the big differentiators with Haivision is, they make sure that when a new band comes out, when a carrier cuts a deal with the FCC or they buy spectrum, Haivision makes sure we can see those bands. I don’t know about that specific NASCAR racing that you talked about, Peter. I’m not sure. But if there’s cellular available, even as a whiff of it, the Haivision tech has a tendency to grab it. But we can’t make a connection out of thin air. We don’t have a magic wand. Janel, do you have a magic wand where you can make a cellular tower up here in the middle of a football stadium or…

Janel Fleming (39:40):
I’d like pretend it.

Jim Jachetta (39:43):
Yeah. But I don’t think we’re out of a job quite yet, Janel and Peter. I think we’re not knock on wood. I think we’re okay. Our customers are still going to need our help.

Ken Kerschbaumer (39:57):
Sure, sure. So the last question, looking to the future, hopefully we’re going to have more 4K, more HDR. Peter, what’s your sense as far as the capabilities and the available bandwidth and spectrum for a really massive 4K, HDR show? Are you feeling in a good space? Will it be a challenge?

Peter Larsson (40:16):
It’s certainly doable. As we said earlier, it’s going to cost a lot of money to do it. We have to go much higher in frequencies, we have to put in a little more receive sites. So the operational expense is going to be considerably different. Either that, or we write ATMT a huge check to use their systems. So, it’s not going to get cheaper, it’s just going to cost more money. But the capabilities are out there. We just need the marketplace to push us in that direction to say, “This is what we need.”

Ken Kerschbaumer (40:49):
Right. Got you. Janel, you mentioned 4K [inaudible 00:40:51] earlier.

Janel Fleming (40:52):
Yeah. So our units have been able to support 4K for a number of years. And the LU800 specifically supports 10-bit HDR which has become, I think, more of the demand recently than 4K, especially if you’re considering colorimetry and how you’re producing the event. If you’re producing an HDR, having a transmission solution that supports 10-bit HDR is critical. And we’ve already seen it being out in the marketplace now for a few months. And I only imagine it’s increasing. I mean, we’ve got a really cool use case coming on for the Red Sox. With their sharing of a feed, they’re going to have a LiveU unit. And LU800, that’s going to bring in a 4K for their own game production in addition to what the local producer is producing. So, you can certainly see it in the video quality with HDR. 4K, I mean, it depends. I think we’re all looking at the AK camera for the Super Bowl game and try and figure out what was the difference. But I think HDR is going to be the real critical element.

Ken Kerschbaumer (41:56):
Excellent. And, Jim, take us home. Let’s [crosstalk 00:41:58]-

Jim Jachetta (41:58):

Haivision rack series of HEVC H.265 and H.264 field encoders

Yeah. So, our partner, Haivision in their HE Series, they have a 4K 4:2:2 10-bit HDR solution. We like to think our customers tell us that our flavor of HEVC makes beautiful pictures. Some of our customers are seeing a 30% savings due to extra efficiencies in our HEVC codec. Fiber connection is a 30% data savings, a big deal, probably not. Your bandwidth may not be metered. But in the cellular world, every kilobit, every megabit, every gigabit, there’s a cost associated with that. So, if you can make beautiful pictures in HEVC 4K, but also be efficient at it, that can be important.

Ken Kerschbaumer (42:57):
Excellent. Great. Well, thank you so much for your time today. Really appreciate it. Good to see you all. I’m glad everyone’s doing well. So, hopefully we’re all vaccinated by the time we next see each other. So that’s the key.

Peter Larsson (43:06):
Exactly.

Janel Fleming (43:07):
Hopefully. Thanks, Ken.

Jim Jachetta (43:09):
Thanks, Ken.

Ken Kerschbaumer (43:10):
[crosstalk 00:43:10]. Bye.

Jim Jachetta (43:10):
Thanks for having us.

Janel Fleming (43:11):
Thanks, everyone.

Jim Jachetta (43:11):
Thanks, everybody.

Peter Larsson (43:11):
Thank you.

Janel Fleming (43:12):
Take care.

Jim Jachetta (43:12):
Bye-bye. Stay safe.

 

 

Continue Reading