How 5G Technology Simplifies Live Remote Broadcast
Join AVIWEST as we discuss the importance of Remote Production.
Topics of this live webinar will include:
- The Challenges of Broadcasters
- The At-Home Production Benefits
- How 5G Technology Simplifies Live Remote Broadcast
Neal Metersky (00:04):
Hi, I’m Neal Metersky, USA Managing Director for AVIWEST and my counterpart here is Samuel Fleischhacker, he’s in France this morning, I’m in Las Vegas and whatever time zone you’re in, either a good morning or good afternoon. And this is we hope to be the first in a regular series of short, brief, high level presentations to our audience and our customers and to the marketplace, of different things that we have going on. Things that are applicable to the bonded cellular arena, the REMI arena and different, whether it’s a technology update, use cases and just to be more proactive and being engaged with our audience and our customers. So thanks for joining me, Samuel.
Samuel Fleischhacker (00:57):
Thank you. And good morning, good afternoon to everyone.
Neal Metersky (01:00):
Yes. Great. And our goal again is to be brief and relatively high level, but give some real information as well. So today we want to talk about remote production, 5G, how the two are being influenced by each other. And I’m going to start with a very brief but a little story of, years ago when I was young, I was an EIC on a production truck and I was 24 years old. And we were an auxiliary truck for ABC Monday Night Football. And this was back when Monday Night Football was it, the epitome of sports broadcasting. And one of the engineers took me under his wing, I guess he liked me. He was talking to me and he said, “Hey kid, do you know what a remote is? Do you know what’s a definition of a remote is?”
And I took it serious, I didn’t want to seem like a stupid kid. And so I was like, “Well, yeah. It’s when you generate a show from a remote facility, not within studio confines.” He’s said, “Yeah. But the real definition of a remote is there’s a remote chance it might all work.” And especially over the years, I’m sure all of you who work in remote production have found that to be true to one extent or another, at that time that we were still using tube cameras. So the chance of everything working was much [inaudible 00:02:25]. But one of the points I want to make besides just nostalgia, is that remote production, especially the types of remote production that we’re trying, not trying, that we are doing today and looking to do more of and better requires robust connectivity, reliability, and performance.
And we’re at the point where those can come together with the technology that we use, where REMI production at home is a reality. Some people on the lower end are doing it at home literally. But for the most part at home REMI to our world means, in a centralized control room. So all sources or as many sources as possible, are simply brought in as remote, rather than a full production on-site. So that everything comes in as remotes and it’s produced in a central location.
And what are the benefits? Why? Almost for the entire time that I’ve been associated with the broadcast industry, it’s been in phases a harder push at times than others, but constantly, how can we save money? How can we do more with less and do it just as well? And so some of the benefits are less people on-site, travel costs, smaller crews, the time to set up. As we will show you, some of the applications are very easy and quick to deploy. Things can be quick set up and you can be up and going in a much, much shorter time. And then of course, both of those combined with other factors say money, and we’ve been bottom-line driven for a long time and we’ll always be bottom-line driven. And the more efficiently that we can produce content the better we will be, and hopefully the more things that we will be able to do. Next slide.
So two basic architectures. And these are very, very basic, very high level, and there’s multiple ways of deploying REMI solutions. One is full REMI where you’re producing, switching, editing at the studio, everything comes back as a remote. Obviously to do this, all sources need to be synchronized with lip-sync and Gen-lock and timing, while we do it at the coder end, but everything needs to come in fully synchronized, easy to use, ready to go, and your sources come in and you can switch your show remotely. A hybrid approach is also splitting it, deploying a smaller truck or a trailer, or even the full production truck, and bringing all of the sources back to the truck as you would traditionally and then back-hauling it using a 5G. With remote control pipes and tying the two together with 5G pipeline. Next slide.
And one of the things that I find it exciting and having been both in sports and news and other TV businesses, I almost see it as where now, it was a totally different animal to deploy six news cameras to a breaking story and bring those back and find microwave paths and different ways. And it was shot, switched as always and the newscast will switch it in the control room. And that really is the most basic traditional type of REMI. But now for sporting event or multiple events for multiple venues, it’s the same thing. You can just simply deploy a camera crews with our encoders to a single venue or to multiple venues. And again, as long as everything is locked, synchronized, lip-sync, Gen-locked, which we do as the field encoders, everything comes in here ready to shoe shop. So different applications. And Samuel, I’d love for you to go a little bit deeper. We show the 5G cloud in the middle here, share a little bit.
Samuel Fleischhacker (07:04):
Yeah. In fact, in all you introduced Neal, there is one important point is of course that the solution that needs to rely on… The beauty of the solutions is that it relies on what we name unmanaged IP networks, and especially on cellular networks, so already existing infrastructures. Today, we already provide solution for 4G networks, but with the 5G and I will elaborate a little bit more later, it opens much more doors thanks to a lot of benefits that I’m going to describe. But let’s say that thanks to the 5G networks with our solution, you can first, deliver the video stream from the field, from the venue to the studio. You can also control remotely from the studio your cameras, as you described, but on the top of that, you can also have inter communication, which is not described here, but you can of course give some command orders from the studio to the field and inversely.
And with our solution, you can also transmit a video return from the studio to the field. For instance, for tele prompting or for delivering the on air signals for confidence monitoring. So everything is possible thanks to the 5G and our solutions. And it relies of course, on the cellular networks, but not only, because we need to add our own source on top of it. Which is a [inaudible 00:08:46] technology named SST, which stands for Safe Streams Transport technology. Which offers many features such as, what we named the network aggregation or the network bonding. It means that all the data video, audio, video return and remote control are not only transmitted over one cellular network, but over multiple cellular network on just the same time. So our system measure and monitor in real-time the quality of the different cellular networks and optimize and spread the transmission over all of them.
If one is going down, another one is used and inversely. So it guarantees a high level of quality of services for all type of streams. We manage of course, packet retransmission, error correction, we are balancing as I described the Bit rate. And we also manage network priorities between the different networks. For instance, in France as well as any country there is multiple operators, and some TV station have some special agreement with special carriers. So they will say that, they want to use as priority first this carriers because they don’t pay so much for the data, and the other ones are used as backup. So our system is able to manage this directly. Now let’s look a little bit more in depth about our solution. So I give you the floor Neal, to describe our transmitter solutions.
Neal Metersky (10:27):
Yes. And the robustness of the SST and the transmitters is, as I said one of the key things that facilitates REMI. You couldn’t really do a high quality REMI production, if you can’t rely on the link. And our transmitters, what we’re showing here is we have our PRO3 and our AIR, both of which are fully capable of 3G, LTE, and 5G. The 360 that we’re showing is six modems and the AIR, which is AIR 320 and 220 is a two modem device that we can extend up to, I believer it’s what, 11, 12 total external modems with antenna pods, but they’re expandable as well. One of the key things though that we deliver with the six 5G modems and differentiator is, all six of the modems are 5G.
So all six of those modems can connect to either a 5G signal or an LTE, or if you have to a 3G band, and we can select which bands that we need to connect to, so there’s a lot of flexibility and a lot of power. The fun factor, AVIWEST has always been pretty much the first, one of the first, that everything with the initial design was camera backed for a production and, or potentially news environment. A lot of people in some markets such news, especially like to you use backpacks. So we can be configured in a… It’s a very robust unit. So it’s not just the connectivity where we have the robustness, it’s in the antennas and the modems and the unit design, it’s a sturdy box.
Can be deployed on the camera back, in a backpack. The AIR can be deployed on a pouch. It’s got a quarter 20 connector, it can be on tripods, many different applications. And coming soon will be our PRO4 series, which will have all of the same capabilities. But we’ll have the additional capability of either a UHD signal or four multi HD signals. The 5G rollout in the US, a lot of people will say, “Oh, it’s just marketing.” And Samuel has a lot of experience worldwide with the rollout, which is a bit different than in the US. But the carriers, the marketing part is somewhat true, but there are real 5G deployments in the US being rolled out, they’re called non-standalone. Where it utilizes 5G for the data path, and the phone call, the actual connection is managed through the LTE infrastructure. And all of the carriers have different flavors within the sub six band. There will also be what’s called standalone within the sub six.
And that is a pure 5G deployment, where the connectivity, the phone call is established and the data connection and the data flow is all through 5G. But one other area that I’ll let Samuel expand on a little bit, that is both a challenge. And in the long-term I think is going to be one of the real benefits to the REMI production world, is the millimeter wave band, which we show here in the 24 to 30 gig. And well that is where some of the US carriers, and I’m not sure overseas, but you have some very interesting things going on at the Olympics, I know Samuel, I’m not sure how much you’ll be able to talk about. But that is a different type of deployment where the sub six is more similar to LTE, where it’s everywhere. But in the millimeter wave, it will require different antennas, different modems, you won’t be able to share across the band. It’s up, it’s very high frequency, limited distance, but it’s almost like a hotspot connectivity with extreme bandwidth and extremely low latency.
Samuel Fleischhacker (14:42):
Yes. You summarized it very well, Neal. I just need to add that today our solution are of course sub six gigahertz compliance, for non-standalone and standalone networks. So they are already full compliant. And thanks to our solution, you can let the system manage the selection of the generation, 3G, 4G, or 5G in automatic mode. Or you can also tell it directly, “I want to be in 5G and only in 5G.” Of course, in standalone mode only. So you can already configure that. For the millimeter waves, yes of course our investigation is that millimeter wave probably will be the technology that will equip stadiums, some Formula-1 arenas and so on, with fixed installation to allow very high Bit rate and ultra low latency. And we’re working on that topics for the next generation of solution and especially your dedicated external cameras.
Now, if we look at the benefit of the 5Gs, because sometimes it’s summarized by more Bit rate, lower latency and better quality. So of course, but what is behind that and what is the interest for the video activities and the remote production? So better Bit rate, so typically it’s 10 times better than 4G. So better Bit rate means that you can transmit much more feed, or you can transmit ultra HD where you could not do it safely right now. Better latency means that you can have much more equipments in the same area, which are 5G. So we can have every thing in 5G, our transmitters, but also the microphone, also headset, everything, so it is possible with 5G. Better latency, Of course, when you are performing and achieving a real-time system, you want to have the lower latency offered by the system.
So on the network side it’s one milliseconds, so it will never be a glass to glass latency because we have to add on the top of that, the encoding times, the decoding time and so on, but the latency the network itself is significantly reduced. And the next point is what we named Mobile Edge Computing. It is the capability to host inside the core network of the carrier, the video processing that today you do on your premises, in your studios. So you will move your studio inside the carrier networks to do the jobs. And thanks to that, to the radio part with the low latency and the mobile edge computing, you can have ultra low reactivity and latency for your systems. And the last point, it’s a better quality. So with 5G you can define different level of quality with the quality of services layers, and also what we named the Network Slicing.
So the networks slicing, you have to imagine that it’s like an highway. Typically, today with 4G you’ve a highway that is used by everyone. With 5G, you will have dedicated pass, the slice, which are named the slice, for which you will rent for instance, the slice, and you will be the only one to use it. So it means that when you are performing live in the stadium, you will not share the same bandwidth with any consumer with a smartphone. So you will have a dedicated road for your live, which will have no artifact, no impairments, thanks to that. So this is the technical benefits of the 5G network. So by AVIWEST, we made several deployment.
So I’m going to leave France in two days for Tokyo, and I will deploy some 5G systems over there. But we already do some testing, proof of concept and deployment with Orange in the French tennis tournament. With Telefonica in Spain for the final cup of basketball. In China. In Italy, with Telecom Italia. With Bouygues telecom in France for our remote control system. And with TDC in the Nordic countries in Europe. So the system is fully operational and works pretty well. Now let’s look on the different type of streams involved in the remote production. And Neal will explain this to you.
Neal Metersky (19:35):
Well, the pieces that need to come together are first and foremost we need the live transmission, and the robust and low-latency connectivity of SST over 5G. Next then we can facilitate once we’ve established that connection, utilizing our data bridge, we’ve established a bi-directional data path back to the studio layer, and we can [inaudible 00:20:07] the remote control again, in a robust environment. We’ve got to get that robust connectivity out of the SST, so it gives us live transmission, camera remote control and video return, and as well as intercom applications. And with the video return, all three pieces come together, we’ve got live, control, video return, we can provide communications. So interfaces for or interfaces to intercom. And one of the things with return video is, we can create an infrastructure in the studio with up to 16 discrete return video sources.
So if you have more than one location, or more than one talent location in the stadium, or if you want to feed cameras different returns, you actually have the flexibility of feeding 16 sources across multiple StreamHubs, back out to multiple fields encoders. So there’s a lot of flexibility there. And that’s what is all three of those brought together. You can stay there, that’s fine. All three of those are the elements that need to come together and successfully be able to deploy a true REMI production. The setup is easy… Go ahead and bring the whole, bringing everything in. It’s simple enough. But the fields side is very simple. It’s an ethernet connection comes. We have two ethernet ports out of the encoders, can plug that directly into an IP control camera. Set up a static IP address on…
I’ll touch a little later of how it’s really just an extension of your LAN. Video in via SDI or HDMI and return video out by [inaudible 00:22:04]. Selects a data bridge, turn on the data bridge and hit live, and away you go. It connects and you’d have full remote control. Very easy, simple, and straightforward.
As I’ve mentioned twice now I believe, it could be three times, I talk a lot. But what is different about our data bridge is… And the design, the approach that was taken is ideal for remote control, is we don’t peel off two of our cellular connections and use those as a wifi over the public area. We are extending your studio LAN, whether it’s production LAN, or whichever LAN, but we’re extending your studio LAN out to the devices in the field, through our SST. And so it’s just a sub-net on your network. So it’s very easy to deploy, very easy to set up the sub-nets, connecting an agnostic IP system and away you go.
One of the things that we also do is because again, it’s part of our overall pipe. We are carving out the data bridge for remote control over our existing SST pipe. And during live, we reduce to both protect the live transmission and ensure that we have enough bandwidth available for remote control is, we reduce the data bridge capability throughput to more than enough to handle remote control, to handle IP audio devices that may be connected news, whether for intercom or future IP devices. So more than enough data is maintained for all of those ancillary services, but we protect the bandwidth of the live transmission. And then when you’re not live, it goes back. The overall solution, here with one of our partners is CyanView.
And one of the challenges, although we extend that LAN, systems that traditionally operate on that LAN, camera control, things like that, they’re used to being within 100 meters or less of a wired connection. And when we’re extending it out to the LAN, there are some potential additional challenges in those protocols, the handshakes, timeout, things like that. So CyanView smooths all of that out. That’s one of the vendors that we work with. In a deployment like this, we use CyanView with BirdDog cameras, just directly RJ45 in and out of those, fully IP. And used on the camera side, in this case, it’s a non IP camera and CyanView provides a small cigarette pack size converter that we plugged into the ethernet and it brings out [inaudible 00:25:19] control. So although we’re now agnostic to any type of system, because we’ve created that extended [inaudible 00:25:31] connection and can be utilized by pretty much anything. Some will work better than others because of the inherent challenges of extending local area network for the 300 feet.
Go ahead. I’m sorry, you can change. And again, just a little bit deeper look, this is technically, just tally, but it shows the pieces a little clearer. It’s fully tally capable, it can do old-fashioned closers, GPIO, it’ll do TSL protocol and pretty much any tally over IP existing protocols that are out there as well. And with zero camera, it can be facilitated with both feeding the tally into the camera for the IPS, and to the front light and a light on the real box as well. So tally’s real easy to implement. And in a summary, it’s to control multiple cameras, as many as you want really. The CyanView system, I believe is expandable to… Well I don’t want to name a number, but extremely large camera control systems with multiple control points.
Multiple cameras, doesn’t matter what camera. We can do IP cameras and it’s very, very simple and fast to set up and establish. Another company that we’ve integrated with specifically is XD motion, very interesting robotic cameras and both tethered and untethered drones. And next slide shows the type of implementation through their system, is they have a small controller and again, we’re providing the LAN connection. Their controller, their devices, or your controller, cameras that are ready to be controlled. But in this instance, both robotic cameras. And what I find really interesting is we can’t get this in the US right now, but from an aerial drone where the AIR is actually mounted on the drone and it’s flown. Or they also have the tethered drone where that is on the ground. Unfortunately, because it’s actually not the FAA, the FCC prevents, that would be an airborne cellular transmission and we can’t do that. But very, very interesting application that [San Roland 00:28:11] team have done previously.
Samuel Fleischhacker (28:16):
Okay. Thank you Neal. Another example. So we wanted to show you, of course the principle and the benefits of 5G remote production and so on. But we wanted also to give you some example of concrete deployment to make sure that it’s fully understood of course, and to show you what we already did in the field. So here is another example with another type of devices, with BirdDog equipments. It’s an event that took place one month ago in Greece for the final of the cup in the Olympic stadium. And as you can see here, with the AVIWEST solution, the production team were controlling BirdDog cameras during the event through the 5G network. So it was on the 24th of May during the 79th final edition between the Piraeus and Thessaloniki team, which are very famous in Greece sort of the two main teams.
And the event took place in the Athens Olympic stadium, but the production was [inaudible 00:29:29] in the COSMOTE TV’s studios and the transmission was managed by the COSMOTE 5G networks. So it looks like this, so nothing new compared to what we explained. So there were two PTZ at the two opposite side of the stadium, of course this system was a complimentary system of the existing infrastructures. Okay. With the traditional cameras and traditional systems, and I wanted to experience if remote production and 5G was working well.
At the beginning, it was just to do a trial, but it worked so fine that during the live, they decided to switch at some times at some moments to this PTZ camera, because there was no artifact, no problem, it was very smooth. The quality was very high. It was all the time, full HD videos without any problems. So they switched it during the game, as you can see here the position of the PTZ, and on the button part some capture during the event. So that was a first example. The next example took place in US during the COVID time. And I’ll leave the floor to my colleague because he’s probably more an expert on golf than I am.
Neal Metersky (30:51):
A little bit, probably a little bit more, but you’re probably more of an expert on soccer than I am. Yeah, this was actually a very interesting event. I believe, it’s my understanding that this was the first live sporting event after COVID hit. It’s a little over a year ago, back in May, which was soon after everything shut down. And the PGA tour did the first production under post COVID. It was broadcast both over broadcast TV and over the top, digitally online. And it was… Yeah, go ahead and bring it in and reveal the entire slide. And the restrictions, it was very restricted because it was the very beginning, it was a private course and so they limited 50 people on the course, including officials. Between the officials, golfers, everybody.
So they left 28 people for the TV crew to cover a golf event. But it was a shootout much for golfers, and they decided to follow the four golfers around the course with a number of cameras. So there was the production crew with two field reporters at the location, shooting everything back to the control room at St. Augustine, where they had play by play and a production team and it was switched and two analysts. And then also a wrinkle was one of the color commentary, Mike Tirico was not able to travel. So he actually via return and REMI, provided the color commentary from his house in Michigan. Next.
[inaudible 00:32:47] Okay. So basically in a nutshell, as I said, followed them around the course with… there were eight per three [inaudible 00:32:56], so two cameras with green, two on the fairway, two at the tee box, and then there were two others that were used for tracking. And one thing that was interesting about the tracking shot that we discovered, is one of the features that we have that is unique is we have the capability of accepting analog audio inputs, in addition to just embedded SDR. And evidently the tracking, I forget the name of the system, the tracking telemetry is just an audio analog line. So we were able to facilitate that using the encoders as well. But there was a lot on one hall, so there were the eight cameras or they were drone using our 320s, a couple of POV.
Three POV cameras using our AIR 320 and two sound guys equipped, strictly the sound guys, with an AIR around their belt following the talent. And I believe there might’ve had parabolas as well, but sound guides facilitated by the analog audio input. And everything going back, all sources going back totally synchronized, Gen-lock, lip-sync, in-time back to the control room and easily facilitated a real live production with that many cameras off just one hall. And they were successful and pretty happy with the deployment.
Samuel Fleischhacker (34:28):
Okay. Thank you Neal. So here is the last slide before collecting your question, if any, to summarize the architectures that we are promoting by AVIWEST, to make your life easier for producing an event remotely in the venue, but from home in your studio [inaudible 00:34:50] controlling any device, so cameras, CCU, switch and so on from the studio. Of course, basically delivering the video streams, audio and video delivery stream, from the field to the studio in a perfect synchronized way, to make sure that your switching in the studio are fine. And to offer additional and extra services like video return, intercom and so on. So I don’t know if you have something to have Neal on that, or some commitment?
Neal Metersky (35:21):
No, just to summarize a little bit on that is also with the… Where you had mentioned the Intercom. And one of the things that I expect may become more and more prevalent is utilizing audio IP devices. People are utilizing Unity for Intercom, that’s very basic, and other Intercom manufacturers have apps. And again, the connectivity for that is there, but who knows where IP audio capabilities can go and what people will want to be able to use, and that’s all facilitated as well. And the improvement in the networks, especially the lower latency, is what eventually I see the carriers with a millimeter wave type deployment, as being able to replace fiber. Being able to replace fiber lines that are now at a facility and your glass to glass latencies are low enough to be able to do this for tier one events without a hitch.
Samuel Fleischhacker (36:33):
Excellent. So I don’t know if our audience or attendees have some questions, so I let you write them on the Q&A panel.
Neal Metersky (36:45):
Haven’t seen any yet. Everybody’s been pretty quiet. Now’s the time we have a few minutes. I’m glad we kept it again, relatively short, there’s a lot more detail that we can go in in each of these areas, and we would welcome the opportunity to discuss that with you later if you don’t want to ask questions now. And as I said Samuel and I hope, or Samuel may tell me to start using some of his counterparts, but we really look forward to doing this on a monthly basis, with updates and more case studies, to be available, to educate and convey where we are and where we feel the technology’s going. And potentially have solutions that are of interested and can make a real impact on the market. So if there aren’t… Oh, we’ve got one question here.
Samuel Fleischhacker (37:51):
Oh, can you talk about what are the differences between SST and SRT? So maybe I can elaborate on that?
Neal Metersky (37:51):
Samuel Fleischhacker (37:59):
So the main difference between SST and SRT, is that SRT does not manage aggregation of networks. So SRT is fine because it’s an open standard. So it is compatible between different equipments from different vendors. But it’s only for let’s say IP networks, it does not work on cellular networks at all, because it does not manage aggregation of different carriers. It does not manage the specificity of the cellular networks with latencies that can vary from a few milliseconds to several hundred milliseconds. So that is the big and main difference. There is also menial difference, but this is a major difference.
Note that with AVIWEST, we also support SRT, not directly on our transmitters, but the transmitters which is close to the camera is finally connected to a receiver, which is located in the cloud or inside the studio. And in this transceiver, you can deliver your stream on let’s say, uncompressed signal on SDI output, as well as on IP output and among all the supported standouts on these IP outputs, there is SRT. We support also LTMP, HLS, transport stream, but SRT is also available. Which means that our ecosystem is finally also compliant with SRT ecosystem.
Neal Metersky (39:37):
As well as inputs. You talked about SRT outputs. We can also look to set up all of those IP sources as inputs as well. And I’m glad you used the term ecosystem, because that’s what I like to use is that the StreamHub transceiver capability, it’s really an ecosystem. It’s not just a receiver, so we can receive and deploy IP of multiple, multiple flavors. And I don’t know if you want to talk about coming soon capability of SRT generation as well.
Samuel Fleischhacker (40:13):
Yes. We also plan in our product, our transmitter for the end of the year to have SRT output. So you will be able to select SRT output if you are not using the equipment on cellular networks, or SST output if you were to have a LAN services, all that I have described in the presentation. So you will have the choice of both on our transmitters in the near future.
Neal Metersky (40:40):
And because the goal is we see the need for interoperability in our IP. And bonded cellular has traditionally been an area where there is no interoperability. And so we’re trying to be as interoperable as possible and things like being able to deliver SRT directly from the encoder, we’ll open up additional upper paths. We’re good?
Samuel Fleischhacker (41:09):
Okay. [crosstalk 00:41:14].
Neal Metersky (41:15):
Go ahead. I’m sorry.
Samuel Fleischhacker (41:16):
No, I think that there is no more question.
Neal Metersky (41:19):
Okay. That’s what I was wanting to confirm. For some reason I cannot open the question, I was able to earlier. Actually there is one, another one just came in. [crosstalk 00:41:27].
Samuel Fleischhacker (41:27):
What is the max number of inputs and outputs for the StreamHub? You want me to answer this? Yes, that’s good.
Neal Metersky (41:38):
Yes. You’re the product manager, go for it. You’re the expert.
Samuel Fleischhacker (41:42):
So for the StreamHub, for people that don’t know what it is, the StreamHub is our transceiver. So it is the equipment located on the studio side or on the cloud. So for the version running on appliances, so on servers, we support up to 16 inputs and up to eight SDI output and 64 IP outputs. For the version running on the cloud, there is no real limitation, it depends on the dimensioning of your system, because you are extension-ing different size of version, so. But basically keep in mind that it’s 16 input, so 16 PRO3s or AIR or transmitters from AVIWEST that you can connect to a StreamHub.
Neal Metersky (42:40):
Or four AVIWEST and three SRT feeds and-
Samuel Fleischhacker (42:40):
Neal Metersky (42:45):
… two RTMP feeds. And yes, we’re now up to eight SDI outputs, 16… And as you said, up to 64, I guess, on our biggest devices, IP outputs, and those are in addition to the SDI output. So if you have an eight SDI output box, you can still do 16 plus IP outputs as well, in addition.
Samuel Fleischhacker (43:10):
Yes. And again the transport formats of the IP output are very large on our solutions, so it can be [inaudible 00:43:22], it can be RTMP, RTSP, transport stream, SRT, so it’s very large.
Neal Metersky (43:30):
Facebook. We could feed directly to Facebook and CDMs and [crosstalk 00:43:33].
Samuel Fleischhacker (43:32):
Exactly. And for instance, one feed coming from one PRO3 or one SRT input can be split on multiple outputs, so it’s a kind of splitter as well. It can behave as a splitter.
Neal Metersky (43:53):
Right. A better term that I would use having a broadcast background would be a router, because to a degree, it is. To a degree it has a scaling routing capability of inputs to outputs, transcode capability between formats, up and down conversions, so it’s really an IP acquisition ecosystem. Okay.
Well, we’re just about to the 45 minute length, unless we have another question come in here real quick, if not, and if anybody has any additional questions, please feel free. If you go, I believe the next slide has our contact info. All right Samuel, next slide?
Samuel Fleischhacker (44:40):
Neal Metersky (44:42):
[inaudible 00:44:42]. You can reach us at either of those addresses, will come to both me and Samuel, following the webinar. And thank you very much for your interest and attendance and questions. And thank you so much Samuel, probably couldn’t have done it nearly as well without your expertise.
Samuel Fleischhacker (45:09):
Neal Metersky (45:10):
And everybody have a great day.