Select Page

How to Simplify Your Live Production Workflow — On-Site, Remote, and in the Cloud [Webinar Recording]

Last updated Jun 29, 2023 | Published on Sep 16, 2021 | At-home Production - Live Remote Production, News, Podcast, Users Guides - Education, Webinars

How to Simplify Your Live Production Workflow — On-Site, Remote, and in the Cloud [Webinar Recording]

Presented by Greg Macchia of Simplylive and Jim Jachetta of VidOvation

Learn how to produce more of the live content viewers demand without scaling up the cost and size of your live production workflow. Find out how cloud-enabled, software-based production architecture can support and simplify live productions of any size, and hear about how leading broadcasters are doing just that!

Discover how to:

  • Leverage production infrastructure — on-site, remote, or in the cloud — to realize unprecedented workflow flexibility
  • Simplify key production tasks so that a single operator can run a full production, on-site or from afar
  • Adapt your workflow to address health safety measures
  • Create a micro control room dedicated to special events or to mitigate pressure on larger control rooms
  • Use bonded cellular and wireless video transport to deliver feeds from even the most challenging locations
Register to watch Webinar Recording, Download Presentation & Read Transcript
 
 

Make Live TV the Way You Picture It.

When you create a live show, you pull together many sources, make choices, react to changes, and generate an experience in real-time.

The classical way to achieve this is to put bodies to boxes: one human for every hunk of hardware. That’s how it’s always been done — and it’s why live production has historically been so expensive.
Today’s producers know there’s a better way.

Imagine building your show the way you really see it. Just reach out and tap the shot you want to cut to. Cue and scrub slowmo replays with a fingertip. Run your pre-made segments, switch cameras and mix graphics, and step through your rundown — all with a simple, intuitive, easy-to-learn touchscreen interface.

Simplylive

This is what Simplylive enables you to do — with our ViBox platform.

ViBox clears away the minefield of physical buttons, sliders, and faders, so the only hardware you need to touch is your screen. You create your shows intuitively and responsively. We eliminate the tech hurdles — and much of the cost.

When the hardware gets out of your way, you can tell your story more easily. That’s what makes great television: your talent, unrestrained.

As live production moves into the all-digital universe, some tech providers make their tools so feature-rich that they overwhelm the machines they run on. That’s what you don’t want in live TV. And it’s not what we do.

The minds at Simplylive are not new to the live production industry. Our founders created products supporting the most demanding televised events, like the World Cup, Super Bowl, and Olympics. We understand television. And we know that simplicity and reliability go hand in hand in telling a great story.

Our job is to give you the capabilities you need, with technology that is there when you need it. That’s why major players like ESPN, IMG, and Canal+ are using our systems right now in the real world.

Transcript:

Jim Jachetta (00:00:23):

Awesome. Good morning everyone. This is Jim Jachetta, CTO and co-founder of VidOvation. Today on our Wednesday webinar, we have a very special guest, Greg Macchia from Simplylive-Multi-camera Production, Re-imagined. He’s the VP of sales and operations for the Americas. VidOvation has recently signed as a partner with Simplylive, and we’re very eager to integrate this technology with some of our live production capabilities, our at-home production, Remi production capabilities, particularly for contribution over bonded cellular and the public internet and other means. So Greg, it’s great to have you. Let me give you control. Why don’t you introduce yourself, Greg?

Greg Macchia (00:01:17):

Yeah, Jim, I appreciate it. Very kind introduction. Happy to be here, happy to have VidOvation partnered with us and certainly being able to combine our expertise and show what we have to offer and looking forward to building this relationship and doing many good things.

Jim Jachetta (00:01:39):

Awesome. Awesome. You should have control.

Greg Macchia (00:01:42):

Yeah. So we will go. So hopefully we should have the presentation up, ready to go. So just want to talk a little bit about, for those who do not know who Simplylive is, what do we do, what’s our advantage? So, I mean, as many probably know, I mean, for us, we look at what’s the industry like right now for production? It’s really about more and more content, as you used to have your traditional linear channels. Now with OTT and second, third screens, and all of the different platforms and streaming services, it’s really about more and more content, but at the same time, the budgets gets smaller. So there’s always, there’s always a challenge there.

Greg Macchia (00:02:33):

So when you look at, let’s say I would call it the traditional models of production truck or control room. It’s typically, there’s a lot of hardware, a lot of cost there. So it’s not always the efficient way to get that extra content. So for Simplylive, we look to come with a solution. So we look again to do the savings across the board. So it starts with just smaller infrastructure and costs. If we’re talking about a more traditional, we do have a hardware model, but even that, it’s all in one, it’s smaller, it’s a powerful system, but very small. And the people, the people needed to operate the system, we’ve designed it to be very intuitive, very easy to use. So the training is very short and you have a lot of power and it’s in front of you. So because we have video in our UI and we’ll get more into that, it allows less people to do a lot more.

Greg Macchia (00:03:28):

And as part of that, it’s just, the system is easy to set up, able to set up locally or remotely. Again, the training time. So a lot of times, it’s the director or the producer, the people that typically would have operators there with them, that they’re doing the production with for us. We make our interfaces such that they can be the ones doing it, so it’s very powerful in that way. But most important, and I think for us what we do, because there’s a lot of good solutions out there certainly at a very low cost, but for us, it’s about the quality. So for us coming from the industry we want that quality to be the best quality.

Greg Macchia (00:04:12):

So for us, the users of our system have no problem to put it on their linear channel, because of the quality that our system brings, all the core pieces are there. And I think the last point, which is important, is the expertise of the team. I’ve been in the industry for a long time, over 20 years, the founders of the company, we come from the industry, we know the pain points, and we’re, again, trying to bring solutions. And we know. We know the parts as you get into cloud and all these different things, security and infrastructure, and all of those pieces, we’re very involved and going really deep in the technology part of it because it’s very important.

Jim Jachetta (00:04:55):

Right. One thing too, Greg, many of our customers, when it comes to replay, I know you come from EVS, a lot of your team are former EVS people, not everyone can afford an EVS. And then, because it’s a complicated system, you need a highly trained operator, then there’s a shortage of operators. So like you said, you don’t have to have a TD and EVS operator. The producer can produce the show themselves. It’s that simple. Right?

Greg Macchia (00:05:27):

Absolutely. And then a lot of what we’ve been doing, that’s been the scenario. I mean, there’s TDs that operate this system. There are certainly EVS operators that can use it. We have a remote controller of when we talk about replay, but it is, at its core, designed to work on a touch screen. So you’ll see in the video that we show later, I mean, I’m using the touch screen. I have the video in front of me, it’s all there, it’s all intuitive and it’s touchscreen. So I call it the next generation.

Greg Macchia (00:05:59):

Today’s generation, they’re used to touch screen, touching, swiping. So we’ve built that in, and for somebody that doesn’t know controllers, isn’t used to a big, large switcher panel screen is in front of them, they see you touch, and it allows you to very easily produce. And we design it. It’s not meant to be like a switcher. It is, of course, but the way that it’s designed, I mean, a big switcher can be intimidating. For me, I would have never tried the switcher show, but with our system, it’s certainly less intimidating and very easy to do. And actually early on, I did do that, which I would have never thought about even a small switcher, I wouldn’t even go near it.

Jim Jachetta (00:06:41):

Right. Right. Well then you don’t need the big multi-viewer in the front of the room or in the truck or you don’t have a truck, you don’t have room for a multi-viewer, and you don’t have the personnel or the budget for a traditional push button production switcher. So it’s kind of like switcher, multi-viewer all in one, right? All with a touchscreen.

Greg Macchia (00:07:03):

Exactly. I mean, it’s there, so it’s there in front of you and you don’t need that, but you talk about multi-viewer and we’ll get into that. I mean, multi-viewer is also a component of what we do. Actually, it’s web based. And we’ll get into more details, but because of this remote, you might have maybe a replay producer or part of the production team that isn’t next to the operators. We offer a web multi-viewer that they can have access from a web browser to see what’s going on. And in a video later we’ll see that as the director is switching, you’ll see in the web multi-viewer that what they’re doing can be followed to that multi-viewer. So for the operator, it’s right there in front of them, they don’t need it, but we do have that tool to be available for those as a more traditional multi-viewer, but again, over the web, and really geared towards the remote or cloud workflows.

Jim Jachetta (00:08:00):

Very good. Very good. We have a video. It’s a few minutes long, but I think it’s really good to kind of give you an overview of where Simplylive is coming from, kind of their business model. Anyone that has any questions, you’re welcome to pop a comment in the question box or the chat box. We also have, I have an SMS, a text message number that I have available. That only comes to me. And if you have any questions now, or if there’s a suggestion for an upcoming webinar, the questions about VidOvation, questions about Simplylive. You can text to 949-755-8881. That’s 949-755-8881. All right, let me switch over. And I will play the nice video. Here we go.

Speaker 1 (00:09:17):

We are Simplylive. We’re the people who make live production more intuitive, more responsive, and less expensive. With our software, you can create live multi- camera telecasts the way you see it by simply touching what you need. Functions that used to be locked in hardware are now at your fingertips. Tap that shot to take it. Queue and scrub your slow-mo replays. Set up and run your graphics and transitions. It’s all there in an easy to learn touchscreen interface. It’s called ViBox, and it lets you work the way you want. The ViBox platform is entirely software driven. It comes in different configurations for different tasks. The various application layers connect with a common software backend. So the same HUD screen can handle live switching or slow-mo replay for referee review or master record, or all of them at once. When the hardware gets out of your way, you can create your show intuitively and dynamically. That’s what makes great television. Your talent, unrestrained.

Speaker 1 (00:10:37):

ViBox is powerful enough to tackle the toughest jobs in broadcasting without dropping the ball. The big players know that, and that’s why they’re using our systems right now. A modest event can be mixed and telecast by one person on one touchscreen with a fast internet, but for larger events, you can network multiple ViBox servers and scale up to dozens of slow-mo channels. But what we’ve achieved until now is only the foundation of where our vision will take us. The jobs of live TV are the same, but the people doing them are no longer [inaudible 00:11:17]. They won’t need to sit together in expensive control centers loaded with proprietary equipment. In fact, they won’t really need to be in the same town. Right now, ViBox software runs on commodity hardware that you can locate at the venue or the production center. But as television moves deeper into the IP universe, the physical location of the talent and the technology fades in relevance. In the future, your talent could be anywhere you want them, In the technology, in the cloud.

Speaker 1 (00:11:56):

ViBox is the natural outcome of the move to IP based television. Flexible, lightweight, less expensive production can deliver more diverse, high quality content at a lower cost. Today’s sports viewer might be watching on a big screen TV in the living room, or on their phone in a snack bar. However they receive it, they come to expect more choice. The audience isn’t getting bigger, the demand for more content is. Whoever delivers what the viewers want wins. That’s why what we’re doing is so important. The minds at Simplylive believe in offering solutions that are fully future imagined. Ready now, ready for what’s next.

Jim Jachetta (00:12:44):

So I guess, Greg, that says everything. We can wrap it up.

Greg Macchia (00:13:03):

Exactly. It makes my job easy, right? That covers it.

Jim Jachetta (00:13:08):

I should also say that go to webinar uses extremely, extremely, extremely low bit rate and very low frame rates. So any kind of stuttering you saw on that video, it’s the go to meeting tool, not anything that that Greg’s doing, or I’m doing, just want to throw that out there. You should have control back, Greg.

Greg Macchia (00:13:31):

Okay. So yeah. So just a little bit more. I mean, again, the video hopefully gives a good basis. I mean, it is on our website, so certainly feel free to see. But so again, so our approach, I mean, we’ve talked about there a bit, but for us, I would call it disruptive, game-changing. I mean, for us, we want to bring a change. We want to bring a new way of doing things, and so that’s really been the way that we’ve approached it. What’s very important, it’s software, software architecture. So when we talk about these remote workflows and we talk about the cloud for us, it’s not a dream, it’s been designed that way. So for us today, I mean, we have installations, we have real tests that we’re doing in cloud, Amazon, Google. So we’re able to do that because it’s software.

Greg Macchia (00:14:22):

So you have to architect the instances correctly, but certainly it’s very easy to do. So as part of that, being able to work with all of the formats of course, SDI, NDI, but then for streaming and remote workflows SRT, RTP, UDP. So all of these are built in as IOs for our system. Of course, as I said, migrating to cloud, and then again, the way that we designed the interface and be easy to use and collaborative in the sense that as you see the different user interfaces that we offer, it can be a single user, again, doing the full production, or it can span to multiple users working together, kind of your TD director and replay operators or for review. And we’ll get into some of those details.

Greg Macchia (00:15:22):

So now getting a little bit into the… There we go. The architecture, so how the system fits together. So first, we start with the back end. So basically that’s your server engine, your IO that manages the core of the system. And now where that runs, it can run on more traditional hardware. So we can offer it as a hardware option that can be scaled as needed based on channels and users, and I’ll get into a little bit more detail. But more importantly, we can run that on an instance in a private data center. So we have partners like [Tod Tod 00:16:11], that has their data center infrastructure, that we can just run our software in that scenario. And again, as I already mentioned in the cloud, so we’re able to have that again, because it’s software based, to be available there in those different places.

Greg Macchia (00:16:27):

So that’s the core, that’s your engine. And then on the front for the user, we have the different software user interfaces that control that server. So we started with the ViBox all in one, which again, they could be a single operator doing all of the live switching, the replays, the keying of graphics, the mixing, well, the audio integration. So all of that can be done from a single user. And then we have the replay. So the ViBox replay, which can be a standalone just for replay and a large replay infrastructure, or it can be secondary users in that all in one where you could have more than one user breaking up the workflow. Then we have a user interface specifically, it’s called Ref Box for referee or official review or coaching. So again, the UI is designed specifically for that application, pinch and zoom.

Greg Macchia (00:17:23):

So design again around the analysis, and then we have the BMR, which is ISO recording. So taking your ISO, being able to stream them off to storage and mass or removable storage for post-production. And for the IO, again, we talked about it, but being flexible there in the IO, if you’ve got hardware, doing 3G or 12 gig for your STIIO, able to do NBI, or then the streaming options, for example, if we talk about your bonded cellular solutions, where you might that have on site, send those streams back, and that could be on hardware, it could be in the cloud, it could be in your data center.

Greg Macchia (00:18:05):

And then, we stick to the core of what we do, and then we give easy integration to the other pieces. So graphics, there’s a lot of great graphics solutions out there. So we allow you to integrate your graphics engine over NDI. So we’re not tying up those precious IO channels. We have specific connectivity over NDI to be able to bring a graphics engine in, and then for audio, same thing. We have embedded audio in and out of the system, but we can integrate over the standard Dante or AS67 to be able to integrate with an external mixer to do full audio mixing and kind of get the best of the full setup.

Greg Macchia (00:18:50):

So now I’m going a little bit into the hardware, again, just showing the scalability of what we can offer. So we started with, call it affordability, and for us, an entry level, as far as the cost. Just keep in mind, the software. Again, the software that runs on this platform, which we started the menu. So this is designed to be portable, designed to be a single user, and getting it a price point that can be entry level for, again, an all-in-one high schools or universities might be, again, because of a budget constraints, they might use this hardware platform, but again, the software is the same, and we just limited by channels. So it starts at micro, which would be a one in, one out, all the way up to an eight channel. So even in the mini, you can do a six in, two out, replay a single user that would be comparable to the full system capabilities. It’s just about the piece of hardware that’s running here in this instance.

Greg Macchia (00:19:53):

And then we scale up to the next level, which we have a 4U server, and here we get into multiple user, networking, rate storage, redundant power supply. So again, the cost goes up a bit, but still, very affordable considering what we’re offering and having the ability to be up to 1080P, multi-channel all the way up to 16 channels in this configuration. And then we have UHD, of course. So there, we can go up to 12 channels in UHD. So we kind of have two platforms that we knew there. Could be standalone, single server, that’s up to eight channels, or we can scale all the way up to 12 channels in a 2U chassis. So again, showing in a more traditional sense, hardware, but keep in mind, if you look at this 4U server, that’s the server, that’s the engine, that is your interface for your IO channels. That’s where you can connect your graphics into. So we’re not talking about a lot of kit, a lot of hardware, to be able to do the things that we’re talking about.

Greg Macchia (00:21:10):

So now we’re going to jump into the applications and what we do on the user side. So we talk about the all in one. So this is an example of the user interface. And just after this, we’ll actually have a video where I’ll get into a little bit more detailed just to show the operational part of it. But here, this is the all-in-one. So this would be the screen that the single operator is able to do the live cutting, the replays, the keying of graphics. So you’ll see, maybe it looks a little busy, but certainly it just shows that his design, we’ve got colors, and we’ve got a very intuitive interface to be able to operate if you want to do all of the control.

Greg Macchia (00:21:59):

And then just to show, again, that we changed the UI based on the role here, we call it ViBox live. So again, it would be around a production, but here we simplify it where this might be more where replay as are necessary, where you just want to focus on your live sources, cut live sources, maybe play in pre-produced for a studio show or something that’s corporate, or this could be the UI that I use as the TD director when I’m working with a second or third operator that are doing the replay.

Greg Macchia (00:22:37):

And this just shows a little bit more detail on the different areas, but I won’t get into too much here, because we actually have the video, so what we’ll do here, so we’ve prerecorded this so it’s a bit easier for the webinar here, but I’ll walk through the operation of the all-in-one. So this particular interface, so you get the idea of kind of the operation of our system, would be, I’m the operator. I’m going to be doing the live production that allows me to do all of it. The live cutting I’ll show, the replays, I’ll show how I can key on the graphics and give everybody a good idea of how that works.

Jim Jachetta (00:23:25):

Okay. I should have a video. Yeah. So this is prerecorded. Greg will talk over the video. And again, apologies if it’s a little choppy, we’re kind of single digit frame rate here with go to webinar.

Greg Macchia (00:23:38):

So here you’ll see, so if you watch here, the program. So for me, I see I’ve got my program in preview, and then I’ve got my live and replay multi-viewer. So I use the live for the switching of live, and then I have my same inputs for replay. I have my asset management up in the top left. So clips, graphics that I’ve pre-imported, that’s where I would go for that. Then I have a playlist area. So again, as I’m going through live production, I have clips. I want to build a rollout or a highlight package, I’m able to do that. I have some audio controls. So there, if I’m connected to a smaller audio mixer, I can trigger snapshots or control what audio I’m listening to. And then I have graphics area. So there, I have three keyable layers of graphics that could be pre-produced graphics that I’ve imported in, or more importantly, again, I could have up to two NDI engines there.

Greg Macchia (00:24:32):

So now you’ll see we’ve switched over and I’m going to show you kind of how I operate. So you should see this is showing a four camera, and there, maybe it’s hard to see, but under the red, I just tap the video that I see, and I’m cutting it live on program there in the middle left. So you’ll see, I tap and I cut the camera angle. Under the green, I can tap under the green and it puts it in preview, and then I have manual transition. So I can do a dissolve, for example, or different kinds of wipes that I would bring in. Otherwise I tap on the red and I can switch the camera.

Greg Macchia (00:25:06):

Same idea for graphics. So you see there on the graphics, if I tap under the red, it keys that graphic on and off, I can tap on the green. It comes to me in preview, and then I can effect over with that graphic. So normal operation there, I’m keying it on and off again, with the same idea. Red always represents program, and green represents preview and what I’m doing there on the screen. Here, I’m showing now, that top right, you’ll see the external graphics engine. This is just showing how we integrate with the NDI. So this could be a separate graphics operator on the [Viz Chiron 00:25:46] AJT system. So they are controlling that separate graphic system. And you see that because I have it keyed on, you see that the multi-layer NDI is available to me there.

Greg Macchia (00:25:57):

Top left, I’m showing how I have access to the assets. So this could be a pre-produced promo that I’ve imported into the system. So obviously we can manage clips on the import and output. They are just showing easily how I select, I tap on green and brings it to preview. And then I can select when I bring it back on air. And then again, I can tap my live cameras and I’m back to the live production.

Greg Macchia (00:26:26):

Now you see that bottom row there? So the same for cameras, but now you see in the bottom right, I have touch control. And you’ll see now that I’m controlling those sources. So those sources are being looped, record. And when I’m ready, I can tap a replay. You see it automatically triggered a replay wipe, and now I can control the speed. Maybe it’s a bit hard to see there. Now I affect, and I went to a second angle. Again, I tap under the red, I tap one under the red again, and you’ll see that I quickly triggered three replays. Now I’ll tap live. I get my live replay wipe, and now I’m back to live. So you can see how quick and easy that…

Greg Macchia (00:27:03):

And now I’m back to live. So you can see how quick and easy that is. Here, just showing too that when I have replays, I can cue at different points. So let’s say it was a big shot. You’ll see in preview, I actually put a replay angle of the reaction. I trigger first replay. While that replay is going, I can still control my sources so I can go at a different time code for the replay. Then from preview, I tap that reaction and I come back to live. So just showing really the speed and how I can do all of those actions quickly within the UI. Here, just showing again what you might need in a live production, just showing picture-in-picture. So we can create these PiPs and allow you to do a toolbox there and be able to switch in those boxes.

Greg Macchia (00:27:54):

Now, just looking there, you see again in the library area, those are clips. So as I’m going along, I can be creating highlights of the different actions that happened. Here, I’m showing I quickly selected multiple clips. Maybe I want to build a highlight package. I select them. I enter them into a playlist and I can build that playlist. Now, I have it ready to go. So these are maybe six or seven clips between breaks. I’m able to obviously edit and move them around. And now just showing, I’ve built the playlist and now I’m rolling through. It’s hard to see here, but within my UI, I would say, “Okay, this is clip three or four.” I’ve got a countdown so I can really watch what I’m doing and when I’m ready, before I get to the end of the playlist, I simply go back to live, and I’ve gone through and rolled the playlist. So I think that’s all of those features.

Greg Macchia (00:28:52):

So I know that was quick, but hopefully, it just at least gives you a sense of how I’m able to go through all of the basic requirements of the live production. Again, cutting the cameras, being able to do different effects between them, doing the replays, keying the graphics, building highlight packages. So these all are the main things that you need to do. Hopefully, that was…

Jim Jachetta (00:29:25):

Yeah. You should have control, Greg, I think.

Greg Macchia (00:29:27):

All right. Let me back up. That was the all-in-one again. Now, just going through the other users. This is for replays. So showing here the UI, it’s the bit different where I’ve got my sources up top, I have a multiviewer area. And then I’ve got my program outputs and I still have access to clips and playlists, but it’s designed differently. It’s designed around the operation that would be expected from a replay operator.

Greg Macchia (00:30:01):

Here, we get into, this shows a little bit more detail of what we have. You can see here on the top, those are my replay sources. So those would be the ISOs that are being recorded in. And similar to what you saw in the all-in-one, I’m able to go control these all in sync, which I think is a great feature of how our replay works. I obviously have control. I have outputs one and two if I’m doing a traditional replay playback to, let’s say, a switcher, but I’m not only controlling those outputs. I have access to all of my inputs sources to scroll back, review, and then decide what I cue on my outputs to play back.

Greg Macchia (00:30:45):

The other thing I have is layouts. So when we get into a more complicated replay setup, maybe I might have 24 or 36 sources across multiple servers. The way that we design our system is you can build your layouts. So I can have layouts of all different cameras. For golf, it might be layouts based on holes. Or for sports, it might be ISOs or certain areas of the field. So I can build multiple layouts in my UI and as an operator select which camera angles I’m looking at to have available to me to cue up and obviously do replays.

Greg Macchia (00:31:22):

So again, intuitive layout. Replay sources, I select, tap under red and green to decide which area I’m going to cue it to. I have playback capabilities here, marking of clips, my playlist areas and clips. So again, you start to get a sense of our design and how it looks and how it works.

Greg Macchia (00:31:52):

Then next, we have, we talked about the ref review. So here again, you see a little bit of a different UI and this is for review. It’s, again, meant to be very simple. This could be the official themselves that is controlling it to be able to do the review. Here, we can go to how that layout works. So here, you’ll see eight angles. Actually for the review, we could go up to 16 sources. So if you had 16 sources coming in for a football game or basketball, I have the ability in the UI here to see those sources, again, control them in sync. In any of these, I can determine, “Okay, these four angles.” Select those, go to a different view. I’m able to [inaudible 00:32:42] zoom within any of these. Because again, this is for analysis. When did the ball leave the hand? Et cetera.

Greg Macchia (00:32:50):

Again, very intuitive controls as far as jogging back and forth with a wheel or in the blue area. Like I showed in the all-in-one, be able to play back at preset speeds. Again, source clips and clip and cue library to access clips that you’ve made along the way. Especially for review, a picture-in-picture here, we would use for a clock, for example. So basketball, the shot clock went to zero, to be able to select one or multiple angles along with the shot clock in the view to be able to decide, “Hey, did that shot get off in time?” So again, features specific to what the application is. And again, tied to the same backend I/O server.

Greg Macchia (00:33:40):

Here, we talked about the BMR. So again, showing the UI this. Again, it’s pretty simple, straightforward to show you the sources that are coming into the backend engine recording, and then deciding, setting up destinations if you want to stream off to a [NAS 00:33:55] for example and be able to have that then being accessed for post-production. So essentially our server here acts as a buffer. So again, because we’re doing loop recording, I can take that content that hi-res, it could be [DNx 145 00:34:10], it could be H.264 that we transcode on the way out, and you’re moving that off to be able to have access to full ISO recording.

Greg Macchia (00:34:20):

When we talk about replay or the workflow where you’re clipping, we have export. So as a replay operator, I’m marking a bunch of clips, I’m doing the highlights. Of course, that becomes sort of your melt reel of the action of the event, we use export. It’s just another application that gets connected to the system and you can set up auto backup and archiving so that the operator can determine the shots that they want. They’re connected to some removable storage or a USB drive connected to the server, select the shots they want and be able to archive those and walk away, or be able to have week after week or game after game archives that then I can always import into my system for my production.

Greg Macchia (00:35:14):

Now, again, looking in a little bit more detail of how the pieces fit together. This would be a more traditional on-site and the LAN architecture. So again, this is showing the pieces. On the left, you would have your hardware specific again to the requirement that you have. That’s the backend software running, you’re managing your I/O. And then we connect over LAN. In this case, for the UI. So again, if this is on a campus, this certainly could be, the server could potentially be at the arena or the stadium and the operator could be across the campus in a control room elsewhere.

Greg Macchia (00:35:57):

And again, these proxy sources are being generated by the server and fed over the connection for the operator to do the job and the example that I gave in the demo on how they’re able to manipulate the production. In this local setup, you’ll see, our proxies are, we’re using JPEG, Motion JPEG, and it could be anywhere from 100 to 150 to 200, which in this setup, it’s not a problem. And so when we talk about that, this gives an example of, again, we talk about the scalability of the system. So this could be a mini, a nice little small setup on the left. I’m integrating over USB to a little audio mixer. I’ve got my little laptop for graphics, single operator, I’ve got my UI here that I can control faders for the mixer. I’m doing the production as an all-in-one here. It could be [2 4-camera 00:36:57], a soccer match or whatever it might be. And on the right showing, same idea built around replay. So again, very powerful. It could be six in, two out, and it could be all touchscreen-driven or combination with a remote.

Greg Macchia (00:37:13):

So here, you’ll see a more traditional remote. Again, for those type of operators that are used to that and want to use it, they can have the ability to use both touchscreen and remote controller.

Jim Jachetta (00:37:26):

Very good.

Greg Macchia (00:37:30):

Here, we look at scaling that. I already talked a bit about the different applications. Here, we go to the larger server. So this would be the 16 channel server. Here, it’s a more technical drawing, but showing 12 camera production. I’ve got my program clean and dirty out. Here, I’m integrating with a larger audio mixer over Dante. And then here, you see, I’ve got multiple users. Again, set up in the LAN architecture, but you’ll see each user has their own mini PC that they have access to their job. So the first one would be that live UI where I’m mainly focusing on the live switching. And then I have the replay UIs and I could have, here I’m showing, again, 12 camera production, two replay operators, and of course the graphics as well, being able to do a larger scale production with the multi-user collaborative workflow.

Greg Macchia (00:38:38):

Here, when we to talk about multiple users, those multiple users also could be, again, across those different applications. So here, I’m showing, you might have a traditional switcher. You’re using us for replay into that traditional switcher, but then you also have the official review aspect of it. So here again, we’re taking advantage of the core I/O engine here, and here I’m showing, for example, eight sources. Those eight sources are available to the replay operators that are doing the live production, but these same sources are also, could be available for a RefBox user here that’s doing official review that they have separate control of those same sources. So again, you’re not recording again for the separate application, but you can work with multiple users on the same box in different applications.

Greg Macchia (00:39:39):

Now, we look at network architecture. So again, scaling and growing. Especially when we talk about standalone replay, you need to scale to some of these larger productions. We’re able to network servers. It could be 10 gig or we could put larger [inaudible 00:39:55] and cards in for a larger network, but over 10 gig, we can manage multiple servers. So here expanding, being able to have more inputs for replay to be able to do larger scale replay.

Greg Macchia (00:40:10):

I always like to give real examples. This is many servers. You’ll see this is a real application that we did for the America’s Cup, where we were providing all of the replay. So here, we were recording 42 sources and we had seven replay operators that had access to these sources for doing replays obviously and then play back into the live productions.

Greg Macchia (00:40:35):

At the same time, we were streaming off to NAS storage because there was post-production being done that the highlights were available for them, they were able to send the clips back, and these operators have access to all of this content. So again, I briefly talked about it. But the way that our replay works is a replay operator is tied to specific channels on a server, but after that, they have full access to all of these inputs across the server. So when I’m operator one here, I have maybe my two outputs, but it doesn’t matter to me where the sources are coming from. I create my layouts based on what I need to be doing, the replays that I need to be doing, and everything that I create is available to everybody across the network. So it’s very powerful.

Greg Macchia (00:41:25):

Again, here, this particular setup, we got up to 20,000 clips and almost 350 highlight packages. So just to show the scale of what we’re doing. So again, entry level, simple replay or scaling to this level of replay.

Jim Jachetta (00:41:42):

Is there a limit to the scalability? I mean, you could just add more servers. If you have a fat enough internet networking pipe between the servers, it’s the sky’s the limit.

Greg Macchia (00:41:56):

Hopefully, it’s a high enough sky. Obviously, here, we wouldn’t put 100 servers on a 10 gig network. If everybody’s trying to go to the same content, there’s some limitations there. But it’s not anything that’s not obvious and we would scale accordingly to… You would add more servers potentially if you need it because you don’t want multiple users going to the same server. So the point being, we can scale it accordingly. Again, we can make that network larger. We’ve done everything in the design of our software to optimize and limit, let’s say, the bandwidth and network issues that you could run into.

Jim Jachetta (00:42:41):

Very cool.

Greg Macchia (00:42:45):

Now, of course, talking about remote, we show, again, I showed the initial setup, which would be local. Here, as of course the pandemic made remote productions even more critical, we added an extra little piece to be able to optimize what the operator sees on their UI. Here, when we talk about having an operator distant from the kit, so here again, you have the hardware that you would need. The idea here is this would be sitting on-site at the venue in the production truck. We add one extra [1U 00:43:29]. What this is doing is it’s integrated with the server and it’s creating optimized H.264 proxy so that now when I connect over a WAN VPN connection, again this could be me at home on my home internet, I have optimized the video in my UI so that now I can really work in distance connectivity and really have essentially a very good experience.

Greg Macchia (00:43:59):

Because again, especially when you’re talking about replay, having that feel and not suffering from extra lag and latency, this is very critical in the setup. That’s showing it as a single. And this can be expanded. Again, showing a larger server and multiple users. So this UI Gateway is able to give access to multiple users. The other thing that this does is a web multiviewer. So I mentioned that, and we’ll get into a little bit more details, but again, because of this extra piece of hardware and software is managing or has access to all of the sources, both inputs live and replay, we now take that and we service as a web multiviewer, which could be for those remote operators, or it could be for another production person that’s located somewhere else.

Greg Macchia (00:44:57):

So again, just talking a little bit more about that particular appliance. So again, it’s a UI Gateway, it’s Linux-based so it’s creating H.264 over WebRTC. So very reliable, low latency. Again, because bandwidth, you never know, especially depending on where the venue is, you have controls within to determine what each source is getting so that you could try to go anywhere from five to 50 megabytes per second to really optimize based on the connectivity what you see. And again, being able to support multiple servers, many sources.

Greg Macchia (00:45:39):

And again, here, we talk about plus or minus 42 sources. But if you need more, it’s simple to just add another UI Gateway and expand on what you can do there. This is an example of what that web multiviewer looks like again, so it’s showing that it’s over a web browser, and we actually have a video, which I’ll get a little bit more detail with tallies. The other very important thing is the audio monitoring. The audio is also available. So within any of these sources, whether it’s a replay or a live source, I’m able to listen to the different audio within the architecture to be able to monitor all audio as well.

Greg Macchia (00:46:28):

This just showing, now that we add that piece, this is a graphical representation of your hardware kit, let’s say, that could be sitting in the production truck at the site. Server, we add the UI Gateway. You could have nearline storage. You’ve got your audio mixer, you’ve got your graphics, so all of those engines, let’s say, are living together. And then remotely, you have your different operators that are doing their specific roles, remotely attached to the system.

Greg Macchia (00:47:02):

We’ve done a lot of this work, real work, there’s a lot of good articles in SVG. Certainly early on in the pandemic, ESPN, their whole Live From Home initiative, they were doing, operators were at home connected into a server on the East Coast and doing live from home productions early on in the pandemic using the ViBox in this type of setup. If you want, we have a web multiviewer, just to go a little bit more detail because I think that’s very powerful. Again, when we talk about remote, we’re talking about people being in different places, and again, ease and easy to use. So this is as simple as sending somebody a web link, connecting, and then they’ll have the web multiviewer interface in front of them to see what’s happening.

Jim Jachetta (00:48:07):

Do you want me to roll the video?

Greg Macchia (00:48:09):

Yep. We can roll it. It’d be great.

Jim Jachetta (00:48:14):

Okay. There we go.

Greg Macchia (00:48:25):

Here we go. So here, just showing, again, it’s a web, so you see I’m connecting. This would be me, let’s say, as the administrator, the engineer setting up, that we have security built in. We know that, so to be able to have password protection to get into the system. This would be me logging in, and now you see a web configuration. This is where I would go to set a up the system. So here again, you see the password, so you can determine what that is. And for added security, again, knowing what the requirements might be, white and a blacklist. So you can determine who connects or doesn’t.

Greg Macchia (00:49:04):

Then, we also even add another layer, which is a waiting room. For example, when I create a layout, I can determine, do they automatically go in or do I have to let them in? This is then just showing some monitoring. So again, for me, as the engineer, if I want to see any of the sources that are available, I can go, I can quickly click and you see that, okay, I see that the sources are available and ready.

Greg Macchia (00:49:28):

And then the last tab they’re on the left, I can go to the layout. So here, you see examples. I can build multiple layouts. And then when I have them, it’s simply selecting the web link, the browser link, sharing it and sending it out to the user. So here, just showing quickly how to build one. I’m going to build a new layout. I have the option, is it a waiting room or not? So again, to determine, I can determine whether I let them in. For here, I’m saying, “Okay, I’m going to build a layout for a replay producer.”

Greg Macchia (00:50:02):

You can see there, we have kind of predetermined layouts. So you can select from one box to four box, two big boxes, eight little boxes. I can select those. And then you see, in each individual one, I can actually determine, okay, here are all my sources available to me. So I can be very flexible in how many and what types of layouts that I build. So once I save it, you see it’s available to me there. And then I can share it with users.

Greg Macchia (00:50:31):

So here, I’m just going to show an example. Let’s say there’s a remote talent, so I’m going to be sending them the program. So here, you’ll see, they try to connect and they’re getting the spinning wheel because they’re sitting in the waiting room because I’ve determined that. Now, for me, as the administrator, I can go in and you’ll see here on my monitoring tool, okay, there they are. They’re in the waiting room. I see that they’re there and available. Now, I can say, “Okay, let them in.”

Greg Macchia (00:50:59):

Now, they’re going to be able to access. You’ll see when I go back, now they’ve been put into the connection, and now you’ll see that the system will wake up. Here, you go on the talent on my web viewer, I’m going to go full screen and I’m watching the program output here. And again, you see there that I’m also monitoring audio. So within the system, I have access to all of the embedded channels. So you see here, I’m just showing the example. If I want to listen to different audio, because I’ve up to 16 embedded audio channels, I can select any of those, what pair I’m listening to.

Greg Macchia (00:51:37):

And then we even also have custom audio. Especially if you’re working with a mixer, you have some mix-minus, a talent is going to have particular things that they want to listen to. They can select that. Now, I’m listening to that specific custom audio. So you see very flexible in the power and what we can do.

Greg Macchia (00:51:57):

Now, I just want to show how the interaction works. So here, I’m going to show the example of I’m the director, the live director, and I’m going to show the live UI. You’ll see the screen on the left will be the actual operator screen. And then on the right, I’m going to show the web multiviewer. So here, this is the web multiviewer. So this could be next to that director at his home, or maybe it’s somebody else. So you’ll see on the left, I have my UI, and you’ll see I’m going to be cutting around, and you’ll notice on the right, which is my web multiviewer, you’ll notice that that red tally is moving around based on what I’m cutting. In the top left, you’ll see my program output. That’s switching there as well, or the graphics as I change graphics, or key graphics, you’ll see on the right, that could be a web multiviewer for a remote production person watching and being able to see everything that’s going on as I operate on the left-hand side.

Greg Macchia (00:52:55):

That was for the live. Now, again, just to show the different layouts, this is now the scenario where I could be a remote replay operator. So this could be the replay operator. Again, you’ll see their UI on the left and on the right is the web multiviewer. That could be a distant replay producer. So there, the layout has the two programs on the top. And then on the bottom, you’ll see there’s two rows of six. One row is the live cameras and the other row is the replay. So you’ll see the replay operator on the left is controlling, that bottom row moves along with the replay operator, but the top stays live. So as a replay producer, I can see exactly what they’re doing. When they roll the replay in the top program, one and two, I can watch what’s going on. As they control, I can see everything that’s going on. And again, as they’re controlling those replay sources, I still see the live. So I have a lot of flexibility in what I can see.

Greg Macchia (00:54:02):

In what I can see. One other feature is, because we have picture-in-picture, I’m going to show as an example of, if I’m recording the clock on a replay, I can throw the clock up there, throw a picture-in-picture directly from the replay operator. So again, showing for me, I’m the remote producer and I can see exactly what’s going on. So there on the top left, I see that they’ve got the distinct replay. And again, showing the full interactivity with the web multi viewer.

Jim Jachetta (00:54:36):

Very good. Let me give you back control. There you go.

Greg Macchia (00:54:49):

Here, we are working in our collaborative workflow, right? We don’t have any questions thus far, nobody in our audience out there?

Jim Jachetta (00:55:03):

I think maybe everyone’s overwhelmed with so much good information. We’re still digesting it. So, no questions yet.

Greg Macchia (00:55:12):

Okay. Yes. I know it’s a lot. And maybe I’m going, it’s quick, but hopefully it’s just to show the different pieces that we have. I know it’s a lot there. But again, hopefully showing that we have a lot of products, a lot of different applications. For us, it’s the building blocks. The building blocks coming together for live production, whether I’m on site, at home, whether it’s cloud, data center, control room, all of these bricks that come together to build our system. They could be together or they could be moved around as needed. So hopefully showing our flexibility there.

Greg Macchia (00:55:55):

So the web multi viewer is part of the UI Gateway. I was showing that product as kind of integrated in our system. But that product in itself is a very powerful tool. So we’ll be releasing now just a standalone web multi-viewer. So, if you’ve got a whole bunch of sources, could be streaming sources, could be physical sources and hardware, we’re building the web multi-viewer as a standalone platform to be able to offer that. So we build it again upon multiple software, so microservices. So per system, let’s say, we have 16 channels of video that we can manage along with those 16 channels of embedded audio. Then we can also get more audio over Dante or AES. Again, we’re doing the H264 over web RTC. And then you have the web server, which basically is going to manage the distribution based on how many users are going to be using those multi viewers.

Greg Macchia (00:56:59):

And so, our first iteration is just going to be very powerful. So we can put a bunch of 16 channel systems together to get all the way up to 256 video channels of multi viewer. It’s a big number. It’s a lot of audio, 5,120 audio channels. So again, as part of the multi viewer and over web browser, being able to monitor the audio as well is a very powerful offering in what we’re doing there. And again, the availability of how many multiviewer outputs you have is going to be built on the architecture of the system you build around the web server.

Greg Macchia (00:57:44):

So remote production workflows. I like to get into actual applications that we’ve done. So here again, this is the idea of having your operators at a different location from the system. So for us, a perfect example is when you’re doing a REMI workflow and you want to add some higher frame rate cameras, so 3, 4, 6, 8 times SuperMo, that’s a lot of feeds to send back in a traditional REMI for just one camera. So we implemented this workflow where we put our hardware on site. We simply added that additional one new UI gateway. And we record all of those extra sources on site into our server. And then we just send back the one or two outputs of that server. And here, in this instance, in a real workflow that we did, we actually sent it back as SRT. So not tying up any of those satellite channels over SRT. And then the replay operators are back at the production center, or they could be at home, controlling that server and controlling those SuperMo replays. Here, similar idea for golf. So here, we get into the scenario where, at the top, this is at venue. So this is a truck onsite at the golf course. Here showing the fact that it’s multiple servers. So we’re networking the replay servers together and we’re recording 36 ISO sources from the golf course. And here, the operators are remote. Here in this case, the operators are actually in Europe. So in Europe, the remote replay operator was controlling servers that were sitting here at the venues in the US. So what was going back at their their control room was a typical world feed. So they were getting the feed and some sources, and they actually had a replay box there that was part of their main production.

Greg Macchia (00:59:56):

But then the key was these additional outputs that were coming from venue with these additional sources. So this operator was in Europe, controlling all of those sources back on site, and basically supplementing the golf coverage. So being able to use those layout changes on the fly, covering certain players or groups that were important to them, and being able to add a huge benefit to their traditional world feed coverage. Here, another real example is different application, but for the RefBox. So review. So we’ve been working with the International Ice Hockey Federation. And for certain women’s and juniors and men’s tournaments, they supply the review for the video goal judge, medical, teams. So here, we have this large infrastructure that we would put at venue. So the video goal judge is there. So if there’s any review that needs to happen, there’s a medical component, if somebody gets hurt, or the teams. What’s big about the teams in this setup is the fact that it’s about full transparency. They’re all connected into the same server. So those 24 sources that the video goal judge might be using, the teams have access to that as well. And they can share that with a tablet that’s sitting at the benches. So very powerful setup.

Greg Macchia (01:01:34):

But the other remote piece is, at the same time, the IIHF has judiciary disciplinary. So kind of other rule infractions that they monitor, and they do this remotely. So they would be connecting in from their Zurich office, into the system that would be at venue, depending on where the tournament is taking place.

Greg Macchia (01:01:59):

And one more here, I mentioned Tata. So, we work with them. And here again, because we’re software based, we actually run our software on their private cloud, on their data center. So they have their infrastructure where they would be getting feed from onsite. And this is a scenario also for review where, here, there were multiple judges for sailing events that couldn’t be on site. So this is a little bit of a different version where, here, they’re running with Teradici. So they run those UIs next to the server instance in their infrastructure. And then we provide these remote interfaces. So these judges are all in different places, being able to have either their own control that they could work in tandem in doing reviews for the events that are happening on site.

Greg Macchia (01:02:58):

And then we talk about the cloud. So again, because our core design and system is software, we’re able to move those components to the cloud. So here, you see the similar drawing that I showed earlier. We have our applications layers. But here, instead of connecting to hardware or the data center, we’re connecting to the cloud. So we run different required instances in Amazon or in Google, for example. And you have your ingest and play out that would be coming from whatever sources. It might be, something hopefully will be doing more of is your bonded cellular, where you could provide those feeds to us in the cloud. We have our system there, depending on what the production requirement is, and then offering the playback.

Jim Jachetta (01:03:46):

Right. Yeah. Our partner, Haivision, their cloud stream, they have physical stream up with SDI out and IP out, but they also have cloud. The common denominator for interoperability is SRT, right? So we can be adjacent to the Simplylive cloud and hand off SRT feeds of live camera feeds that way.

Greg Macchia (01:04:10):

Exactly. So you see here, again, a graphical representation. So when the cloud, we have our key pieces, our backend server, you’ve got software audio mixer, you’ve got the UI Gateway to provide multi viewer or optimize UIs, and then your graphics, so that all is integrated together in the cloud. And as you mentioned, SRT. So we have a product with venue gateway. If you have, let’s say, physical SDI cameras on site, we have the utility to be able to get those and bring them in as an SRT. Or like you said, the Haivision could provide those to us. And then again, you have all your individual users and all of the requirements that they have connecting in, including remote talent, to be able to do a full live production in the cloud. This is not fake. This is not pretend. This is core in our vision and what we do. And we’re able to do that.

Greg Macchia (01:05:06):

And here, again, just showing another example. This would just be around replay. So there are other providers there, for example, we also are partnered with Panasonic on their solution, their chiro solution. So you may want to use that on a switcher side? No problem. We can just launch this as replay. And here, showing again, your SRT or UDP feeds in and out. We do that for replay, and showing that we can, if you want to work with our UI Gateway, we call it optimize because we kind of control, and it’s a very full experience. But again, if you wanted to use Teradici, a nice DCV, we could also run the UI connected to the server in the cloud instance, and then just do it as a thin client scenario. We’re able to do both.

Jim Jachetta (01:05:58):

Okay. I should also add too that, in addition to SRT, we can inter-op with NDI as well. I see, on all your slides, you have transport, you have UDP, SRT, NDI. And our cloud solutions can support all that. So whatever your IP preference is, most likely we support it.

Greg Macchia (01:06:24):

Exactly. And there too, like JPEG access, that’s another one that’s on our list. So as that grows, as that changes. For us, our design is really all of the work, and all of we do is in the server, in our software and that backend engine. So for us, it’s let’s make it as simple as it is no matter what your format is just to get it into that engine. And then we process it. They’re essentially the same. So for us, it’s relatively easy to be able to add more IO capability.

Greg Macchia (01:06:59):

One last thing to talk about, again, talking about live production is eSlomo. So, AI Super-Mo. So again, when we talk about efficiency and cost savings, this is the ability to take a standard camera, and using AI software that we designed to be able to create 2, 3, 4 times AI generated Super-Mo for your clip. So the first iteration that we have today is taking a clip that you’ve done, maybe you have a 3, 4, 5 second clip, sending it through the engine, it automatically comes back to the server, and you can play it out with that generated Super-Mo smoother replay look. And for there, we’re working towards then having that engine on a network to be able to do it live. So the idea of a replay operator, I’ll be able to scroll back in my record train, go into this AI Super-Mo, you might have to wait a very short period of time, maybe a second. It’s ready. Once AI is ready, load it. You can take that standard camera and play it back on the fly as a higher frame rate replay. And that is my spiel.

Jim Jachetta (01:08:23):

Well, thank you.

Greg Macchia (01:08:25):

Hopefully everybody didn’t get tired of hearing my voice. But I certainly appreciate the opportunity. A lot of stuff packed in there. But if you can see behind me, we have the studio here, so hopefully, happy to get into more details with you and end users that would like to get into a little bit more depth of the many different things we talked about.

Jim Jachetta (01:08:48):

Absolutely. Absolutely. Yeah. I wish I had more questions. But we only signed a couple of weeks ago. And I wanted to get Greg on the webinar. And I was listening. I’m checking Facebook. And I’m TDing here while on camera. So I’m learning like the rest of you. And if you folks have any questions, let me put this slide up. Let me take back control. Let me see, show my screen. Here we go. Yeah, you can call VidOvation at (949) 777-5435. You can email me Jimj@vidovation.com. And we’d welcome an opportunity, as Greg mentioned, to set up a demo. We can do a virtual demo. We can do an in-person demo. But we’re very excited at the synergy.

Jim Jachetta (01:09:51):

I think you can help a lot of our customers, Greg. Not everyone can afford a high end production switcher. Not everyone can afford high end replay. So you tick a lot of boxes. You work in the cloud. You work with physical hardware. You can do touchscreen. But then those old school operators that insist on having a button, you can facilitate that as well. So, I think you can help a lot of our mutual customers produce more content. And that’s what it’s all about.

Greg Macchia (01:10:30):

Yeah, exactly. And for us, we try to show how high we can go and how far we can go. So when you talk about replay, our replay is very affordable, but powerful. Just as powerful as the high end. So when I show those larger scale set up, is to say, well, hey Simplylive offers a solution at the high end. But oh wait, by the way, it’s also available in an entry level. But it’s the same software, the same powerful capability. So a high school that’s using our replay and the UI that that operator is using, if they go to a full scale production where they’re using Simplylive replay, it’s the same. It’s the same experience. They’re going to have more sources, they’ll have to switch layouts maybe. They’ll be more more demand on them. But they will have learned on that mini system to go all the way up. So that’s what we’re here to do.

Greg Macchia (01:11:36):

And we have a lot of high end users. So to show the level that we get to, because again, for us it’s that quality. And then being able to say, well, hey, if I’m a high school or university looking to do my productions, we just signed on as a conference to broadcast all of this content on ESPN plus. But like you said, affording all of that kit, or how do we set it up? How do we maintain it? No problem. We come with an easy solution that ESPN themselves uses. So certainly for them, as a conference or as a school to be able to do that, absolutely. We can do that. We can train them. We can make them feel comfortable.

Greg Macchia (01:12:19):

And then when you talk about the cloud. The cloud, as hopefully people adapt to that, now you even have a less requirement of on site because you say, all of a sudden, I need another control room. I want to get another four camera production going. Fine. You just spin up that instance. And you have minimal requirements for the operator web multiviewer. Boom, and I can spin that up and have it ready for you very quickly.

Jim Jachetta (01:12:46):

Right. I don’t have the budget for a truck. I don’t have a budget for the people. I don’t have budget for the travel expenses, et cetera, et cetera.

Greg Macchia (01:12:55):

That’s a huge component. Exactly. And I hope that the pandemic certainly really pushed that. And now I think the gray area, the question mark will be how much of that sticks because that’s a big part of it. Those operational costs. Traveling people, the amount of people being on site, all of those things. Look, for me as an operator, our technology literally allows you to do it from home. So I don’t have to travel. I can set up my system and hey, guess what? I can do two or three productions in a day. Because if I were traveling, I couldn’t be in New York and California on the same day to do a production. Whereas this way, I connect to the venue or to the truck that might still be on site, the small truck for the one production. And then later on, I connect to the other, and I’m able to do that from the same location.

Jim Jachetta (01:13:49):

Yeah. You’d be hard pressed to do two or three in one week. You’d be lucky if you could do one between going to the venue, traveling. And then the truck can’t be in two different venues the same day. I think too, some of your NEPs, your game creeks, when we have frank conversations, I don’t think the trucks will go away, but they’ll just be smaller trucks. There’ll be more of them. Or a lot of customers, they’ll be a sound stage, but they don’t have a control room in the sound stage. The truck’s parked outside in the street. So I just think the assets will be shifted around. They’ll be moved, and we’re here to adapt as our customers needs change.

Jim Jachetta (01:14:37):

Simplylive had popped up on our radar quite a few times. But what finally got me to connect with you, Greg, was some of the work with the PGA. And the PGA would cover the final rounds of golf tournaments during prime time on weekends for sports, your Saturday and Sunday afternoon tournaments. And whether it’s PGA or Turner Sports or other sports entities practices, et cetera, practice rounds, rounds that leading up to the final tournament, whatever it might be, we’re doing a lot with fishing. Believe it or not, people enjoy watching bass to tournaments on television, and coverage that normally wouldn’t be covered or lower level tournaments leading up to qualifying, leading up to the final rounds. They don’t have the budget for a truck. And cellular and Simplylive production tools make these type of coverage possible and affordable to cover.

Greg Macchia (01:15:49):

Absolutely. That goes into more content and smaller budgets. But that’s what we hear. Because sometimes the argument is, oh, well, you’re taking jobs away, less people. No, no, no. The idea is more, more.

Jim Jachetta (01:16:01):

More hours, right?

Greg Macchia (01:16:03):

More hours, more events. Exactly. You talk about more. Again, SVG, great article about ESPN and what they did at US Open Tennis. So the outer courts coverage, again, that was not traditionally covered. So with V box, they originally started with nine outer courts. And the article that just came out, they went all the way up to 14 courts. So 14 outer courts, which would never have been covered, which wouldn’t be covered in a traditional sense because of the cost, the amount of people. Now they’re doing it. So essentially, the entire tournament from qualifiers all the way to the end. And certainly interesting this year, because it was a qualifier, if I’m not mistaken, that won the tournament. So all of that is being covered, and able to do it in a small facility.

Greg Macchia (01:16:49):

And again, the quality of the production that we allow them to do, it really shows. It’s a real experience and a real story about what we’re doing. Also the Olympics, again, SVG, a little write up about NBC and smaller control rooms that they built able to add to the live coverage that they were doing. Because again, there’s so many outlets and feeds that they have around the huge Olympics. That again, more coverage, more content.

Jim Jachetta (01:17:23):

Right. You touched on it too, Greg, that operators, whether you’re union or non-union, you don’t get paid while you’re in the airplane, you don’t get paid while you’re traveling. You’re paid while you’re on site. So if you can do two or three events a day instead of one or two events a week, you’re going to make more money. So I think this is the new normal. And then the industry won’t have a shortage of personnel. We have so many outlets for content now, Hulu, YouTube. Everyone has a streaming live platform of some sort.

Jim Jachetta (01:18:13):

A lot of these bass tournaments that we’re doing. Some of it will appear on ESPN, but a lot of it is just going to websites. People either plop down a subscription to gain access, or there are commercials that run. The CDN streaming to the website inserts commercials like a regular broadcast. And that’s how the revenue’s generated. So linear television is not the same anymore. I can’t get my kids to sit down on the couch and watch television with me anymore. They’d rather be in their bed. They watch TV like this on their little screen. So the way we watch is different. It’s like, don’t you want the big screen and the surround sound? Nope, I’m good, dad. I watch this way.

Greg Macchia (01:19:01):

It’s great. But even for us, I want to watch on the big screen. And sometimes I’m like, oh, wait a second. No, linear. Wait, which streaming service is that on? Okay. I got to go there to be able to watch it. But that’s it. It’s not about, again, yes my kids are off in college now. They’re playing. I want to be able to watch their feed. So being able to go on. I would be willing to pay for it, five, eight, whatever it might be. So it might be small, but again, there’s that part of it for the industry to learn, well, hey, how do we make some money on this? Now I have an easy way to put a feed out there. I know that alumni or parents are willing to pay. It’s not going to be a million viewers, but there might be a thousand here, 500 there from the different events. And again, that starts to add up. We’re getting commercials, like you say, to those different things.

Jim Jachetta (01:19:59):

Right. we’re helping some small production companies with just that. Obviously, if you’re in the Bible Belt, Friday night lights, high school football is more important than professional football or college football. These high school kids are already like semi-pro. So getting that on the local TV station, they don’t have a huge budget. Satellite is to costly. So cellular combined with Simplylive is a great solution. In Southern California, believe it or not, high school football is very important. And some of that makes it onto some secondary broadcast channels. Turner Sports does some interesting things, where they may not be the primary rights holder for a PGA tournament, but they have the rights for the warmup. And they interview the players while they’re warming up. And they’ll stream that to Facebook. And Proctor and Gamble will pay-

Jim Jachetta (01:21:03):

Proctor and gamble will pay for advertising because they get half a million to a million eyeballs watching the warmup of some PGA tournament. So, if you’re a golf fan or any kind of fan, whatever your content is or your sport, peeking behind the curtain. That alternative content we find some gems, innovation and our partner, Abby West, we were, we helped the PGA facilitate bringing back the first professional event in May during the lockdown. And it wasn’t a sanctioned tournament. It was a charity event, the skins game, and in May 2020, I believe that the PGA switched that show in house with their control room, but because the pandemic was going on that control room was probably available.

Jim Jachetta (01:22:05):

We’ve done some PGA tournaments where the main control room was busy and they use simply live either remotely or in an ancillary area. And they were able to do a tournament at the same time of a main major. They could cover some minor tournaments. So Simplylive was able to on a moment’s notice, on short notice, give that extra capability to a customer. Let me see. I think I see a little question mark. Here. Did somebody have a question? Oh, this is a great question. Russ asks, what the file format is being recorded and is accessible to other editors say avid or premiere.

Greg Macchia (01:22:49):

So our default codec is DNX or DNX 145. We have pro Rez now and on the export side. So obviously I can export it as pro Rez or DNX, depending on what, but typically the servers are running DNX. When I export it, I obviously can get that, but I can also change a trans code to H264 or even XD cam. But typically, when those workflows, I’d be running DNX 145, I export that out. I can send that to the editor or the editor can edit it, then they can drop it in a watch folder. And the servers connected to watch bolder. It shows back up, and then I have it in my clip database or in a particular bin that I set up and I can bring it in. So we kind of have the full workflow, of course the avid part of it, we’re doing some additional development there for checking in and things like that, that would make that specifically for an avid workflow, even more integrated.

Jim Jachetta (01:23:50):

Very good. What I like is the proxies for remote that web RTC is the protocol primarily used for E-sports right, and latency and high quality. Because I’m not a remote gamer, I won’t pretend I understand everything about it, but if you’re collaborating and gaming with someone else around the world, latency obviously would be a problem. And web RTC, you touched on that Greg, it’s robust, the latency’s low enough and it makes it possible to operate remotely. I learned a new term recently from one of our customers. They said, we’re doing remote directing. So the director may not be pressing the buttons, but the director might because of COVID or health concerns, or just don’t have the time, may want to stay in their Malibu house.

Jim Jachetta (01:24:51):

And they’re calling the camera’s over comma to a TD who is either physically onsite or the TD is also remote. So, remote TD and remote directing, they’re even doing this in non-life. Have you dabbled in that we’re getting requests where the director for scripted or cinematic wants to see live proxies out of your area or your red camera, so they can see that they’ve got the shots, do some remote directing. So, there could be a live mixed element, live just for video assist, or for seeing what’s going on set or even some movie executives where an executive producer wants to make sure they’re filming. If everyone’s sitting around drinking coffee on the remote set that the executive can watch and see what’s happening, we’re burning $200,000 an hour, having the set lit. What’s happening folks, are we shooting today? Have you got into that space, have there been requests for that kind of bridging?

Greg Macchia (01:26:06):

Kind of the live to tape, although it’s not tape. I mean, yes. We were doing talk show stuff where they were using us to cut. The other thing in addition to the ISO is we can record the program. So, you’re basically doing a rough cut, that’s recorded that along with the ISOs is streamed off of the system and then goes to post-production and then essentially they’re able to edit and do the final show. So absolutely that’s available. We had done a demo actually early on one of these, I don’t know, it wasn’t a cooking show with some kind of show. It didn’t get too far, but certainly yes.

Greg Macchia (01:26:47):

So those are interesting applications to get into. And I think we obviously can do that. Again, we’re a small company, we’re a small team, but we want to be that way. So, if I get feedback directly to R and D goes directly to the teams that say, Hey, can we do this? Can we make a UI specific to a certain workflow? Can we bring the data feed in and be able to work that into? So these are all things that we’re open and we’re hoping as we get that feedback. We want to make our product, or I’m going to say our product is good, but what we’re never happy, we always want to make it better and develop a based on the feedback of the [inaudible].

Jim Jachetta (01:27:34):

Yes, yes. Also, too, you said you’re a small company, but you’re also working with some, of the largest players in the industry. And, when bid ovation partners with a vendor, we kind of do our due diligence too. So I had some pretty lengthy talks with the PGA, Hey, should we sign with these guys as simply live? Is this a good product? And ESPN, NBC PGA, I hope I’m not breaking any NDAs right now, but I said it, Greg. So it’s on me.

Jim Jachetta (01:28:09):

They all said, it’s a rock solid solution. You guys really have developed a robust product that’s affordable. And it’s a little bit of a different workflow. I would say, a simplified workflow and easier workflow. You were telling us when you did our initial training, you really don’t need much training. I guess the thing they get used to is two thirds of the screen is under the red. So, that’s an immediate take. If you want to preview your takes first, you’d hit the right one-third of the screen, and then it’s up. Then it goes, pops up to the preview area. Then you tap the preview, then it goes live. So you go pop, pop? So if the director kind of likes to ready camera two, okay.

Jim Jachetta (01:29:01):

The TD, then when it would hit the green area, bring it up and program, wait, oh, wait, he changed his mind, take three, boom. Then you hit the red area of three. So I’m not always cracked. I’ve done some amateur video engineering that never TD, but I’m always, can I volunteer at the Saddleback church as a video engineer? Because, I knew what a vector scope was, an aligned scope. I knew what that was. So they’re like, you’re hired, well, I’m a volunteer. So, I’m always convinced the directors taking the camera. I’m not shaded properly for, he’ll be like, ready, camera two, take four. Where’s the ready for where you skip a step. So the TDS got to be ready that to dump the four.

Jim Jachetta (01:29:45):

Because, we’re not going to, we’re not previewing for, but if you’re controlling at Saddleback church, a lot of times the director was the TD. So, he knows what he, but I say sometimes that was a negative. There would be a good shot where the director now has this, Hey, where’s that macro button where he’s not looking at the multi-viewer and a good shot was missed. So, if you’re short on staff, this would, so you can afford to have a director watching the multi-viewer making sure they’re getting all the shots that an operator can multitask, but it’s not necessary. Right.

Greg Macchia (01:30:28):

Look, if it gets there in front of you, the training part, we show, we train people the day of most of them haven’t seen it. We show them, it’s like, look, yes. You know, the bar, look, if you don’t like 70, 30, you want to make it 60, 40, you have that option, but it’s-

Jim Jachetta (01:30:47):

Oh, I know that okay.

Greg Macchia (01:30:47):

in front of you it’s to set this, so you see that show, you’re pointing it yet. Just touch the screen. Oh yeah, there it is. So it’s so intuitive, easy to use. I mean, we spend an hour training somebody and even within that hour, I mean, less than an hour, look in five minutes. And then the little reel that I ran, I gave you the main things, but it’s going through that, the basics are there.

Greg Macchia (01:31:11):

They’re covered, you’re able to cut cameras. Okay. I get out of key to graphics. Okay. That’s nice to do a replay. You start with one replay. Then as you start to get comfortable, I’ve got a second replay. Now I’m going to get that crowd reaction in there. So it builds, but the base, it’s very short training and yes, it’s the director and the producer that are actually doing it. And, a lot of them it’s fun and exciting for them like, Hey, this is cool. I get to do it. I actually liked doing this. And, it’s been a great experience. And for me, you brought up, like switch or, and where is it? You know, for me, I think the interesting part is training the next generation of production people. So we take away kind of that physical intimidation saying, no-

Jim Jachetta (01:32:00):

Yes, you’re not looking down at buttons. Because, you don’t know where your fingers are. There’s like a little bump on so, okay. Five is in the middle, I got 10 cameras. You’re feeling by braille where you are and you got to look down the cheat this way. I’m, my screen is my multi-viewer and I don’t miss anything. So, missing, if you’re short, if your production doesn’t have the personnel to have a dedicated director and a dedicated TD, you won’t miss any of those shots.

Greg Macchia (01:32:33):

You won’t miss end. And you’re learning again, if it’s on new people, you’re learning the other nuances of telling that story of what cameras to cut when to cut, being able to say, Hey, camera two, zoom in a little bit. I want the certain shot. It takes to those little details replaced. When do I do a replay? Not all of those things. So now we allow you to focus. You know, what story am I trying to tell? It’s all there, it’s all in front of me, and I’m learning that. And then yes, certainly being able to eventually be a director or be a TD on a larger production, a larger, physical switcher and I think part of it for us too, is training people. Okay. But almost, TD is probably the hardest for us are TDS and nothing against TDS, but they’re used to a certain way. So we’ll wait.

Jim Jachetta (01:33:27):

Yes, they’ve been running button now, what do I do? But if they really need to have the buttons, you can facilitate that you have a variety of control surfaces. If that’s what’s warranted one thing too, you and I had talked about this, and I just want to let customers know. So Abby West, our stream hub, it’s the brain. It’s not just the receiver. We have us actually. I think it’s a great name. It’s a streaming hub. So they call the stream of there’re ins and outs SRT in IP, IPN, IP out. If it’s a physical device, there’s this SDI ins and outs. But what I’m envisioning is spinning up a Ace. We typically use AWS. So, spinning up a demo instance of simply live connecting it to the Abby west cloud. The innovation team will get some training and then we can invite customers.

Jim Jachetta (01:34:31):

I love the whole waiting room idea. So, we could have a demo with XYZ network next week, and we would send them credentials. Here’s the server you log in. Then we could take an SRT output, and hit an IRD in their master control. So they could see the program out of the show. We’re going to, the fake show we’re going to switch during the demo. We even transport over AWS elemental media connect. And we can give we’re doing a lot with the PGA would that, where we either give them an invitation to the media connect SRT feed, or they make a request and we hand off to them.

Jim Jachetta (01:35:18):

So the AWS and the media connect cloud is a big part of our business. We also work with Google as you and whatnot. You mentioned Tata. We talked that does own some Abby West gear. Maybe we can talk more about some of maybe the, to help proliferate simply live and other video vacation offerings. Maybe we can help each other with some introductions. We need the cloud provider to make it all happen. Video ovation is not quite big enough yet to have its own cloud. I don’t know about you guys, Greg, but-

Greg Macchia (01:36:01):

Yes, but you know, that’s cloud for us is going to be, that’s what we’re working through, but certainly, right now a lot of the testing is with the knowledgeable, the networks. So, we’re doing some testing with them. So there it’s like, well, look, we, know we own our AWS instances, what do we need? Okay. And we’re just licensing software to them and working with them to do that. So, but yes, then we are the model to say, well, what about conferences or universities or high schools? They want to cover their football. They want to get it out there, but how do we do it? We don’t know fine. You know, we would have to send the cameras out, we would say, wouldn’t you want to do a four camera production?

Greg Macchia (01:36:51):

No problem. So we would potentially say, yes here’s a simply live cloud instance. You, have the specs, what do you need? Here you go send the feeds to here, where are your operators going to be? You know, here’s the connection for them to control and do the live switching and replace and the output goes out. So, it’s that model we’re working on, but that’s kind of the idea for us we’ll have the software or then providing kind of the full kit. And, there probably partnering with others to say, okay, being like you said, we’re not big enough to potentially do that, but we’ll still be kind of the engineering backend management of that as employees, because I think that’s going to be the, again, there’s security, there’re things, but we’re working through all of that.

Greg Macchia (01:37:44):

So we know, look, we’re more, we’re kind of the engineers at heart. So we really work hard on it, working, testing, making shorts to because we know, look, this is live, we come with a lot of experience and I think that’s a big part of, well, simply live. We don’t know where the people behind. I think that’s really, really important because we know, live its we’re, the program. So if something happens, there’s nothing’s going on here. So we know that the stress, we know what it’s supposed to look like, the pains, the good and the bad. We’re not saying we know everything, but we certainly come with that understanding. And hopefully we answer 80% of the questions. Well, they’ve got it all pretty much covered, and we’ve got these little details specific to what we do. And, we were able to deploy quickly and easily and, and hopefully push that technology. Right.

Jim Jachetta (01:38:46):

Right. Well, I think the important thing too, is I don’t, I’m not aware of any of our customers that are fully a hundred percent cloud. It’s a hybrid. And if you’re not, there could be cloud elements to it. That could be a physical element. When we did the Ryder cup with Turner, they did have some trucks onsite and they did shading locally because when you stream over bonded cellular or the public internet, there could be a half, a second or a few hundred milliseconds of latency. And sometimes, shading a golf ball going through the air. Okay. It’s in the, it’s in the woods, or it’s against the trees. Now it’s in the sky. Now it’s on the green, the shade during that whole arc can be challenging. So there’s, like Greg said, like you said, Greg, we’ll design a system to whatever you need, do it fully in the cloud, partially in the cloud and or a mixture of the above.

Jim Jachetta (01:39:46):

Well, I think we’ve gone more than 90 minutes, and this is all great stuff, Greg. I can talk, shop all day with someone like you. So thank you so much for doing this as the great presentation. What I may do is in post-production, I’ll have my team edit the real video in so that the choppy video is not in the recording. So we’ll fix up the recorded content. It takes us about a week to get the content ready. That’s a common thing. Like, we’ll get a flood of emails later today. Oh, darn it. I missed the event. So, we try to put it up within a week and I’ll be on the website and you’ll see more content on simply live on the video ovation website, or you’re certainly welcome to go to simply live as simply live dot T V. Correct. Is your URL.

Greg Macchia (01:40:42):

Yes. So, yes, certainly look, if for those, for now, we have a video section, so do you know, seeing kind of a more detailed demo kind of like I showed here, but getting into a little bit more on the all-in-one on the replay, on the rec box where we have all of those on our website, so you can have at least a first look, the video we’ll be smooth and it’ll be a lot easier to understand, and then, certainly we’re here along with Jim to work together and be able to then, kind of take the next step and talk about it.

Greg Macchia (01:41:15):

Yes. And, being able to, whether it’s deploying a kit, testing, the cloud aspect of it, or look, we’re on the east coast, Jim is on the west. So, we have kid here. So being able to show, well, look, there’re cameras in the office there, they’re sending it to us, the server sitting here, the operator somewhere else. So you have the operator there in your office and say, look, here’s the gamers, here’s the operator switching. And all, you had the servers, over in Pennsylvania and showing that experience and it’s real, it works and how, right.

Jim Jachetta (01:41:49):

Right. We, can show you a real world demo where, some of the pieces are in, in Southern California and et cetera. We typically, well, we typically spin up AWS instances, close to where the customer is working, but most of our demo stuff is in the AWS knock in Oregon on the west coast. But yeah, we might be on the east coast, AWS, or even in, in the Google knock somewhere else in the world, and we can have the SRT or NDI streams going back and forth. So we can show that it’s not just theoretical. It’s real-world, we really do know how to collaborate. Well, thanks so much, Greg. Thank you for joining us. Thank you everyone for tuning in one last, Bob said he appreciated the, a shout out for Kairouan. I’m sorry. Russ, Russ appreciated the shout out. No, it wasn’t Russ. It was Bob Hawkinson, do you know him? Is he with-

Greg Macchia (01:42:54):

Yes, yes, absolutely. Look again, here’s simply live and working with Panasonic, they obviously do great things. So, again, showing what we do from the all in one, but, being able to integrate with fantastic partners, because again, that’s, there’s a lot of good, good pieces out there, so fitting those in together and being flexible and open to do that is very, very simple.

Jim Jachetta (01:43:28):

Yes. I should also mention we’re a Panasonic partners as well, so maybe we can broaden the integration. That’s part of what innovation does. I mean, we’re a reseller distributor, but primarily I think of us as a systems integrator or a technology integrator, not every customer has the technical know-how on staff to integrate, to put bonded cellular in the cloud, how to integrate it with simply live, how to install it on AWS, how to connect to media connect. We smooth out all those bumps. We make all that easy and we have the support of vendors, like simply live to help us facilitate that and make it easier for our mutual customers. So, all right, Greg. Well, thank you so much. Thanks everyone. We’ll have the recording up in about a week, and if you have any questions, please call 949 777 5435. That’s the main innovation number or email sales@vidovation.com. That’s V I D O V a T I O n.com. Have a great day. Thanks again, Greg. We’ll talk to you soon. Take care. Bye-bye.

 

Continue Reading