At-Home Production (REMI) using Cellular & the Public Internet
Maintain Frame-accurate Genlock and Lip Sync across Multiple Handheld Cameras for Your Live Remote Production
Ken Kerschbaumer (00:05):
Hi everyone. And welcome to our first panel discussion of the day. Obviously our goal with this event is
talking about remote production. But we want to take a little bit of a different angle to talk more about
what life is like for people at home. And then recommendations for people who are working in new
ways the last seven months. So it’s great to be joined by Jim Jachetta, who is VidOvation CTO and cofounder.
Jim, how are you?
Jim Jachetta (00:24):
Good. How are you, Ken? Thanks for having us.
Ken Kerschbaumer (00:27):
Glad to have you. And James Japhet from Hawk-Eye North America, managing director. How are you,
James?
James Japhet (00:32):
I’m pretty well, Ken. Yourself?
Ken Kerschbaumer (00:33):
I’m doing all right. So we must start with James, you are in the Appalachians, I believe. Which I’m
amazed you have even a wifi and connectivity down there. It’s great.
James Japhet (00:43):
Yeah, it’s a plus. Yeah, no, it’s a welcome change from the city. So yeah, no complaints from my end.
Ken Kerschbaumer (00:51):
Yeah, sure. And I guess that gets to one of the key challenges we want to talk about today, which is,
people are working from new places, right? And places where you maybe wouldn’t think you would be
able to do a video production. So from your standpoint, because I know Hawk-Eye, Josh was on the
previous panel discussion discussing his working from home. So give us a sense of what the last seven
months meant for Hawk-Eye. Because I know you guys have been key for a lot of the PGA golf events.
What’s it been like for you and your team as far as this new normal, if you will?
James Japhet (01:20):
Yeah. I mean, the first half it was very quiet and the second half has been pretty busy. It’s been an
interesting time. I mean, it actually threatened to tip just before COVID hit with the work we were doing
with the PGA players, with their every shot live project. But it’s been interesting. We’ve taken a very
different approach to most with regards to remote production and the replay side of it, choosing to
leave all of the high res actually on site and simply almost creating a portal into that. So yeah, we’re very
low bandwidth. As a solution which, as you say, even in the Appalachians I’ve got half a chance.
Ken Kerschbaumer (02:04):
Right, right, right. Well actually, I do want to start with the players. Because that was, without a doubt, I
got to watch about what five hours, six hours of the cool every shot, every whole live. Give me a walk
through it real quick, because your operators were based in the UK. So maybe that’s a perfect example
of this new workflow in terms of, how was that accomplished? How many people, is it 20, 18, 22
operators? How many were in UK? I’m quizzing you now.
James Japhet (02:32):
Yeah. Close. We had 28 actually. But yeah, effectively we had all our servers onsite in Florida capturing
every single angle on the course. So 120 plus angles coming into the system. And effectively we are
producing 25 concurrent streams, one for every group that was on the course. It actually crept up to 27
or 28 at one point, just because of a bit of slow play. But yeah, effectively we were curating a stream for
[inaudible 00:03:04] groups. So cutting graphics, TrackMan integration for the PGA’s nonlinear
broadcasters around the globe.
Ken Kerschbaumer (03:14):
Yeah. It was amazing. It was really good. I wish we had the whole tournament obviously, because-
James Japhet (03:19):
So do we. We learned to model lessons on day one.
Ken Kerschbaumer (03:22):
Don’t say anything, right?
James Japhet (03:23):
Well, I mean, we were very much looking forward to day two, come the end of day one. But yeah, we
thought we were going to sneak in when the PGA hadn’t called it off at the end of the day.But-
Ken Kerschbaumer (03:36):
I was gutted, at you like to say. Gutted, is that the expression?
James Japhet (03:39):
Yep, gutted. This is indeed the phrase. Yeah, very good.
Ken Kerschbaumer (03:42):
Right. So Jim, from your standpoint, what’s the last seven months… You guys had some interesting golf
also. I guess when golf [crosstalk 00:03:47]-
Jim Jachetta (03:47):
Yeah. Yeah. We thank God every day that we’ve been busy pretty consistently since COVID started. We
are working maybe with different clients in different ways. We had done a couple of tournaments with
the PGA, a tournament in the Caribbean, some other tournament’s with them experimenting. Can we do
at home production or a REMI production over an unmanaged network like cellular? And one of the big
challenges in doing any kind of a REMI production at home. I’m sure James, it’s important to him as well.
Is maintaining Gen-Lock and audio lip sync. Bonded cellular was really meant for a single camera going
to the courthouse steps to interview someone. Sometimes there was a camera operator, sometimes
not. They’d put a tripod. The operator is their own camera person.
Jim Jachetta (04:53):
So really wasn’t meant for multi-camera. So VidOvation, with our partner Haivision, we’ve fallen at this
sweet spot now going on the better part of five years. Doing as many as 50 cameras, all maintaining a
synchronous frame accurate Gen-Lock, audio lip sync. And it started with this crazy live TV show with Big
Fish Entertainment. And even though it’s a reality TV show, a lot of the workflow, a lot of the tech is like
live sports. They dump all the raw video into an EVS. They play it back to the director. The director picks
their shots. So there is up to a 15 minute delay in that production. And then with the PGA, they tried it
out, had a couple of early events that were successful. And then in April, May, there was the first golf
event that came back, the Skins event.
Jim Jachetta (05:58):
They did it at a private club where there was no fiber, the switch level three, no one had a connection
into that venue. Only alternative maybe would have been satellite. So the PGA trusted us, we came in,
we provided them the eight modem pro 380 bonded cellular. They mount on the camera, so the
operators can move around. There are some systems that will do multi-channel, but they have to be
tethered, there’s cables. So they had two photogs in the T box, two photogs on the fairway, and then
two on the green. And they kind of rotate as groups come through.
Jim Jachetta (06:47):
And then a big requirement is handling analog audio. Now some cameras have analog audio inputs on
the camera, but if you’re using a smaller camera, that’s not necessarily the case. You’re stuck with the
internal microphones of the camera. I think of myself more of a video guy than an audio guy, but there is
anywhere from four times to eight times more audio for every video. There’s never enough audio
channels. So each of our field encoders does four today, it’s on the roadmap to do eight. I think
customers would like 32 channels of audio. But we have a way around that. I learned about the top
trace, to draw the red line. This may compete with you, James. But-
James Japhet (07:48):
No, no, no. You go right ahead.
Jim Jachetta (07:49):
We all know how Hawk-Eye has revolutionized tennis. If we only had it when McEnroe was around,
there would have been less tantrums. But the top trace, tracing the ball. I didn’t know this, that
telemetry is actually sent to the graphical equipment back in master control via an audio channel. So
having analog audio available on the camera made the top trace very viable. And then PGA, they have
operators with parabolic mics getting distance shots, following the crowd. They had lapel microphones
on the talent. There were commentators walking in the course with the players. And for audio only, they
used the small pro 320. It has two modems. We just put them on their internal video test signal, ran it at
a lower bit rate, and then use the analog audio inputs. So the commentator following the talent, his
microphone will go into analog audio one.
Jim Jachetta (09:04):
Then he’d have a LOV receiver to one of the players he’s near going into audio two. And then for
diversity, there might be another LOV receiver on another camera, just in case cameras drift away. So
you can have redundancy in your audio paths. And so in the case of PGA, there were seven or eight. No,
there were nine camera feeds simultaneous, some tight, some wide. And then dozens of microphones
open all at the same time. You can imagine the audio guy would go crazy. They tee off a bit, whoosh,
whoosh, whoosh, you get all this echo. And with a live show, you can’t fix this in post. So you really have
to have it frame accurate, perfect Gen-Lock. So that’s really a piece that you need, is that frame accurate
Gen-Lock.
Ken Kerschbaumer (09:59):
Right. Right, right. Yeah. So James, is that an issue? I mean, lip sync, as far as you’re still, let’s take the
PGA, because you guys have been a part of almost every tournament, right? I think that happens.
James Japhet (10:09):
We have. Yep. So far yeah.
Ken Kerschbaumer (10:11):
Walk us through where you fit into those regular productions, if you will. And also, well the
championships too, right? The Master. Sorry, not the Masters [crosstalk 00:10:20] Championship, the US
Open.
James Japhet (10:22):
Masters is coming up next month. Yeah, we’ll be a part of that as well. But yeah, I mean, just going back
to the lip sync. We saw that as an issue, or a potential issue for us. So we actually, again, by leaving
everything we had onsite and only offering a portal for the operator into those machines, we actually
avoided that being a problem for us on the golf. But yeah, I mean the migration from every shot live to
your regular performance and the deeper, be it the championships we’ve had thus far. We’ve replay
operators for network operating from their living room with a 15 to 20 Meg pipe, just standard internet
pipe, able to then to access the servers on site and operate them as if they were sat in front of them.
James Japhet (11:16):
So we’ve taken a very different approach to most. We felt that, to Jim’s point, there isn’t always
connectivity everywhere. There isn’t always a lovely Gig plus pipe that that’s available. We focused on
just making sure that people had access. So the control device that the T-bar and jog wheel, et cetera,
we basically created our own low latency, almost remote desktop. Which was designed specifically for
the purpose of video transport.
Ken Kerschbaumer (11:54):
Right, right, right. So what was the challenge for the operators as far as getting set up? And were you
nervous at all that this was going to go off? Were you pretty confident this would work?
James Japhet (12:06):
I mean, a little bit of both. I think-
Jim Jachetta (12:10):
Did you sleep at night?
James Japhet (12:13):
I never have any problems sleeping, Jim. That’s fine. But we were certainly a little bit… I mean, look,
whenever you introduce something new and especially when you introduce it on the stage that is golf,
when there was as little sport going on as there was at the time, there’s always going to be a slight
nervousness to what you’re doing. We were fortunate in the fact that we’ve done remote video for
years and years in an officiating space. Jim, we transport all your cameras back for NHL to Toronto and
New York every game. So we’ve got a fair familiarity with your tech and the remote video side, and
transporting that video. So that alleviated some fears, certainly. But again, we put it through its paces
internally. We put it through its paces behind closed doors externally. But as everyone knows, when the
golfer approaches the first tee, there’s very little you can do. But thankfully the tech stood up and
continues to do so now across multiple sports.
Ken Kerschbaumer (13:19):
So for both of you, for people who are… People in our industry have worked a certain way for decades,
right? They have a rhythm to the way they do their shows, the way they show up on site and do their
productions. What’s your advice for people who are going to be working in new ways? How should they,
for example, James, if I was working in my living room, what do I do for backup in case the wifi gets cut
off? The winter time, I’m always curious because we’ll have power outages and things like that. So
what’s your advice for people who are looking to work this way, how they can be comfortable working
this way?
James Japhet (13:53):
I mean, it really doesn’t change very much as to whether you sat in the truck or whether you sat at
home, you still have an engineering support network there to help you out should something go awry.
Okay, it’s live TV, there is stuff that does go awry, whether you sat in a truck or otherwise. I mean, I think
people, if you’re open to it and you’re happy to just ride the bumps that may come along, wherever you
may be sat, I think it’s not as scary as all that.
Ken Kerschbaumer (14:24):
Right. Right. Jim?
Jim Jachetta (14:26):
Yeah. To that point, with any kind of change in workflow there’s a learning curve. Cellular today,
particularly Haivision cellular is just about as reliable as satellite. But even satellite’s not perfect, there’s
rain fade. So a director and a TD have to work together that, hey, take camera one, but if something
happens, dump to three. A wide shot. There can be a little hiccup. In a live show, like Live PD, they like
some of the edginess that sometimes happens, but they’re going 120 miles an hour in cop cars.
Jim Jachetta (15:15):
Or, if you’re in a little bit of a dead spot, increasing the latency tends to smooth that out. This weekend
the PGA was at a little bit more of a challenging course. And one of the things with cellular, people
always think it’s bandwidth is the evil thing. Yes you do need bandwidth, but first it’s latency. Latency
that’s varying. Latency that exceeds a couple a hundred milliseconds. That’s the first evil of video
transport. We don’t have a variable buffer, because if the latency was changing on a show, that would
be horrible. You can’t produce a show where cameras are changing in latency. So you have to pick a
latency at the start of the show. In our systems, if you run into problems it’s a good idea to have profiles
in your unit for one second, two second, three second.
Jim Jachetta (16:11):
So on Saturday there was a little… Or Friday there was a little bit of choppiness in one of the PGA
events, and the text, we conferred like, yeah, the latency on the cellular connections is not great on all
of them. But they’re like, but we got five bars. It’s congestion in the back hall from the tower to the
telephone central office. There’s congestion in the SONiC network connecting all the telco central offices
together. So there’s a lot of telco telephone company infrastructure between the encoder and the
public internet. Actually, once it hits the public internet then we’re home free. Public internet now tends
to be more reliable. And PGA probably has 100 Gig pipe, or redundant pipes into their facility.
Jim Jachetta (17:07):
So it’s the first mile or it’s between the tower and hitting the public internet, that’s where… So there
was some congestion, unexplainable. So they’re like, why are we seeing less than perfect video? So we
got up the latency to three seconds. Now, challenges with that. How do you direct a show? Like, hey,
camera one, zoom in. Well that already happened three seconds ago, he’s three seconds late. So you
have to trust your photogs. Or, just using golf as an example, leading into the green. Okay, I want you to
go wide on Tiger, and then zoom in. But these photogs know how to shoot. Camera one, you follow
Tiger. Camera two, you follow Phil Mickelson. And they know what they need to do. The other challenge
is, how do we paint cameras remotely? Some of our customers, the video engineer is sitting in master
control 10,000 miles away, or he’s at home. Really doesn’t make a difference.
Jim Jachetta (18:17):
All the tools, like James and his technologies, we can remote into it from home. Maybe the server is in
the cloud in AWS or something similar, or it’s in master control. Really doesn’t matter. The challenge is,
camera control systems don’t like latency. So our data connections tend to be a couple hundred
milliseconds. The video we need more buffering. So anywhere from one second to three seconds, it kind
of varies in that range. The technology is heading to do half a second, but you’d have to have a really
clean network to get down there. The tech could do it. Maybe with 5G, we’re being promised lower
latency. So then maybe then we can get down to half a second. So how do I control a camera? You hook
a Panasonic, Sony, Ikegami, whatever kind of camera controller PTC.
Jim Jachetta (19:17):
It kind of falls apart with a couple hundred milliseconds. So VidOvation and our partner Haivision, we’re
working with CyanView. A lot of sports production teams are already working with CyanView. We’re
working with Haivision data bridge. And what a lot of these systems do, is they spoof the controller. So
what happens is, can’t control the latency in between, so the unit on the camera, it mimics the CCU so it
doesn’t time out. And then the controller in the master control sends commands to it. So it keeps the
camera happy and keeps the controller happy. So there’s some trickery that is done there to keep
everything happy so it doesn’t fall apart. Yes, as a camera operator, you may want to be a little slower
on your controls, because there will be a little bit more latency than you would see in the studio. That’s
unavoidable. So you just got to, don’t overcompensate on shading the camera. Just ease into it. Again,
these are part of the workflows, part of the learning curve.
Ken Kerschbaumer (20:27):
Sure, sure. So James, I mean, what’s your sense on the, and for both of you, the impact on the feature of
the remote production compound, if you will? Right? You have people, Josh, he was fine. I think people
miss being in the commissary, although not now. So what’s your sense on the new normal as it relates to
the broadcast production compound for where your teams usually would be hanging out? Is this a good
thing that you don’t have to be on site, do you think? Does this open up more opportunities for you as a
company and for those operators to get more jobs?
James Japhet (21:05):
I mean, I think that the number of jobs shouldn’t really be impacted by it. I think an operator will be an
operator whether they’re sat at home or sat in the broadcast compound. I mean, it is interesting. I think
COVID’s effectively acted as an accelerator for what was probably likely to happen over the next three to
five anyhow, with more remote operations coming into being and more cloud-based operation coming
into being. I think, yeah, I think it has accelerated things. I think it’ll be very interesting to see just how
much equipment actually needs to go on site moving forward, and how much does actually move into a
cloud. And also the number of people that need to travel. Because I mean, there are benefits. Josh could
probably vouch for this, the fact that he can actually do multiple gigs in the day, if he’s not having to
travel for things. So there are benefits for the operators as well. But yeah, I mean, I think it was on the
card, so I just think it’s been, a fast forward button’s been hit, as it were.
Ken Kerschbaumer (22:12):
Yeah. Yeah, exactly. So, Jim, what’s your take on the future of the broadcast compound and how things
may evolve?
Jim Jachetta (22:18):
Well, yeah. That’s a great question, Ken. So even before COVID the dialogue is you, you have your
knowledge worker, your experience worker, whether it’s EVS or Hawk-Eye, whatever system you’re
using. Your video engineer. They fly to LA to do a hockey final tournament. They can’t do another event
that same afternoon, they’re stuck in LA. The truck or trucks that are doing the event in LA. I talked to
some of my friends at NAP, at Game Creek, they’re like… We have a picture on our website of a
production truck with a red circle a line through it. I was like, I’m just trying to make a marketing point. I
don’t think the trucks are going to go away. Here in Hollywood it’s very common a stage being used as a
television studio, there’s no control room in that stage.
Jim Jachetta (23:15):
The control room is an NEP truck parked in the alley. So the trucks will just be in a different place. They
may stay in Pittsburgh. They may stay in a central location. Metaphorically, we may take the wheels off
the truck. So a truck could be doing an early afternoon game out of New York, all the cameras are
fibered in, or bonded cellulared in. And then do a game in LA that afternoon. And if you get very
efficient, that same truck maybe could do three events in one day. And all the operators, maybe some of
the operators are near the truck, but then some of the operators could be working from home. So this
saves cost. I know, talking to many of our customers, there’s always a shortage of EVS talent. I don’t
know. Maybe we need to get them to switch to your tech James. So we can outsource it to the UK.
Sorry, EVS. But I just try to be brand agnostic.
Ken Kerschbaumer (24:20):
Replay operators.
James Japhet (24:21):
Appreciate it, Jim.
Jim Jachetta (24:21):
Yeah, yeah, yeah. So with the PGA, they’re hiring camera operators that can drive in, so you don’t have
to get on a plane. So their exposure is limited. They’re driving in their own personal car. They’re cutting
down. I feel the biggest chance of getting sick is not breathing the air on the plane, depending upon
what you believe. The airlines tell you the air is actually very clean. They’ve always used ultraviolet
cleaning of the air. It’s the door knob to the bathroom. And in the public world using public bathrooms,
it’s touching something dirty and then touching our face. Even having a mask on we rub our eyes, we’re
done for. So staying at home. Or, you don’t have to pay for travel, you don’t have to pay per diem for
food.
Jim Jachetta (25:17):
You don’t have to pay for a flight, rent them a car. They go to the work that they’re familiar with. They
already have the shielding up in the control room. There’s someone wiping everything down every
couple of hours. So they’re either at home or going to the job, the master control they always go to. So
the PGA, in particular, is very excited at how much money they’re saving. This is saving them tons of
money. And they’re able to do tournament’s they would normally not have covered. Not covered live.
Ken Kerschbaumer (25:52):
Yeah, it’s hard because the cost of all the COVID testing and all the protocols is making all those savings
vanish. But I think eventually they will say, okay, this is a new way to hire people and have them
working. And I guess, James, I want to ask you, because the Hawk-Eye and so many big on the cloud and
the future of the cloud-based production. One of the themes for the rest of this event is all about
virtualized hardware. And this whole new opportunity to transform your product development, right?
Everything from product development. To also, if I was a Hawk-Eye operator, can I band together with
five other Hawk-Eye operators and create our own little band of brothers, if you will, for re-playing?
Create a company without having to spend $3 million buying re-play servers. Let’s start with your own
internal product development for both of you. As far as James, how does this virtualized world improve
your product development? Or is it making things faster? Are we no longer waiting for NAB in April and
then IBC in September, but you can improve products almost monthly and weekly?
James Japhet (27:04):
Yeah. I mean, Ken, we’ve never been one to wait for IBCs and the likes. But no, and it definitely provides
you more flexibility and freedom around what you’re actually producing and how things are moving
forward. I mean, a lot of what we do, I mean, we consider ourselves almost a software company that
uses hardware as a necessity. And we are cloud capable, and we use AWS as that platform. And we are
doing the odd show actually in the cloud, and I expect springtime that to pick up and the level of show
to go up significantly, accordingly on the cloud.
James Japhet (27:47):
But I do think a lot of the time we’re waiting on hardware to catch up. When you’re talking about
hardware, there is a longer lead time on things in terms of the planning stages and what you’re actually
going to go with as your box, if you will. What feature sets it’s going to have, what chip sets is going to
have, what ingest cards it’s going to have. And the fact that you can shift all of that out of the way, and
effectively stick some encoders into a truck that gets you get all the vision onto the cloud. It’s, life is far
easier. And the product development is far, far quicker.
Ken Kerschbaumer (28:27):
Right. Right. Yeah. Jim, now obviously your stuff’s a little more hardware-centric, I believe. But what’s
your sense on the cloud?
Jim Jachetta (28:33):
Yes and no. On the encoder side, Haivision has always taken the approach that a silicon, particular with
HEVC, a Silicon encoder usually has better performance, better video quality. So they use a very efficient
HEVC codec across their family of products. And efficiency is particularly important when you’re paying
for bandwidth, but you like cellular. So typically the rule of thumb H.264, migrating to HEVC was a 50%
savings. Haivision saves maybe another 30, 35%. So it’s a 65% savings going from 264 to 265. And then
comparing Haivision, HEVC, 265 to others, we see a 30 or 35% more efficiency. So you could run an
encoder at two megabits per second and get the quality of 3.5 megabits per second. Or, run at five and
get the… I can’t do math in my head anymore. Five and equivalent of eight or nine megabits per second.
Jim Jachetta (29:45):
With sports too it’s like, you usually run a little bit higher bit rate. So that’s the hardware side. And
there’s a number of factors that make Haivision bulletproof, as we like to say. The first factor is the
antennas. Not all antennas are equal. They put a lot of R&D into it. A lot of it is patented. Then the
cellular modems they use, in the category, they grab more bands. They grab all the available bands. So
we have eight modems. Another vendor’s box won’t see any cellular. And why is Haivision locking to
something? Are you faking something, Jim? What are you doing? You have a fiber going up the
operator’s leg or something? Jokes like that.
Jim Jachetta (30:37):
Or, is there a guy with a Yagi antenna following him and you’re really doing microwave, and you’re
cheating somehow? But it’s not perfect, it can drop out, but you got to prepare for that. But back to the
cloud. So the receiver, with bonding people are always like, “Oh, can I get your encoder and go straight
to Facebook, straight to YouTube, straight to the cloud?” You need an appliance, whether it’s virtualized
or physical to put all the packets back together. Because we take the stream, it’s a variable bit rate
stream based on available bandwidth. We split it into eight or nine, or up to 11 paths. As you know
James, you got to put the packets in the right order in order to make video. Right?
James Japhet (31:25):
It helps.
Jim Jachetta (31:27):
And that happens in the buffer, that magic orders all the packets before they play out. So Haivision
stream hub software is Linux-based. And they call it a hub, it’s more than just a decoder. It decodes, it
does transcoding, encoding. The transcoding capability is very helpful, because not all cloud resources
support HEVC yet. To my knowledge, Facebook still will only take H.264. So you don’t have to leave the
Haivision ecosystem to reach Facebook, you can transcode right in the unit. And we can do that on a
physical Supermicro 1RU server in someone’s master control or data center. Or a virtualized instance in
AWS Webconnect, et cetera. And how Haivision connects with other systems, so if a customer is using…
On our last panel it seemed a common denominator a lot of people were using vMix. The NHL, Major
League was using vMix. So how does Haivision get in and out of a vMix environment?
Jim Jachetta (32:40):
Well, I have a slide that shows Haivision now has SRT input and output capability in their stream hub
software. So we could accept a feed from a vMix, or we could send a feed to switch a show and produce
a show via vMix. But we also support transport over IP, RTMP, HOS, RTSP. NDI is coming later this year,
as is SMPTE 2110. So those ins and outs, and I’m sure you do similar things, James. That interoperability.
It seems most hardware now supports SRT, so we’re banking on that being the hooks to get in and out
to operate fully in the cloud moving forward.
Ken Kerschbaumer (33:33):
Gotcha. [crosstalk 00:33:33]-
James Japhet (33:34):
Jim, just to pick up on your points around transport streams. I think that’s actually going to be
fascinating. The move from 264 to 265, and people are already talking about 267 not being all that far
away, and this direction is actually coming far quicker than they have previously. I think that’s going to
be a real big game changer for the cloud and cloud production generally. Because, if you’re only having
to swap out encoders in order to get a far more efficient compression, it’s a far more attractive
proposition for people. If you’re not having to swap out all of the server hardware and everything that
goes with it, I think it enables you to actually progress far, far quicker as an industry, really.
Ken Kerschbaumer (34:19):
Sure. Sure. So last question. As far as, you mentioned, James, I looked at what you guys did for the
players. It really proved, I think, that large scale production can be done remotely. Right? A big,
massive… I mean, a golf show is always the kind of thing you would say, you’re always going to need a
lot of trucks on site. Because there’s just so many cameras in so many inputs and replay channels and
what have you. And then for the regular season you got four operators who are involved. And I want to
have your advice from both of you. A lot of the freelancers out there who are freaked out, right? So
they’re being told this is the new normal. But they’re like, okay, so I can’t travel just because of the
COVID situation. So they’re like, well I’m stuck at home. What am I going to do, right?
Ken Kerschbaumer (35:08):
So in general, it seems if though there will be opportunities, if you’re a good operator, to embrace this
new way of working. Yeah, you may lose some frequent flyer miles, like people like to say. And the Hyatt
status may drop off a little bit. In general, what is your advice for people who want to get engaged in
this new world?
James Japhet (35:31):
Yeah. I mean, we’ve seen plenty of hands go up with the networks we’ve been working with. There’s no
shortage of operators put in a hands up to say, actually I’m okay foregoing those air miles and hotel
points. Mainly because I don’t get the reduced pay on the travel days either side, and can actually earn
full whack on all three days of a one day gig. So yeah, I mean, we’ve certainly not seen anyone being shy
in terms of putting hands up. And I think I’d just encourage people to continue embracing it and actually
volunteering yourself to have a go, rather than necessarily sitting back and waiting, expecting to be
asked.
Ken Kerschbaumer (36:12):
Right. Great. Good advice. Jim?
Jim Jachetta (36:14):
Well, yeah, I believe there’s services now, technology capabilities, freelance capabilities where people
can put their capabilities up. A broadcaster could post, next Saturday we need three EVS operators, we
need three Hawk-Eye operators. And match the talent with the need. One thing, to your last point, Ken.
We’ve been talking about operators working remotely. What about talent? I watched the evening KTLA
News, and the two anchors are 12 feet apart on the desk. They got a real wide shot on them. But the
weather person is at home, the sports commentator is at home. And I have found here in the LA area,
just because you live in a $3 million mansion in Calabasas, it doesn’t mean you got good internet. We did
a live show from Jerry O’Connell’s house, and his wife was cutting his hair.
Jim Jachetta (37:30):
I can never, Rebecca, I can’t remember her last name. She was the original blue woman in-
Ken Kerschbaumer (37:35):
Rebecca Romijn Stamos.
Jim Jachetta (37:36):
… Hunger Games.
Ken Kerschbaumer (37:36):
[inaudible 00:00:37:36]. Rebecca Romijn, she’s not a Stamos [crosstalk 00:37:38]-
Jim Jachetta (37:40):
Yes. Yes. So I spent the afternoon in their house, the nicest people. We all had masks on. But they had
sketchy internet. So you can take a bonded cellular encoder, and our larger encoders have two land
connections. So if somebody had two internet connections, one could be fiber, one could be satellite,
one could be traditional internet, and supplement that with cellular. So a lot of our customers, because
the talent doesn’t want a tech coming into their house. So they cleverly take a big Pelican case, strap the
bonded cellular to it, mount a PTZ camera to it, have one big red switch to turn it on. It’s like plug in,
power on. That’s it. The Haivision unit will come on, automatically connect, automatically transmit. And
now we’re able to do PTZ camera control through the cellular. And then a video engineer, who also is
probably working from home, frames the shot, zooms it in, gets on the talent. And now the cellular is
supplementing or making their sketchy internet reliable. So I think that’s an important factor.
Ken Kerschbaumer (39:00):
Yeah, yeah. I mean, look, I think that eventually that when all the events get going, and it’s going to
happen eventually. This is when the job opportunities, to your point James, will open up. I mean, right
now they are limited because of just what’s going on. But I think it’s going to be an interesting couple
years, that’s for sure. At the least. So I really appreciate both of you joining us today. Stick around for
the Q&A at the end of the day. And then we’ll see you then.
Jim Jachetta (39:23):
Thanks Ken.
James Japhet (39:24):
Thanks Ken.
Jim Jachetta (39:24):
Thanks James.
James Japhet (39:26):
Thanks Jim.
Jim Jachetta (39:26):
Cheers. Bye.
Podcast: Play in new window | Download (Duration: 39:47 — 36.4MB) | Embed
Subscribe: Google Podcasts | Email | RSS
Podcast (video): Play in new window | Download (Duration: 39:47 — 1.5GB) | Embed
Subscribe: Google Podcasts | Email | RSS