Apply tracking data to replacement sky in VR

Hi there,

Sorry for this basic question (just getting started in Mocha VR), but I wondered if someone could help me work out how to track x and y rotation and roll for a sky replacement on equirectangular (monoscopic) drone footage?

I’ve done the horizon track and reorient to stabilise, but would like to know if it’s possible to re-use that tracking data on the sky plate to match the movement in the original shot.

Many thanks in advance.


Yes, you should be able to match the movement in the original shot, but not the reoriented shot. It all depends on what you need to do. You’d have to retrack for the reoriented shot for best results. I would try to add effects after reorient, unless you’re doing a rig removal, in which case, I would remove the rig before I used reorient, to keep the rig more stable.

Many thanks for the reply Mary.

Rather than paint out the drone I think I’d rather do a full sky replacement, so will follow your advice and re-track the stabilised and reoriented comp and add the new sky to that.

Thanks again,


Hi mate

I’m struggling with the same issue. Could you please explain how you solved it. Thanks

Sorry mate, I never did…

Still trying though :frowning:

If you never did, can you send me the shot and I will take a look?

Thanks Mary, that would be amazing.

I am aware though that it runs somewhat counter to the point of Mocha, in that with a rock solid stabilised shot you shouldn’t need to track the pan, tilt and roll of the sky replacement to the original footage.

Even so, it would be a nice technique to know how to achieve, in the case of shots which aren’t intended to be static, or when 100% stabilisation is really difficult, such as drone shots over water.

In any case I’ll upload a shot to send over.

Thanks again

Here we go. Not the same material I was working on the first time round, but similar issue.



I have the same problem too.
I made it according to the tutorial.
I erased the drone using a CleanPlate.
However, the CleanPlate in the drone part moves differently from the nearby clouds.
I posted the my work procedure on yuotube.
Also I posted the finished video on youtube.
What’s wrong with my work?

How I erased the drone with Mocha


This would be a really useful technique to know, just for when rock-solid stabilization isn’t possible or desired.

I believe it’s possible and fairly straightforward using the Dashwood tools, for example here 360° Video Stabilization Tutorial - YouTube

Should be possible to do the same thing in Mocha too right?

Thanks for any guidance

Your technique will not create the best results. Generally, Zenith replacements are more suited to clone & patch technique (does not require Mocha) because the is not much planar data to track on sky. To get a good Mocha remove, you need to have a good solid planar track on the background (in this case the clouds). If I was doing this job, I would create a simple patch offset in AE or Premiere with a some feathered masks on either side of the patch area. You could use Mocha’s Lens module to help here or the built on Adobe VR tools to flatten the Zenith view>do comp>undistort back to 360. Look up Mocha VR “lens workflow” for this.

If you really wanted to use Mocha for this tasks, you should display the zenith view and try to track the clouds just below the black hole. Since the black spot does not move, you need to avoid it to get a track on the movement of the cloud pixels. Once this is tracked, then use the UBER KEY and make the shape layer cover the black hole. Might need 2 keyframes. Last step is to make a 2nd mask layer around the patch hole. Without following this type of technique, the remove module will not work on shots like this.

Another options is to again flatten the zenith view in AE, precomp, then use the new Adobe Content Aware Fill tool.

Best of luck and hope this helps.

You could certainly use Mocha Pro to support the Dashwood (or AE VR/Mettle tools) which is the older style workflow and you only use Mocha to track in the flattened (non-360) mode and link the track data.

If you read the response earlier, Sky/zenith patching is typically better suited to a clone & patch technique unless there is very clear tracking data on the sky (very specific clouds, etc).

Hi Ross-san
Unfortunately, my comprehension only understands half of your explain.
You mean by filling the zenith with copy paste?

Hi There,

He means using rotopaint to fill the zenith with paint strokes sampled from other areas of the footage.