I noticed that if I want to use reverse stabilization mesh warp workflow as demonstrated here: Mocha Pro 2021: Intro to PowerMesh Warp Tracking - YouTube and the footage is trimmed in after effects, then after I precomp - the reverse stabilzation duplicated effect does not have the proper offset.
take a longer than neccesary footage that mocha is using a trimmed part of it, do the mocha mesh warp, set it to unwarp. it’s stabilized nice and pretty. then precomp it. now copy paste the effect and set it to “warp” and see it does not sync with the unwarp. the workaround is first precomp the parts you need to work with and then use the workflow, or workaround after the fact find a way to make them sync by precomp - leave all attributes for both precomps. it’s error prone and to much user error probability. this workflow should work seamlessly but I guess it requires the offset to happen internally (since there are no keyframes). what do you think Team
BTW - after effect’s warp stabilizer reversible stabilization workflow suffers from the same gotcha
This is because of a bewildering problem when precomping in After Effects.
If you trim a clip in a composition, we take that trim and apply it as a project range in the Mocha UI, just like After Effects does, so that the source frames match exactly.
However, if you Precomp that trimmed clip, the Precomp layer is now assumed to not just be trimmed, but actually starts at the frame it was comped on. i.e It assumes there are no more frames in the clip.
In text form, this looks like this, where ||| = frames outside of the trim range.
A trimmed clip in AE:
A precomped trim in AE:
This means when a copy of an existing Mocha effect is applied to the Precomp, there are suddenly less frames at the head of the track, and everything is moved forward.
At present, the only way around this is to precomp your trimmed layer BEFORE you track.
We are looking at finding a way to avoid this automatically by detecting when the footage layer is a precomp, but we’re currently trying to find other solutions as well.
thanks Martin. yes. I think if the data would be keyframes, then this could be resolved. because Reverse stabilization workflow for powerpin does work on trimmed footage. when you see the pasted data it’s also easy to see how they align. because here the data is internal you can’t offset it…
so automatic would be great, but if not, maybe there’s a way to create a keyframe for it. then it could be easier. maybe.
There are a couple of new features coming which should help this workflow, but we’re still trying to address the initial problem.
When are the new features coming that you speak of? can you comment on what they are?