In an ideal world you should be tracking something at the horizon like a cloud or a very distant object near infinity, this is because parallax on moving video will mess with you when stabilizing.
I’m currently trying to figure out a solution for the same question.
Here’s what I’m working with, and what I’ve tried, and hopefully someone can come save the day for us.
My footage is a steadicam shot in a tight canyon, it’s got some wobble on the Y axis that I’m trying to get rid of so that the VR experience is clean and not gonna make anyone woozy.
It’s stereoscopic, 180, rendered to 360 with black on the sides (since mocha doesn’t have a 180 option)
I’ve tried a handhold method like what you’re describing (track until something is almost out of frame, overlap 10-20 frames on a new layer in a similar spot) and have had only a modicum of success in reorienting and stabilizing, nothing worth calling good enough.
I then tried tracking the furthest object I could see which is about 200ft from camera, but at that distance (even in 7680x3840 per eye) there’s just not enough detail to get a clean track.
My most recent try was really solid at stabilizing the shot, but introduced a lot of jitter, that was done by having two planar tracks on that same distant object set to only translate/scale/rotate.
These shots are between 40 to 90 seconds long and have to be done as full takes, so it’s really hard to get a solid stabilized track over 1800 frames while the camera moves through an area with no visible horizon, or even clouds.
From what I understand it should be possible to overlay the tracks like you’re talking about, and it’s definitely the proper method for near object tracking.
What I need to know (piggybacking on this question) is how to achieve a decent stabilization which isn’t trying to “track” those focus points, but rather just using them to smooth out motion before reorienting.