The online demo (https://sam2.metademolab.com/demo) is good, but my local demo deployed using the github opensource code is bad.
Steps are 1) adding some prompt points in first frame, for example, include palm and exclude arm,
2) in some following frame which get wrong, add some corrective points, it will respect the input points and update current frame's result, which looks good.
3) after press "track objects" button, it propagates the video, in the opensource demo, it will overwrite the result on the frame I just added corrective points, with a worse result.
Anyone knows how to fix this? I makes the local demo almost unable to use.
Left: local demo, Right: online demo

