0

I have tracked my camera and recreated the scene mesh on which I am projecting the frame of the image sequence. image shows the projection, using the UVProject modifier, and tracking

If I step some frames back the camera "sees" something different. A different area is textured. How can I now automatically merge these different projections into one so I get a continuously textured model? projection shown on a different frame

I found this tutorial of CGMatter explaining the process but for a scene where multiple textures are needed, this is quite a labour-intensive job. Is there any way to automate or speed that process up? I am aware of this StackExchange question but I do not really understand it and it's quite old too. Thank you for your help appreciate it a lot. :)

Phönix 64
  • 720
  • 6
  • 24
  • 1
    I would suggest use shader for projection instead of modifier that generates distortions. Also merging several projections in shader sounds to me probably easier. BTW ... Next time plan a bit if possible and spend a minute to capture spherical 360 deg environment texture ... easy to reconstruct – vklidu Sep 07 '21 at 18:17
  • Yea, I usually take a HDRI but this was a quick shot in the train station. but thanks for the shader idea did not think of that – Phönix 64 Sep 07 '21 at 18:28

0 Answers0