Recently I have been figuring out ways to do virtual production with blender, and I've already got live camera tracking set up. I have the virtual world in one scene and a plane in the other. Both cameras in both scenes have linked data, so they have the same location and rotation. To make the parallax correct on my projector, I need to UV project the camera view from the active camera onto the plane in the second scene. From there, I go into the side view and put that onto the screen. Everything works perfectly, but I just can't figure out how to get the live camera view as an image texture. Because of that, I can't move the location of the camera during a shot because I could only figure out how to use the render result as a texture. Does anyone know if it is possible to use a viewport camera feed like this? If so, how can I do it? (The purpose of this question is not just as a render, it is for real-time virtual production)
Asked
Active
Viewed 489 times
0
-
https://blender.stackexchange.com/questions/48132/how-can-i-view-the-camera-through-an-object-as-a-material-in-cycles https://blender.stackexchange.com/questions/112586/how-to-make-working-security-cameras https://blender.stackexchange.com/questions/111632/workflow-for-multi-subject-multi-angle-compositions/ https://blender.stackexchange.com/questions/111632/workflow-for-multi-subject-multi-angle-compositions/ – Duarte Farrajota Ramos Feb 02 '22 at 22:25
-
I can't render an image sequence and use that for the texture because the camera location is constantly changing. It's a tracked handheld camera. – Logan S. Colon Feb 02 '22 at 22:40
-
It's not possible to do that in Blender – Gorgious Feb 03 '22 at 08:17