I have a mesh of the interior of a room, of known dimensions.
I also have a number of equirectangular 360° images of the interior of the room, taken from several positions.
How may I project the image data onto the mesh to use as a single texture? I’m keen to use the best-quality image data from each position, which implies using parts of each image for the final texture.
I suspect the answer may involve multiple UV meshes, but I’m not sure how to handle that or how to indicate the various points to project from. I’m wondering if it is possible to set up several camera positions, Project from View onto different UV maps, then somehow bake the textures, export them all separately, and combine them in external software. Only I don’t know how that could be done! Or can it be handled with a complex Node set up for the Image Textures?
I’m working in Cycles.
All help is appreciated.
Thanks!