1

I have a mesh of the interior of a room, of known dimensions.

I also have a number of equirectangular 360° images of the interior of the room, taken from several positions.

How may I project the image data onto the mesh to use as a single texture? I’m keen to use the best-quality image data from each position, which implies using parts of each image for the final texture.

I suspect the answer may involve multiple UV meshes, but I’m not sure how to handle that or how to indicate the various points to project from. I’m wondering if it is possible to set up several camera positions, Project from View onto different UV maps, then somehow bake the textures, export them all separately, and combine them in external software. Only I don’t know how that could be done! Or can it be handled with a complex Node set up for the Image Textures?

I’m working in Cycles.

All help is appreciated.

Thanks!

  • Something like https://blender.stackexchange.com/a/71334/29586 might help - but you’d need to enhance it for the equirectangular projection (currently only a flat projection). For multiple images you could use Maximum to combine multiple projections together to effectively blend them together or combine them in another way. Then bake the result if necessary (although it should be quite efficient without baking). – Rich Sedman Feb 16 '18 at 20:05
  • I think the difficulty is merging the several projections. I’m not familiar with Maximum - will investigate it. Thanks! – solidbronze Feb 16 '18 at 21:50
  • Possibly 'blend' them based on distance might work - this way the 'closer' projection (which is likely more accurate for those 'points') would have more of an influence. I agree that the merging is going to be tricky. It would be useful if you could provide some sample files if possible, along with any other necessary detail (eg, the relative points where each camera was situated) - to allow people to play around with the setup to try and get convincing results. – Rich Sedman Feb 16 '18 at 21:55

0 Answers0