2

I have a simple scene of a reflective sphere which reflects the hdr map of its surroundings. See the example image below.

enter image description here

My intention

As the sphere moves around the scene the image reflected unto the sphere changes as expected. I want to take the exact (or as near as possible) image currently reflected from the sphere and apply it to another object, preferably by saving it first. This would mean that in essence the image now appears to be 'painted' on top of the other object.

See this short video example. In the first 5 seconds of the video the object has a typical reflection and as the object moves, the reflection changes depending on the scene. In the last 5 seconds of the video, the reflection no longer updates/changes and instead simply moves with the object making the object appear to be matte instead of reflective.

Things I tried

I have tried a few promising methods suggested to me in a previous question, but regretfully none of the methods lead to the desired result.

I tried to bake the current texture, which returns something looking quite similar but not identical to the original reflection. See the example image below. The right image is the reflective sphere (same as the own shown in the image above) while the left image has the baked texture of the right sphere applied to it as an image texture.

enter image description here

As can be seen, the right sphere has the(more or less) physically correct distortion associated with circular object reflections (more distortions towards the edges, etc.,) while the left object does not.

As suggested by user Emir I tried a workaround of rendering the scene using a panoramic cameras at the origin of the right cube and applying that render as an image texture on the left cube, but regretfully this returned in similar problems as the other attempt mentioned above.

Edit As per Emir's answer, my previous approaches were missing an emission node and using an image texture instead of an environment texture

1 Answers1

4

I'm not sure how you render your panoramic nor how did you apply that to the mesh.

In order to get the result that you want you first need to render your scene using a panoramic camera with equirectangular type. This option is only available when you use Cycles render.

enter image description here

Once you render your scene, you need to apply that to your mesh as follow:

Texture coordinate node > Enviroment texture > Emission or glossy node.

And no matter how you move that, it should give the effect you are after. Here is a comparison between a real reflection and a "fake" reflection made with the material nodes.

enter image description here

Emir
  • 5,701
  • 1
  • 14
  • 31
  • +1. Any idea why an ordinary bake doesn't produce the same perspective? – Robin Betts Nov 03 '21 at 12:32
  • 2
    @RobinBetts To be honest, i haven't try to use bake, but if as described from the question "The result is close to the original". Then i could guess is because of the enviroment node, the OP is using an image texture node instead. But i'm not sure – Emir Nov 03 '21 at 12:39
  • Sure. I just can't work out why the image texture shouldn't capture the reflections on a mapped sphere in exactly the same way as the 'reflection' coordinate system captures the environment. Maybe a bit more pencil-and-paper required :) – Robin Betts Nov 03 '21 at 12:44
  • 1
    Probably UV distortion, the UV's not taking the whole canvas and the fact that the baked textures are inside a perfect square canvas, while the panoramic is more like a rectangle. But also just guessing. – Emir Nov 03 '21 at 12:56
  • I can not check it today, but this appears to be exactly what I need. The only difference (that I can see) is the emission node. Just to check, if the reflective sphere is at coords x,y,z the panoramic camera should also be at cords x,y,z? Might the lens settings have an effect? – Mitchell van Zuylen Nov 03 '21 at 12:58
  • I'm not sure what you mean by "lens setting", because the panoramic and equirectangural are part of the lens settings and also did not get the x,y, z. 3d Is all based on X, Y and Z coordinates. My only recomendation is to make sure to point the camera in a straight line or at least 90 rotation to avoid distortions. – Emir Nov 03 '21 at 13:11
  • 2
    @RobinBetts about the why, from the cam perspective you can see reflections that are "far behind" the visible part of the sphere (so reflected things are smaller). Or another way to say that: you see only half of the sphere, but nearly 75% of the environment is reflected in it. At the opposite, when baked, a cast from each face normal only is done (so this is half sphere for half sphere). – lemon Nov 03 '21 at 13:39
  • 1
    @lemon Ahh! Clicked! (Pencil and paper checks out). It's the difference between bisection of the incident and viewing angle, as opposed to mere normal-view. – Robin Betts Nov 03 '21 at 13:59
  • This indeed perfectly captures the reflection of the panoramic image on the sphere. When moving it around, the reflection also stays in place. However, when moving the camera around or rotating the texture coordinates of the environment texture with a vector rotate, the reflection will update so that the point closest to the viewpoint is least distorted. This deviates from the intended behavior of making the object appear matte, as in the example video. – Mitchell van Zuylen Nov 04 '21 at 01:22