I am making a procedural bump shader that needs to interact with other directional data in realtime within the material (as opposed to external lighting). Therefore I need to get the normals from this bump/height data, and cannot use the baking method to do it.
I adopted the node setup from this answer, and I believe I'm doing it basically correctly, however there are a couple of things that seem "off", and I'm wondering if the result can be improved.
Above: The center image is a plane that uses a spherical bump map as its source (left), to derive the normal vectors. When compared with spherical geometry that has a normal-colored shadeless material (right), we can see the discrepancy in accuracy.
In comparison to the reference on the right, the center plane that derives its normals from a bump map appears less spherical. It is more flattened and the shading produces crescent shapes on all four sides, so the gradient is not completely accurate. This is what I am hoping to improve.
In the material's node tree there is a Mapping node that has values X:1.4 and Y:-1.4. These values are totally arbitrary and I arrived at this by moving the sliders until I got the result that looked best to me. If there are "correct", more precise values, I don't know what the basis would be.
In my result there is a Moiré pattern artifact that can be seen when zoomed in closely. It becomes less noticeable as the source bump texture's render resolution is increased. I'm not concerned with it because I will ultimately be using procedural textures that don't rely on pixel resolution, but I'm pointing it out for the sake of establishing that it is not part of the matter this question is concerned with.
It's worth noting that the bump map was derived from the same hemisphere as the reference. It has a value of pure black at its base, and pure white at its summit. In other words, a range of 0.0 to 1.0.
Finally, I'm not ruling out the possibility that obtaining accurate normals from bump data alone might have its limitations. Maybe this node setup is already the best that can be done. But hopefully there's a way to improve the accuracy. Seeing the result of the bake technique is encouraging. If that result is accurate, the math to achieve that should, in theory, be possible using material nodes too.
Edit:
Below is an example of how 3D angles can be isolated in a 2D image when there is normal data available.
↑ The above GIF is not a 3D mesh, but an image plane using a normal map of Suzanne. ↓
While in this example a normal map image texture already exists, my question is mainly for the purpose of using procedural textures that have only height data and did not originate as 3D geometry.
I know this is possible (refer to the first two screenshots), however I was wondering if there is a way to improve the accuracy.




Multiply Add). Furthermore, I noticed that after adding a Vector Math node toMultiplyit by itself, the luminosity darkened to almost the exact same color as when Color Management's View transform is set toRaw. The essential nodes: https://i.stack.imgur.com/4Yi3m.png Thank you for your help. Please post as an answer if you feel inclined to. – Mentalist Apr 03 '23 at 10:56