3

I'm trying to convert the normal of a material to an image texture, simple thing! It's not even taking a more complex model and baking it to a low poly... But for some reason, every time I bake the normals seem to darken the model's surroundings!

This is the model I made just trying to make simple concrete:

enter image description here

But when I bake the normal map with these settings:

enter image description here

This is what comes out in the texture per object:

enter image description here

It looks right when I get close to the camera, but it looks like the normals are obscuring the model! I'm using object space, I've tried tangent space and it always seems to repeat itself, I've tried CPU and GPU computation and there's no difference... What's happening?

Update

There's no secret to what I'm doing, this was just a test for me to see if I could quickly generate a normal map in Blender for a new texture, so I created a new texture myself and painted it with a diffuse image of concrete material that I I found it on the internet...

enter image description here

Then I simply used the bump node together with a color gradient to simulate a normal map for the model like this:

enter image description here

Then I generated it from the normal that was for a texture image and then I connected the normal image that had been generated to the normal map and then... Why did it get dark?

enter image description here

Update 2

When I bake trying to use tangent space, things get completely bizarre!

enter image description here

1 Answers1

4

Normal Map texture nodes should usually be set to either "Non-Color" or "Linear" color space. sRGB should never work for a correctly baked Normal map.

Also, the Normal Map node should be set to use "Tangent Space" for the type of Normal Map that you're baking here.

So, set your Normal Map image texture's color space to "Non-Color", and your Normal Map node to "Tangent Space" and you should be good to go.

gcs_dev
  • 516
  • 4
  • 9