8

Is there any way to bake the Object vector that you get from the Cycles texture coordinate node to UVs so that it can be used for image textures? This would be for Dynamic Paint, or just a quicker or more convenient way to get a clean UV unwrap.

Object texture coordinates do the job well for most of my procedural texturing needs. There are times where I want to use image textures, but getting clean UV unwraps is always a nightmare.

Ascalon
  • 6,579
  • 8
  • 55
  • 121
  • 3
    You can surely bake every color information you can get with the node editor by using an emission shader. What I don't get is how you intend to use it (and why you need to bake it). Could you be more precise or set up a quick example? – Carlo Feb 28 '16 at 17:23
  • 3
    Dynamic paint bakes to either vertex colors or an image texture using UVs. It needs clean UVs. The quick UV options that are available either cause bad seams that bleed on the texture, or are very distorted on complex objects. I use Object and/or Generated for procedural textures because they are cleaner. It would save a lot of time setting up Dynamic paint if I could use it there as well. – Ascalon Feb 28 '16 at 21:28
  • Could you add some precisions to your question ? Pictures for instance. Thanks – lemon Aug 16 '16 at 17:29
  • this question already has two answers here: http://blender.stackexchange.com/questions/58932/apply-object-texture-coordinates/58945#58945 – aliasguru Aug 17 '16 at 06:52
  • @aliasguru Thanks. It looks like that does answer it, but ends with the saved coordinates being an image texture, which I assume will eat RAM on render? Is there any way to get this to be a UV map? If you want to make a proper answer to this, I can award you the bounty. – Ascalon Aug 17 '16 at 07:08
  • Ahhh now I see where you want to take it. I was about to test the memory consumption during rendering using the posted method, but actually you want an unwrap that replicates what Object Coordinates would give you. Need to think about this in the evening... – aliasguru Aug 17 '16 at 12:41
  • After a few tries I have to admit I don't find a solution. The already posted answer (see link above) works, but it does indeed add a memory overhead when rendering. Testing with a 1K baked UV texture, there is additional 13MB showing up in the Viewport render diagnostics. Which isn't too bad, given the fact it is a 32bit tex. I was thinking about then baking those colors into the vertices (which can only be done with Blender Internal), so I can read out the color values, and then place the UVs via Python according to this. But by then, you'll have a clean unwrap anyways, it's kinda pointless. – aliasguru Aug 18 '16 at 19:22
  • @aliasguru Thank you for trying! I don't know the proper protocol, but I will need to award this bounty somehow. If you want to make a proper answer saying that, I can give it to you. – Ascalon Aug 18 '16 at 19:47
  • I've posted an answer, but please wait the four days, it's possible that someone finally comes up with a working solution. There is some very talented people around here on this site. You can afaik also not award the bounty, then only half of the points are given to an answer, and only if it was upvoted at least twice, see here, 'how is a bounty awarded': http://blender.stackexchange.com/help/bounty – aliasguru Aug 18 '16 at 19:54

2 Answers2

4

The bottom line is:

This is theoretically impossible

If you think about it, object, generated, mirror (and any other auto-generated texture coordinates) can be as detailed as is necessary -- they are not based on the detail of the mesh. UV maps, on the other hand, are very much related to the detail/design of the mesh object. This is often called comparing apples and oranges :)

That being said, the only way to fake it would be to use a texture (to get much more detail than a mesh, but less than auto-generated coords) as mentioned here. If it would be easy to convert the clean object or generated coordinates into uv-maps, game devs would be using it everyday...but there isn't. So for right now we just need to stick to smart unwrap and our uv seams ; )

Finally, if you're still not convinced, take a look at this mess of auto-generated coordinates...

enter image description here

Here I am mixing three different texture coordinates (generated, window, and reflection) and using the result in the checker texture's vector input. This material is applied to a cube and however you manipulate the uvs, you will never be able to mimic this mapping because these texture coordinates act like textures themselves: the massive amount of warping from the checker texture is only because of the strange texture coordinates. Oh, and about the texture coordinates acting like textures themselves...have you ever tried to apply a texture as the model (I mean a diffuse/albedo/color texture not a displacement map)?

Yet another reason as to why this would not be feasible or even desirable is that the texture coordinates, excluding uvs, are all volumetric while the uvs are two-dimensional. Therefore, as soon as you "converted" to uvs you would have lost the most important benefit that the object texture coordinate gives you: flexibility.

I hope this helps to clear up your question effectively.

JakeD
  • 8,467
  • 2
  • 30
  • 73
1

After a few tries I have to admit I don't find a solution. The already posted answer here Apply Object Texture Coordinates works, but it does indeed add a memory overhead when rendering. Testing with a 1K baked UV texture, there is additional 13MB showing up in the Viewport render diagnostics. Which isn't too bad, given the fact it is a 32bit Texture.

I was then thinking about baking those colors into the vertices (which can only be done with Blender Internal), so I can read out the color values, and then place the UVs via Python according to this. But by then, you'll have a clean unwrap anyways, it's kinda pointless. Maybe someone else has a better idea?

aliasguru
  • 11,231
  • 2
  • 35
  • 72