I wrote a Python script to generate several image textures for use in Image Texture nodes. All the calculations yield values with many decimal places (some being on the order of 1e-7).
Printing the results of the calculations themselves to the console shows the values in full (or at least sufficient) precision. Printing the RGBA values of the generated image pixels for the image texture node, however, shows only one decimal place—with minuscule values being reduced to 0.0.
For this particular image texture node, this precision is not just a matter of display on the console. It really does seem that the RGBA values are 0.0. I say this because scaling the output of the Image Texture node by 1e7 has no effect (on color strength, which the node serves to control); yet, adding a 1e-7 to the output and then scaling suffices to get a nonzero color strength.
I figure that somewhere between the calculations themselves and the storing of the results in the generated image pixels, something is happening. This seems true: printing the RGBA values of the pixel entries in the generated image itself shows 0.0s instead of the small numbers I saw when printing the calculation results.
I'll post more detail if needed, but does anyone know why this might be happening? I should note that this kind of rounding doesn't seem to be happening for a different image texture file (whose values have many decimal places, though all are sizable fractions between 0 and 1, as opposed to being on the order of 1e-7).
Thanks in advance!