I am rendering depth maps from different angles for many meshes by placing my cameras on a sphere with fixed radius. The meshes I use are all normalized (all within a sphere with radius 0.5) and centered at (0, 0, 0). After rendering I load the depth maps in Python using imageio package or scipy.ndimage.imread I noticed that each individual depth map always has a maximum value of 255. However, what I actually expected was to obtain depth maps whose values are interpretable and intuitive. By interpretability I mean a distance of 40 in one depth map rendered from mesh X will be the same as distance 40 of another depth map rendered from mesh Y. But this is not the case as of now, even within the renderings of the same mesh. To make it more clear: if I have a very small mesh whose furthest vertex would be very close to (0, 0, 0) I would still have values of 255 in the rendered depth maps. However, I will still have pixel values of 255 in the depth map renderings of another mesh which is much larger than the previous one. The same issue persists if I normalize the output values of the Z-pass: all renderings have a maximum value of 1. I'm afraid the issue is related to the PNG saving pipeline that might be doing something weird to the rendering results, as discussed here. So I wonder does anyone know if I am doing something wrong? Any thoughts/solutions would be greatly appreciated.
FYI, here's the code I use for rendering:
scene = bpy.context.scene
scene.render.image_settings.color_depth = '16'
scene.display_settings.display_device = 'sRGB'
scene.view_settings.view_transform = 'Raw'
scene.sequencer_colorspace_settings.name = 'Raw'
scene.use_nodes = True
for node in scene.node_tree.nodes:
scene.node_tree.nodes.remove(node)
renderNode = scene.node_tree.nodes.new('CompositorNodeRLayers')
depthOutputNode = scene.node_tree.nodes.new('CompositorNodeOutputFile')
depthOutputNode.format.file_format = 'PNG'
depthOutputNode.format.color_depth = '16'
depthOutputNode.format.color_mode = 'RGB'
depthOutputNode.base_path = 'somePath/'
depthOutputNode.file_slots[0].path = 'fileNameDepth#'
scene.node_tree.links.new(renderNode.outputs[2], depthOutputNode.inputs[0])
bpy.ops.render.render(write_still=True)
Update: I just did file depthImg.png in Ubuntu and confirmed that the generated depth maps are 16-bit. It's likely that the image packages in Python do not support reading png files with more than 8 bits per channel! I tried a couple of methods and non works. Any solutions for this?

bpy.data.images[0].pixelswork? I think Float Buffer have to be enabled to get non normalized values. – Omar Emara Mar 31 '18 at 20:02