I am following this blog post for testing out rendering possibly without writing to a file: https://ammous88.wordpress.com/2015/01/16/blender-access-render-results-pixels-directly-from-python-2/
I've tried this procedure out on a custom scene, but noticed the lighting was significantly different. Trying it out on a public demo, the Race Spaceship, I also observe a difference between the image from the saved file from render() (img_file) and the file from the view data (img_view), computed as np.abs(img_file - img_view):
Any chance someone knows the root cause of this?
Here's the script I'm using to create this image (Blender 2.80, Ubuntu 18.04): https://github.com/EricCousineau-TRI/blender_server/blob/625f30f191d315fb741b88c817094ecd104eeaa6/view_rendering_test.py
Possibly related: is that the Viewer node data is using sRGB, whereas I want to see RGB?
Blender Cycles render-> Different result between screen image and saved jpg file (will see if I can tinker with settings to affect this)

Scene > Post Processingsection, the Dither value is set to 0, and Compositing is checked. However, if I manually render via the GUI (after running the script) and save both theRender ResultandViewer Resultimages, I do get the same output, so it seems like it may be a difference in file I/O, but it's still unclear what exactly. If I then open up the scripting panel, execute just the image extract, I still need thenp.clipsince some of the RGB channels fall outside of the range of[0..1]. – Eric Cousineau Oct 18 '19 at 20:01