So I've been dinking around with the NetRender function in Blender, using an addon I found to enable GPU NetRender functionality, I'm looking to render single images across multiple slaves
My first idea being to set a render border and send two jobs with different borders to two different machines, but the NetRender doesn't support Borders.
So I fell back on Image Stacking, rendering two shots with different seeds, but due to my idiocy, I can't find a way to merge two pictures of different sample seeds, I have both rendered shots, but can't for the life of me find a way to combine them, in Blender, or in my photo editor of choice, GIMP.
What would be the most optimal way to go about this?
Although the focus was more on split network rendering.
– Jul 04 '17 at 03:19