0

I'm using python scripting to randomly place a few hundred copies of 10 assets into a scene along with some individual modifications to those objects.

The assets originate from a separate blender file, and each asset is in it's own collection.

Each of those source assets could be as simple as a mesh, or could be a complicated set of objects, geometry nodes, materials. It's arbitrary

The individual modifications include modifying shape key values, but can also be anything else including artibrary vertex edits.

This is also running in headless blender, v3.2.1

The approach I'm using now is:

  1. Make a empty scene
  2. Append the source collections from our separate file containing the source assets
  3. For each of our hundred objects
    1. Duplicate the collection
    2. Modify the duplicate
  4. Hide the original source assets from viewport and rendering
  5. Render the image

The way we duplicate the collection is


def copy_collection_using_duplicate_objects(dest_collection_parent, src_collection, name=None):
    dest_collection = bpy.data.collections.new(src_collection.name)
    dest_collection_parent.children.link(dest_collection)
    # Instead of copying the objects individually, make use of the 
    # UI operations to ensure the objects all remain linked
    bpy.ops.object.select_all(action='DESELECT')
for obj in src_collection.all_objects:
    obj.hide_select = False
    obj.select_set(True)

bpy.ops.object.duplicate(linked=False)

# Move objects into the new collection
objs = bpy.context.selected_objects
for ob in objs:
    for coll in ob.users_collection:
        coll.objects.unlink(ob)

    dest_collection.objects.link(ob)

if name:
    dest_collection.name = f'{src_collection.name}-{name}'

return dest_collection

The reason for implementing the code this way is through trial and error it was the only way I could find at copying all the objects and retaining all the links between objects, materials, modifiers, nodes and everything else.

What I've measured is that can take 100-200ms to copy just a blank cube. When you have one or two hundred collections to copy the time adds up and becomes as much or longer than the rendering time.

I've attempted to rewrite copy_collection_using_duplicate_objects() using only bpy.data but mimicing all the functionality of bpy.ops.object.duplicate() is a daunting task. I have read through the Blender source code, and it does a lot of things I can't find an easy parallel with in python.

My goal is to get this to take as little time as possible.

  • Have you tried https://blender.stackexchange.com/a/243063? – scurest Feb 28 '23 at 19:03
  • I think you may want to use Instance collections, instance them where you want, use bpy.ops.object.duplicates_make_real() and then tweak the objects. – Gorgious Feb 28 '23 at 19:27
  • Good suggestions - thank you. I'll try turning the view_layer updates off and see if there are side effects further on.

    And that will be interesting to test if there is any difference in time taken between bpy.ops.object.duplicate and bpy.ops.object.duplicates_make_real()

    – Phil Martin Feb 28 '23 at 20:00

0 Answers0