I have a script which I am loading mutliple .obj files of room models from the disk. Each .obj goes into sub-collections depending whether it is a room in the same house or not. In the beginning of the script I am trying to clear the previous collections. Βy doing that I thought that the corresponding embedded components are also deleted thus flushing the cache memory that they were preserving. But instead, I've noticed that this is not happening which ends up at some point to run out of memory, I learned that the hard way. What I've noticed is that I am ending up with a lot of orphan structures and so on.
Therefore, my question is whether there is a way to do that without resetting to factory settings, closing/re-opening the software or looping through all the objects, materials, meshes, etc in each collection as suggested in the link below and considering that I might have quite a few collections and sub-collections loaded each time.
The following links:
How to completely remove all loaded data from Blender?
Python: How to completely remove an object
Blender 2.80: Delete Collection or clear the intial scene in scripting mode
is what I've found from previous questions but one way or another they do not seem to work good enough or work at all. Thus, I would appreciate some feedback if possible, thanks.
The script that I am using which clears the collections but not their components is the following:
def clear_collections():
for c in bpy.context.scene.collection.children:
bpy.context.scene.collection.children.unlink(c)
for c in bpy.data.collections:
if not c.users:
bpy.data.collections.remove(c)
def load_scenes(path):
scenes_list = glob.glob(path)
for i in range(len(scenes_list)):
# find all the .obj files
obj_files = glob.glob(scenes_list[i]+"*.obj")
# create house collections
scene_name = str(pathlib.PurePath(scenes_list[i]).name)
house_coll = bpy.data.collections.new(name=scene_name) #create new collection in data
bpy.context.scene.collection.children.link(house_coll) #add new collection to the scene
for j in range(len(obj_files)):
print(obj_files[j])
# get scene name and room and rename inline in blender
room_name = str(pathlib.Path(obj_files[j]).stem)
# create room collections
if house_coll:
room_coll = bpy.data.collections.new(name=room_name) #create new sub-collection in data
house_coll.children.link(room_coll) #add new sub-collection to the scene
# import scene with all objects not merged
bpy.ops.import_scene.obj(filepath=obj_files[j], split_mode="ON", use_split_objects=True, use_split_groups=True)
obs = [o for o in bpy.context.selected_objects
if o.type == 'MESH']
# put them in the center of the plane
# TODO: find why model location is not changing as well
coords = []
for o in obs:
coords.extend(o.matrix_world @ Vector(b) for b in o.bound_box)
x, y, z = np.array(coords).reshape((-1, 3)).T
global_xy_trans = Vector(
(
(x.min() + x.max()) / 2 ,
(y.min() + y.max()) / 2
)
)
for o in obs:
if o.parent in obs:
continue
o.matrix_world.translation.xy -= global_xy_trans
# link objects to specific collection and unlink them from the root collection
for ob in obs:
room_coll.objects.link(ob)
bpy.context.scene.collection.objects.unlink(ob)
# exclude from view layer
bpy.context.layer_collection.children[scene_name].exclude = True
toggle_expand_collapse(2)
def toggle_expand_collapse(type = 2):
# expand/collapse collections in outliner
bpy.ops.wm.redraw_timer(type='DRAW_WIN', iterations=1)
toggle_expand(bpy.context, type) # 1 for expand, 2 for collapse
if name == "main":
# clear previous collections and their data
clear_collections()
# path to find scenecs
path = "/home/ttsesm/datasets/houses/*/"
load_scenes(path)
bpy.ops.outliner.orphans_purge()? – HikariTW Sep 29 '20 at 16:05whileloop where it checks whether there are any data blocks remaining and if yes it purges them. It seems to work as long as there are not any data block marked as retained (which it should be like that). Apparently, by accident I have marked some brushes as retained in from my previous projects. Thanks. – ttsesm Oct 01 '20 at 09:24