0

I'm working on a project that loads a bunch of 3D models (around 100 models) into a scene, exports a render, then deletes all the models and loads in 100 fresh models into a new scene, then dumps them, and so on.

I have a script that needs to do this a few thousand times... but my computer's memory gets eaten up at around 100 times. I think it's keeping all the deleted models in memory. Closing blender and opening it again to run the script again works, but I don't want to do that every 100 iterations.

There must be a way to permanently delete models from a scene right?

Any suggestions would be highly appreciated. Thanks!

  • 1
    Does it matter if you add a bit of an overhead of opening Blender every run of the script? If not, I would suggest using python's subprocess module to run Blender in headless mode with your script, and modifying the script to accept command line arguments specifying which models to load and where to save your renders. https://docs.python.org/3/library/subprocess.html See also: https://blender.stackexchange.com/questions/1365/how-can-i-run-blender-from-command-line-or-a-python-script-without-opening-a-gui – TLousky Jan 14 '19 at 15:59
  • If anyone is having this problem, I actually found the answer in here: https://blender.stackexchange.com/questions/48836/purge-unused-data-e-g-particle-system-or-groups-by-script/80199 Amir's answer seems to be working fine! – BoobooRoose Jan 24 '19 at 05:19

0 Answers0