1

I wrote a procedural generator and it saves the output for the Meshes in a text file. Then I wrote a python3/Blender script which parses through that text file and creates the meshes from the input. The parsing itself is not fast but definitely within reasonable time.

However when I run blender (blender --python parsetext.py) with more than ~12000 Polygons, Blender turns grey on startup and stops responding (My computer turns quiet - Blender seems to stop doing anything). When I run the script with slightly fewer Polygons it works just fine and also the rendering / in-Program movement works flawless.

It seems like there is some kind of Limit in how many Objects Blender can save or that Blender adds Objects in a weird way (recursively...?!?! recursion depth reached or something along those lines maybe...!?!?).

Has anybody else had this problem? It really seems like Blender should be able to handle more than ~12000 Polygons.

Else, could someone think of a way to save/load my meshes from the script more effectively? (I can't find anything useful on the .blend file format though)

Edit

I tried following some advice concerning merging objects. Before that, each Polygon was created with createpolygon() with a list of coordinates and a texture. The object was then linked to the scene.

Now createbuilding() gets called which calls createpolygon a few times but then tries to join the objects. However, I get a segmentation fault error.

def createpolygon(coords,texture,length):
    me=bpy.data.meshes.new('mesh')
    ob=bpy.data.objects.new('mesh',me)
    me.from_pydata(coords,[], [range(length)])
    me.update(calc_edges=True)
    try:
        me.materials.append(materialslist[texture])
    except:
        pass
    return ob

def createbuilding(obj,texture):
    meshes=[]

    for me in obj:
        ob=createpolygon(me, texture, len(me))
        ob.select=True
        meshes.append(ob)
        bpy.context.scene.objects.link(ob)

    bpy.context.scene.objects.active = meshes[0]

    bpy.ops.object.join()
Johnny S
  • 11
  • 2
  • Would be useful if you could show some of your code as there may be existing questions which may answer your issue. See: http://blender.stackexchange.com/questions/14814/object-creation-slows-over-time, http://blender.stackexchange.com/questions/7581/blender-gets-very-slow-to-draw-a-scene-having-large-number-of-plane and https://blender.stackexchange.com/questions/7358/python-performance-with-blender-operators – Ray Mairlot Jun 28 '15 at 13:10
  • from_pydata() is really not supposed to be called for every single polygon you wanna create. You should pass it the entire vertex and face info in one go, see e.g. https://blender.stackexchange.com/questions/2407/how-to-create-a-mesh-programmatically-without-bmesh For low-level mesh creation see http://blenderartists.org/forum/showthread.php?301499-Create-Default-Cube-(Exactly)-from-python-code – CodeManX Aug 28 '15 at 21:41

1 Answers1

1

If you are trying to add one object per polygon, then yes, it is a quite bad idea with current Blender - adding object can indeed be O² because each time you add one, Blender has to check all (or at least a large subset) of existing ones against name collision. Would suggest to add your 12000 polygons in a single object…

Further than that, it's pretty much impossible to help you without having access to the code.

mont29
  • 3,437
  • 14
  • 17