3

I make a terrain (ground + vegetation mesh) for an old game engine which needs all the geometry to be separated into chunks consisting of ~ 25k vertices max.

So far I use my "separate by radius" script which enters Edit mode, select faces in a given radius, separate using bpy.ops.mesh.separate(type='SELECTED') and continues until it is separated completely.

This is the original script:

import bpy
import bmesh
import time

class SeparateByRadius(bpy.types.Operator):
    bl_idname = "object.separate_by_radius"
    bl_label = "Separate by Radius"
    bl_options = {'REGISTER', 'UNDO'}

    max_diameter = bpy.props.IntProperty(name="Radius", default=0)

    def execute(self, context):

        if self.max_diameter == 0:
             return {'FINISHED'}

        context = bpy.context
        scene = context.scene

        bpy.ops.object.mode_set(mode="EDIT")

        obj = bpy.context.edit_object
        me = obj.data

        bm = bmesh.from_edit_mesh(me)

        time_start_updating = time.time()

        def separateNextChunk():

            bm.faces.ensure_lookup_table()

            if len(bm.faces) == 0:
                bpy.ops.object.mode_set(mode="OBJECT")
                bpy.ops.object.select_all(action='DESELECT')
                context.active_object.select = True
                bpy.ops.object.delete()
            else:
                active_median = bm.faces[-1].calc_center_median()

                continue_next = False

                for f in bm.faces:
                    f.select = False
                    if (f.calc_center_median()-active_median).length <= self.max_diameter / 2:
                        f.select = True
                        continue_next = True


                bpy.ops.mesh.separate(type='SELECTED')
                # Show the updates in the viewport
                # and recalculate n-gon tessellation.
                bmesh.update_edit_mesh(me, True)

                if continue_next == True:
                    separateNextChunk()

        separateNextChunk()

        print("Separated in " + str(time.time() - time_start_updating))

        return {'FINISHED'}

def register():
    bpy.utils.register_class(SeparateByRadius)


def unregister():
    bpy.utils.unregister_class(SeparateByRadius)

if __name__ == "__main__":
    register()

While it does the job, it is EXTREMELY slow - the splitting often results into ~2000 objects and this take hours to separate using this method. I suspect the reason is that we are doing modifications of the original (large) mesh by the each single separation.

I have tried to find some alternative methods that would generate the chunks without modifying the original mesh, but as I am not a Python programmer, it started to be too complex for me.

I would appreciate so much if anyone can share some specific ideas how to enhance the script to get it work noticeably faster with the same output, the requirements are:

  • separating into chunks of a given max vertex count or given area
  • splitting must use the existing mesh (no cutting)
  • UVs, materials and vertex colors must be preserved perfectly

enter image description here

A .blend with the resulted mesh, just Join all the pieces to single mesh for the purpose of a script testing.

Jan Kadeřábek
  • 4,404
  • 16
  • 20
  • You should add a sample blend file, I think, to allow people to test eventual improvements. – lemon Jul 12 '19 at 17:41
  • split your mesh into a grid with a rule per face into which object it gets copied, not circle ish parts which need a lot of l2 distance checks. 2. duplicate parts of the original mesh directly in python to avoid editing the big mesh. 3. you could maybe use the seperate loose parts function
  • – HenrikD Jul 12 '19 at 17:44
  • @lemon I have attached it, thx. – Jan Kadeřábek Jul 12 '19 at 17:54
  • @HenrikD Split into grid? I am not sure I understand you. It must produce pieces of around 25k vertices. I would like to duplicate it into chunks by Python, but I struggled especially with preventing UVs, Vertex Colors and Materials, unfortunately my skills are quite limited. Separate by Loose Parts can't be used as it must work with larger continuous meshes (like the ground) but at the same time it shouldn't produce too many small chunks (in case of individual trees or grass clusters). – Jan Kadeřábek Jul 12 '19 at 17:57
  • 1
    the file you've uploaded is already cut, right? so join it again will make the base mesh? – lemon Jul 12 '19 at 18:07
  • 1
    I've joined it again (if, see previous comment). I think you should use a dichotomy to separate the parts. Cut in half, cut the half in half, etc. The more the mesh is divided the more it is fast (doing it from Blender GUI). – lemon Jul 12 '19 at 18:10
  • @lemon It must be automated, no hand cutting, even switching into Edit Mode is very slow in Blender in case of such large mesh. The part in the uploaded .blend is just a very small portion of the overall mesh (which is 40km consisting of a road + surroundings covered by vegetation). – Jan Kadeřábek Jul 12 '19 at 18:24
  • Consider employing this method https://blender.stackexchange.com/a/133136/15543 – batFINGER Jul 13 '19 at 05:25
  • @batFINGER Thanks, but one of my requirements is to explode the mesh using it's existing topology, not by creating extra cuts - this would create a lot of unnecessary extra vertices. It should basically select an area of 25k vertices, copy it into new object and continue with a next part. – Jan Kadeřábek Jul 13 '19 at 08:15
  • I have modified this script and converted it into an addon, not very good code, but it works really great - thank you: https://github.com/jendabek/blender-export-x-rbr – Jan Kadeřábek Jul 15 '19 at 00:03