I am using blender for a synthetic data generation for over a month now. I was hoping to achieve the same computation with less time than using BVH tree for collision detection, no luck so far.
I was using bound box in bpy module, like
cube = bpy.data.objects['Cube']
cube.bound_box # return itterable of points in 3d space
There are two problems that I faced,
Bound box won't update on its own because of the transformation of the objects, like rotation, location, scale. For that I had to use
bpy.ops.transform_apply(scale=True, location=True, rotation=True)When I did use that force update, I noticed that the bound box that blender gives isn't ideal as they mention in documents, it can be confirmed in the example below
The surface of the cube is itself a bound box as it is a cube after all, but the cube.bound_boxis giving this bound box shown in pic, which is nowhere near ideal.
Note: If anyone found something wrong in this post, please do comment as I am in the middle of development of a framework which is basically a automation of this scene creation. It will really help me a lot, if you could suggest or provide the solution of my issues that I mentioned above.
[ob.matrix_world @ Vector(b) for b in ob.bound_box]will giive the transformed coordinates of the "updated bbox" and will be both axis and object aligned as it was in the first place. IMO this is a local vs global coordinates issue, not how to program an OOBB. Anyway we'll see. – batFINGER Nov 30 '20 at 13:01