Over on this question I found out about io_blend_utils/blend/blendfile.py, which is a one-file-module to read a blender file without actually loading it into blender. I need to read the elements of a CollectionProperty, which I am currently only able to do for the first element.
I fear this is a very specific and advanced question. I am hoping @ideasman42 may be able to help?
My use case
I need to crawl through a bunch of .blend files to find all the texture atlases (created with the uv_texture_atlas plugin) contained in them.
Since there are many and large .blend files, I do not want to have the overhead of loading all the files into blender.
My code so far
def group_iterator(block):
"""
Iterate over a IDPropertyGroup BlendFileBlock.
:param block: block to iterate over
:return: All IDProperty blocks in the group or list
"""
i = block.get_pointer((b'data', b'group', b'first'))
while i is not None:
yield i
i = i.get_pointer(b'next')
atlases = collections.OrderedDict()
for blend_file_path in blends:
with blendfile.open_blend(blend_file_path) as blend:
scenes = [block for block in blend.blocks if block.code == b'SC']
for scene in scenes:
# get custom properties
properties = scene.get_pointer((b'id', b'properties'))
if properties is None:
continue
# iterate through all the property groups
for itor in group_iterator(properties):
if itor.get(b'name') == "ms_lightmap_groups":
group = itor.get_pointer((b'data', b'pointer'))
if group is not None:
# iterate over all the properties
for prop in group_iterator(group):
name = prop.get(b'name')
if name == 'name':
offset = prop.get((b'data', b'pointer'))
length = prop.get(b'len')
blk = blend.block_from_offset.get(offset)
blend.handle.seek(blk.file_offset, os.SEEK_SET)
atlas_name = blendfile.DNA_IO.read_string0(blend.handle, length)
elif name == 'resolutionX':
resX = prop.get((b'data', b'val'))
elif name == 'resolutionY':
resY = prop.get((b'data', b'val'))
atlases[atlas_name] = (2**(resX+8), 2**(resY+8))
print("Found atlas %s" % atlas_name)
Thank you so much in advance (I have been trying to figure this out for over a day now)!
Regards, Jonathan.