I'm trying to simulate IMU sensors in a Blender animation in order to generate data.
Specifically, I've parented (using 3-vertex parenting) mesh cubes to a human body mesh, which I've animated using IK bones. The cubes are supposed to represent the IMU sensors. I've made them active rigid bodies, but they're currently controlled by the animation. Here are a couple of example screenshots, the red box on the left thigh is the "sensor":
I would like to get the orientation and acceleration of these cubes for every frame.
I'm already getting the orientation using the object.matrix_world.to_quaternion() API call for objects. I would like to know if there's a similar API call or approach for getting or deriving the acceleration.
I've already read this post, as well as this one, but I'd like to know if there's a more accurate approach, or even an idiomatic or conventional way this is done in Blender.


matrix_world.to_translation()and then calculate the derivatives from that data as post-process. You can also get acceleration like this: A way to get Rate of Change in translation or rotation into a shader node. Using Animation Nodes plugin to calculate stuff is also a very good way. – Jaroslav Jerryno Novotny Oct 22 '18 at 10:15