Pretty new to Blender - I'm having difficulty wrapping my head around how Blender caches image sequences. I have a 355MB jpeg sequence which I am trying to Prefetch and do a camera track on. Somehow Blender is turning this 355MB sequence into over 20GB of cache when I attempt to prefetch the frames. There is nothing else in my outliner.
I've got 32 GB of RAM on my machine, I've set my Memory Cache Limit to 16GB in the system settings under Video Sequencer.
Why is Blender taking up so much RAM for such a small sequence? Is there a setting somewhere I'm missing?
Thanks for any advice you have!