I've been trying to train an RNN on some data that I pre-processed externally and saved to disk. My dataset is too large to fit in memory and is additionally varying in shape.
Looking for a way to do out-of-core training with such requirements, the closest thing I've found was this question, however it's not quite what I need.
So, my question is whether or not I can create custom NetEncoder (i.e. something I can pass into my NetGraph). Or if this is a feature being worked on?
Otherwise, is my only option to load subsets of my data for training?