Meta-learning in Julia

Meta learning is a branch of machine learning that deals with learning to learn. An example of this is few-shot learning where the algorithm is trying to learn from a small dataset fast by given example of many datasets similar in some way (for example that they all represent classifying different objects but that the objects present in each dataset differ).

I recently stumbled upon pytorch-meta which includes data loading functionality for the most common datasets used for meta learning. I feel that this would be a good contribution to get people to use Flux and Julia for meta learning.

Is this of interest to anyone else out there? Would be happy to work together on a package like this for Julia!


Have you looked at Flux zoo? There are some meta learning examples there

1 Like

If you make a PR to add that data loader to MLDatasets.jl I will review it.

Here is the link.


Thanks for the response @Zach_Christensen @oxinabox!

The MLDatasets.jl is good for normal machine learning, but for meta-learning we need to create many small datasets from a larger dataset (or explicitly sample from a predefined distribution over distributions, see for example the sine-wave example in the original maml paper. It’s unruly to create all of these datasets in batch since the resulting dataset-of-datasets will be very large; creating them on the fly from the original big dataset seem to be a better solution.

For the framework, I was thinking of something more in line of Reinforce.jl but for meta-learning.