Writing pytorch layers in julia

Hi All,

From time to time, I am pushed to rewrite Mill.jl (multi-instance learning library) in pytorch. While that might be possible, i would like to avoid it, as we made heavy use of custom written gradients to help us deal with missing values and with a flexibility of input data, which can be json files in general. I wonder, if I can create a wrapper in pytorch that would behave to pytorch as a normal layer and people can combine mill models with other pytorch modules. I expect it to be possible, but i would like to ask first, if someone tried something like this. I know some people did it other way around, using pytorch from Julia, but that is not what I want.

Thanks for answers in advance. Explaining possible roadblocks is welcomed.