[ANN] Lilith.jl - yet another deep learning library

Last several months I’ve been working on a thing called Lilith.jl - deep learning library with PyTorch-like API, high backward compatibility and ephasis on performance. It’s far from being complete yet, but it starts to pay back, so perhaps it’s time to share it with the community. To give you a quick view of what Lilith is here’s an example of a simple model definition:

using Lilith


mutable struct Net
    conv1::Conv2d
    conv2::Conv2d
    fc1::Linear
    fc2::Linear
end


Net() = Net(
    Conv2d(1, 20, 5),
    Conv2d(20, 50, 5),
    Linear(4 * 4 * 50, 500),
    Linear(500, 10)
)

function (m::Net)(x::AbstractArray)
    x = maxpool2d(relu.(m.conv1(x)), (2, 2))
    x = maxpool2d(relu.(m.conv2(x)), (2, 2))
    x = reshape(x, 4*4*50, :)
    x = relu.(m.fc1(x))
    x = logsoftmax(m.fc2(x))
    return x
end

If it looks interesting, please continue reading with:

Anticipated questions

Why don’t you just use Flux?

I described my frustration with existing DL frameworks in Julia here.

Why PyTorch-like API?

PyTorch is one of the most widely used frameworks with tens of thousands of already implemented models. By using similar API, Lilith allows you to port these models to Julia easily. As an example, compare implementation of ResNet in PyTorch and in Lilith.

Note that PyTorch and Lilith are only similar in neural network API, internally they are guite different with former being more flexible and later being faster.

What are the next steps?

Issues with highest priority are summarized here.

21 Likes