Spiking Neural Networks

What is the state of the art for modeling spiking neural networks in Julia?

I haven’t found much recent work:

I didn’t find anything using recent training algorithms. Anyone working in this area? Did I overlook any projects?

1 Like

I’m working on a new theory on spiking neural network using Julia, still work in progress, not released yet.
The problem of SNN is that it has not come to a consensus like normal NN, so you cannot have a standard framework for people to use…

As far as I know the people who use SNN has very different dataset/structure/training algorithm/computational model/purpose.

Some are trying to build neural simulator as realistic as possible, to mimic the pattern happen in brain.
Some are fitting a NN trained by traditional BP and transformed into SNN.

In term of computation, there are event based SNN, time-step based, rate based model, there are also people doing it by ODE solving. Most are just using CPU computation only. And most framework can only simulate very small number of neurons on a single computer.

In term of learning algorithm, some are experimenting with variations of STDP or Hebbs rule, on either spike train recorded from device or brain’s biological data. There are also people doing STDP learning on DL dataset like MNIST. And there are also people trying to get BP to work on SNN.

My personal goal on this project is to have a general learning theory on SNN learning, implemented with CUDA.jl(GPU acceleration), it’s a time-step based model, so it’s not as realistic as those event-based or ODE based model and has it’s limitation. But it’s fast and it allows me run experiment on larger scale of neurons, I believe with only neurons at a scale can show more interesting phenomenon.

But as a independent researcher, I’m doing it rather slow…

2 Likes

I don’t think we have anything close to NEURON, NEST or BRIAN in Julia. But I think the current Julia ecosystem would have a lot to offer to write a competitor, e.g. DifferentialEquations, Unitful or Automatic Differentiation (Zygote and friends).

For my own research I use usually custom code, like this event-based integrator, or some recent work (not public) with stochastic binary neurons where I use automatic differentiation with the straight-through estimator.

What kind of spiking neural network models are you interested in?

Some time ago I’ve written GitHub - Datseris/SpikeSynchrony.jl: Julia implementations for measuring distances, synchrony and correlation between spike trains

If you people can use it in any way, please do go ahead. Happy to transfer to a different org as well.

That was exactly my thought, I am surprised that not much work has been done with AD and the ODE library.
I am interested in playing with recent learning algorithms like e-Prop and SG and different neuron models. Ideally something based on OrdinaryDiffEq, but I am not sure how big a network this could simulate.

sounds interesting. Looking forward to hear more about this in the future.

For the simple (forward Euler) integration used by Bellec et al. in the e-prop paper you don’t necessarily need OrdinaryDiffEq, but I think you could use it (with Euler) without any performance penalty. The pseudo-derivatives of Bellec et al. are easy to implement with Zygote, e.g.

spike(u) = u .> 0
function Zygote.ChainRules.rrule(::typeof(spike), u)
    output = spike(u)
    function sp_pullback(y)
        return NoTangent(), @.(y * .3 * max(0, 1 - abs(u)))
    end
    return output, sp_pullback
end
1 Like

I think you just don’t need a SNN lib for what you want, in Julia, if you use CPU only, you’ll just need array of your structs and apply the algorithm on them. SNN doesn’t have too much line of code if ODE is well handled.
If you use CUDA, it’s still super easy to write kernels with CUDA.jl.
If you use FPGA or some spike-based hardware, I cannot help here…

I wrote WaspNet.jl, but it’s not something I use for work anymore. There is a partially implemented DifferentialEquations.jl branch for the integrator, but I never got it merged into master. I gave a small talk on the project at JuliaCon 2020 and Chris Rackauckas helped me get DiffEq up and going afterwards. With WaspNet.jl we were trying to support spiking, non-spiking, and some idea about “wrapped neurons” with functions applied to the spiking outputs, so DiffEq.jl was a bit harder to work with in the latter cases (or I was just using it wrong!).

I’ve been thinking about testing and merging the DiffEq branch to master, if there’s interest I could see about doing that in the near future.

The DiffEq branch is here: GitHub - leaflabs/WaspNet.jl at diffeq-integration

2 Likes

I am interested in playing with different types of neurons, and it would be very natural to capture their behavior with ODEs. Not saying it has to be done this way and I don’t know at what point this would run into performance issues. Currently I am looking at all that is available.

In my framework, I define different type of neurons by it’s output effect(and all neurons is affected the same way by all kinds of neurons). So with julia’s multiple dispatch, I could implement it as a apply() method, dispatch on different type of neurons.
For my current work, I cares more about computational function than biological accuracy, so I choose to ignore some temporal feature of real neuron, and use time-step based model. The obvious benefit is that I can easily run networks with at least 10000 neurons on a single 2080ti GPU, and takes about 1-2ms for all neuron to run a single step. If you take 1 step as 1ms, I can almost get real time speed. So maybe in the future I can plug a live camera for it to see the world.(I didn’t really try to scale to the limit, so I don’t know how big it can hold)

Large scale computational function is more interesting than running accurate simulation on hundreds of neurons for me…

1 Like