Neuroscientists using DifferentialEquations.jl ecosystem?

I’ve been contributing to SpikingNeuralNetworks.jl, which has a few different spiking neuron types available.


That’s really impressive! A lot of these formulas are just outside the reach of macro scale analysis with EEG (which is more along the lines of what I do). However, it would be interesting to see how well certain frequency domains could be characterized be these. Thoeretically some of these domains are suppose to represent specific cell types.

That looks really solid! I hope I find an excuse to play with it : )

Indeed the implementation of the equations isn’t such a big deal (although, it can get more elaborate when you start adding cell geometry, and networks).

I’m curious whether anyone has attempted using Hodgkin-Huxley or related models as an input layer to DiffEqFlux, and subsequently training on real data to learn the coefficients. If someone has tried it or has some experience, it would be great to hear how it went for them. If not, I may just give it a go by my lonesome :nerd_face:

Also, thanks for the link @jpsamaroo interesting indeed!

1 Like

Anything new on this front? I am interested in a Julia alternative to the Brian simulator.

Nothing that I am aware of and I am from this field. It should not be too difficult to write a Julia replacement though. Same for Neuron. I know a couple people who wrote a grant about writing a Julia analog of Neuron, not sure they got funded.

What is it that makes you turn away from Brian?

I love Brian actually. However, we are modling an compartmentalized system with astrocytes with a numerical technique that makes it difficult to work within the Synapse/NeuronGroup framework provided. I had to create an artificial Synaptic group and control the order of execution just so I could implement the numerical approach. This was unnatural to say the least, and non-maintanable. So I keep searching. Also, I fell in love with Julia lately. :slight_smile:


SpikingNeuralNetworks is a really good start on this. The thing I found missing when I briefly played with it was easily composing networks. I have a heap of code that binds LightGraphs to a metadata structure that is more suitable for this sort of thing. It’s littered with notes and untested right now though. I was hoping to have something out weeks ago (nothing official but just to get some conversation going) but some events New York state have disrupted … everything.

Edit: by “more suitable for this sort of thing” I just mean composing networks not the actual neuron attributes that SpikingNeuralNetowrks does.


The thing I am worried about SpikingNeuralNetworks, and it should be a fairly simple addition, is that the time stepper is not that great. It should perhaps give the choice of using OrdinaryDiffEq.jl. For example, the iz neuron blows up in finite time…

Hence the precision of the time spikes is not great.


Agreed, I think it’d be ideal to convert SNN (SpikingNeuralNetworks.jl) to use OrdinaryDiffEq as the integrator, especially since that would potebtially net us great GPU support. I have somewhat limited time right now, but if someone feels like putting together a PR to convert some or all of the models to use OrdinaryDiffEq, I’d be happy to review and merge.

1 Like

Would you rate this as a “beginner” friendly task?

I think that it’s a pretty simple task from my standpoint, since the current API for each neuron model’s integrate! call looks reasonably similar to a DiffEq solve call. I can help you debug any issues you run into while doing this, of course.

I am going to give it a try!
Hopefully, I can start to work on it tomorrow.

1 Like

I’m doing exactly this for my final year uni project! It’s largely motivated by a different design principle from Brian. Brian wants the user to express their neuron through a series of differential equations as strings, which makes sense for a mathematician but feels very wrong to a programmer, and somewhat loses the representation of the biology behind a neuron. My goal is to build some package(?) in which the user creates an H-H style neuron as a struct, adding channels and gating variables with the necessary parameters, and then the resulting function is built from the structure and parameters, and then passed to an ODEProblem call. In this way the biological meaning is there, and from a programming perspective having the differential equation as a function built from the structure makes a lot more sense.
I had hoped to integrate it with Unitly.jl in order to have the same unit-checking behaviour as Brian, and include some other model types. However progress has been significantly hampered by the whole global pandemic thing, so that may be an overly optimistic. I certainly intend to continue work on it after I had in whatever mess I manage to create in time for the end of the academic year though.
If I manage to not totally fail my degree I might have something worth showing in a couple of months here.

1 Like

ModelingToolkit might be the thing to target now, since it has fancy component support and unit checking stuff. It’s not complete yet, but it’ll keep growing. I think @asinghvi17 has been doing some things along those lines.

I was using ModelingToolkit, and had a system set up, but for some reason, even in a simple example, the results from the hand-spun system and the MTK one were pretty different, and the solving was also pretty slow (understandably, since the constraints caused the final equation to be a DAE rather than an ODE).

After my exams end, my plan is to look into constructing the system on an expression level, by appending coupling terms to the equations. Since the metadata of the system (jacobian, Wfact, etc) seem not to be constructed, I think it should be possible to construct the ODE systems, and then append to them.

I’m using the FitzHugh-Nagumo model, by the way, which is simpler than most of the more commonly used ones but still reasonably accurate.

Are you referring to unit support with comment or just that it’s generally not ready for prime time use.

Not complete yet and not ready for prime time use are quite different. It’s ready for people to use, and a lot of infrastructure is using it (ParameterizedFunctions.jl is one use, and Pumas is a commercial software that utilizes it under the hood).That said, there’s still a lot of things to add, like automating the algebraic simplifications, automatic GPU support, etc. so where we are vs where I want it to be is still quite a gap. That said, the current interface is stable, and there are ways to do things like unit support, it’s just not the easiest to tell MTK what your units are (exposing type definitions at the higher level is one thing we plan to add fairly soon).

There’s some discussion going on about specifying systems using LightGraphs here.


I’ve done this before as a coding exercise. You could use DiffEqFlux to get gradients of the model wrt some loss function. Or, more simply, any of the techniques here.
DiffEqParamEstim.jl is also a nice package for this.

One tip: don’t use an L2 loss on your data. A slight phase lag in a spike results in a huge L2 difference vs the nominal spike. So the loss function is badly conditioned, and optimisers converge more slowly.

A well conditioned loss for spiking models is: calculate the current you would have to inject at any point in time to make \dot{V}(t) of the parameterised model the same as that of the nominal model. IE
I_{inj}(t) = \dot{V}(t) - \dot{V}^*(t)
Then take the l2 integral of this current:
Cost = \int^T_0 I_{inj}(t)^2 \ dt
IE how hard would I have to push charge in/out of the membrane to make the model dynamics match the nominal dynamics.

Scientifically speaking, it might be useful to make your model behave like the data. However I don’t think there is any point in trying to infer biophysical parameters (e.g. conductances) by model fitting. It’s a hugely ill-coniditioned problem on these conductance based models. You give me a model and a set of parameters that fit the data. I will give you a completely (orders of magnitude) different set of parameters that also fit the data. Hence, the parameter values themselves are meaningless (although the capability of the model to fit the data may be meaningful). (This might be a bit of a generalisation)

Good luck!