Using Flux for a neural net solution to differential equations

I have recently started using Flux to construct a physics informed neural network. To do this, I start with a model

u = Chain(Dense(1, 10), Dense(10,15), Dense(15,1))

Then I construct a function to impose the differential condition y’(t) - y(t) = 0

function f(t)
y = sum(u(t))
dy = gradient(() → u(t)[1], params(t))
f = dy[t][1] - y
end

with cost function

cost(t) = abs(f(t))^2

I would like to minimize this cost with respect to the parameters of u but when I call

grad = gradient(() -> cost(t), params(u))

I get a long stacktrace starting with “ERROR: Can’t differentiate foreigncall expression”. The full stacktrace is in this gist.
I am having troubles figuring out what is causing the error. I want to be able to train a network u(x) to respect a condition u’(x) - u(x) = 0 but don’t know how to specify this with Flux. If anyone has any suggestions I would greatly appreciate it.

I believe the error is from the Dict access on the last line of f(). dy[t], finding the t derivative of dy looks like it does a dictionary lookup. I am surprised this doesn’t work, but I don’t know how to fix it either.

Rather than using a derivative constraint, you might try using an integral constraint with a differential equation solver. The DiffEqFlux.jl package was written for this kind of exploration and might give you some ideas.

Alright I will check out DiffEqFlux.jl, thank you!

Look at NeuralNetDiffEq.jl which is a repo of physics-informed neural network implementations and deep BSDE methods.

1 Like

Hi all, I also ran into this issue. I opened an issue on Flux and they solved it, see here for my discourse question and here for the answered flux issue.