Lux.jl demo with Lotka-Voltera with UODE

Is there a tutorial or example of a universal differential equation which is mixture of deterministic and neural networks in the RHS, something like the Lotka-Volterra equations where the nonlinear terms are modeled with neural networks? Of course, there is such a demo using Flux.jl, and I have run it. Given that the scientific machine learning library is moving towards Lux.jl, I would like to see a working example of the Lotka-Volterra equation solved using Lux.jl, something like:

dx/dt = -x + NN[1]
dy/dt =  y + NN[2]

where NN is a neural network with two inputs and two outputs. Lux.jl is sufficiently different than Fluix.jl that such an example would be useful. If such an example already exists, could you please email a link? Thanks.

Also, what is the recommended use of Flux.jl versus Lux.jl in Scientific Machine Learning at this time?

Basically. Well, remember we are in the middle of launching our new documentation. This new tutorial is already written, but it’s having deployment issues so hopefully it will be online tomorrow. You’ll be able to find it here

where right now it’s a link to an older version before the explanation was written in. A much more complete version (SciMLDocs/ at main · SciML/SciMLDocs · GitHub) should go online tomorrow if we fix the keys issue.

Extra Details You May Not Need

I’m trying to not give too many details because I’m instead trying to get this all done and have the complete new documentation launched by the end of the year, but basically you can see we are in the last stages which includes:

With all of that almost completed, we just set the old links ( redirect to the new websites (DifferentialEquations.jl: Scientific Machine Learning (SciML) Enabled Simulation and Estimation · DifferentialEquations.jl) on Monday. We hit a little bit of a snag because we had to move the big CPU and GPU tutorial runs to dedicated hardware, and those are hitting some keys issues so they aren’t deploying. That should all be figured out by tomorrow, and by Wednesday we should be able to declare that all tutorials are now being actively live generated and tested.

Then the last pieces are to:

  • write a few of the remaining tutorials (I know there are a few blank pages in the getting started section…)
  • restructure a few according to the ModelingToolkit PR’s specifications
  • move the UDE example code and setup a section in the SciMLSensitivity docs with them
  • setup the downstream doctesting system

and then we are good to go. I’d call that “SciML v1.0”, and I want it done by the end of the month. With that all together, no code change should ever merge that unknowingly breaks any example or tutorial, and if one changes, it would require an immediate update. This has been a huge undertaking, and requires a huge amount of compute for a system this large, but it’s so close…

So anyways, tl;dr, that page exists but it needs to fix some deployment keys. We will put out a blog post when our documentation overhaul is complete.


Thanks. I can wait a few days, obviously. As I am sure you realize, there is no way for me to know what state the transition is in. I will wait. Cheers,. Gordon.

Yup no worries. It’s hard to communicate it, especially as we just set the links to start redirecting this week. But it’ll take a bit for the new sites to roll out. When they are out, this will all have very clear answers, and so I’m just going to hold off until that’s done.

Great docs, Chris! How many people helping out? The sheer magnitude of the effort is mind-boggling. I like the Lux + diffeq example. It was how I implemented. Next week I hope to find out more about a strange error I have when embedding functionality within my own functions. But I recognize your lack of time!


You pointed out the other day the issue with having too low a tolerance when integrating ODES together with saveat. Here is a line from your own code that you referred me to:
(URL: Lux.jl demo with Lotka-Voltera with UODE - #4 by ChrisRackauckas)

solution = solve(prob, Vern7(), abstol=1e-12, reltol=1e-12, saveat = 0.1)

I realize this is not the official release. Where should i report these types of issues if not here? Thanks.

For now just here, or GitHub - SciML/SciMLDocs: Global documentation for the Julia SciML Scientific Machine Learning Organization when it releases. But I haven’t been able to see the full results on that tutorial yet because of the deployment not working, but I think the DataDrivenDiffEq part isn’t completely updated to DDQ v1.0 yet.

Again, I am just reporting what I see, Chris. I assumed there was an issue, but I figured that the more information you get, the easier it “might” be to debug.

1 Like


Who wrote the Lux.jl demo with Lotka-Voltera equations? I almost got it working, but then I decided to learn more about package and module management and AutoZygote could not be found. After some researching, I think I found the reason. The code, located at, has the following statements near the top:

# SciML Tools
using OrdinaryDiffEq, ModelingToolkit, DataDrivenDiffEq, SciMLSensitivity, DataDrivenSparse
using Optimization, OptimizationOptimisers, OptimizationOptimJL

# Standard Libraries
using LinearAlgebra, Statistics, Random

# External Libraries
using ComponentArrays, Lux, Plots

All looks good, but consider Optimization.jl: in that package, one can find an __init__ function with the following:

function __init__()
    # AD backends
    @require FiniteDiff="6a86dc24-6348-571c-b903-95158fe2bd41" include("function/finitediff.jl")
    @require ForwardDiff="f6369f11-7733-5829-9624-2563aa707210" include("function/forwarddiff.jl")
    @require ReverseDiff="37e2e3b7-166d-5795-8a7a-e32c996b4267" include("function/reversediff.jl")
    @require Tracker="9f7883ad-71c0-57eb-9f7f-b5c9e6d3789c" include("function/tracker.jl")
    @require Zygote="e88e6eb3-aa80-5325-afca-941959d7151f" include("function/zygote.jl")
    @require ModelingToolkit="961ee093-0014-501f-94e3-6117800e7a78" include("function/mtk.jl")

This function gets executed after Optimization.jl is loaded. The line with @require Zygote ... requires the package Zygote to be loaded as well (as far as I understand it), which it is not in the code you referred me to. So I added using Zygote to my Julia code, restarted Julia, and all worked as expected.

So my question is: why wasn’t Zygote required in the code you sent me? Probably there is more that I do not understand.

Thanks for any insight!


It was a mixture of @Julius_Martensen and I. The last piece is in a PR:

Interesting that you added “using Zygote.” Was it for the reason I identified? This begs the question: normally, users would not go to the source code and find the __init__ function. How would they be able to figure out that Zygote must be included? At some level, Julia has become too “cool.” So much gets hidden with slick interfaces that when something goes wrong, it is hard to figure out. Isn’t that partly why you are transitioning to Lux from Flux; in order to have more control over the parameters and what happens to them? At least that is partly the stated motivation for Lux.jl.

When I use Python, I rarely do “from module import *”. Rather, I do: import xxxxx as yy", and use the yy prefix on all my calls. I find that this makes it easier to track what is going on in complex code. That is not the practice in the Julia community. Why is that? Thanks.

I think the short answer is that perfect is the enemy of good, and for better or worse Julia tends to be willing to forgo common features if it cannot come to consensus on the cleanest / most consistent API by which to implement them.

I will make no comment on what is the right way to handle module and namespace inclusion, but you may be interested to see that this question is actually one of the oldest open issues #4600 and there was similarly heated discussion here.

Zygote is only used internally to Optimization.jl. It’s not exposed to the user in this example at all. You aren’t calling anything from Zygote, so there is nothing to see from Zygote. Things in Zygote are like Zygote.gradient, they just aren’t in this example at all so there’s no mention of Zygote.

Yes, I agree, that Zygote is not directly exposed to the user. However, it seems to be more complicated than that. I am referring to the code you referred me to at

You include the following modules:

# SciML Tools
using OrdinaryDiffEq, ModelingToolkit, DataDrivenDiffEq, SciMLSensitivity, DataDrivenSparse
using Optimization, OptimizationOptimisers, OptimizationOptimJL

# Standard Libraries
using LinearAlgebra, Statistics, Random

# External Libraries
using ComponentArrays, Lux, Plots

Lower in the code, there is the line:

# First train with ADAM for better convergence -> move the parameters into a
# favourable starting positing for BFGS
adtype = Optimization.AutoZygote()

What happens when Optimization.AutoZygote() is called? One gets an error (at least I get one now, and I understand it). AutoZygote requires access to Zygote.jl by Optimization.jl . I say this because I forked Optimization.jl and searched for all references to AutoZygote, and found the following:

➜  Optimization.jl git:(master) findh '*.jl' Zygote
2:using ForwardDiff, Zygote, ReverseDiff, FiniteDiff, Tracker
109:optf = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
110:optprob = Optimization.instantiate_function(optf, x0, Optimization.AutoZygote(), nothing)
58:                              Optimization.AutoZygote())
2:using FiniteDiff, ForwardDiff, ModelingToolkit, ReverseDiff, Tracker, Zygote
12:             ForwardDiff, ModelingToolkit, ReverseDiff, Tracker, Zygote],
1:using OptimizationOptimJL, OptimizationOptimJL.Optim, Optimization, ForwardDiff, Zygote,
75:    optprob = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
86:    optprob = OptimizationFunction((x, p) -> -rosenbrock(x, p), Optimization.AutoZygote())
99:    optprob = OptimizationFunction((x, p) -> -rosenbrock(x, p), Optimization.AutoZygote(),
1:using OptimizationNLopt, Optimization, Zygote
10:    optprob = OptimizationFunction((x, p) -> -rosenbrock(x, p), Optimization.AutoZygote())
15:    optprob = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
1:using OptimizationMOI, Optimization, Ipopt, NLopt, Zygote, ModelingToolkit
37:    optprob = OptimizationFunction((x, p) -> -rosenbrock(x, p), Optimization.AutoZygote())
43:    optprob = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
1:using OptimizationNonconvex, Optimization, Zygote, Pkg
9:    optprob = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
3:using Zygote
27:    optprob = OptimizationFunction(sumfunc, Optimization.AutoZygote())
2:AutoZygote <: AbstractADType
11:This uses the [Zygote.jl]( package.
14:forward-over-reverse mixing ForwardDiff.jl with Zygote.jl
23:Hessian is not defined via Zygote.
25:struct AutoZygote <: AbstractADType end
27:function instantiate_function(f, x, adtype::AutoZygote, p, num_cons = 0)
28:    num_cons != 0 && error("AutoZygote does not currently support constraints")
34:                                    res .= Zygote.gradient(x -> _f(x, args...), θ)[1]
42:                Zygote.gradient(x -> _f(x, args...), θ)[1]
27:    @require Zygote="e88e6eb3-aa80-5325-afca-941959d7151f" include("function/zygote.jl")

The last line,

     @require Zygote="e88e6eb3-aa80-5325-afca-941959d7151f" include("function/zygote.jl")

is the key. While the user is not e exposed to Zygote.jl, Optimization.jl is exposed, in the __init()__ function. The Zygote.jl cannot be included unless Zygote is in either my Project.toml or the Project.toml of Optimization.jl. Looking at the Optimization module, we find that its Project.toml does not include Zygote.jl:

name = "Optimization"
uuid = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
version = "3.10.0"

ArrayInterfaceCore = "30b0a656-2188-435a-8636-2ec0e6a096e2"
ConsoleProgressMonitor = "88cd18e8-d9cc-4ea6-8889-5259c0d15c8b"
DocStringExtensions = "ffbed154-4ef7-542d-bbb7-c09d3a79fcae"
Logging = "56ddb016-857b-54e1-b83d-db4d58db5568"
LoggingExtras = "e6f89c97-d47a-5376-807f-9c37f3926c36"
Pkg = "44cfe95a-1eb2-52ea-b672-e2afdf69b78f"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
ProgressLogging = "33c8b6b6-d38a-422a-b730-caa89a2f386c"
Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
SciMLBase = "0bca4576-84f4-4d90-8ffe-ffa030f20462"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
TerminalLoggers = "5d786b92-1e48-4d6f-9151-6b4477ca9bed"

ArrayInterface = "6"
ArrayInterfaceCore = "0.1.1"
ConsoleProgressMonitor = "0.1"
DocStringExtensions = "0.8, 0.9"
LoggingExtras = "0.4, 0.5, 1"
ProgressLogging = "0.1"
Reexport = "0.2, 1.0"
Requires = "1.0"
SciMLBase = "1.79.0"
TerminalLoggers = "0.1"
julia = "1.6"

ArrayInterface = "4fba245c-0d91-5ea0-9b3e-6abc04ee57a9"

Therefore, my question is, why should the code you send me work, unless I specifically add Zygote.jl to my own Project.toml to address the omission in Optimization.jl.

I’d be interested to know if I made an error in reasoning. Thanks!

That was already addressed in the source BTW: SciMLDocs/ at main · SciML/SciMLDocs · GitHub

Thanks. I notice in the source link you sent (which is different from the link you sent me, and that I checked before writing my message), that you now include Zygote in the modules to be loaded. Is that what you mean by “addressed in the source code”?

But my point still stands. How would the user know to include “Zygote.jl” if the module is not used directly? The user could simply assume that Zygote was included in the Optimization.jl Project.toml file. I am thinking from the user’s perspective. It would be as if you required the user to add “SunDial” to the toml file when using DifferentialEquation.jl. You do not require this. Rather, you add the package to the toml file. Shouldn’t the optimization package do the same? There would be no memory penalty. It seems to me that you make it harder to debug user code by not including it (I refer back to Zygote.jl).


It’s not.

It’s required if you use AutoZygote

It does the same.

Again, I mentioned in the first post that this is an unreleased documentation, I was just pointing you to it early if you wanted an early preview. I shared it with you because I thought you wanted to see what’s in progress. What’s in progress is not indicative of what the end result looks like nor is it necessarily correct. Of course it would not run without adding using Zygote and adding it to the .toml. Please do not use unreleased pieces of documentation that have giant warnings of “this is not released yet” as a sign of designs before the tutorials even run!

Note that the MIT machines went down this week so while we fixed the deploy keys, the deployment likely won’t work until this weekend.

Ok, thanks. Note that I was not commenting on the tutorial per say, but on the structure of the optimization package. But I have wasted too much of your time already. I appreciate the help. It has significantly improved my understanding of Julia. I will catch up with you once the software as released as I still have some questions. Enough for now, though.

Great work!

I will say that if it does work without sticking using Zygote in there, it’s because some other package in the dependency tree is using Zygote. But I would never rely on that, and if you’re using Zygote directly you should depend on it. This is one of the things being addressed by Julia v1.9’s new conditional dependency system.

Anyways, the complete showcase is “done”, as in it’s doing its build now.

It “should be good”, but since no one will have seen the full build we can only guess. Most likely I’ll need to clean up a few bugs over the weekend after we see what comes out. When this is completed, SciMLDocs will get a v1.0 tag, this will be set to stable, and a blog post will come out. It’s just so on the precipice of being done…