DiffEqFlux.sciml_train giving UndefVarError: ProgressLogging not defined

When running the Optimization of Ordinary Differential Equations tutorial in the DiffEqFlux documentation, I get the following error

ERROR: LoadError: UndefVarError: ProgressLogging not defined
Stacktrace:
 [1] default_logger(::Atom.Progress.JunoProgressLogger) at C:\Users\jmurp\.julia\packages\DiffEqBase\VoY3t\src\utils.jl:262
 [2] sciml_train(::Function, ::Array{Float64,1}, ::ADAM, ::Base.Iterators.Cycle{Tuple{DiffEqFlux.NullData}}; cb::Function, maxiters::Int64, progress::Bool, save_best::Bool) at C:\Users\jmurp\.julia\packages\DiffEqFlux\FZMwP\src\train.jl:42
 [3] top-level scope at C:\Users\jmurp\OneDrive\Desktop\Research\AstroML\Julia\DiffEqFluxTutorials.jl:45
 [4] include_string(::Function, ::Module, ::String, ::String) at .\loading.jl:1088
in expression starting at C:\Users\jmurp\OneDrive\Desktop\Research\AstroML\Julia\DiffEqFluxTutorials.jl:45

My code is identical to the tutorial

using DifferentialEquations, Flux, Optim, DiffEqFlux, DiffEqSensitivity, Plots

function lotka_volterra!(du, u, p, t)
    x, y = u
    α, Β, δ, γ = p
    du[1] = dx = α * x - Β * x * y
    du[2] = dy = -δ * y + γ * x * y
end

# Initial Condition
u0 = [1.0, 1.0]

# Simulation Interval and Intermediary Points
tspan = (0.0, 10.0)
tsteps = 0.0:0.1:10.0

# LV parameter p = [α, β, δ, γ]
p = [1.5, 1.0, 3.0, 1.0]

# Setup and Solve the ODE Problem
prob = ODEProblem(lotka_volterra!, u0, tspan, p)
sol = solve(prob, Tsit5())

# Plot the Solution
plot(sol)
savefig("LV_ode.png")

function loss(p)
    sol = solve(prob, Tsit5(), p = p, saveat = tsteps)
    loss = sum(abs2, sol .- 1)
    return loss, sol
end

callback = function (p, l, pred)
    display(l)
    plt = plot(pred, ylim = (0, 6))
    display(plt)
    # Tell sciml_train to not halt the optimization. If return true, then
    # optimization stops.
    return false
end

result_ode = DiffEqFlux.sciml_train(loss, p,
                                    ADAM(0.1),
                                    cb = callback,
                                    maxiters = 100)

My code is runniing on Julia 1.5.0 and my package statuses are :
DifferentialEquations v6.15.0
Flux v0.10.4
Optim v0.21.0
DiffEqFlux v1.17.0
DiffEqSensitivity v6.31.1
Plots v1.6.0

I’ve tried restarting Julia and Atom and building ProgressLogging locally.

This was an error in the latest release. Sorry. I fixed this up with the release of DiffEqFlux v1.21.0

Got the same error. when I try to download the lastest version of DiffEqFlux i get this compatibility issue…

(@JuliaPro_v1.5.0-1) pkg> add https://github.com/SciML/DiffEqFlux.jl
Updating git-repo https://github.com/SciML/DiffEqFlux.jl
Resolving package versions…
ERROR: Unsatisfiable requirements detected for package Adapt [79e6a3ab]:
Adapt [79e6a3ab] log:
├─possible versions are: [0.3.0-0.3.1, 0.4.0-0.4.2, 1.0.0-1.0.1, 1.1.0, 2.0.0-2.0.2] or uninstalled
├─restricted to versions 1-2 by DiffEqFlux [aae7a2af], leaving only versions [1.0.0-1.0.1, 1.1.0, 2.0.0-2.0.2]
│ └─DiffEqFlux [aae7a2af] log:
│ ├─possible versions are: 1.21.0 or uninstalled
│ └─DiffEqFlux [aae7a2af] is fixed to version 1.21.0
├─restricted by compatibility requirements with CuArrays [3a865a2d] to versions: [1.0.0-1.0.1, 1.1.0]
│ └─CuArrays [3a865a2d] log:
│ ├─possible versions are: [0.2.1, 0.3.0, 0.4.0, 0.5.0, 0.6.0-0.6.2, 0.7.0-0.7.3, 0.8.0-0.8.1, 0.9.0-0.9.1, 1.0.0-1.0.2, 1.1.0, 1.2.0-1.2.1, 1.3.0, 1.4.0-1.4.7, 1.5.0, 1.6.0, 1.7.0-1.7.3, 2.0.0-2.0.1, 2.1.0, 2.2.0-2.2.2] or uninstalled
│ ├─restricted to versions * by an explicit requirement, leaving only versions [0.2.1, 0.3.0, 0.4.0, 0.5.0, 0.6.0-0.6.2, 0.7.0-0.7.3, 0.8.0-0.8.1, 0.9.0-0.9.1, 1.0.0-1.0.2, 1.1.0, 1.2.0-1.2.1, 1.3.0, 1.4.0-1.4.7, 1.5.0, 1.6.0, 1.7.0-1.7.3, 2.0.0-2.0.1, 2.1.0, 2.2.0-2.2.2]
│ ├─restricted by compatibility requirements with GPUArrays [0c68f7d7] to versions: [0.2.1, 2.0.0-2.0.1, 2.1.0, 2.2.0-2.2.2] or uninstalled, leaving only versions: [0.2.1, 2.0.0-2.0.1, 2.1.0, 2.2.0-2.2.2]
│ │ └─GPUArrays [0c68f7d7] log:
│ │ ├─possible versions are: [0.3.0-0.3.4, 0.4.0-0.4.2, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.2, 1.0.0-1.0.4, 2.0.0-2.0.1, 3.0.0-3.0.1, 3.1.0, 3.2.0, 3.3.0, 3.4.0-3.4.1, 4.0.0-4.0.1, 5.0.0, 5.1.0] or uninstalled
│ │ ├─restricted to versions * by an explicit requirement, leaving only versions [0.3.0-0.3.4, 0.4.0-0.4.2, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.2, 1.0.0-1.0.4, 2.0.0-2.0.1, 3.0.0-3.0.1, 3.1.0, 3.2.0, 3.3.0, 3.4.0-3.4.1, 4.0.0-4.0.1, 5.0.0, 5.1.0]
│ │ ├─restricted by compatibility requirements with CUDA [052768ef] to versions: [3.4.0-3.4.1, 4.0.0-4.0.1, 5.0.0, 5.1.0]
│ │ │ └─CUDA [052768ef] log:
│ │ │ ├─possible versions are: [0.1.0, 1.0.0-1.0.2, 1.1.0, 1.2.0-1.2.1, 1.3.0-1.3.3] or uninstalled
│ │ │ ├─restricted to versions * by an explicit requirement, leaving only versions [0.1.0, 1.0.0-1.0.2, 1.1.0, 1.2.0-1.2.1, 1.3.0-1.3.3]
│ │ │ └─restricted by compatibility requirements with Flux [587475ba] to versions: [1.0.0-1.0.2, 1.1.0, 1.2.0-1.2.1, 1.3.0-1.3.3]
│ │ │ └─Flux [587475ba] log:
│ │ │ ├─possible versions are: [0.4.1, 0.5.0-0.5.4, 0.6.0-0.6.10, 0.7.0-0.7.3, 0.8.0-0.8.3, 0.9.0, 0.10.0-0.10.4, 0.11.0-0.11.1] or uninstalled
│ │ │ └─restricted to versions 0.11 by DiffEqFlux [aae7a2af], leaving only versions 0.11.0-0.11.1
│ │ │ └─DiffEqFlux [aae7a2af] log: see above
│ │ └─restricted by compatibility requirements with CuArrays [3a865a2d] to versions: [3.1.0, 3.2.0, 3.3.0, 3.4.0-3.4.1], leaving only versions: 3.4.0-3.4.1
│ │ └─CuArrays [3a865a2d] log: see above
│ └─restricted by compatibility requirements with CUDAdrv [c5f51814] to versions: [0.4.0, 0.5.0, 0.6.0-0.6.2, 0.7.0-0.7.3, 0.8.0-0.8.1, 1.0.2, 1.1.0, 1.2.0-1.2.1, 1.3.0, 1.4.0-1.4.7, 1.5.0, 1.6.0, 1.7.0-1.7.3, 2.0.0-2.0.1, 2.1.0, 2.2.0-2.2.2] or uninstalled, leaving only versions: [2.0.0-2.0.1, 2.1.0, 2.2.0-2.2.2]
│ └─CUDAdrv [c5f51814] log:
│ ├─possible versions are: [0.8.0-0.8.6, 0.9.0, 1.0.0-1.0.1, 2.0.0, 3.0.0-3.0.1, 3.1.0, 4.0.0-4.0.4, 5.0.0-5.0.1, 5.1.0, 6.0.0-6.0.1, 6.1.0, 6.2.0-6.2.3, 6.3.0] or uninstalled
│ ├─restricted to versions * by an explicit requirement, leaving only versions [0.8.0-0.8.6, 0.9.0, 1.0.0-1.0.1, 2.0.0, 3.0.0-3.0.1, 3.1.0, 4.0.0-4.0.4, 5.0.0-5.0.1, 5.1.0, 6.0.0-6.0.1, 6.1.0, 6.2.0-6.2.3, 6.3.0]
│ ├─restricted by compatibility requirements with CUDAnative [be33ccc6] to versions: [0.8.0-0.8.6, 3.0.0-3.0.1, 3.1.0, 4.0.1-4.0.4, 5.0.0-5.0.1, 5.1.0, 6.0.0-6.0.1, 6.1.0, 6.2.0-6.2.3, 6.3.0]
│ │ └─CUDAnative [be33ccc6] log:
│ │ ├─possible versions are: [0.7.0, 0.8.0-0.8.10, 0.9.0-0.9.1, 0.10.0-0.10.1, 1.0.0-1.0.1, 2.0.0-2.0.1, 2.1.0-2.1.3, 2.2.0-2.2.1, 2.3.0-2.3.1, 2.4.0, 2.5.0-2.5.5, 2.6.0, 2.7.0, 2.8.0-2.8.1, 2.9.0-2.9.1, 2.10.0-2.10.2, 3.0.0-3.0.4, 3.1.0, 3.2.0] or uninstalled
│ │ ├─restricted to versions * by an explicit requirement, leaving only versions [0.7.0, 0.8.0-0.8.10, 0.9.0-0.9.1, 0.10.0-0.10.1, 1.0.0-1.0.1, 2.0.0-2.0.1, 2.1.0-2.1.3, 2.2.0-2.2.1, 2.3.0-2.3.1, 2.4.0, 2.5.0-2.5.5, 2.6.0, 2.7.0, 2.8.0-2.8.1, 2.9.0-2.9.1, 2.10.0-2.10.2, 3.0.0-3.0.4, 3.1.0, 3.2.0]
│ │ ├─restricted by compatibility requirements with Adapt [79e6a3ab] to versions: [0.7.0, 0.8.0-0.8.10, 0.9.0-0.9.1, 2.2.1, 2.3.0-2.3.1, 2.4.0, 2.5.0-2.5.5, 2.6.0, 2.7.0, 2.8.0-2.8.1, 2.9.0-2.9.1, 2.10.0-2.10.2, 3.0.0-3.0.4, 3.1.0, 3.2.0] or uninstalled, leaving only versions: [0.7.0, 0.8.0-0.8.10, 0.9.0-0.9.1, 2.2.1, 2.3.0-2.3.1, 2.4.0, 2.5.0-2.5.5, 2.6.0, 2.7.0, 2.8.0-2.8.1, 2.9.0-2.9.1, 2.10.0-2.10.2, 3.0.0-3.0.4, 3.1.0, 3.2.0]
│ │ │ └─Adapt [79e6a3ab] log: see above
│ │ └─restricted by compatibility requirements with CuArrays [3a865a2d] to versions: [3.0.0-3.0.4, 3.1.0, 3.2.0]
│ │ └─CuArrays [3a865a2d] log: see above
│ ├─restricted by compatibility requirements with CUDAapi [3895d2a7] to versions: [0.8.2-0.8.6, 0.9.0, 1.0.0-1.0.1, 2.0.0,
3.0.0-3.0.1, 3.1.0, 4.0.0-4.0.4, 5.0.0-5.0.1, 5.1.0, 6.0.0-6.0.1, 6.1.0, 6.2.0-6.2.3, 6.3.0] or uninstalled, leaving only versions: [0.8.2-0.8.6, 3.0.0-3.0.1, 3.1.0, 4.0.1-4.0.4, 5.0.0-5.0.1, 5.1.0, 6.0.0-6.0.1, 6.1.0, 6.2.0-6.2.3, 6.3.0]
│ │ └─CUDAapi [3895d2a7] log:
│ │ ├─possible versions are: [0.5.0-0.5.4, 0.6.0-0.6.3, 1.0.0-1.0.1, 1.1.0, 1.2.0, 2.0.0, 2.1.0, 3.0.0, 3.1.0, 4.0.0] or
uninstalled
│ │ ├─restricted to versions * by an explicit requirement, leaving only versions [0.5.0-0.5.4, 0.6.0-0.6.3, 1.0.0-1.0.1,
1.1.0, 1.2.0, 2.0.0, 2.1.0, 3.0.0, 3.1.0, 4.0.0]
│ │ └─restricted by compatibility requirements with CuArrays [3a865a2d] to versions: [3.0.0, 3.1.0, 4.0.0]
│ │ └─CuArrays [3a865a2d] log: see above
│ └─restricted by compatibility requirements with CuArrays [3a865a2d] to versions: [6.0.1, 6.1.0, 6.2.0-6.2.3, 6.3.0]
│ └─CuArrays [3a865a2d] log: see above
└─restricted by compatibility requirements with CUDA [052768ef] to versions: 2.0.0-2.0.2 — no versions left
└─CUDA [052768ef] log: see above

That’s because CuArrays was replaced by CUDA.jl

Hello, I’m using Juno’s debugger in DiffEqFlux.sciml_train and I get the same error:

debug> ERROR: UndefVarError: ProgressLogging not defined
Stacktrace:
[1] evaluate_call_recurse!(::Any, ::JuliaInterpreter.Frame, ::Expr; enter_generated::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:212
[2] evaluate_call_recurse! at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:201
[inlined]
[3] eval_rhs(::Any, ::JuliaInterpreter.Frame, ::Expr) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:388
[4] step_expr!(::Any, ::JuliaInterpreter.Frame, ::Any, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:531
[5] step_expr!(::Any, ::JuliaInterpreter.Frame, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:581
[6] finish!(::Any, ::JuliaInterpreter.Frame, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:14
[7] finish_and_return! at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:29 [inlined]
[8] evaluate_call_recurse!(::Any, ::JuliaInterpreter.Frame, ::Expr; enter_generated::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:239
… (the last 7 lines are repeated 5 more times)
[44] evaluate_call_recurse! at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:201 [inlined]
[45] eval_rhs(::Any, ::JuliaInterpreter.Frame, ::Expr) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:388
[46] step_expr!(::Any, ::JuliaInterpreter.Frame, ::Any, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:526
[47] step_expr!(::Any, ::JuliaInterpreter.Frame, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\interpret.jl:581
[48] finish!(::Any, ::JuliaInterpreter.Frame, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:14
[49] finish_and_return!(::Any, ::JuliaInterpreter.Frame, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:29
[50] finish_stack!(::Any, ::JuliaInterpreter.Frame, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:59
[51] debug_command(::Any, ::JuliaInterpreter.Frame, ::Symbol, ::Bool; line::Nothing) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:473
[52] debug_command(::Any, ::JuliaInterpreter.Frame, ::Symbol, ::Bool) at C:\Users\Sebas.julia\packages\JuliaInterpreter\kDn3E\src\commands.jl:415
[53] (::Atom.JunoDebugger.var"#54#56"{Bool,Bool,Bool})() at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:159
[54] evalscope(::Atom.JunoDebugger.var"#54#56"{Bool,Bool,Bool}) at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:392
[55] startdebugging(::JuliaInterpreter.Frame, ::Bool; istoplevel::Bool, toggle_ui::Bool) at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:157
[56] (::Base.var"#inner#2"{Base.Iterators.Pairs{Symbol,Bool,Tuple{Symbol,Symbol},NamedTuple{(:istoplevel, :toggle_ui),Tuple{Bool,Bool}}},typeof(Atom.JunoDebugger.startdebugging),Tuple{JuliaInterpreter.Frame,Bool}})() at .\essentials.jl:713
[57] #invokelatest#1 at .\essentials.jl:714 [inlined]
[58] (::Atom.JunoDebugger.var"#48#51"{String,String,Bool,Int64})() at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:119
[59] hideprompt(::Atom.JunoDebugger.var"#48#51"{String,String,Bool,Int64}) at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\repl.jl:127
[60] #47 at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:84 [inlined]
[61] task_local_storage(::Atom.JunoDebugger.var"#47#50"{String,String,Bool,Int64}, ::Symbol, ::String) at .\task.jl:226
[62] debug_file(::String, ::String, ::String, ::Bool, ::Int64) at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:83
[63] debug_file(::String, ::String, ::String, ::Bool) at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\debugger\stepper.jl:81
[64] handlemsg(::Dict{String,Any}, ::String, ::Vararg{Any,N} where N) at C:\Users\Sebas.julia\packages\Atom\MxsDb\src\comm.jl:169

The version of DiffEqFlux is 1.24.0

Does the debugger have any limitation in DiffEqFlux? Or should it works?

Thanks!!

@pfitzseb is there some weird interaction between the debugger and ProgressLogging?

Not as far as I’m aware. Do you have a MWE I can try, @junsebas97?

For example, in the following code:


using Flux, DiffEqFlux, DifferentialEquations

# Data: Lotka-Volterra simulation
function lotka_volterra(du, u, p, t)
    x, y       = u
    α, β, δ, γ = p
    du[1] = dx =  α*x - β*x*y
    du[2] = dy = -δ*y + γ*x*y
end

u0    = [1.0, 1.0]               # initial Condition
tspan = (0.0, 15.0)              # integration interval
Δt    = 0.01
p     = [1.5, 1.0, 3.0, 1.0]     # system parameters
prob   = ODEProblem(lotka_volterra, u0, tspan, p)
sol    = solve(prob, Tsit5(), saveat = Δt)
y_real = sol[2, :]    # label


# ML model:
ANN    = FastChain(FastDense(2,  5, tanh), FastDense(5,  2))
params = initial_params(ANN)
function model(du, u, p, t)
    du[1] = dx = ANN(u, p)[1]
    du[2] = dy = ANN(u, p)[2]
end

function prediction(θ)
    problem = ODEProblem(model, u0, tspan, θ)
    solve(problem, Tsit5(), saveat = Δt)
end

function loss(θ)
    pred   = prediction(θ)
    y_pred = pred[2, :]
    sum(abs2, y_real - y_pred)
end
loss(params)

function callback(p, loss_val)
    println(loss_val)
    false
end

sol = DiffEqFlux.sciml_train(loss, params, ADAM(0.05), cb = callback, maxiters = 50)

When I put a breakpoint inside the loss function (y_pred = pred[2, :]) and it’s called by DiffEqFlux.sciml_train, i get the ProgressLogging error.