Weird scoping issue/misunderstanding in JuMP.jl

I spent the last few hours hunting a bug and boiled it down to a behaviour which I do not understand at all. Here is an MWE which basically reflects the core of the issue I had with my package:

using JuMP
using Ipopt

function wtf()
    f(x) = (x+2)^2
    
    model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
    register(model, :f, 1, f, autodiff=true)
    @variable(model, x)
    @NLobjective(model, Min, f(x))
    optimize!(model)

    retval = value(x)
    
    if false
        f(x) = (x-10)^2
    end
    
    return retval
end

Gives 10 instead of -2.

So what kind of magic is going on there? Is value reevaluating f(x) which is redefined by the compiler even if the body of the if-block is not reachable? Whaaat?

Edit: to avoid any confusion, I originally wanted to rerun optimize! with an updated quality function (f(x) in the example) only when some conditions are met, but it obviously always gets overwritten. So any advise how to rerun an optimisation procedure without all the boiler plate (@register, @NLobjective etc.) and only with a new function is really welcome!

This is disallow methods of local functions in different blocks Ā· Issue #15602 Ā· JuliaLang/julia Ā· GitHub.

Solution is to use an anonymous function f = x -> (x-10)^2.

And yes, itā€™s pretty bad.

2 Likes

Uff, using an anonymous function is not easy in my case. I do things like this to avoid ugly closures:

struct SingleDUMinimiser <: Function
    z_positions::Vector{Float64}
    times::Vector{Float64}
    pmt_directions::Vector{Direction}
    multiplicities::Vector{Int}
end

and then

function SingleDUMinimiser(hits::Vector{CalibratedHit}, triggered_hits::Vector{CalibratedHit})
    n = length(hits)
    n_triggered = length(triggered_hits)
    z_positions = Vector{Float64}()
    times = Vector{Float64}()
    multiplicities = Vector{Int32}()
    pmt_directions = Vector{Direction}()
    sizehint!(z_positions, n)
    sizehint!(times, n)
    sizehint!(multiplicities, n)
    sizehint!(pmt_directions, n_triggered)
    for i āˆˆ 1:n
        hit = hits[i]
        push!(z_positions, hit.pos.z)
        push!(times, hit.t)
        push!(multiplicities, hit.multiplicity.count)
    end
    for i āˆˆ 1:n_triggered
        push!(pmt_directions, triggered_hits[i].dir)
    end
    SingleDUMinimiser(z_positions, times, pmt_directions, multiplicities)
end


function (s::SingleDUMinimiser)(d_closest, t_closest, z_closest, dir_z, Ļ•, tā‚€)
    n = length(s.times)

    d_Ī³, ccalc = make_cherenkov_calculator(d_closest, t_closest, z_closest, dir_z, tā‚€)
    expected_times = ccalc.(s.z_positions)

    max_multiplicity = maximum(s.multiplicities)
    Q = 0.0
    for i āˆˆ 1:n
        t = s.times[i]
        z = s.z_positions[i]
        m = s.multiplicities[i]
        t_exp = ccalc(z)
        Ī”t = abs(t - t_exp)
        Q += Ī”t^2 * m / max_multiplicity
    end

    Ī”ts = abs.(s.times - expected_times)
    # Ī”Ļ•s = filter(!isnan, azimuth.(s.pmt_directions)) .- Ļ•

    # return sum(Ī”ts .^2) + sum(Ī”Ļ•s.^2)/length(Ī”Ļ•s)
    # return sum(Ī”ts .^2)
    return Q
end

Long story short, I donā€™t see how this could be done using anonymous functions :confused:

You define those functions conditionally inside other functions?

I define an initial quality function inside the function which performs the minimisation, like this:

qfunc = SingleDUMinimiser(shits, filter(h->h.triggered, du_hits))

and then the whole JuMP stuff with model, register, @NLobjective and @variable etc.

Then I call optimize!(model) and given some conditions I need to adjust qfunc and rerun optimize!(model) again.

So yes, I need to redefine the function inside that wrapper function conditionallyā€¦

That shouldnā€™t be a problem. Can you repro your issue using something similar to what you now describe.

Sure, let me hackā€¦

OK Kristoffer, you are right, I probably boiled too much, that is not the root of my optimisation problem:

using JuMP
using Ipopt


struct FMinimiser <: Function
    a
end


function(f::FMinimiser)(x)
    (x + f.a)^2
end


function optimisation_procedure(reoptimize::Bool)
    f = FMinimiser(2)
    
    model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
    register(model, :f, 1, f, autodiff=true)
    @variable(model, x)
    @NLobjective(model, Min, f(x))
    optimize!(model)
    
    if reoptimize
        f = FMinimiser(10)
        optimize!(model)
    end
    
    value(x)
end

This gives

> optimisation_procedure(false)
-2.0

However, it also returns -2.0 for optimisation_procedure(true).

Is the only way to let JuMP recognize the new function calling register and @NLobjective again, like this?

function optimisation_procedure(reoptimize::Bool)
    f = FMinimiser(2)
    
    model = Model(with_optimizer(Ipopt.Optimizer, print_level=0))
    register(model, :f, 1, f, autodiff=true)
    @variable(model, x)
    @NLobjective(model, Min, f(x))
    optimize!(model)
    
    if reoptimize
        f2 = FMinimiser(10)
        register(model, :f2, 1, f2, autodiff=true)
        @NLobjective(model, Min, f2(x))
        optimize!(model)
    end
    
    value(x)
end

There is no way that rebinding f can have any influence on what value(x) returns. You are not passing in the variable f to register you are passing in the object that f ā€œpointsā€ or ā€œmapsā€ to which is the first instance of FMinimiser. Rebinding f does absolutely nothing to that object.

2 Likes

Yes, that I understand. So I need to reregister the function again and define the objectiveā€¦

Whatā€™s a bit annoying is that I need a new name (symbol) every time I register a new one. I thought there is something to replace or reevaluate the registered function. I guess I have to ask some JuMP users

JuMP.register creates a mapping between a symbol and a Julia function. At least in the global scope, you can change the Julia function without needing to re-register or reset the objective:

julia> model = Model(with_optimizer(Ipopt.Optimizer, print_level = 0))
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Ipopt

julia> @variable(model, x >= 0)
x

julia> f(x) = (x - 2)^2
f (generic function with 1 method)

julia> JuMP.register(model, :foo, 1, f, autodiff=true)  # No need to name :f

julia> @NLobjective(model, Min, f(x))  # Using function doesn't work
ERROR: Unrecognized function "f" used in nonlinear expression.
Stacktrace:
 [1] error(::String) at ./error.jl:33
 [2] top-level scope at /Users/oscar/.julia/packages/JuMP/iGamg/src/parse_nlp.jl:73
 [3] top-level scope at /Users/oscar/.julia/packages/JuMP/iGamg/src/macros.jl:1497

julia> @NLobjective(model, Min, foo(x))  # Registered symbol, not Julia function

julia> optimize!(model)

******************************************************************************
This program contains Ipopt, a library for large-scale nonlinear optimization.
 Ipopt is released as open source code under the Eclipse Public License (EPL).
         For more information visit http://projects.coin-or.org/Ipopt
******************************************************************************


julia> value(x)
2.00000000062657

julia> f(x) = (x + 10)^2
f (generic function with 1 method)

julia> optimize!(model)

julia> value(x)
0.0

julia> objective_value(model)
99.99999980250593

You would have to experiment to see what the interaction is between JuMP and the Julia ā€œfeatureā€ in function scopes.

1 Like

Yes exactly, it only works if I redefine the function, but also only in the global scope.
But I am using a functor and assign it to a ā€œvariableā€, so I do not have access to the connection anymore (it seems).
If I was using function definitions then I am again at the original problem disallow methods of local functions in different blocks Ā· Issue #15602 Ā· JuliaLang/julia Ā· GitHub were it simply does not work :frowning: