Nonlinear Optimization with Many Constraints + Autodifferentiation: Which Julia Solution?

Thanks, I finally did find a mention of it buried deep in https://jump.dev/JuMP.jl/stable/moi/submodules/Nonlinear/overview/#ReverseAD. Do you think it deserves a more visible spot, somewhere in https://jump.dev/JuMP.jl/stable/manual/nonlinear/? I would open a PR myself but I’m unsure how to phrase it

It doesnt have a prominent mention because almost all users should never need to know about or understand the details. The AD system is very JuMP specific, so you can’t just swap to use Enzyme or similar.

2 Likes

Alright. I was only asking because until now I was not aware of the different treatment between “algebraic expressions” (aka function tracing?) and custom functions. But granted, I haven’t done lots of nonlinear modeling so far

1 Like

@odow thanks again for your help. OptimalTransportNetworks.jl is under development. I have another question regarding the specification and updating of the kappa_ex vector (containing exogeneous transport frictions). As you suggested, I am using a parameter as follows, for example in model_fixed.jl:

 kappa_ex_init = auxdata[:kappa_ex]

# ... some code

# Parameters: to be updated between solves
@variable(model, kappa_ex[i = 1:graph.ndeg] in Parameter(kappa_ex_init[i]))

Then, in optimal_network.jl, there is an updating line as follows:

set_parameter_value.(model.obj_dict[:kappa_ex], kappa_ex_updated)

I wanted to know if this is the recommended way of doing so. Is it possible to also generate a vector parameter which would simplify the Syntax and updating? (noting also that kappa_ex can be large for large networks). The reason I am asking this in particular is because I seem to be getting a failure building the library on Github because of this line. Many thanks!