Hello everyone,
I have been a Julia user for some years now and I really love JuMP, however I am finding particularly difficult setting up a NLobjective in a mixed-integer program with linear constraints which corresponds to a dynamic program. The part of the code causing trouble is the following
using LinearAlgebra
function ApproximateValueH(State,BasisFunctions,BasicHorizon_ind)
n=length(BasisFunctions[1])
f(y,u,g,z)=y+dot(g,z-u)
Linear=[]
for i in 1:n
push!(Linear,f(BasisFunctions[BasicHorizon_ind][i][2],BasisFunctions[BasicHorizon_ind][i][1][:,2],BasisFunctions[BasicHorizon_ind][i][1][:,1],State))
end
return maximum(Linear)
end
function ApproximateValue(Weights,BasisFunctions,Horizons,State)
Approximations=[]
for H in Horizons
Hix=findall(Horizons.==H)[1]
push!(Approximations,ApproximateValueH(State,BasisFunctions,Hix))
end
return dot(Weights,Approximations)
end
Horizons=[2,5,15,30,45]
Weights=1/length(Horizons)*ones(length(Horizons))
BasisFunctions=[[[rand(28*33,2), rand()] for j in 1:2] for i in 1:length(Horizons)] #Made up for MWE
using JuMP
import KNITRO
model=Model(KNITRO.Optimizer)
@variable(model, NewState[l=1:28*33]>=0, Int)
# Other variables and linear constraints here
NextStateValue(X...)=ApproximateValue(Weights,BasisFunctions,Horizons,X...)
register(model,:NextValueS, length(NewState), NextStateValue; autodiff=true)
The error message is very long but the key info reads as
Unable to register the function :NextValueS because it does not support differentiation via ForwardDiff.
Common reasons for this include:
# explanation about how to debug ForwardDiff
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:33
[2] _validate_register_assumptions(f::typeof(NextStateValue), name::Symbol, dimension::Int64)
@ JuMP ~/.julia/packages/JuMP/0C6kd/src/nlp.jl:1979
[3] register(m::Model, s::Symbol, dimension::Int64, f::Function; autodiff::Bool)
@ JuMP ~/.julia/packages/JuMP/0C6kd/src/nlp.jl:2052
[4] top-level scope
@ REPL[25]:1
caused by: MethodError: no method matching ApproximateValue(::Vector{Float64}, ::Vector{Vector{Vector{Any}}}, ::Vector{Int64}, ::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#137#138"{typeof(NextStateValue)}, Float64}, Float64, 12},...
However, running
ForwardDiff.gradient(NextStateValue,rand(28*33))
returns an answer. I would appreciate any thoughts about what is causing the trouble: I suspect it has to do with the fact that ApproximateValueH take a max over a vector, but I am not sure. How does JuMP work when one tries to register a function which calls other functions? Should I register all of them? Thank you very much and I apologize for the long post!