User defined objective function from external pkl file

Hello,
I have a complex function wrapped in a pkl file and I want to use to define my objective function. Here is a small example that I want to realize and the problem I meet:

using JuMP
using PyCall

py"""
import pickle

def add(x):
    return x[0] + x[1] - x[2] * x[3]

def save_pickle(fpath):
    with open(fpath, 'wb') as f:
        pickle.dump(add, f)

def load_pickle(fpath):
    with open(fpath, "rb") as f:
        data = pickle.load(f)
    return data
"""

save_pickle = py"save_pickle"
load_pickle = py"load_pickle"

save_pickle("function.pkl")
func = load_pickle("function.pkl")

func((1, 2, 3, 4))

model = Model()
register(model, :func, 4, func, autodiff=true)
@variable(model, x[1:2, 1:2])
@NLobjective(model, Min, func(Tuple(1.0reshape(x, 4))))
MethodError: no method matching register(::Model, ::Symbol, ::Int64, ::PyObject; autodiff=true)
Closest candidates are:
  register(::Model, ::Symbol, ::Integer, ::Function; autodiff) at ~/.julia/packages/JuMP/Y4piv/src/nlp.jl:2043
  register(::Model, ::Symbol, ::Integer, ::Function, ::Function; autodiff) at ~/.julia/packages/JuMP/Y4piv/src/nlp.jl:2129
  register(::Model, ::Symbol, ::Integer, ::Function, ::Function, ::Function) at ~/.julia/packages/JuMP/Y4piv/src/nlp.jl:2207 got unsupported keyword argument "autodiff"

Stacktrace:
 [1] top-level scope
   @ In[45]:2
 [2] eval
   @ ./boot.jl:373 [inlined]
 [3] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
   @ Base ./loading.jl:1196

You can convert a PyObject to a Function yourself by something like

func1(x) = func(tuple(x...))  
func2(x,y,z,w) = func((x,y,z,w))

but then you’ll just get the error that ForwardDiff can’t differentiate your python function:

caused by: MethodError: no method matching Float64(::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4})
Closest candidates are:
  (::Type{T})(::Real, ::RoundingMode) where T<:AbstractFloat at /usr/local/julia-1.7.3/share/julia/base/rounding.jl:200
  (::Type{T})(::T) where T<:Number at /usr/local/julia-1.7.3/share/julia/base/boot.jl:770
  (::Type{T})(::AbstractChar) where T<:Union{AbstractChar, Number} at /usr/local/julia-1.7.3/share/julia/base/char.jl:50
  ...
Stacktrace:
  [1] convert(#unused#::Type{Float64}, x::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4})
    @ Base ./number.jl:7
  [2] cconvert(T::Type, x::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4})
    @ Base ./essentials.jl:417
  [3] macro expansion
    @ ~/.julia/packages/PyCall/7a7w0/src/exception.jl:95 [inlined]
  [4] PyObject(r::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4})
    @ PyCall ~/.julia/packages/PyCall/7a7w0/src/conversions.jl:23
  [5] PyObject(t::NTuple{4, ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4}})
    @ PyCall ~/.julia/packages/PyCall/7a7w0/src/conversions.jl:196
  [6] _pycall!(ret::PyObject, o::PyObject, args::Tuple{NTuple{4, ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4}}}, nargs::Int64, kw::Ptr{Nothing})
    @ PyCall ~/.julia/packages/PyCall/7a7w0/src/pyfncall.jl:24
  [7] _pycall!
    @ ~/.julia/packages/PyCall/7a7w0/src/pyfncall.jl:11 [inlined]
  [8] #_#114
    @ ~/.julia/packages/PyCall/7a7w0/src/pyfncall.jl:86 [inlined]
  [9] (::PyObject)(args::NTuple{4, ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4}})
    @ PyCall ~/.julia/packages/PyCall/7a7w0/src/pyfncall.jl:86
 [10] func2(x::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4}, y::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4}, z::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4}, w::ForwardDiff.Dual{ForwardDiff.Tag{JuMP.var"#142#143"{typeof(func2)}, Float64}, Float64, 4})

That’s because register tries to differentiate your function with ForwardDiff, and ForwardDiff requires your function to be a generic Julia function. Earlier thread: Does ForwardDiff work with Python functions? It’s not clear to me whether there is any way to auto-differentiate a python function in julia, maybe it’s easier to implement the gradient yourself, even in python, maybe with a python autodiff tool, and then pass it separately.

2 Likes

The user defined function comes from surrogate model (random forest, etc). In this condition, how can I use it with the optimization algorithm? I should always tell Julia the definition of the gradient?

You’d need to match your choice of optimizer in JuMP to what you have available in your python code. For example, can you use a derivative-free optimization algorithm? Or can you calculate gradients in python? What algorithms can you use and what assumptions do they make?

1 Like

JuMP is the wrong tool for the job: Should I use JuMP? · JuMP

If everything else is in python, you might be better to use an optimizer in Python. There are a range of packages for derivative free optimization.

As one example Py-BOBYQA: Derivative-Free Optimizer for Bound-Constrained Minimization — Py-BOBYQA v1.3 documentation