Hello, for a project of optimal approximation of functions, I need the expression of the gradient of the function I need to approximate, so I can put these expressions in constraints in a JuMP model (with Ipopt as the solver). For now, I have to give to my program the explicit expression of the gradient, but I want to automatize this process with symbolics calculus. And I manage to make it work to some extent. With Symbolics.jl, I can retrieve the expression of the gradient I want but if it contains a function of the NaNMath package (log,sin,cos etc…) it crashes because these functions does not support parameter of type NonLinearExpr, AffExpr, etc…
Thus, I was wondering if I still could make it work like that or if I have to change my method and how ?
This code is a simplified version of what I work on with the function to do the automatic gradient (gradient()
), another to see the expression of the gradient built by Symbolics.build_function()
(gradient_show()
).
And the model to which I apply the constraint with the gradient.
using JuMP,Ipopt,Symbolics
f2(x) = (1+x[1])^x[2]
#Automatic gragient
function gradient(f,d)
Symbolics.@variables x[1:d]
grad_f(y) = [Symbolics.build_function(Symbolics.gradient(f(x),[x...])[i], x;expression=Val{false})(y) for i in 1:d]
return grad_f
end
#This function is added to see the expression return by Symbolics.build_function() (see third text block in the post).
function gradient_show(f,d)
Symbolics.@variables x[1:d]
return [Symbolics.build_function(Symbolics.gradient(f(x),[x...])[i], x;expression=Val{false}) for i in 1:d]
end
#Hard coded gradient
grad_f2(x) = [((1+x[1])^(x[2]-1))*x[2], ((1+x[1])^x[2])*log(1+x[1])]
m=Model(Ipopt.Optimizer)
@variable(m,x[1:2])
grad_f = gradient(f2,length(x))
@objective(m,Max,f2(x))
#
@constraint(m, δ[i=1:length(x)], grad_f(x)[i] == 0.0)
Trying to run the model give this error:
ERROR: LoadError: MethodError: no method matching log(::AffExpr)
You may have intended to import Base.log
Closest candidates are:
log(::Num)
@ Symbolics ~/.julia/packages/SymbolicUtils/ZlDJZ/src/methods.jl:87
log(::Float32)
@ NaNMath ~/.julia/packages/NaNMath/ceWIc/src/NaNMath.jl:10
log(::Float64)
@ NaNMath ~/.julia/packages/NaNMath/ceWIc/src/NaNMath.jl:9
...
The gradient built by Symbolics.build_function()
and used in gradient() from the symbolics expression that I want to use in my constraints look like that (We can see here the mention of NaNMath in the second expression):
julia> gradient_show(f2,2)
2-element Vector{RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:x,), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", id, Expr} where id}:
RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:x,), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x002aed2f, 0xa47b1b83, 0x559f9b2f, 0x522a4e58, 0x9f4895b8), Expr}(quote
#= /home/bdl/.julia/packages/SymbolicUtils/ZlDJZ/src/code.jl:373 =#
#= /home/bdl/.julia/packages/SymbolicUtils/ZlDJZ/src/code.jl:374 =#
#= /home/bdl/.julia/packages/SymbolicUtils/ZlDJZ/src/code.jl:375 =#
(*)((getindex)(x, 2), (^)((+)(1, (getindex)(x, 1)), (+)(-1, (getindex)(x, 2))))
end)
RuntimeGeneratedFunctions.RuntimeGeneratedFunction{(:x,), Symbolics.var"#_RGF_ModTag", Symbolics.var"#_RGF_ModTag", (0x19dff194, 0x82a6974b, 0xfe3f57df, 0xef781ba9, 0x40da3917), Expr}(quote
#= /home/bdl/.julia/packages/SymbolicUtils/ZlDJZ/src/code.jl:373 =#
#= /home/bdl/.julia/packages/SymbolicUtils/ZlDJZ/src/code.jl:374 =#
#= /home/bdl/.julia/packages/SymbolicUtils/ZlDJZ/src/code.jl:375 =#
(*)(NaNMath.log((+)(1, (getindex)(x, 1))), (^)((+)(1, (getindex)(x, 1)), (getindex)(x, 2)))
end)