JuMP.jl user-defined function has to return a scalar?

I have a question related to what was asked here some time ago. I want to use JuMP and need to differentiate through a function from the BasisMatrices.jl package and I am having trouble making that work. I don’t understand why the docs of JuMP say that the package uses ForwardDiff.jl (my error contains calls to ReverseSparseDiff instead). The call to ForwardDiff.jl works. artificial example:

module trying


	using JuMP
	using Ipopt
	using BasisMatrices
	using ForwardDiff


	function t1()
		chpar = ChebParams(10, 0.2, 2.0)
		Ψprime(k) = evalbase(chpar, k); # evalbase returns (1,10) matrix
		f(x) = ForwardDiff.jacobian(Ψprime, x);
		f([1.0])
	end

	function t2()
		chpar = ChebParams(10, 0.2, 2.0)
		m = Model(solver=IpoptSolver())
		@variable(m,x[1:3])

		function myfun(x)# evalbase returns (1,10) matrix
			return BasisMatrices.evalbase(chpar,x)
		end
		
		JuMP.register(m,:myfun,1,myfun,autodiff=true)

		@NLconstraint(m,sum(myfun(x[i]) for i=1:3) <= 0)
		@NLobjective(m,Min,1)
		solve(m)
	end
end

julia> trying.t1()
10×1 Array{Float64,2}:
  0.0     
  1.11111 
 -0.493827
 -3.16872 
  1.92654 
  4.74606 
 -4.15542 
 -5.56715 
  6.95444 
  5.41916 

julia> trying.t2()
ERROR: TypeError: #forward_eval#7: in typeassert, expected Float64, got Array{Float64,2}
Stacktrace:
 [1] #forward_eval#7(::ReverseDiffSparse.UserOperatorRegistry, ::Function, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{ReverseDiffSparse.NodeData,1}, ::SparseMatrixCSC{Bool,Int64}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}) at /Users/74097/.julia/v0.6/ReverseDiffSparse/src/forward.jl:149
 [2] (::ReverseDiffSparse.#kw##forward_eval)(::Array{Any,1}, ::ReverseDiffSparse.#forward_eval, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{ReverseDiffSparse.NodeData,1}, ::SparseMatrixCSC{Bool,Int64}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}) at ./<missing>:0
 [3] forward_eval_all(::JuMP.NLPEvaluator, ::Array{Float64,1}) at /Users/74097/.julia/v0.6/JuMP/src/nlp.jl:450
 [4] eval_hesslag(::JuMP.NLPEvaluator, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64, ::Array{Float64,1}) at /Users/74097/.julia/v0.6/JuMP/src/nlp.jl:754
 [5] initialize(::JuMP.NLPEvaluator, ::Array{Symbol,1}) at /Users/74097/.julia/v0.6/JuMP/src/nlp.jl:399
 [6] loadproblem!(::Ipopt.IpoptMathProgModel, ::Int64, ::Int64, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Array{Float64,1}, ::Symbol, ::JuMP.NLPEvaluator) at /Users/74097/.julia/v0.6/Ipopt/src/IpoptSolverInterface.jl:44
 [7] _buildInternalModel_nlp(::JuMP.Model, ::JuMP.ProblemTraits) at /Users/74097/.julia/v0.6/JuMP/src/nlp.jl:1248
 [8] #build#119(::Bool, ::Bool, ::JuMP.ProblemTraits, ::Function, ::JuMP.Model) at /Users/74097/.julia/v0.6/JuMP/src/solvers.jl:300
 [9] (::JuMP.#kw##build)(::Array{Any,1}, ::JuMP.#build, ::JuMP.Model) at ./<missing>:0
 [10] #solve#116(::Bool, ::Bool, ::Bool, ::Array{Any,1}, ::Function, ::JuMP.Model) at /Users/74097/.julia/v0.6/JuMP/src/solvers.jl:168
 [11] t2() at /Users/74097/git/HedonicRentals/julia/test2.jl:30

thanks for any help!

I believe that is correct.

If I understand correctly, you are trying to create 10 constraints (that the element-wise sum of myfun(x[i]) for i=1:3 is elementwise non-positive. Such vectorized constraints only work with linear and quadratic constraints (Link).

Finally, note that this feature is not currently supported directly in nonlinear expressions; for example, a matrix–vector product will not work inside a call to the @NLconstraint macro.

You can still create such constraints element-wise, but it will involve making 10x more call to myfun than you were hoping.

My workaround back then was to code up a function that evaluates the nth-order Chebyshev at x in the domain:

function my_cheb(x, n)
    k_stst = 4.628988089138438
    a = 0.2*k_stst
    b = 2*k_stst
        
    z = (2/(b-a)) * (x-(a+b)/2)
    
    if (n == 1)
        return 1.
    elseif (n == 2)
        return z
    else
        
        out1 = 1.
        out2 = z
        out3 = 0.

        for j in 3:n
            out3 = 2. * z * out2 - out1
            out1 = out2
            out2 = out3
        end

        return out3
    end
end

This, of course, involves repeated computation of already known numbers. And I am uncertain about type stability when JuMP calls the function.

It is a bit better to hand the Chebyshev coefficients to the user-defined function, like:

function Ψprime(k, θ...)
    (BasisMatrix(Basis(ChebParams(10, 0.2*4.628988089138438, 2*4.628988089138438)), 
        Expanded(), [k]).vals[1] * collect(θ))[1]
end

It returns a scalar too, but it saves some computations compared to the solution above. The function call inside some @NLconstraint is then:

Ψprime(Kprime[i], θ[1], θ[2], θ[3], θ[4], θ[5], θ[6], θ[7], θ[8], θ[9], θ[10])
2 Likes