# Jump using `max(x...)` where x is a matrix?

I need to revive a previously asked question. I have a problem that involves evaluating the `exp` function over an array of variables and I want to avoid huge values by centering those variables before appplying `exp`. I get a bounds error:

``````using JuMP
using Ipopt

m = Model(Ipopt.Optimizer)
fmax(y...) = max(y...)
JuMP.register(m, :fmax, N, fmax, autodiff=true)
@variable(m,x[i=1:2, j=1:3])
@NLexpression(m, xbar[i=1:2, j=1:3], x[i,j] - fmax(x...))  # normalize by largest value in x
@NLexpression(m, p[i=1:2, j=1:3], exp(xbar[i,j]) / sum(xbar[k,l] for k=1:2, l=1:3))
@NLobjective(m, Max, sum(p[i,j] for i=1:2, j=1:3))
optimize!(m)

ERROR: BoundsError: attempt to access 2-element Vector{Float64} at index [1:6]
``````

it is strange because it seems the subexpression is built correctly:

``````julia> m[:xbar]
2×3 Matrix{NonlinearExpression}:
subexpression: x[1,1] - fmax(x[1,1], x[2,1], x[1,2], x[2,2], x[1,3], x[2,3])  …  subexpression: x[1,3] - fmax(x[1,1], x[2,1], x[1,2], x[2,2], x[1,3], x[2,3])
subexpression: x[2,1] - fmax(x[1,1], x[2,1], x[1,2], x[2,2], x[1,3], x[2,3])     subexpression: x[2,3] - fmax(x[1,1], x[2,1], x[1,2], x[2,2], x[1,3], x[2,3])
``````

or, at least to me that looks like exactly what I wanted to achieve. What 2-element vector are we talking abou here?
thanks!

When I tried to run this, it became clear that the value of `N` is not defined.
I think you have `N = 2` set somewhere.
Setting `N = 3` gives 3 in place of 2 here:
`ERROR: BoundsError: attempt to access 3-element Vector{Float64} at index [1:6]`
while setting `N = 6` misses this error but ultimately returns from IPOPT:
`EXIT: Invalid number in NLP function or derivative detected.`

1 Like

I don’t have my laptop right now, but it seems to me `maximum(x)` is more appropriate than `max(x...)`

2 Likes

Oh! How dumb, N = 6 in my case!
And no, it’s precisely impossible to use `maximum` that’s why we have to splat the vector into `max`

Are you sure that fmax(y…) = maximum(y) won’t work?

sure.

``````using JuMP
using Ipopt

m = Model(Ipopt.Optimizer)
N = 6
fmax(y...) = maximum(y...)
JuMP.register(m, :fmax, N, fmax, autodiff=true)
@variable(m,x[i=1:2, j=1:3])
@NLexpression(m, xbar[i=1:2, j=1:3], x[i,j] - fmax(x...))  # normalize by largest value in x
@NLexpression(m, p[i=1:2, j=1:3], exp(xbar[i,j]) / sum(xbar[k,l] for k=1:2, l=1:3))
@NLobjective(m, Max, sum(p[i,j] for i=1:2, j=1:3))
optimize!(m)

ERROR: Unable to register the function :fmax because it does not support differentiation via ForwardDiff.

Common reasons for this include:

* the function assumes `Float64` will be passed as input, it must work for any
generic `Real` type.
* the function allocates temporary storage using `zeros(3)` or similar. This
defaults to `Float64`, so use `zeros(T, 3)` instead.
``````

I am wondering is it possible to use the max relaxation in this context?

Something like this,

``````@variable(m, xmax)
@variable(m, x[i=1:2, j=1:3])
for x_var in x
@constraint(m, x_var <= xmax)
end
``````