Help with JuMP 0.19 upgrade and performance hits

jump

#1

Hi all,

I’m having some issues with Ipopt not finding a solution with a script I’m updating to Julia 1.1 / JuMP 0.19 from a functioning one on Julia 0.64 / JuMP 0.18.

Unsure exactly where the problem is, so I’ve put the script in a gist so you can see the changes I’ve made.

Functioning version (Julia 0.64, JuMP 0.18)
Runtime on my machine
5.514864 seconds (5.72 M allocations: 172.761 MiB, 2.19% gc time)

Attempt at update to Julia 1.1, JuMP 0.19
On the third optimize!, this one seems to hit a convex region and just spools forever.

iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls
0 -0.0000000e+00 1.10e+04 1.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0
250r -1.2028687e+12 9.60e+07 7.38e-13 -7.2 1.11e-04 - 1.00e+00 1.00e+00h 1
500r -1.2028687e+12 9.60e+07 7.38e-13 -7.2 1.11e-04 - 1.00e+00 1.00e+00h 1
750r -1.2028687e+12 9.60e+07 7.38e-13 -7.2 1.11e-04 - 1.00e+00 1.00e+00h 1
1000r -1.2028687e+12 9.60e+07 7.38e-13 -7.2 1.11e-04 - 1.00e+00 1.00e+00h 1
1250r -1.2028687e+12 9.60e+07 7.38e-13 -7.2 1.11e-04 - 1.00e+00 1.00e+00h 1
1500r -1.2028687e+12 9.60e+07 7.38e-13 -7.2 1.11e-04 - 1.00e+00 1.00e+00h 1

Diff between the two versions
The changes here, as far as I can tell, are the only ones that need to be made. Arrays need to conform to the v1.1 syntax, and then the new JuMP syntax is also altered, although nothing else is done.

Is there something obvious that I’ve missed? Are there some pointers you could give me to help solve this problem?


#2

What is the problem exactly?

Why three calls to optimize? Do you not find a solution after the first?

As a side comment (probably not the problem), you have a lot of global variables. Consider declaring them const, for example const N = 100.


#3

I’m expecting the optimization to converge quite quickly (as you can see, the old version takes 5 seconds), but instead it just hits this -1.2028687e+12 objective value and goes on forever, with no changes to the output. I’ve waited for over an hour with no change. So somehow my solution has become infeasible. This to me suggests either an oversight in my implementation, or has identified some edge case in the new JuMP version.

No, the problem changes. The initial run sets the \psi_2 parameter to 0, then is altered based on the result. Strictly speaking this will ultimately be two runs rather than three. The initial call is more to do with another portion of the script (see below).

Yep, this was just to try and drop down the complexity of the example. This isn’t an issue in the real codebase.


#4

Can you post the full Ipopt log?

Also, try simplifying the problem as much as possible so that it still demonstrates the problem. It could be for any number of reasons, although it is probably a problem handling pψ₂.


#5

I think this is the right line of reasoning.

I’ve been cutting down the job to not run multiple times in an attempt to simplify the example for you. If I hard code the result of CPRICE from the first run and only optimize! one time, I get the correct solution in half the time as the v0.64/v0.18 version.

Thanks for your help so far. I’ll work on it a bit more—see if I can identify why the parameter isn’t doing what I expect, then will come back to you if I get stumped.


#6

OK, so it’s definitely the parameter that’s causing issues.

Here are two different versions. Simplified a little from the original files.

JuMP 0.19 using pre-calculated photel value
This works without issue and identifies the correct objective.

Same implementation with photel calculated in the same run via parameter change
This fails as described above.

Diff between files
Literally the only change is calling optimise! twice now with the parameter updated as in intermediate step.

Any idea why this is the failure point?


#7

It’s probably a bug in JuMP. Please create a minimum working example (i.e., with as few variables, constraints, and special constants as possible) and open an issue at https://github.com/JuliaOpt/JuMP.jl/issues


#8

OK, Thanks. Will take a look there.