Hi all. I am using Julia through Juno on a Mac, and got the following error today when I was trying to call the
optimize! function from JuMP
using JuMP, Ipopt
m_test = Model(with_optimizer(Ipopt.Optimizer, linear_solver = "ma57"))
@variable(m_test, x[i in 1:10], lower_bound = 0.0, upper_bound = 1.0)
@constraint(m_test, sum(x) == 1.0)
@objective(m_test, Min, sum(i * x[i] for i in 1:10))
I can run my code through the terminal though. I am not sure what’s going on here. Any suggestions? Thanks.
Can you try this code outside of Juno? As @mbauman says, it’s possible that Juno is the culprit here, but so far I’ve only seens segfaults in Juno with Julia 1.0.x during Pkg operations (and a different stacktrace).
I think that’s what this means:
Yes. I am to run my script through the terminal, which also calls JuMP and
Can you make sure you’re running the same version of Julia and all (relevant) packages in both environments?
FWIW, your code works just fine for me in Juno, but I don’t have access to a macOS system.
Yes. I am using Julia 1.3.1 in both environments (Juno or terminal), and I believe they are using the same
Could you please paste the code in text, and not in a screenshot, so that other people can try to reproduce it?
Just tried on Juno with Julia 1.3.0 on a macOS and it works for me.
Sorry. Just wanted to show the problem happens when calling the
No need to apologize, It is just that it makes it easier for people that want to help reproducing it!
I switched to Julia 1.1 by changing the Julia path in Juno, and didn’t see this segfault any more. I am fine with Julia 1.1 for now. Still, I am able to execute the code with Julia 1.3 through the terminal (activate a Julia session and include the script). Am I missing anything when setting up Juno with Julia 1.3?