It’s the question of tolerances: Results NLP problem - #2 by blob
As for a workaround - ipopt will always have finite tolerances, so differences will always be possible. If things are crashing, then you are probably requiring stricter tolerances in the objective function than ipopt. But -7.761358324221462e-9
should really be treated as zero in most applications.
Maybe the performance will be better if you get rid of division:
@variable(model, 0.0<= 0.6*ξ[1:6] <= 1.0)
Alternatively, if the constraints must be satisfied, you can tighten them:
@variable(model, 0.00001 <= 0.6*ξ[1:6] <= 1.0-0.00001) ##You can pick a different positive value instead of 0.00001