I’m having some trouble with the function “log” in an optimization problem.

For instance:
I create a variable, and then the following constraint:

@NLconstraint(ModelName, log(x) <=100), I use Ipopt to solve it (a feasible problema) and it gives me x=0 (it is infeasible I think), but then I put this constraint: @NL(ModelName, x <= exp(100)) and now it returns a feasible solution. This is a simpler version of my issue…

The thing is…I strongly need to use “Log” in some constraints and the objective function of the optimization problem.How can I solve this?

for i in [1:n;] @NLconstraint(m, x[i]<=1 ) @NLconstraint(m, x[i] >= exp(-gamma[i]) )
end @NLconstraint(m,-sum{log(x[i])/gamma[i], i=1:n}>=-mm)

the last constraint presents the infeasibility I wrote above. I’m using the change of variables xi=exp(-yi*gamma[i]), because with this the problem transforms into a convex one.

The last constraint is probably an issue when x[i]=0. Try adding bounds on x rather than via NLconstraint. You probably want to set the start keyword as well since it defaults to 0.0.

model = Model(solver = IpoptSolver())
@variable(model, 0.01 <= x[i = 1:n] <= 1, start = 1)
@NLconstraint(model, sum(log(x[i]) for i in 1:n) >= -1)

P.s. for future reference, take a look at the post I linked to. It demonstrates how to format code nicely. I can’t copy and paste your example because gamma, n, and mm aren’t defined.

IPOPT is not a feasible method. The first thing it does is add slack variables to turn your constraint into an equality: log(x) + s = 100 with s ≥ 0. IPOPT only ensures that s > 0 during the iterations, but the equality constraint is only guaranteed to be satisfied in the limit. The problem you have is that your constraint function isn’t defined everywhere, and IPOPT is likely to venture outside of its domain. If you have access to KNITRO, it has a “feasible” option to ensure that nonlinear inequality constraints remain strictly satisfied. If you don’t have access to KNITRO, your best bet is to rework your model.