I apologize for the long answer. Below are 3 experiments.
Here’s a solution based on solving a feasibility problem (minimizing a constant objective function subject to your system as constraints). This allows you to include bounds on your variables. However, it seems to me that your problem is over-constrained: 6 variables and 7 equality constraints if you include w_1 + w_2 = 1. I left this last constraint out and it appears to be (approximately) satisfied at the solution found by IPOPT:
(@v1.4) pkg> activate test-feas
(test-feas) pkg> add NLPModels NLPModelsIpopt
julia> using NLPModels, NLPModelsIpopt
julia> function c(x)
M = [2.250, 9.675, 57.263, 427.219, 3836.109, 40234.852]
F = [ M[1] - x[5]*x[3]*x[1] - x[6]*x[4]*x[2] ;
M[2] - x[5]*x[3]^2*(x[1]+1)*x[1] - x[6]*x[4]^2*(x[2]+1)*x[2] ;
M[3] - x[5]*x[3]^3*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^3*(x[2]+2)*(x[2]+1)*x[2] ;
M[4] - x[5]*x[3]^4*(x[1]+3)*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^4*(x[2]+3)*(x[2]+2)*(x[2]+1)*x[2] ;
M[5] - x[5]*x[3]^5*(x[1]+4)*(x[1]+3)*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^5*(x[2]+4)*(x[2]+3)*(x[2]+2)*(x[2]+1)*x[2] ;
M[6] - x[5]*x[3]^6*(x[1]+5)*(x[1]+4)*(x[1]+3)*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^6*(x[2]+5)*(x[2]+4)*(x[2]+3)*(x[2]+2)*(x[2]+1)*x[2] ]
#x[5] + x[6] - 1] # left this out
return F
end
julia> x0 = [0.5; 0.5; 0.5; 0.5; 0.5; 0.5];
julia> # define a model with derivatives computed by ForwardDiff
julia> # bounds on the variables are included
julia> model = ADNLPModel(x -> 0.0, x0; c=c, lcon=zeros(6), ucon=zeros(6), lvar=zeros(6))
julia> stats = ipopt(model) # solution found after 481 iterations
julia> x = stats.solution
6-element Array{Float64,1}:
2.0064087361563248
0.46019054128577414
1.4990976942166683
1.0902146315580712
0.6949587215544324
0.3183161594144738
julia> x[5] + x[6]
1.0132748809689063
Choosing a different starting point might give you the solution you’re looking for.
As an experiment, I tried formulating your problem as an over-determined nonlinear least-squares problem with bounds and include the term w_1 + w_2 - 1 in the residual. IPOPT doesn’t manage to solve it but KNITRO does (KNITRO is a commercial solver but you can download a demo version from the Artelys website):
(test-feas) pkg> add NLPModelsKnitro
julia> using NLPModelsKnitro
julia> function c(x)
M = [2.250, 9.675, 57.263, 427.219, 3836.109, 40234.852]
F = [ M[1] - x[5]*x[3]*x[1] - x[6]*x[4]*x[2] ;
M[2] - x[5]*x[3]^2*(x[1]+1)*x[1] - x[6]*x[4]^2*(x[2]+1)*x[2] ;
M[3] - x[5]*x[3]^3*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^3*(x[2]+2)*(x[2]+1)*x[2] ;
M[4] - x[5]*x[3]^4*(x[1]+3)*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^4*(x[2]+3)*(x[2]+2)*(x[2]+1)*x[2] ;
M[5] - x[5]*x[3]^5*(x[1]+4)*(x[1]+3)*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^5*(x[2]+4)*(x[2]+3)*(x[2]+2)*(x[2]+1)*x[2] ;
M[6] - x[5]*x[3]^6*(x[1]+5)*(x[1]+4)*(x[1]+3)*(x[1]+2)*(x[1]+1)*x[1] - x[6]*x[4]^6*(x[2]+5)*(x[2]+4)*(x[2]+3)*(x[2]+2)*(x[2]+1)*x[2] ;
x[5] + x[6] - 1]
return F
end
julia> model = ADNLSModel(c, x0, 7, lvar=zeros(6)) # NSL = nonlinear least squares
julia> stats = knitro(model) # solved in 154 iterations
julia> x = stats.solution
6-element Array{Float64,1}:
2.0014732680262566
0.4965607369186837
1.4997796318745336
1.0165903327466042
0.6988962139789188
0.30110695603014176
Finally, you can try one of our pure Julia solvers that handles bound constraints:
(test-feas) pkg> add JSOSolvers
julia> using JSOSolvers
julia> stats = tron(model) # solved in 1006 iterations
julia> stats.solution
6-element Array{Float64,1}:
1.9441520958770582
1.9441520958770582
1.5104044760203716
1.5104044760203716
0.36776275007834164
0.36776275007834164