Optim.jl ArgumentError: Value and slope at step length = 0 must be finite

Hello,

I’m minimizing the analitic center of a set of inequalities (see problem 9.30 Convex Optimization Boyd and Vandenberghe).

I generate a random instance as follows:

n=8
m=7
A=rand(m,n)

Then I create the necessary functions:

f(x) = - sum(log(1-dot(A[i,:],x)) for i=1:m) - sum(log(1 - x[i]^2.) for i=1:n)

function ∇f!(g, x)
    g = sum(A[i,:]/(1-dot(A[i,:],x)) for i=1:m) + 2*x./(1-x.^2)
end

function Δf!(h, x)
    h = sum(A[i,:]*transpose(A[i,:]) / (1-dot(A[i,:],x))^2 for i=1:m) +
        diagm( (2 + 2*x.^4) ./ (1-x.^2).^2)
end

If I optimize f(x), without using the information of the gradient and hessian I get:


res = optimize(f, zeros(n), NelderMead())

Results of Optimization Algorithm
 * Algorithm: Nelder-Mead
 * Starting Point: [0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0]
 * Minimizer: [-0.45985687074938164,-0.4699442909210658, ...]
 * Minimum: -5.254794e+00
 * Iterations: 374
 * Convergence: true
   *  √(Σ(yᵢ-ȳ)²)/n < 1.0e-08: true
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 639

However, if I try to use this information (GradientDescent()-Newton(),NewtonTrustRegion()):

optimize(f, ∇f!, zeros(n), GradientDescent())

ArgumentError: Value and slope at step length = 0 must be finite.
in optimize at Optim/src/multivariate/optimize/interface.jl:119
in #optimize#132 at Optim/src/multivariate/optimize/interface.jl:121
in optimize at Optim/src/multivariate/optimize/optimize.jl:49
in update_state! at Optim/src/multivariate/solvers/first_order/gradient_descent.jl:74
in perform_linesearch! at Optim/src/utilities/perform_linesearch.jl:40
in  at LineSearches/src/hagerzhang.jl:101
in  at LineSearches/src/hagerzhang.jl:116
optimize(f, ∇f!, Δf!, zeros(n), Newton())


DomainError()
in optimize at Optim/src/multivariate/optimize/interface.jl:126
in optimize at Optim/src/multivariate/optimize/optimize.jl:49
in update_state! at Optim/src/multivariate/solvers/second_order/newton.jl:83
in perform_linesearch! at Optim/src/utilities/perform_linesearch.jl:40
in  at LineSearches/src/hagerzhang.jl:101
in  at LineSearches/src/hagerzhang.jl:215
in  at LineSearches/src/LineSearches.jl:87
in value_gradient! at NLSolversBase/src/interface.jl:75
in value_gradient!! at NLSolversBase/src/interface.jl:88
in  at NLSolversBase/src/objective_types/abstract.jl:14
in f at example_convex.jl:144
in mapfoldl at base/reduce.jl:71
in  at base/<missing>
in log at base/math.jl:419 
in nan_dom_err at base/math.jl:300
optimize(f, ∇f!, Δf!, zeros(n), NewtonTrustRegion())

BoundsError: attempt to access 0-element Array{Float64,1} at index [1]
in optimize at Optim/src/multivariate/optimize/interface.jl:126
in optimize at Optim/src/multivariate/optimize/optimize.jl:49
in update_state! at Optim/src/multivariate/solvers/second_order/newton_trust_region.jl:266
in #solve_tr_subproblem!#60 at Optim/src/multivariate/solvers/second_order/newton_trust_region.jl:85
in getindex at base/array.jl:554

I’m not sure if the source of the error is evident.
Thank you!

I get a negative log when running your code? Also your gradient is wrong:

function ∇f!(g, x)
    g = sum(A[i,:]/(1-dot(A[i,:],x)) for i=1:m) + 2*x./(1-x.^2)
end

does not do what you think it does but creates a new variable g which shadows the old one. Use .= or copy!. @pkofod possibly we could have a clearer error message here?

1 Like

Great!
Now NewtonTrustRegion() works:

Results of Optimization Algorithm
 * Algorithm: Newton's Method (Trust Region)
 * Starting Point: [0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0]
 * Minimizer: [-0.4642088906377006,-0.5205478310783768, ...]
 * Minimum: -5.958903e+00
 * Iterations: 11
 * Convergence: true
   * |x - x'| ≤ 0.0e+00: false 
     |x - x'| = 1.34e-08 
   * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
     |f(x) - f(x')| = -2.98e-16 |f(x)|
   * |g(x)| ≤ 1.0e-08: true 
     |g(x)| = 8.97e-09 
   * Stopped by an increasing objective: false
   * Reached Maximum Number of Iterations: false
 * Objective Calls: 12
 * Gradient Calls: 12
 * Hessian Calls: 11

It seems you are right about the negative result inside the logarithm. So I have changed f(x) to its extended domain function:

f(x) = any(A*x.>=1) ? Inf : ( - sum(log(1-dot(A[i,:],x)) for i=1:m) - sum(log(1 - x[i]^2.) for i=1:n))

function ∇f!(g, x)
    g .= sum(A[i,:]/(1-dot(A[i,:],x)) for i=1:m) + 2*x./(1-x.^2)
end

function Δf!(h, x)
    h .= sum(A[i,:]*transpose(A[i,:]) / (1-dot(A[i,:],x))^2 for i=1:m) +
        diagm( (2 + 2*x.^4) ./ (1-x.^2).^2)
end

Now both GradientDescent() and Newton() are having the same error.

DomainError()
in optimize at Optim/src/multivariate/optimize/interface.jl:119
in #optimize#132 at Optim/src/multivariate/optimize/interface.jl:121
in optimize at Optim/src/multivariate/optimize/optimize.jl:49
in update_state! at Optim/src/multivariate/solvers/first_order/gradient_descent.jl:74
in perform_linesearch! at Optim/src/utilities/perform_linesearch.jl:40
in  at LineSearches/src/hagerzhang.jl:101
in  at LineSearches/src/hagerzhang.jl:136
in  at LineSearches/src/LineSearches.jl:87
in value_gradient! at Optim/src/Manifolds.jl:47 
in value_gradient! at NLSolversBase/src/interface.jl:75
in value_gradient!! at NLSolversBase/src/interface.jl:88
in  at NLSolversBase/src/objective_types/abstract.jl:14
in f at example_convex.jl:145
in mapfoldl at base/reduce.jl:71
in  at base/<missing>
in log at base/math.jl:419 
in nan_dom_err at base/math.jl:300 

These solvers are not designed to handle functions that have infinities. Fix your initialization to get a feasible point.

Yes, x0=zeros(n), is always a feasible start point:

( - sum(log(1-dot(A[i,:],x)) for i=1:m) - sum(log(1 - x[i]^2.) for i=1:n)) = 0

Still it throws the same error.

You’re missing one part: f(x) = (any(A*x.>=1) || any(abs(x) .>= 1)) ? Inf : ( - sum(log(1-dot(A[i,:],x)) for i=1:m) - sum(log(1 - x[i]^2.) for i=1:n)) works.

Thank you very much! It work perfectly!

Yes, the line search could / should inform the users of the likely reasons instead of just stating that something is zero :slight_smile:

1 Like