How to use specific convergence measures in Optim.jl?

I’m using this code:

results = Optim.optimize(
    obj, Optim.TwiceDifferentiableConstraints(lo, hi), θ0,
    Optim.IPNewton()
)

And getting this output:

 * Status: success

 * Candidate solution
    Final objective value:     -3.375589e+01

 * Found with
    Algorithm:     Interior Point Newton

 * Convergence measures
    |x - x'|               = 0.00e+00 ≤ 0.0e+00
    |x - x'|/|x'|          = 0.00e+00 ≤ 0.0e+00
    |f(x) - f(x')|         = 1.02e+00 ≰ 0.0e+00
    |f(x) - f(x')|/|f(x')| = 3.01e-02 ≰ 0.0e+00
    |g(x)|                 = 7.72e+04 ≰ 1.0e-08

 * Work counters
    Seconds run:   7  (vs limit Inf)
    Iterations:    5
    f(x) calls:    94
    ∇f(x) calls:   94

As you can see, only the “algorithm looks stuck in the x domain” measures are satisfied, yet |f(x) - f(x')| and |f(x) - f(x')|/|f(x')| are really off. (I think the gradient being really long (|g(x)| = 7.72e+04) is OK in constrained problems?)

Is it possible to use only a subset of these convergence measures? Say, I want to make sure that |f(x) - f(x')|/|f(x')| ≤ 0.0e+00 is always satisfied. How do I do that?