How can I check if a function is always positive?

Suppose I have a user-defined mathematical function, say

f(x::Real) = x*exp(x)

I would like to check if f(x) > 0 for all x \in [x_l, x_u] \subset \mathbb{R}.

The best ideas I have are either

x = xₗ:0.01:xᵤ
if any(f.(x) .≤ 0)
    ErrorException("Input function must be positive for x ∈ [$xₗ, $xᵤ].")

or this option (which only checks dynamically, when I want it checked statically (if possible))

function f_internal(x)
    val = f(x)
    if val ≤ 0
        ErrorException("Input function must be positive for x ∈ [$xₗ, $xᵤ].") |> throw
        return val

but is there a better/Julian way to do this?

You should check out IntervalArithmatic


Oh that’s perfect. Thanks!

The one warning is that the basic Interval methods can be slow if your function is complicated/ is badly behaved in some technical ways. if IntervalArithmatic doesn’t work well, you might want to try TaylorModels which is a more sophisticated (but higher overhead) technique.


Oh thanks! If I have issues with IntervalArithmetic, then yes I’ll check out TaylorModels.

Here’s an example using

The main method / goal of interval arithmetic is to calculate the range of a function f over an input set X, i.e. the set \mathcal{R}(f; X) := \{f(x): x \in X \}. Calculating this exactly is hard (impossible in general, equivalent to global optimisation of the function), but interval arithmetic gives a cheap way of calculating an enclosure Y of this range, i.e. an interval that is guaranteed to contain it: \mathcal{R}(f; X) \subseteq Y.

The idea is to split up a complicated function f into simple pieces and make sure you bound the range for each piece. Putting the pieces together gives you an enclosure for the complicated function (the “fundamental theorem of interval arithmetic”), but which is in general an over-estimate.

julia> using IntervalArithmetic

julia> f(x) = x * exp(x)
f (generic function with 1 method)

julia> X = interval(1, 2)
[1, 2]

julia> f(X)
[2.71828, 14.7782]

julia> f(X).lo > 0

This constitutes a rigorous proof that f(x) > 0 for all x \in X, where X = [1, 2] is the interval of all real numbers between 1 and 2.

In fact you can also do things like

julia> f(0..∞)
[0, ∞]

to prove this for any x \ge 0!

But this technique has “some” (many) limitations, where it can’t prove things that you know to be true, for example

julia> g(x) = x^2 - 2x

julia> g(100..∞)
[-∞, ∞]

This just says that the true range of the function over that input set is contained in (i.e. is a subset of) the given result.

In this case we (humans) actually know that x^2 is much bigger than 2x in that interval, and hence the function is positive there. In this particular case you could deduce that by calculating the derivative, e.g. using ForwardDiff.jl (which works out of the box with intervals!). But in general I believe it is a hard problem. (But I am very willing to be proved wrong here!)


The concept is awesome, and I’m loving the implementation!

The example with g(x) = x^2 - 2x scares me though, because g is a composition of elementary functions and IntervalArithmetic doesn’t process that it’s positive for all x \in [100, \infty).

So, I’ll try and be careful with the implementation I have in my code. Luckily, I just need this functionality for checking output range rather than complicated and involved transformations on the input range.

Thanks @dpsanders!

1 Like

In general the over-estimation of the range comes when a variable is repeated more than once in an expression, as is the case for x^2 - 2x. This is due to the definition of subtraction of two intervals as

X - Y := \{x - y: x \in X, y \in Y \}

So that if e.g. X = [0, 1] then X - X = [-1, 1] instead of 0. This is known as the dependency effect. It is difficult to get rid of, unfortunately.

In the case of g(x) = x^2 - 2x you can do the following:

julia> g(X.lo)

julia> ForwardDiff.derivative(g, X)
[198, ∞]

Since the derivative is positive everywhere in X, the function is increasing. Since it starts out positive, it remains positive.

But then

julia> h(x) = x^3 - 3x^2
h (generic function with 1 method)

julia> ForwardDiff.derivative(h, X)
[-∞, ∞]

so in that case you would need to go to the second derivative.

For a complicated transcendental function I’m not sure what you would do.


Ah, that makes sense. Utilize the derivatives to enable checking of the range, in a sense, semi-manually constructing the range of the function in question.

@dpsanders or anyone, how well does IntervalArithmetic play with interpolated functions such as solutions to DifferentialEquations?

Good question. I’m guessing they’re just polynomials and you are probably OK?
But the output of a solver from DifferentialEquations is never rigorous.

If you want guaranteed correct (rigorous) bounds for solutions of ODEs you need TaylorModels.jl.


Btw. @dpsanders made a really nice hands-on tutorial on this topic:


It also matters how you write it:

julia> f1(x) = x^2 - 2x
f1 (generic function with 1 method)

julia> f1(100..Inf)
[-∞, ∞]

julia> f2(x) = x*(x - 2)     # mathematically identical to f1
f2 (generic function with 1 method)

julia> f2(100..Inf)
[9800, ∞]

Why not use BlackBoxOptim.jl to o find the global minimum, and compare with 0?

1 Like

One alternative here is to use ApproxFun.jl, e.g.

julia> using ApproxFun

julia> checkpos(f, xₗ, xᵤ) = minimum(Fun(f, xₗ..xᵤ)) > 0

julia> checkpos(x -> x*exp(x), 3, 4)

julia> checkpos(x -> x*exp(x), -2, 2)

julia> checkpos(x -> x^2 - 2x, 100, 200)

This works by constructing a polynomial approximant to the function, to nearly machine precision, and then finding the minimum of that polynomial.

Unlike interval arithmetic, it doesn’t have a dependency problem where it drastically overestimates the range. On the other hand, it might give a false positive/negative if roundoff errors cause the function approximation to slightly cross the axis, and it only allows you to check finite (and reasonably small) intervals.


That’s a good point, you can also use IntervalOptimisation.jl and IntervalConstraintProgramming.jl to do this, at least over finite intervals.