How do I find `m` and `b` such that `some_function((x) -> m*x + b)` is maximized?

I have the following Julia function which calculates the expected mean of some model. I want to find m and b such that EBuy(4, (x) -> m*x + b) is maximized.

using Distributions

function EBuy(λ, buy ; limDefault=10^2)

    function sell1(d1, m0)
        min(d1, buy(m0) + m0)
    end

    function trash1(d1, m0)
        max(m0 - d1, 0)
    end

    function remaining1(d1, m0)
        max(0, buy(m0) - max(0, d1 - m0))
    end

    function sell2(d1, m0, d2)
        r1 = remaining1(d1, m0)
        min(d2, buy(r1) + r1)
    end

    function trash2(d1, m0, d2)
        max(0, remaining1(d1, m0) - d2)
    end

    function profit(d1, m0, d2)
        profit_price = 20
        waste_price = -100

        profit_price * (sell1(d1, m0) + sell2(d1, m0, d2)) + waste_price * (trash1(d1, m0) + trash2(d1, m0, d2))
    end

    D1Rng = Poisson(λ)
    D2Rng = D1Rng
    function D1(x)
        pdf(D1Rng, x)
    end
    D2 = D1
    M0Rng = Poisson(λ - 2)
    function M0(x)
        pdf(M0Rng, x)
    end

    function EProfit(limI=limDefault, limJ=limDefault, limK=limDefault)
        sum = 0
        for i in 0:limI
            Sj = 0

            for j in 0:limJ
                Sk = 0

                for k in 0:limK
                    Sk += D2(k) * profit(j, i, k)
                end

                Sj += D1(j) * Sk
            end
            if i == 0
                println("With m0=0, EProfit = $(Sj)")
            end
            sum += M0(i) * Sj
        end

        return sum
    end

    EProfit()
end

I tried these this, but it did not lead me anywhere:

using ModelingToolkit
@parameters p_a p_b
sys1 = OptimizationSystem(EBuy(4, (meat) -> max(0, p_b - p_a * meat) ; limDefault=10^2), [], [p_a, p_b]

Update: I found a somewhat working solution (though I think a better one should be possible):

using JuMP, Ipopt
m = Model(optimizer_with_attributes(Ipopt.Optimizer, "max_cpu_time" => 60.0))
@variable(m, x1)
@variable(m, x2)
function f(m, b) 
    EBuy(4, (meat) -> max(0, b + m * meat), limDefault=10^2)
end
JuMP.register(m, :f, 2, f, autodiff=true)
@NLobjective(m, Max, f(x1, x2))
optimize!(m)
println("x1=$(value(x1)), x2=$(value(x2)), objective=$(objective_value(m)), direct objective=$(f(value(x1),value(x2)))")

I don’t really understand your problem. Is x fixed? AFAICT (x) -> m*x + b is a function, so it’s not clear to me what maximizing it means. I would think you want almost the opposite: you start with a function that maps (x, m, b) -> m * x + b, then construct a closure that holds x fixed and gives you a function of only m and b. Then you maximize with respect to the m and b.

1 Like

See my update, I found a partial solution. Do you understand my question based on that solution, or should I explain it more clearly?

This looks like something I’d use Optim.jl on rather than JuMP.jl.

You need to form your objective function to only be a function of the parameters m and b

objfun(m,b) = EBuy(4, (x) -> m*x + b)
result = optimize(p-> -1 * objfun(p[1],p[2]),p_guess,solverchoice)

where p_guess is your starting values for the parameters and solverchoice is one of the Optim.jl solvers. Note the sign flip because the tradition is to minimize not maximize.

I’d also say that you probably might want to even consider using a global solver like Particle Swarm because it looks like your objective function could take a weird shape with all of those min and max, but that’s for you to explore.

3 Likes