Help with Performance in Large Mixed Complementarity Problem

With a bit of re-arranging, I’ve simplified your problem down to:

using JuMP
import BitOperations
import PATHSolver

function possible_shock_combinations(J)
    return [BitOperations.bget(j, i) for j in 0:(2^J-1), i in 0:(J-1)]
end

function shock_probabilities(sequences, probs)
    @assert size(sequences, 2) == length(probs)
    return map(1:size(sequences, 1)) do i
        return prod(
            s_i ? p_i : (1 - p_i) for (s_i, p_i) in zip(sequences[i, :], probs)
        )
    end
end

function solve_complementarity(param1, param2, shock, probs, w, i)
    model = Model(PATHSolver.Optimizer)
    set_silent(model)
    @variable(model, x[1:J] >= 0, start = 0.5)
    A = 1 .- shock .* (1 - param2)
    b = -(1 - param1) * w[i]
    @constraint(model, c[j in 1:J],
        b * sum(probs[s] * A[s, j] / (A[s, :]' * x) for s in 1:2^J) + w[j] ⟂ x[j]
    )
    optimize!(model)
    return value.(x)
end

J = 10
param1 = 0.5
param2 = 0.1
prob = fill(0.5, J)
shock = possible_shock_combinations(J)
probs = shock_probabilities(shock, prob)
w = ones(J)
i = 1
x = solve_complementarity(param1, param2, shock, probs, w, i)

The problematic bit is

sum(probs[s] * A[s, j] / (A[s, :]' * x) for s in 1:2^J)

For J = 30, this is a sum with 1 billion elements. And each element has the form a / b' * x, so this is an expression with something like (2 * J) * 2^J nodes in the expression graph. And even once you have build it, JuMP has to compute derivatives of it. JuMP isn’t really built for expressions of this size.

The work-around is usually to write a user-defined function, but PATHSolver doesn’t support them yet (the nonlinear interface is still pretty new). I’ve opened an issue: Support MOI.UserDefinedFunction · Issue #93 · chkwon/PATHSolver.jl · GitHub