Unfortunately I cannot practically create a MWE for this issue. I build a rather large MINLP model, it has 33,331 variables (2,218 binary), 157,260 constraints, 7,818 non linear constraints. When optimising, after 10 minutes I get this:

```
[2022-12-01 13:24:42] Info bb_strategies.jl L326: Breaking out of strong branching as the time limit of 600.0 seconds got reached.
ERROR: OutOfMemoryError()
Stacktrace:
[1] sizehint!
@ ./array.jl:1267 [inlined]
[2] filter(f::Juniper.var"#78#79"{Vector{Int64}}, a::UnitRange{Int64})
@ Base ./array.jl:2559
[3] upd_gains_step!(tree::Juniper.BnBTreeObj, step_obj::Juniper.StepObj)
@ Juniper ~/Git_repos/Juniper.jl/src/bb_gains.jl:99
[4] upd_tree_obj!
@ ~/Git_repos/Juniper.jl/src/BnBTree.jl:424 [inlined]
[5] solve_sequential(tree::Juniper.BnBTreeObj, last_table_arr::Vector{Any}, time_bnb_solve_start::Float64, fields::Vector{Symbol}, field_chars::Vector{Int64}, time_obj::Juniper.TimeObj)
@ Juniper ~/Git_repos/Juniper.jl/src/BnBTree.jl:491
[6] solvemip(tree::Juniper.BnBTreeObj)
@ Juniper ~/Git_repos/Juniper.jl/src/BnBTree.jl:743
[7] optimize!(model::Juniper.Optimizer)
@ Juniper ~/Git_repos/Juniper.jl/src/MOI_wrapper/MOI_wrapper.jl:358
[8] optimize!
@ ~/.julia/packages/MathOptInterface/a4tKm/src/Bridges/bridge_optimizer.jl:376 [inlined]
[9] optimize!
@ ~/.julia/packages/MathOptInterface/a4tKm/src/MathOptInterface.jl:87 [inlined]
[10] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Juniper.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/a4tKm/src/Utilities/cachingoptimizer.jl:316
[11] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ JuMP ~/.julia/packages/JuMP/Z1pVn/src/optimizer_interface.jl:185
[12] optimize!(model::Model)
@ JuMP ~/.julia/packages/JuMP/Z1pVn/src/optimizer_interface.jl:155
```

Now I appreciate the size of this model could cause the OOM error, but when I check my memory pressure (I have 64GB on an M1 Max) it remains minimal and I still have ~20GB of memory remaining. Additionally, I can solve models of twice the size when solving it as an NLP with Ipopt, or as an MILP with a polyhedral relaxation and Gurobi.

These are the settings I’m using

```
:Gurobi => Dict(
"crossover" => 0, "presolve" => 0, "numeric_focus" => 0
),
:Ipopt => Dict(
"linear_solver" => "ma57",
"ma57_automatic_scaling" => "yes",
"mu_strategy" => "adaptive",
"ma57_pivot_order"=> 2,
"file_print_level" => 5,
"max_iter" => 1_000_000,
"timing_statistics" => "yes",
"hsllib"=> _get_dl_load_path() *"/libhsl.dylib"
),
:Juniper => Dict(
"branch_strategy" => :StrongPseudoCost, "strong_branching_time_limit" => 600,"processors" => nworkers(), "log_levels"=>[:Table,:Info,],"traverse_strategy"=>"DBFS"
),
```

Why am I getting on OOM error when my memory is nowhere near full?