A short while ago, I shared https://github.com/focusenergy/LinearFractional.jl for solving linear fractional programs with JuMP. I’m now looking into a big refactor to take advantage of the new goodies in Jump 0.19, MOI, and Julia 0.7. I see that there is now a nice example extension in https://github.com/JuliaOpt/JuMP.jl/blob/master/test/JuMPExtension.jl and this takes a totally different approach than what I used so far.

We use the Charnes-Cooper transformation (https://en.wikipedia.org/wiki/Linear-fractional_programming). In the existing code, I wrap the JuMP.Model inside of a LinearFractionalModel that has a t transformation variable. Then each time a new variable or constraint is added, I modify that variable by multiplying the constant term by the t-variable before adding it to the containing model. I use a similar approach for the constraints.

I’m wondering if the experts here think it’s wiser to instead follow the JuMPExtension.jl and create a new Model type that inherits from AbstractModel. Then perhaps, there is easy machinery in MOI to let me simply multiply the constant terms by the transformation t-variable just prior to solving instead of needing to modify each variable on the way in. This seems like it might be a much simpler approach, but I’m new to all of this MOI stuff and the changes. Maybe there’s a better / simpler way that I’m not yet aware of as well.

I’d much appreciate any advice from the JuMP experts here. Thank you!