A short while ago, I shared GitHub - nexteraanalytics/LinearFractional.jl: Linear fractional programming with Julia and JuMP for solving linear fractional programs with JuMP. I’m now looking into a big refactor to take advantage of the new goodies in Jump 0.19, MOI, and Julia 0.7. I see that there is now a nice example extension in JuMP.jl/JuMPExtension.jl at master · jump-dev/JuMP.jl · GitHub and this takes a totally different approach than what I used so far.
We use the Charnes-Cooper transformation (Linear-fractional programming - Wikipedia). In the existing code, I wrap the JuMP.Model inside of a LinearFractionalModel that has a t transformation variable. Then each time a new variable or constraint is added, I modify that variable by multiplying the constant term by the t-variable before adding it to the containing model. I use a similar approach for the constraints.
I’m wondering if the experts here think it’s wiser to instead follow the JuMPExtension.jl and create a new Model type that inherits from AbstractModel. Then perhaps, there is easy machinery in MOI to let me simply multiply the constant terms by the transformation t-variable just prior to solving instead of needing to modify each variable on the way in. This seems like it might be a much simpler approach, but I’m new to all of this MOI stuff and the changes. Maybe there’s a better / simpler way that I’m not yet aware of as well.
I’d much appreciate any advice from the JuMP experts here. Thank you!