A short while ago, I shared GitHub - nexteraanalytics/LinearFractional.jl: Linear fractional programming with Julia and JuMP for solving linear fractional programs with JuMP. I’m now looking into a big refactor to take advantage of the new goodies in Jump 0.19, MOI, and Julia 0.7. I see that there is now a nice example extension in JuMP.jl/JuMPExtension.jl at master · jump-dev/JuMP.jl · GitHub and this takes a totally different approach than what I used so far.

We use the Charnes-Cooper transformation (Linear-fractional programming - Wikipedia). In the existing code, I wrap the JuMP.Model inside of a LinearFractionalModel that has a t transformation variable. Then each time a new variable or constraint is added, I modify that variable by multiplying the constant term by the t-variable before adding it to the containing model. I use a similar approach for the constraints.

I’m wondering if the experts here think it’s wiser to instead follow the JuMPExtension.jl and create a new Model type that inherits from AbstractModel. Then perhaps, there is easy machinery in MOI to let me simply multiply the constant terms by the transformation t-variable just prior to solving instead of needing to modify each variable on the way in. This seems like it might be a much simpler approach, but I’m new to all of this MOI stuff and the changes. Maybe there’s a better / simpler way that I’m not yet aware of as well.

I’d much appreciate any advice from the JuMP experts here. Thank you!

The easiest way to extend JuMP is by writing new methods for [parse|build|add][variable|constraint]. The JuMPExtension approach might be overkill for some extensions but it may be needed, it is the approach used by StructJuMP.

In your case, it seems you could multiply the constant by `t` in the `addconstraint` call. To define a new method for the `addconstraint` call you need new types. Here you `AbstractConstraint` will be classical ones so you need a new type for the model.
You could define

``````struct LinearFractionalModel <: JuMP.AbstractModel
model::JuMP.Model
t::JuMP.VariableRef
end
``````

then define all methods defined in `JuMPExtension` for `LinearFractionalModel`. However here instead of creating `MyVariableRef` ans `MyConstraintRef` you just redirect all calls to the inner `model` field except `addconstraint` for which you modify the function by modifying the constant term by `t` before redirecting it to the inner `model` field.

Let me know if that’s clear or if you have any question

Note: this should be documented here, your post help us see what are the use cases and how the doc could be structured

EDIT: An alternative would be to do this at the MOI level by writing an MOI layer. You would write a

``````struct LinearFractionalModel <: MOI.ModelLike
model::MOI.ModelLike
t::MOI.VariableIndex
end
``````

then you redirect all calls to the inner `model` except `MOI.addconstraint!` in which you modify the constraint and

``````MOI.set(model::LinearFractionalModel, ::ObjectiveFunction{MOI.ScalarNonlinearFunction}, f::ScalarNonlinearFunction)
``````

in which you error if the nonlinear function is a fraction and get the numerator and denominator then pass an

``````MOI.set(model.model, ObjectiveFunction{MOI.ScalarAffineFunction{Float64}}, new_obj)
``````

You would then need to modify the results, e.g. `MOI.VariablePrimal`, to make the transformation transparent, e.g. by dividing `y` by `t` to get `x` because the variable primal values returned by the inner `model` field would be the values of `y` and not `x`.
The user will then be able to do `@NLobjective model (x+1) / (x - 2)`.
However, that approach is currently not feasible as the nonlinear interface does not currently permits you to look into the expression graph (you need it to grab the numerator, denominator, …) but that will be the case when the nonlinear interface will be rewritten with Cassette.

1 Like

Thank you for the detailed response @blegat and offer for assistance!

Your first suggestion seems very similar to my current version, which also wraps an inner transformed model and holds the t-variable. An interesting difference is that you suggest defining `t` as a `JuMP.VariableRef` field. In my current version, it’s actually a full `JuMP.Variable`. I can do some research to understand the difference.

It seems like I will get to skip defining a bunch of methods in JuMPExtension.jl if I don’t define a new `JuMP.AbstractVariableRef` and `MyConstraintRef `-type, so that’s nice!

I will experiment with your suggestion and report back if I have more questions or success. Thank you!

P.S.: Thanks for the second (MOI) approach. I think I need to study MOI/JuMP a bit more to get to this level.

I can do some research to understand the difference.

JuMP v0.18’s `JuMP.Variable` has been renamed to `JuMP.VariableRef` in JuMP master so it’s the same thing

Ah, just renaming! Thanks for clarifying that.

Incidentally, I started working on this, but I think I might be JuMPing the gun a bit. I got stuck just searching for basic but full working examples of solving simple problems with Julia 0.7/JuMP master/MOI with open source solvers. I understand that tests were just made to pass a few days ago, so I’ll be patient and come back to this in a bit.

Not to worry though, I’m not blocked by this yet, just trying to stay ahead of it. Thanks for your excellent work on this amazing open source product!

I have started working on updating LinearFractional to JuMP 0.19 using @blegat’s first suggested approach. It all seems to be working great except for one thing: I haven’t figured out how to specialize `JuMP.value` for `LinearFractionalModel`s. The problem is that `JuMP.value` dispatches on the variable type and we don’t have a new variable type for this extension (post-upgrade – we did in the old version). So when a user does `value(x)`, of course this returns the solved value in the transformed space, whereas we’d want the back-transformed result `value(x)/value(t)`.

It will be great if we can avoid creating a specialized variable type. Doing so would require us to write/copy code for operators, affine expressions, and other downstream methods. This is what we had in the old version.

The upgrade WIP branch is in https://github.com/focusenergy/LinearFractional.jl/tree/jump0.19. I appreciate any suggestions.

Edit: Added a PR at https://github.com/focusenergy/LinearFractional.jl/pull/7 as it might make the conversation simpler to comment on code over there. https://github.com/focusenergy/LinearFractional.jl/pull/7/files#r295872689 has a bit more on thinking about the `value` problem.

It will be great if we can avoid creating a specialized variable type. Doing so would require us to write/copy code for operators, affine expressions, and other downstream methods. This is what we had in the old version.

We made an effort with JuMP v0.19 to have a well-defined API for `AbstractVariableRef` and have all the operators, affine expressions, etc… work on top of this API.
We have an example implementation in the test:
https://github.com/JuliaOpt/JuMP.jl/blob/36dc0863b5fa6d73bea904f0824dfb0e25cb68d4/test/JuMPExtension.jl#L43-L200
all the operators tests, etc… are run with this example implementation to guarantee we don’t break things for custom implementations of `AbstractVariableRef` in the future.

That’s awesome! Thank you @blegat. I just pushed that change to my PR and now have the first test working. Onto the rest!

1 Like