I am trying to serialize JuMP Models to .mps, .lp, .rew formats via write_to_file method. But I am facing the same error with all three conversion while the .nl conversion is successful:

MathOptInterface.UnsupportedConstraint{MathOptInterface.ScalarNonlinearFunction, MathOptInterface.EqualTo{Float64}}: MathOptInterface.ScalarNonlinearFunction-in-MathOptInterface.EqualTo{Float64} constraint is not supported by the model.

The models that I am using are PGLIB models instantiated as JuMP models.

Since my main objective is to solve these models to solve using Gurobi, but trying to solve this JuMP model directly with Gurobi.Optimizer generates the same error.
Is there any way to bypass this error by working with Gurobi C api during direct_model()?
I tried:
model = direct_model(Gurobi.Optimizer())
instantiate model(filepath, model) #To load the model
JuMP.optimize!(model)

It runs and log indicate that Gurobi is being used, terminal status is 1 but objective function is zero.

“Barrier solved model in 0 iterations and 0.01 seconds (0.00 work units)
Optimal objective 0.00000000e+00.”

The model works perfectly well with Ipopt and other solvers.