Lagrangian Function

Hello everyone,

I am trying to solve an optimization problem by using the lagrangian function, which package is best suited for this task specially because I need the equality constraint multipliers to be returned as they represent a physical quantity that I need to measure

best regards1

1 Like

Not an expert in this area but maybe have a look at JuMP, there’s a discussion on Lagrange multipliers here: Lagrange multipliers (duals) in JuMP 0.19

2 Likes

Hey there

Do you mean that you want to solve a problem in mechanics by numerically minimizing the action as a function of the trajectory? Or are you solving some other kind of constrained optimization problem by the method of Lagrange multipliers? I’m going to guess the latter, please say if that’s wrong.

I don’t know of a numerical method that uses Lagrange multipliers the way you would with pen and paper. However, if you know the minimizer, you should be able to find the gradients of the cost and the constraint functions using ForwardDiff. The Lagrange multipliers are just the coefficients of the cost gradient expanded over the constraint gradients, right?

The stock answer to numerical optimization in Julia is Optim. I’ve linked to the section of the manual that says how to apply constraints. This is still under development, and you might need to write your own functions to project (tangent) vectors onto the manifold that satisfies your constraint. When I’ve struggled with Optim, the developers have been very helpful.

I hope that is what you were looking for.

3 Likes

It is not clear from your question which symbols are the decision variables and which are constants, and which terms come from equality constraints and which come from inequality constraints. These will determine which package to use, e.g. a linear programming solver, a convex optimization solver or a generic nonlinear optimization solver. JuMP + Ipopt or JuMP + NLopt are probably good places to start if you are new to Julia as the syntax is easy. Optim also has a constrained optimization solver but using it is a bit harder.

1 Like

I am solving a power system problem…The objective function, equality and inequality constraints are linear

but I care about the multiplier of equality and inequality constraints so I want a solver that can enable me to get their values

my problem is related to power system and it is linear and I am interested in the method of lagrange multiplier as the values of mutlipliers reflect the price of power.

xx
This is the problem formulation
P and theta are my variables, but D and zeta are given

Any linear programming solver can give you the Lagrangian multipliers without explicitly forming the Lagrangian function. For example, you can use JuMP with GLPK or Clp and use the dual function to get the dual variable for any constraint (Query Solutions · JuMP).

1 Like

Okaay…I will check

Cant I use it with Cbc ??

You could. Cbc is an extension of Clp for mixed integer optimization.

I cant define the error

here

line62
line 62

Ya for the dual, you might want to use Clp. I haven’t used these stuff for a while so you will get a better response from experts if you change the domain of the topic to #domain:opt.

1 Like

Thanks alot

I’ve gone ahead and changed the domain

Cbc is for mixed-integer problems. It doesn’t compute duals. Use Clp or GLPK.

Depending on the sign you expect from the Lagrangian, you may want to use shadow_price instead of dual: Constraints · JuMP

Note that, given any optimization algorithm, you can easily compute the Lagrange multipliers yourself given the optimum — it’s just a system of linear equations (from the KKT equations) given the gradients of the objective and constraints at the optimum, which you can solve with a backslash operation in Julia.

For example, if you are minimizing f_0(x) subject to h_i(x)=0 for i=1\ldots p, and your optimization algorithm gives you an approximate minimum x^*, then to find the Lagrange multipliers \nu_i for your constraints you would solve

\nu = \begin{bmatrix} \nabla h_1 & \cdots & \nabla h_m \end{bmatrix} \setminus -\nabla f_0

where we have a matrix whose columns are the constraint gradients (evaluated at x^*) and \ denotes the Julia backslash operation, which will solve the equations in the least-square sense assuming you have fewer constraints than variables x (this is necessary since x^* will only be an approximate minimum).

If you have inequality constraints, it is the same thing except that you need only include active constraints in your linear system; the Lagrange multipliers of inactive constraints are zero (“complementary slackness”).

6 Likes