Least squares fitting that accounts for covariance matrix of all input parameters

@cgeoga is unfortunately correct.

There’s limited documentation and you’ll have to get pretty low-level:


Thank you @cgeoga and @odow !

I am not sure how to do this in a way that accounts for the constraints. The objective function itself does not depend on the unknown parameters, so its hessian would not be enough. Maybe taking the hessian of the function Φ in the paper mentioned above (page 3, objective function + Lagrangian multiplier term) would do the trick, but I am missing the mathematical background and are merely guessing :man_shrugging: Plus, I do not have the values of the Lagrangian multipliers.

Thank you for pointing me to the right place in the docs. Based on the given examples, I tried the following:

# get variable values in correct order
vars = [zs; ζs; [f, A, m1, m2, m3, m4]]
indx = [JuMP.index(v).value for v in vars]
vals = zeros(length(vars))
for (v, i) in zip(vars, indx)
    vals[i] = value(v) # looks good, order is just as I entered the variables in the model

# make evaluator object
evl = NLPEvaluator(model)
# request hessian calculation
MathOptInterface.initialize(evl, [:Hess])
# get index to place elements from vector returned by eval_hessian_lagrangian into a matrix
hess_struct = MathOptInterface.hessian_lagrangian_structure(evl)
# flat vector, to be filled with hessian elements
hess_flat = zeros(length(hess_struct))
# retrieve the hessian
MathOptInterface.eval_hessian_lagrangian(evl, hess_flat, vals, 1.0, ones(19))
# reshape from vector to matrix
hessian = zeros((length(vars), length(vars)))
for (i, (k, l)) in enumerate(hess_struct)
    hessian[k, l] = hess_flat[i]

It runs through w/o an error, but the numbers I get span many orders of magnitude, which seems wrong:

# print diagonal elements together with corresponding variables
for (v, h) in zip(vars, [hessian[i, i] for i in 1:52])
    println(v, " ", h)
Iₒ[1] 3.780718336483931e9
Iₒ[2] 3.780718336483932e9
Iₒ[3] 3.7807183364839325e9
Iₒ[4] 3.780718336483932e9
Iₒ[5] 3.780718336483932e9
Iₒ[6] 3.780718336483931e9
Iₒ[7] 3.7807183364839315e9
Iₒ[8] 3.7807183364839315e9
Iₒ[9] 3.780718336483932e9
Iₒ[10] 3.780718336483931e9
Iₒ[11] 3.7807183364839315e9
Iₒ[12] 3.7807183364839325e9
Iₒ[13] 3.7807183364839315e9
Iₒ[14] 3.780718336483931e9
Iₒ[15] 3.780718336483931e9
Iₒ[16] 3.780718336483932e9
Irₒ[1] 3.780718336483931e9
Irₒ[2] 3.780718336483932e9
msₒ 2.0e10
mrₒ 3.125e10
ρₒ 23.781212841854938
ρrₒ 3.9674667724657806
aₒ 163265.306122449
I[1] 8.580578122520386e-9
I[2] 8.580578122520386e-9
I[3] 8.580578122520386e-9
I[4] 8.580578122520386e-9
I[5] 8.580578122520386e-9
I[6] 8.580578122520386e-9
I[7] 8.580578122520386e-9
I[8] 8.580578122520386e-9
I[9] 8.580578122520386e-9
I[10] 8.580578122520386e-9
I[11] 8.580578122520386e-9
I[12] 8.580578122520386e-9
I[13] 8.580578122520386e-9
I[14] 8.580578122520386e-9
I[15] 8.580578122520386e-9
I[16] 8.580578122520386e-9
Ir[1] 8.580578122520386e-9
Ir[2] 8.580578122520386e-9
ms 2.0e10
mr 0.0
ρ 3.9568198152688134e-12
ρr 3.9674667724657806
a 0.0
f 0.0
A 0.0
m1 0.0
m2 0.0
m3 0.0
m4 0.0

The returned hessian cannot be inverted, I get a LAPACKException.

Maybe I did something obviously wrong, but I don’t see it and my missing background knowledge in this field is not helping … :slight_smile: I am also not sure what the weighting factors passed to eval_hessian_lagrangian (sigma and mu) should be, so I set them to 1.
I noticed that I can manipulate these values to “compress” the range that the values in the hessian span, but even if I do, I still cannot invert it (plus I am pretty sure I cannot just plug in whatever values I like here).

Since I did not make considerable progress on the JuMP and ModelingToolkit fronts, I post here my “homebrew” solution which does what I want:


It is basically the code fragment I posted above together with some convenience functionality to declare the input parameters and model more easily.
I don’t suspect it to be very performant on larger problems, but for my personal use it is good enough. Perhaps it will be helpful for others as well.