Hi,

yesterday I worked on a script provided by Maxime to find an optimal solution for the parameters of an equivalent circuit. For this purpose there is the established python package impedance.py,

but this package fails to compete with a commercial fitting tool.

The solution provided by Maxime reaches already the quality of the commercial reference software and outperforms the impedance.py-package.

The number of fitting parameters is 11. Is there beside the optimization package BlackBoxOptim another suggestion to optimize such problems?

Here the link to my code:

EC_optimisation_experiment_Maxime.jl

Dependency:

Currently the master version of EquivalentCircuits.jl must be installed to enable the execution.

What’s the nature of this optimization problem? Is it

- Nonconvex?
- Does it have constraints? Linear, nonlinear?
- Does it have integer variables?
- Can you compute gradients of the loss and constraint functions?

It is a continuous parameter space. The problem is the rugged quality mountain. The parameter room is constrained (all parameters are positive). The quality function is nonlinear. The variables are Float64.

Can you compute gradients of the loss and constraint functions?

I have no experience for this task.

If you write your problem using Optimization.jl you can try out a large number of solvers rather easily. You can use BBO through this interface, but also MultistartOptimization.jl and some of the global optimization algorithms from NLopt and Evolutionary.jl etc.

The docs has a table of solvers and what problem features they support.

If a particular solver does not support enforcing positivity of the parameters, a common trick is to optimize p_l = \log(p) instead, that way the parameter seen by the optimizer is unconstrained, and you compute p = \exp(p_l) in the cost function to get a positive parameter.

thanks

I have started to use the package `Optimization.jl`

but the objective function generated by the package EquivalentCircuits.jl does not seem to be compatible to the function: `Optimization.OptimizationProblem()`

do you have a suggestion?

Here is my code: EC_optimisation_via_evolution_strategy.jl

That’s a rather large amount of code, and it’s not obvious to me where the cost function is defined. What problem do you encounter?

Thanks for your replay! - The code can be executed without error. If you comment-in line l103:

```
prob = Optimization.OptimizationProblem(objective, initial_parameters, SciMLBase.NullParameters(),; lb = lower, ub = upper)
```

The message starts with:

```
ERROR: All methods for the model function `f` had too few arguments. [...]
```

The object function `objective`

that causes the issue is compatible to `Optim.optimize()`

,

but not to `Optimization.OptimizationProblem()`

.

Optimization.jl assumes that your objective function takes two arguments `cost(x, p)`

- The optimization variables
`x`

- Other parameters
`p`

If you have no “other parameters”, you can create an anonymous function that just discards the extra parameters like this

```
obj = (x, p) -> objective(x)
```

and pass `obj`

into Optimization.jl

I submitted a PR to clarify the documentation on this point

Thanks for the advice, the error messag is gone, but the results are not yet promissing.

P.S.:

What worked very fine to increase the result is a weighting functionality inside the quality function

in the package `EquivalentCircuits`

:

```
function objectivefunction(circuitfunc,measurements,frequencies,weights=nothing)
if isnothing(weights) ; weights = ones(length(frequencies)) ; end
function objective(x)
model_output = [circuitfunc(x,fr) for fr in frequencies]
return mean(weights .* (abs.(measurements - model_output).^2)./(abs.(measurements).^2 .+ abs.(model_output).^2))
end
return objective
end
```