[ANN] AccessibleOptimization: optimization with arbitrary objects instead of vectors


Combining Accessors.jl + Optimization.jl to enable function optimization with arbitrary structs. Vary struct parameters, combinations and transformations of them. Uniform and composable, zero overhead.


Suppose you need to optimize parameters of a function that takes some Julia object as its input. Maybe it’s a model you want to fit to some data, maybe it’s just an optimization problem.

There are various optimization packages in Julia, surely they can help here. But they typically work with parameters as a vector, or at least a vector-like object, so arbitrary user/package-defined structs should be manually converted to/from vectors.

Several packages aim to help with this conversion. AFAIK, the most generic, composable and extensible approach is optics by Accessors.jl. Even with Accessors, there is a lot of boilerplate back/forth conversions, especially noticeable for small adhoc problems.

AccessibleOptimization is a thin wrapper around Optimization.jl, that uses Accessors + AccessorsExtra machinery to flexibly define target parameters for optimization, using arbitrary structs as function inputs.


A simple example, using just (named)tuples:

# define a model (sum-of-sqexps) and a loss function to optimize:
expsum(m::Tuple, x) = sum(c -> c.scale * exp(-(x - c.shift)^2), m)
loss(m, data) = sum(d -> abs2(d.y - expsum(m, d.x)), data)

data = ...  # collection of points with x and y

using IntervalSets
using AccessibleOptimization  # reexports everything from Optimization and AccessorsExtra

# define which parameters to optimize, and what are their bounds
vars = OptArgs(
    # component shifts - values from 0..10:
    @o(_[∗].shift) => 0..10.,
    # component scales - positive-only (log10 transformation), from 10^-1 to 10^1:
    @o(log10(_[∗].scale)) => -1..1,

# create and solve the optimization problem, interface very similar to Optimization.jl
mod0 = ((
	(scale=1, shift=1),
	(scale=1, shift=2),
	(scale=1, shift=3),
ops = OptProblemSpec(Base.Fix2(loss, data), mod0, vars)
sol = solve(ops, ECA(), maxiters=300)
sol.uobj  # the optimal model
loss(sol.uobj, data)

See the README for more details, and the Pluto notebook for examples that include custom structs, more involved parameter transformations, and constraints.

See also

  • Accessors and AccessorsExtra packages for generically referring to parts and transformations of arbitrary objects (so-called “optics”)
  • PlutoTables using a very similar approach to define tabular editing UI for Julia objects

AccessibleOptimization.jl has just been registered in General.


This looks great. Could we just merge it into Optimization.jl / SciMLBase? We’ve been wanting to handle some canonicalization for awhile.


It could make sense, but I’m not sure what the interface should be, and could it be used for something else in addition to Optimization.
The current state of AccessibleOptimization just grew from my annoyance of manually putting Accessors.getall/setall everywhere and keeping track of the parameter vector. Maybe, something better is possible, open to suggestions.

I don’t understand sciml internals and type hierarchy well… AccessibleOptimization defines its own types for “optimization problem” and “optprob solution” to convert object ↔ vector automatically when needed.
Btw, no code there really depends on Optimization, only on SciMLBase. Optimization is loaded just to reexport it, because actually using AccessibleOptimization requires it anyway.

Shameless plug: GitHub - JuliaNonconvex/Nonconvex.jl: Toolbox for non-convex constrained optimization. already supports arbitrary decision variables. It is even documented.

1 Like

I may’ve missed that package, but looking at the docs I don’t see if supports the same flexibility as AccessibleOptimization.
Can you give a simple example comparable to that in the README here? Where the model struct and the target function are defined first, without any regard to optimization and with no restrictions, and separately after that one specifies what parts of the model struct to vary in optimization.

Not quite the same but similar enough.

struct ExpModel{A,B}
(m::ExpModel)(x) = m.scale * exp(-(x - m.shift)^2)

lb = [ExpModel(0.3, 0.0)]
ub = [ExpModel(10.0, 10.0)]
x0 = [ExpModel(5.0, 5.0)]

data = (x = rand(10), y = rand(10))

loss(m::Vector{<:ExpModel}, data) = sum(abs2.(data.y - m[1].(data.x)))

obj = Base.Fix2(loss, data)

using Nonconvex
Nonconvex.@load Metaheuristics

alg = MetaheuristicsAlg(ECA)

options = MetaheuristicsOptions(N = 1000)

model = Model(obj)

addvar!(model, lb, ub)

res = optimize(model, alg, x0; options)

1 Like

Yes, in the simple case it’s similar indeed.
AccessibleOptimization also lets you choose what parameters to modify in optimization, and add variable transformation.
No model modification needed, just change optimization variables definition:

# vary shifts and scales of all components
vars = OptArgs(
    @o(_[∗].shift) => 0..10.,
    @o(_[∗].scale) => 0..10,

# same, but log-transform scales:
vars = OptArgs(
    @o(_[∗].shift) => 0..10.,
    @o(log10(_[∗].scale)) => -1..1,

# keep the first component parameters fixed, only vary 2 and 3:
vars = OptArgs(
    @o(_[2:3][∗].shift) => 0..10.,
    @o(log10(_[2:3][∗].scale)) => -1..1,

# only vary scales, keep shifts fixed:
vars = OptArgs(
    @o(log10(_[∗].scale)) => -1..1,

and so on.

Can probably do that already by using a lower bound that’s equal to the upper bound. Internally, we can then eliminate such variables. But many solvers will already eliminate decision variables whose lower and upper bounds are the same. Also you can easily define custom ways to linearize structs and go back in Nonconvex.jl by overloading a method. But I will stop derailing this post now. Congratulations on your package!

That’s not derailing, always nice to see alternative approaches to fundamentally the same problem!

Maybe it’s “with a hammer, everything looks like a nail”, but I really like how Accessors deal with selecting and transforming values. AccessibleOptimization itself is a thin wrapper, real work is done downstream.


@aplavin Nice package!

It might still be nice to have a limited version of this in Optimization.jl, I’m working with a student who would maybe be interested in doing a small PR to Optimization.jl (maybe less powerful than your interface).

I wanted to check with you, not that we both do the same thing!

1 Like

AccessibleOptimization in its current state already covers everything I wanted from it beforehand, so I don’t plan to add any other major features or integrations. This may change over time, of course.
Please let me know if some related features are missing or interface could be better, there’s likely something I didn’t imagine (:

It would be nice to include arbitrary objects handling into Optimization.jl itself, but I don’t feel familiar with the codebase, and a separate package works just fine for me.

1 Like

Sounds good. So, it would be fine for you if we work on the integration (maybe borrowing a bit of your ideas)?

Sure! That’s the point of open sources and permissive licenses.


Look forward to seeing the PR! Feel free to ping me on slack if you run into any blockers.