# Nonlinear equations on abstract vectors?

I need to solve a (smooth, unconstrained) system of nonlinear equations for an unknown defined on multiple subdomains (to wit, domain decomposition for a PDE). A natural representation of the unknown might be a vector of arrays of different sizes.

My first thought was that I would need to give a true Vector to the nonlinear solver and then view it as the subdomain-aware version for my stuff. Something like a PseudoBlockVector with a resize operation within each subdomain. Doable, but I never look forward to making multiple views into the same data without creating unwanted copies (or failing to create needed copies, or deep copiesâŚ).

Then I thought a cooler Julian solution would be to subtype AbstractVector for my representation and generalize +,-,length, and so on, so that the nonlinear solver would never have to know it wasnât working with a true Vector.

Does anyone know if the NLsolve or Optim packages, or another not on my radar, can handle that? And what the âand so onâ set of necessary vector operations would be?

1 Like

Doesnât `BlockArray` from https://github.com/JuliaArrays/BlockArrays.jl do what you want?

1 Like

You donât need to make it an `AbstractVector`, only a subset of the methods is required for a `newton` solve. For example in my package PseudoArcLengthContinuation.jl, I am relying on the same set of methods as KrylovKit.jl. It provides `newton` method and others for ``generalââ vectors.

You can also provide you own linear solver if you wish.

1 Like

Again, that was my first thought, and it should be fine to start. Eventually, though, I might want different parts of the vector to live on different processes. Wouldnât that leave me stuck if the solver wants to, say, add vectors together, unless it is willing to accept an `AbstractVector` in the first place?

1 Like

@rveltz, it looks like KrlovKitâs description of vector-like behavior is the kind of thing Iâm after. Your PALC might fit the bill, then. It looks like nice work! I must say that after reading your front-page disclaimer about performance, I was very surprised by the extent to which you seem to defy it in your examples.

1 Like

It could be but I would not worry too much about this. For example: see `DArray{T,N,A} <: AbstractArray{T,N}` in DistributedArrays.jl. See also (WIP) MPIArray.jl which gives an `AbstractArray`. Knowing a bit where youâre headed would help you: for example, do you want cluster of GPUs, cluster of CPU, multithreadingâŚ

For example, using `AbstractArray` allows some performance improvement by calling BLAS (search in `KrylovKit` about this).

You might be interested in RecursiveArrayTools.jl for your partition problem

When I first wrote PseudoArcLengthContinuation.jl, I was using `ArrayFire.jl`, `KrylovKit.jl` and `ApproxFun.jl`, thatâs why the package does not require `AbstractArray`.
I donât use much IterativeSolvers.jl because it lacks an eigensolver, it is otherwise a wonderful package.

Thank you!

Well, I do researching with this tool studying some challenging PDEsâŚ So I guess I toned down too much the disclaimer about performance. Most of the good performance is squeezed out of the linear solver thoughâŚ

That said, @ChrisRackauckas explains very well what such type of ``simpleââ package performs well in this post. Also, if you are interested in PALC, you should check the excellent Bifurcations.jl

KrylovKit.jl is for linear problems and eigenvalue problems, and related iterative methods. For non-linear optimization problems, I was also working on OptimKit.jl , but itâs not registered and still work in progress (though the basic functionality is working). It goes a bit further than just relying you to overload `dot` and `axpy!` and these kind of functions for your custom vector-like type (which does not need to be `<:AbstractVector` or `<:AbstractArray`). You can just specify which method it has to use for those so your function for adding âvectorsâ can even have a different name, and you donât need to go through the trouble of defining a new custom type. If the parameters of your optimization problem are best captured as say a tuple of a scalar, a matrix, and some other thing, you can just use that, i.e. `x=(scalar, matrix, otherobj)` and then define functions for adding and scaling these, and for computing inner products.

2 Likes

We should tease you a bit more so we can get cool new packages more often

1 Like