In Base, this routine is called `A_ldiv_B!`

# Unified Interface for Linear Solving

**barche**#22

Youāre absolutely right that initialization is necessary, in Trilinos the āAā is also something called a ālinear operator with solveā, which is typically an initialized factorization, so it would present the same workflow, except that there is the extra parameters option for passing parameters you might want to vary at each step and that donāt require recomputing the initialized operator, maybe things like the tolerance or the number of iterations.

That said, I like your example and it would fit my future use case (solving the Navier-Stokes equations) well, but I think it would be nice to also have a more advanced syntax that allows passing in extra parameters at each step if needed.

How standard is the `@into!`

syntax? It is much nicer than having to remember a non-intuitive ordering of function arguments, I think it would be good to have macros like this in Base so they are more universally recognised.

**ChrisRackauckas**#23

Weād have to try to standardize some arguments if we want that to work nicely. Can you give an example of what youāre thinking of?

`@into!`

is just a nice way from InplaceOps.jl to use `A_ldiv_B!`

. `A_ldiv_B!`

is standardized. I prefer the \ look with a macro in front. Pretty sure this macro isnāt standard, but I would hope it would be become so.

**barche**#24

I was thinking of just adding an extra argument `parameters`

with an abstract type (`abstract SolveParameters`

?), so using multiple dispatch it could be specialized to contain anything.

**ChrisRackauckas**#25

What would that give us over the method using āGod typesā and closures? Thatās the example Iām looking for.

**barche**#26

You could set e.g. tolerance at each step:

```
function OneStep(f,...,linear_solver=lufact)
...
A = Jacobian(f(...)) # A comes from some Jacobian
K = linear_solver(A) # Build the factorization object
... # Some calculations
@into! x1 = K\b1 # First linear solve, default settings from factorization object
... # use x1 a bit
solve!(x,K,b2,MaxIterations(10)) # Suppose the second linear solve just needs an approximation, stop after 10 iterations
... # use x2 a bit
solve!(x,K,b3,Transpose(true)) # third linear solve uses the transpose of A
... # Do some stuff to end
end
```

The same could probably be achieved by modifying `A`

, but this would be trickier if the solutions could happen in parallel. In general, I think itās prudent to have a provision for passing parameters related to the application of `A`

for a specific RHS but that are not directly a property of `A`

itself.

**ChrisRackauckas**#27

What about just requiring that every factorization object implements an `update!`

function?

The only thing required to make this work well would be a PR to Base convincing them that a blank dispatch for `update!`

would make this code much easier to write (and maybe make a new spot on the āInterfacesā page of the documentation about this interface).

**barche**#28

(I Realize I wrote `A`

instead of `K`

in my previous post, will correct after sending this)

This does not address the case (a bit far fetched, admittedly, and not something I require in my code) where the solves would happen in parallel: different parameterizations of `K`

would require making copies. Also, conceptually things like the number of iterations and the tolerance arenāt really a property of the factorization object, but of a specific application of that object, so I feel it makes sense to be able to set these parameters when they matter (allowing of course for defaults to be set in `K`

, to allow the `\`

variant to work).

Regarding `update!`

: How about a `parameters`

function that gets parameters that can be manipulated using `setindex!`

:

```
params = parameters(K)
setindex!(params, 1e-6, :tolerance)
```

The name `update!`

seems a bit ambiguous, it could also be interpreted as āI changed the matrix, update the factorizationā, for example.

Also, is it necessary to have all the interface for this in Base? Couldnāt a package describing the interface be created? It should then be easier to make changes if necessary.

**ChrisRackauckas**#29

If we want to use \ and `A_ldiv_B!`

(which I still think is sensible), we would only need a minimal amount of Base compatibility (i.e. just ways for it to ignore the parameters instead of throwing an error when theyāre there).

**barche**#30

Yes, the parameters should always be optional, with defaults configurable in `K`

. I agree that \ and `A_ldiv_B!`

should stay, of course, but the `solve!`

interface would be a nice addition to add a little more flexibility. I must admit I never would have thought to look for a function named `A_ldiv_B!`

, would it be a totally crazy idea to rename that to `solve!`

?

**dlfivefifty**#31

Iām not a huge fan of the naming scheme, but itās consistent with `A_mul_B!`

, `Ac_mul_B!`

, etc. The notation `solve!`

on its own is not as versatile.

**ChrisRackauckas**#32

In parallel to the discussion on parameters, I think we should bring up the discussion of linear maps. Not every linear system that is to be solved is a matrix, in many cases one might want to solve with a linear operator or a linear map. Generalizing it like this computationally good because things like matrix-free methods can be much faster and/or take much less memory.

IterativeSolvers.jl seems to want to go this direction by using LinearMaps.jl

I think it would be nice if more packages try to support at least something like linear maps. If I understand correctly, a `WrappedMap`

is just a matrix wrapped as a linear map, so I assume that the package handles the cases to make this work anywhere a matrix \ would work (does it?).

Then the push would really be about trying to get other libraries to support something like the `FunctionMap`

where you define a function for `A*x`

. Itās clear that iterative solvers ācanā work with this, but what packages can be made compatible with something like this? Do things like CUSOLVER and PETSc strictly require a matrix?

**dlfivefifty**#33

Iām confused why LinearMaps.jl doesnāt do the following:

```
abstract AbstractLinearMap{T} <: AbstractMatrix{T}
getindex(A::AbstractLinearMap,k::Integer,j::Integer) = (A*[zeros(j-1);1;zeros(size(A,2)-j)])[k]
```

**ChrisRackauckas**#34

An update on this: the factorization-type style has been implemented in DifferentialEquations.jl. I tested it out, it works really nicely as a cross package solution. While itās only currently enabled on master, docs are already up which show how to specify it:

http://docs.juliadiffeq.org/latest/features/linear_nonlinear.html

Iāll see if I get around to putting some PRs around to make this work with more packages. All that needs to happen is the type interface like PETSc.jl needs to be created. Thereās a thread going in IterativeSolvers.jl for this:

Iāll also be poking around for packages which should use this interface, like NLsolve.jl.

In total, itās really painless to setup but is a really cool feature for Julia. Thanks everyone for the ideas!

**JaredCrean2**#35

@ChrisRackauckas Re. your earlier question, PETSc requires a matrix object. Matrix objects can have different formats, including various flavors of sparse storage formats and MatShell, which doesnāt require an explicit matrix but instead requires a callback function that performs matrix-vector products.