Explicit Jacobian on DAEs?

Hey,
So after getting rid of the dispatch and changing back to the default linear solver it works:

using Sundials

p=[2,1.2,3]


function f(res,du,u,p,t)
  res[1] = du[1] - p[1] * u[1] + p[2] * u[1]*u[2]
  res[2] = du[2] -p[3] * u[2] - u[1]*u[2]
  res[3]=  u[2] + u[1] - u[3]
end

function g(J,du,u,p,gamma,t)
  J[1,1] = gamma - p[1] + p[2] * u[2]
  J[1,2] = p[2]* u[1]
  J[1,3] = 0
  J[2,1] = - 1 * u[2]
  J[2,2] = gamma - p[3] - u[1]
  J[2,3] = 0
  J[3,1] = 1
  J[3,2] = 1
  J[3,3] = -1
  nothing
end

fg = DAEFunction(f;jac=g)

dIC=[-0.8,-4]
IC =[1.0,1.0,2.0]
tspan=(0.0,10.0)
diff_vars = [true,true, false]

testProb2 = DAEProblem(fg,dIC,IC,tspan,p,differential_vars=diff_vars)
sol = solve(testProb2,IDA())

So that’s using your Jacobian. You can check by putting a print in there. But now you want to scale to a large problem. Instead of going straight to GMRES, if you’re willing to define your Jacobian then it’s usually more efficient to define a sparse Jacobian. You do that by defining jac_prototype and pass in a matrix with the right sparsity pattern.
Then you can directly write the Jacobian pieces in an call a sparse factorization method (:KLU). You can then directly loop through J.data to speed up the Jacobian updates instead of using Cartesian indexing.

If you are willing to do this work though, sparse factorization with a 5th order Rosenbrock method, writing the DAE in mass matrix form, can be more efficient (it’ll get more miles per factorization!), but of course than can be a pretty substantial rewrite.