Handling sparsity properly in jacobians and Hessians is not implemented in Nonconvex yet. At best for now you can only use ForwardDiff using https://github.com/JuliaNonconvex/NonconvexUtils.jl. It’s not impossible though with the right combination of AbstractDifferentiation (AD) and ModellingToolkit (MTK) to generically detect and exploit sparsity in your functions. It will take some time though because the group of people familiar with autodiff, MTK, AD and optimisation is a small group in the Julia community who are usually busy with other things. I am glad the JuMP team and LANL are starting this new project though and I am sure they will help advance the Julia ecosystem for nonlinear optimisation.