LsqFit.jl usually has a lot of instability issues. For curve fitting I’d just use an optimizer directly.
Ah, I don’t think I know how to do that. I’ll try to figure it out, or I can also maybe use SciPy.jl
?
This has been my question for a while. It would be nice to have a simple package for least squares and/or curve fitting for those who are just doing basic curve fits. I’ve been very slowly gradually building a wrapper for Optimization.jl to reduce some boilerplate code (and I think SciPy’s optimize.curve_fit
is a wrapper for optimize.least_squares
, which is where I’m coming from)
I don’t have the knowledge to make such a package (yet at least) but it would be immensely useful if it were there. At the moment PyCall.jl
has a bug where you have to wrap your model function in a python lambda function as outlined here on issue 367, which is fine for me but I can’t really recommend my peers use Julia when I can’t give them a library to do curve fitting that just works because we use it in undergrad labs all the time. Absolutely love all the wonderful work that has been done in the Julia ecosystem, just wanted to put this out as a request for a feature if possible.
I have used this package: EasyFit.jl:
This is just a wrapper for LsqFit.jl, which, as @ChrisRackauckas mentioned, has instabilities.
Yeah exactly. When we were putting together the curve fit in DataInterpolations, there were a lot of cases failing so we switched from LsqFit.jl to Optim.jl and they started succeeding. So I’d double check if that’s still the case but it was a pretty clear trend at least in like 2020.
Does Optim.jl have a method to get standard errors of the optimized parameters? I couldn’t find something like that here. Maybe I’m just looking at the wrong place.
Trixi.jl does not concentrate on finite volume methods but (mostly) on discontinuous Galerkin methods (which are arguably a merge of finite volumes and finite elements).
Yeah, probably a bit over simplified. It should be discontinuous Galerkin. BTW, we’ll be doing WENO in MethodOfLines.jl, so it’ll be nice to test these out against each other (once the Symbolic IR form of MTK is completed for scaling).
Cool Let me know when you’re ready, I would like to take a look (and assist in setting up nice test cases).
You’re almost there: Optim.jl.
That’s very neat!
A little nitpicking: I would describe Turing.jl as “Bayesian Modeling”. When I read “Bayesian Inference” I think more about inference algorithms.
Thanks, those changes will be in the next doc build.
I suggest that the list of curated packages based on the discussions be updated in the top-most post, to make it easy for educators and students to select packages that are best suited for their applications. This is not intended to promote / recommend certain packages over the others, but to help users select the packages that are most complete / have most bugs fixed.
Good idea. I just did it.