Repo for Scientific Machine Learning - UDE paper does not seem to give consistent results on rerunning

Hello,

I was going through the code behind the paper for Scientific machine learning here: universal_differential_equations/scenario_1.jl at master · ChrisRackauckas/universal_differential_equations · GitHub

and I am not able to duplicate the sparse regression. Instead of getting the two parameter model I instead get a 22 parameter model. I have not changed any code (aside from not allowing f_increases for BFGS because the model was unable to finish training if I did not include that) or dependencies and am using Julia 1.7.3 (which I think from the top level Manifest file is the right version). Does anyone know what might be going on here? I would have thought a consistent manifest file wouldn’t have issues with reproducibility. Happy to provide more details if needed

1 Like

I would recommend following the tutorial which is tested and displayed here:

https://docs.sciml.ai/Overview/stable/showcase/missing_physics/

@Julius_Martensen can we work to get the Hudson Bay examples into the SciMLSensitivity docs? It would be much easier to maintain these pieces if they were all tested tutorials.

1 Like

Thanks very much!

It seems like the main difference is that the ADMM optimizer is used instead of STLSQ. Do you know why STLSQ no longer performs well but used to?

1 Like

STLSQ just needs a lot of tuning. You need to choose lambda correctly for it to lop off extra terms. How high the lambdas need to be can be dependent on the fit. That’s one of the major reasons why I don’t prefer STLSQ these days.

1 Like