Learning Surrogates or Flux

Proceeding trough my Julia learning I have a specific goal, produce a realtime surrogate of the complex soft body dynamics produced from packages such as deal.ii or similar specialized FEM.
This is just an example of deal.ii possible sim

At this stage will use the FEM package as black box, so will not try to replicate the PDE solving in Julia.

My doubt is if will be better for me investing time in learning something such as Surrogates.jl (but need I to explicitly define the PDE?) or Flux to achieve my goal.

Or there is another approach I will be better consider.

It depends on what you’re trying to do. Generally if you spend enough time with a deep neural network you can get something that’s a “little better”, but Surrogates.jl has a lot of the classic methods like GLKPLS that just kind of work without too much tweaking. They are different kinds of methods and which one you use depends on the situation.

We find reservoir computing approaches tend to be the most robust.

2 Likes

So far learning this things, good and bads of various approaches, needed pipeline and their respective effort.
At this stage seeing my learning path Surrogates>Flux>SciML in order of growing complexity and accuracy.
SciML is probably what we will need.

Just to have an update maybe useful for others, due my need to create a surroge for multidimensional data I will be skip Surrogates, that’s really not focused on this kind of data, and directly go for Flux.

It is. Not super high dimensional, but radial basis functions work well for tens or hundreds of dimensions

My data will be positions, velocity, acceleration of a set of hundred or thousand (for a prototype) of nodes in a mesh.
A short contact with @ludoro get me the idea (maybe my fault) at his times was not a strong point of Surrogates dealing with this, but maybe has improved in the last couple of years and worth trying, I don’t know.

Algorithmically, radial basis functions and such are fine with high dimensions. The original implementations were not, but more of the core package code for scaling has been replaced.

1 Like

Thanks this changed the perspective of course and sure worth a try.
I can so in the meantime sort out my Flux learning path, a bit stuck in JuliaAcademy notebooks.