I calculate the map of the largest Lyapunov exponent (LE) and extreme events(EV), that is, i change two parameters within some range. Julia uses above 2gb memory when calculating map LE and EV. How can i fix this? I use jupyter.( I can’t delete the flush because I need to keep track of the result)

You’re saving a dense interpolation for tens of millions of iterations. Is this what you meant to be doing instead of saveat? The memory usage seems to be directly just the amount you are telling it to save

Is the maximum of the time span `Inf`

or something?

How can i check this?

I tried use default algorithm and this error is fixed, but i need faegin12 or another algorithm with high accuracy.

And how exactly do you expect to achieve `1e-25`

tolerance with 64-bit floating point numbers with values ~1 when machine epsilon is `1e-16`

? The error estimator is probably just going insane because that’s not necessarily possible with 64-bit floats. It’s surprising any method could give something (I assume Vern9 would be more stable here since its error estimator matches stability regions better, and stability will matter in weird cases like this).

But anyways, it should be clear that you need to use different precision if you actually want `1e-25`

relative error (which is what `1e-25`

absolute tolerances with ~1 values would require).

Thank you. I try Vern9

Please don’t use `saveat`

or other output controlling keywords with `trajectory`

. Instead, use `solve`

directly. `trajectory`

is nothing more than a convenience wrapper around solve that ensures equal time steps and also that the output is a `Dataset`

. There is no reason to use it besides a convenience quick function; go to the source, `solve`

, which gives you much more power if what you are interested in is simply evolving the system with a lot of control.

You shouldn’t save a trajetory anyways. If you want to compute extreme events you should use `integrator`

and simply step the integrator and detect the events and store only these events. It is very ineficient to make trajectories of 100000000000 of time steps and *then* loop through that to find the events.

By the way, I am very interested to make a function `extreme_events(ds::DynamicalSystem, ...)`

that computes extreme events given some quantifier of the event/state. This could be done with call backs or integrator looping. It is very common in nonlinear dynamics. Perhaps you want to contribute this? In any case, you should participate in this discussion here and mention your use case so I can see what can of use cases people are interested in.

https://github.com/JuliaDynamics/DynamicalSystems.jl/issues/189

Thank you. I can try contribute this. I know some basicly criteria for extreme events