I am looking at a biochemical reaction networks and how it behaves under noise. The code looks something like:

```
gr()
...
prob_sde = SDEProblem(model,u0,(0.,1000.))
sol = solve(prob_sde,ImplicitEM(),dt=1//2^(7),callback=cb])
plot(sol)
```

(the callback ensures that the solution do not go into negative values)

I have a term which I use to linearly scale the amount of noise. One problem I have had is that if the noise terms is large and make the noise big enough I get an error

```
WARNING: Newton steps could not converge and algorithm is not adaptive. Use a lower dt.
```

However when I increase the `dt`

to over `dt=1//2^(10)`

things gets *very* slogish and jupyter starts to crash. I have identified that this is due to the plotting (if I remove it things work). When I have `dt=1//2^(10)`

and have a time interval `(0.0,1000.0)`

that means a million of timepoints (multiply that by 5, the number of variables of the system) and maybe I should not be surprised that I am having problems?

What is the correct approach to my issues? For the deterministic solution I can use the `plotdensity`

flag, but not here. Maybe that is with good reason as well as interpolations are not as obvious when I have a stochastic problem? I have a feeling that I might have to be more careful here than in the deterministic case… Anyone who have any input? Certainly there should be some way to get away with small time steps and also have something useful to plot?