Future.randjump and accumulate?

I am getting the following warning with nightly:

 Warning: `randjump(rng, steps::Integer, len::Integer` is deprecated; use `Future.randjump` and `accumulate` instead

The only Future I could find is Distributed.Future, which is a DataType.

The warnings were initially complaining about randjump(rng, n*(gpt+1)) saying to use Random.randjump(rng, big(10)^20, n*(gpt+1)) instead which, when I did, gave the above warning. Any clues or pointer to relevant docs?

My version of master appears to have it:

julia> VERSION

julia> using Future

help?> Future.randjump
  randjump(r::MersenneTwister, steps::Integer) -> MersenneTwister

Create an initialized MersenneTwister object, whose state is moved forward (without generating numbers) from r by steps steps. One such step corresponds to the generation of
two Float64 numbers. For each different value of steps, a large polynomial has to be generated internally. One is already pre-computed for steps=big(10)^20.
1 Like

Found it, it was masked by “an existing identifier”, this single most annoying error when forgetting to import some stdlib package and having to restart REPL to find it. Thanks.

For the record, this is how I managed to replicate the default behavior of the old randjump:

oldrandjump(mt, len) = accumulate(Future.randjump, [0; [big(10)^20 for i in 1:len-1]], init = mt)

 oldrandjump(mt, len) = [mt; accumulate(Future.randjump, [big(10)^20 for i in 1:len-1], init = mt)]

I am not sure if there is a more obvious way, as this one uses the non-obvious behavior of accumulate with a function taking 2 arguments of different types.

1 Like

This is not exactly correct. Old randjump returned source object as a first element of an array. Your function returns a copy. So actually you should prepend mt to a generated sequence without adding 0 element inside.

Actually you have an example of this in the Julia Manual in the last example of this section https://docs.julialang.org/en/latest/manual/parallel-computing/#The-@threads-Macro-1.


Corrected and thanks for the pointer.

1 Like