I have a specific question. I am working on abductive explanation of neural networks, f. Abductive explanation is a subset (let’s denote indices ii) of the input x guaranteeing that the model (neural network) provides the same output, i.e. (∀ z)(∀ i ∈) (x[i] == z[i])(f(x) = f(z)) (there does not exist adversarial sample). The search requires to repetitively show that there does not exist adversarial sample, for which I use MILP (now, I use HiGHS, but once i am back in office, I will switch to Gurobi). I have found, that frequently one finds adversarial sample by a sheer luck and random sampling is very effective. But I would like to have something like random sampling with certification, i.e. for small degrees of freedom, I would like to certifiably exhaust the input space. For that, I would like to iterate over all values in range 0:2^n -1), but ideally in random order. So my question is, if someone does not know the algorithm which would allow this without materializing all values in memory, i.e. I am not interested in solution `randperm(2^n), because I want to iterate in few batches.

Thanks in advance for answer (negative is fine, because I would not need to think about it).

this is something I had in mind. But I need to be certain that I iterate over each value just once and I provably iterate over all values in 0:2^N range for different N

Depending on how you want to divide it into batches, you can save the state of the rng to use in the next batch, or just put it into a for loop if you’re iterating all at once.

An ideal block cypher with N bits is indistinguishable from a random permutation on 1:2^N. In other words, you can get your random permutation permuted(key, n) = encrypt(key, n).

So you need a block cypher with N as its block size, for non-standard N. You get that with e.g. Luby-Rackoff / Feistel, where you simply plug in any reasonable thing as round function (e.g. aes if you use hardware acceleration, otherwise chacha8 or something like that).

This sure is not optimal (you’re wrapping another layer around aes which already has its 12 rounds), but should still be reasonably fast.

PS. I am assuming that you need the 30 < N < 70 regime? For smaller N, just do the usual randperm, for larger N just use random outputs since you’re not gonna exhaust the stream anyways.

This sounds like a nice solution.
Do I have a guarantee that I will iterate over all values?

My N is currently upper bounded by 24. With that, I can linearly try all 2^24 in about 30s, which is OK. Since I am searching for a first adversarial sample (existence is sufficient), I was thinking that I might benefit from random search rather then from linear search. But the overhead has to be small, since testing one value takes around 2μs.

If you need a permutation of 2^N, modular arithmetic is your friend. Multiply by an odd number and add another number and take modulus 2^24, and you have permuted 0…2^24-1. This isn’t cryptographically random, but random-ish (what is the adversary here?).