How dangerous is it to run
Polyester.@batch inside of a
An example where I might need this: a library implements a low level function that uses Polyester for multithreading, and I want to run multiple such functions in parallel.
Furthermore, to make this actually useful (i.e., not letting a single
Polyester.@batch hogging all the threads), is it possible to control the number of Polyester threads with a context manager of some kind. E.g., how do I implement this pseudocode:
function f(arg) println(arg) Polyester.@batch for i in 1:10000 do_something() end end @thread for j in 1:2 set_max_polyester_threads(2) f(j) end
I want this code to cause 4 threads to be running in parallel. Two threads dedicated to
f(1) and two threads dedicated to
f(2). I want this to work in a situation in which I can not modify
f itself. Is this possible?
Background: I am using Polyester because benchmarking has shown that cheap threads for the “inner” problem do provide a big performance gain. On the other hand, the “inner” problem can not effective use more than a handful of threads. The outer problem (running the inner problem multiple times) is embarrassingly parallel.