I have 3 models with total 10 parameters. Data I’m using is very noisy and for each set parameters’s particle or i should say for each iteration, it requires to simulate models for 24 times to get required output to be feed in likelihood. So, for 6000 iteration it takes about 1 hr while for 60000 about 2.5 days. Now main issue is not time here but waiting blindly. I mean that you have no intermediate results until its all done and lets say if you need to check just three different prior combinations then by this approach you need 7.5 days which is not good.
Now if there was anything where I can get intermediate results lets say after 6000 iteration it would have been best. Atleast I can stop inference if i see things not working out.
So, I want to ask everyone is there anyway ?
I tried to use restarting sampling in a loop after every 1000 iteration but it seems not doing exactly want i want.