In my field we are usually doing imaging where input signals of the detectors sometimes only have ~8bit so quite often a reconstructed image with 32bit works totally fine.
Further, since we optimize over 2D or 3D arrays we have ~millions of parameters and without AD and gradient descent similar schemes, we can’t do much.
Optimization with millions of decision variables–that’s remarkable! I had no idea the state of the art for optimization had progressed so far. Do you need any special hardware setup? What optimizer do you use? Can you point me to any papers that describe such an optimization? TIA!
We reconstruct arrays so an array of size 512x512x512
can be done on modern CPUs or GPUs within minutes or less.
This paper is under review but it shows how we do a deconvolution (hence an optimization) with Optim.jl, Zygote and CUDA.
This is just one use case but in Deep Learning people optimize > billions of parameters.
1 Like