Performing max-min optimization with optimization.jl

Hello,

I was wondering if it is possible to solve minimax optimization problems using Optimization.jl for neural networks? I am currently performing the optimization using Optimization.solve(...). I am specifically interested in multiplying the gradients of the loss function with respect to some parameters, by -1 so that the loss function is maximized with respect to those specific parameters and minimized with respect to the rest of the parameters. I would assume that this would have to be done in the callback function since I am using solve(...)? Just for clarification, my code is very long and I would prefer to keep it in the current format with Optimization.solve(...).