I have a large matrix that contains sampled values of a convex 2D function. I want to find the minimum efficiently without visiting all elements, given indexes for a good starting guess close to the minimum.
For example:
x = LinRange(-3,3,100)
y = LinRange(-3,3,100)
z = (x .- 1.1).^2 .+ (y' .- 1.1).^2 # my matrices are much larger than 100x100
julia> findmin(z)
(0.0008999081726354381, CartesianIndex(69, 69))
julia> findminopt(z, CartesianIndex(65, 67))
# same answer but uses local search starting at (65,67)
Is there a package for simple hill climbing that operates on matrices? So something like Optim.jl except discrete instead of continuous optimization (and with a matrix instead of an explicit function to minimize).