JuMP optimization with vector input and analytical gradient

Docs:
https://jump.dev/JuMP.jl/stable/manual/nlp/#Multivariate-functions

I guess you want something like the following (I have not run, there may be typos, etc.):

f(x...) = (x[1] - 1)^2 + (x[2] - 2)^2
function ∇f(g::Vector{T}, x::T...) where {T}
    g[1] = 2 * (x[1] - 1)
    g[2] = 2 * (x[2] - 2)
    return
end
model = Model()
register(model, :my_square, 2, f, ∇f)
@variable(model, x[1:2] >= 0)
@NLobjective(model, Min, my_square(x...))

You should also read:
https://jump.dev/JuMP.jl/stable/background/should_i_use/#Black-box,-derivative-free,-or-unconstrained-optimization
There are other tools in Julia that may be more suited if you have an unconstrained problem.