CheckGradient In Manopt.jl

I am performing an optimization problem and wish to make sure that the gradient I have is correct.

When using the check_gradient function, I find that the gradient is only correct some of the time, (about 90% of the time).

Here is code to reproduce my problem

using Manifolds, Manopt


for test in 1:100
    local k = rand(1:20)
    local n = k + rand(1:10)
    global A = rand(k,n)
    local M = Stiefel(size(A)[2], rand(1:n-k))
    function f(M, X)
        return norm(A*X, 2)^2
    end
    function grad_f(M,X)
        euclidean_grad = 2*transpose(A)*A*X
        return project(M, X, euclidean_grad)
    end
    grad_result = check_gradient(M, g_loop, grad_g_loop)
    println("gradient matches at test $(test): ",grad_result)
    
end

There is a chance that my gradient is correct, but from example 4.34 here https://www.nicolasboumal.net/book/IntroOptimManifolds_Boumal_2023.pdf

I do not think that is the case (that said, my matrix A is not symmetric…). Is there a chance I am misunderstanding the check_gradient function, and this high accuracy percentage is indicative of it working?

Thanks!