Calculate SVM dual loss effeciently

I am trying to implement a SVM dual optimization problem in Julia. My data set is the 60000x784 MNIST number recognition dataset. Here are my functions for
The linear version is very fast:

function dual_loss(α::Vec, X::Mat, y::Vec) where {T<:AbstractFloat, Vec<:AbstractVector{T}, Mat<:AbstractMatrix{T}}
    t = α .* y
    return sum(α) - t' * X * X' * t
end

The kernel version however is extremely slow and I am not patient enough to wait for it to finish:

function dual_loss(α::Vec, X::Mat, y::Vec, kernel::Function=dot) where {T<:AbstractFloat, Vec<:AbstractVector{T}, Mat<:AbstractMatrix{T}}
    t = α .* y
    return sum(α) - sum(@views 0.5 * t[i] * t[j] * kernel(X[i, :], X[j, :]) for i=1:size(X, 1), j=1:size(X, 1))
end

I also tried to use the MLKernel.jl library and it is still very slow. I wonder is this due to the scale of the problem itself (I’m on my Laptop PC and running on CPU), or due to my code being not optimized enough? If so, Is there any way to improve its performance?