I tested multiplication of two Double64 matrices and saw no effects on performance by using BLAS.set_num_threads(n).
Do you mean Float64
? If so, yes. What size matrices?
No, I mean Double64 in DoubleFloats.jl.
It both isn’t threaded and it also isn’t using BLAS. BLAS only supports Float32 and Float64.
2 Likes
To be precise, it supports
julia> LinearAlgebra.BlasFloat
Union{Float32, Float64, ComplexF64, ComplexF32}
1 Like