Announcing my first Julia package, 'HypergeoMat'

Hello,

Just to announce my first Julia package.

Cheers.

22 Likes

This is awesome! The math is way over my head but new packages with detailed docs is a beautiful thing to see. Congrats!

Thanks for sharing this package. In addition to the references provided, it would be nice if you will, to indicate some applications of these hypergeometric functions of a matrix argument, providing some links in the documentation to use cases where such code is required.

1 Like

A small blurb describing the package is generally helpful in the announcement (you can still edit your post).

4 Likes

The package is described in the doc :smile:

I don’t know a ton of applications. The only ones I know are related to random matrix theory: these hypergeometric functions appear in the density functions, Laplace transforms or Fourier transforms of several kind of random matrices.

1 Like

That’s an excellent enough application in itself!

Thanks.

Here is my second Julia package (Jack polynomials), related to the first one.

Damn, fast production! :slight_smile:

Lol, no, I had this code for a while, and I just packaged it, now that I know how to do a package :slight_smile:

Now my R package jack can calculate the Jack polynomials with Julia (thanks to the,JuliaConnectoR package). Observe the difference of speed (Julia 1.6):

## Unit: milliseconds
##   expr          min           lq       mean     median         uq       max
##      R 15767.690900 16131.731601 16546.8662 16381.9088 17123.5399 17329.460
##  Julia     8.395101     9.733001   458.8062    11.0415    90.6334  2174.228

1600x faster than R! Amazing.

That’s strange. On my Linux laptop I have Julia 1.7 and it is slower:

Unit: milliseconds
  expr         min          lq       mean      median          uq       max neval cld
     R 21411.06068 21418.31391 21475.3985 21423.03423 21531.32397 21593.260     5   b
 Julia    48.71635    49.13159   312.7035    52.17541    84.16258  1329.331     5  a 

Hmm looks like the slowness is due to the R version. If I benchmark directly in Julia 1.7, it takes only 3ms.

julia> @benchmark Jack([1/2; 2/3; 1; 2/3; -1; -2; 1], [5;3;2;2;1], 3.0)
BenchmarkTools.Trial: 1575 samples with 1 evaluation.
 Range (min … max):  2.823 ms …   7.333 ms  ┊ GC (min … max): 0.00% … 47.20%
 Time  (median):     2.873 ms               ┊ GC (median):    0.00%
 Time  (mean ± σ):   3.162 ms ± 945.253 μs  ┊ GC (mean ± σ):  8.44% ± 14.43%

Update of JackPolyomials.jl, now it can deal with a symbolic parameter alpha:

julia> jack = JackPolynomial(2, [2 ; 1])
(alpha + 2)*x_1^2*x_2 + (alpha + 2)*x_1*x_2^2

julia> jack(1,2)
6*alpha + 12

Thanks to AbstractAlgebra.

1 Like