[ANN] EntropyHub.jl

This post is to announce the release of EntropyHub.jl - a package for estimating various information-theoretic entropy measures from time series and image data.

EntropyHub.jl (v0.1) features functions for computing:

  • standard (base) entropy measures
    e.g. sample entropy, fuzzy entropy, permutation entropy, slope entropy, and much more.
  • cross-entropy methods
    e.g. cross-approximate entropy, cross-Kolmogorov entropy, and more.
  • multiscale entropy using any of the standard entropies
    e.g. multiscale dispersion entropy, refined multiscale approximate entropy, and much more.
  • multiscale cross-entropy using any of the cross-entropies
    e.g. multiscale cross-conditional entropy, composite multiscale cross-distribution entropy, and more.
  • bidimensional entropy measure for matrix (image) data
    e.g. bidimensional fuzzy entropy, bidimensional dispersion entropy.

The EntropyHub project aims to bring together the many entropy measures in the scientific literature under one complete package that is available in Julia, Python and MatLab
:juliabouncer: :snake:


EntropyHub is in the Julia Registry and can be installed in the Pkg REPL as follows:

pkg> add EntropyHub

Documentation for EntropyHub.jl, with descriptions of function syntax and examples of use can be found at:
MattWillFlood.github.io/EntropyHub.jl/stable and in the EntropyHub Guide.pdf.

EntropyHub continually seeks to integrate as many established entropy methods into one comprehensive package. We welcome suggestions and support from all in helping to achieve this :slightly_smiling_face:



Are you aware of Entropies.jl ? Have you compared any implementations, especially w.r.t. performance? I’d be interested to see the result.


I just did a timing, so if you can post your own on permutation entropy I’d appreciate it.

julia> using DynamicalSystems, BenchmarkTools

julia> m = 3; τ = 1; x = rand(10000);

julia> @btime permentropy(x, m; τ) # base e
  163.200 μs (28 allocations: 470.09 KiB)

Are you also aware that the naming convention you use is in opposition with the standard Julia notation, and may bring conflict to potential Julia users?

Julia recommends: Style Guide · The Julia Language

But every function I’ve checked uses CamelCaseNotation instead of lower_case_notation. If the CamelCase is not really necessary due to field-related conventions, I believe you should definitely consider following the lowercase version.

Hi @Datseris :slightly_smiling_face:

Yes, we are aware of Entropies.jl and the wonderful functionality it provides.
While Entropies.jl provides many ways of estimating Shannon, Renyi and similar generalized entropies, EntropyHub.jl focuses on the broad array of entropy measures introduced in the last few decades for various time series applications (Entropy | Free Full-Text | The Entropy Universe).

Comparing permentropy to PermEn, indeed the former is a lot faster:

@btime permentropy(x, m; τ)
  151.700 μs (27 allocations: 470.03 KiB)

@btime PermEn(x, m=3, tau=1, Logx=0)
  7.858 ms (90229 allocations: 7.95 MiB)
([-0.0, 0.6931449751172578, 1.7916762374702377], ...)

Actually, it appears that permentropy has a bug as it returns the same value regardless of m or tau

julia> permentropy(x, 3; τ = 1)

julia> permentropy(x, 3; τ = 4)

julia> permentropy(x, 4; τ = 4)

julia> permentropy(x, 7; τ = 4)

The EntropyHub version of permutation entropy returns all entropy values for embedding dimensions from 1:m, which probably slows it down.

However, one big advantage of the EntropyHub version is that it provides numerous additional arguments so one can estimate the edge [1], modified [2], amplitude-aware [3], weighted [4], uniform-quantization [5] and fine-grained [6] variants of permutation entropy too!

As EntropyHub was originally developed in MatLab and Python, we wanted to keep the syntax consistent across all platforms, that’s why CamelCase was used (otherwise we would have followed the style guide exactly :wink:)

Thanks :slightly_smiling_face:
I hope you find the package useful! :+1:t3:

[1] Zhiqiang Huo, et al., Edge Permutation Entropy: An Improved Entropy Measure for Time-Series Analysis,45th Annual Conference of the IEEE Industrial Electronics Soc, (2019), 5998-6003
[2] Chunhua Bian, et al.,Modified permutation-entropy analysis of heartbeat dynamics,
Physical Review E, 85.2 (2012) : 021906
[3] Hamed Azami and Javier Escudero,
Amplitude-aware permutation entropy: Illustration in spike detection and signal segmentation,
Computer methods and programs in biomedicine, 128 (2016): 40-51.
[4] Bilal Fadlallah, et al.,Weighted-permutation entropy: A complexity measure for time series incorporating amplitude information,Physical Review E, 87.2 (2013): 022911.
[5] Zhe Chen, et al.,Improved permutation entropy for measuring complexity of time series under noisy condition, Complexity, 1403829 (2019)
[6] Xiao-Feng Liu, and Wang Yue,
Fine-grained permutation entropy as a measure of natural complexity for time series,
Chinese Physics B, 18.7 (2009): 2690.


I have a question on the Licensing, you state:

License and Terms of Use

EntropyHub is licensed under the Apache License (Version 2.0) and is free to use by all on condition that the following reference be included on any outputs realized using the software:

    Matthew W. Flood and Bernd Grimm, 
    EntropyHub: An Open-Source Toolkit for Entropic Time Series Analysis,
    2021 www.EntropyHub.xyz

Does this mean that anybody using your code in their projects must return something like:

PermEnNewProject(x, m=3, tau=1, Logx=0)

Matthew W. Flood and Bernd Grimm, 
    EntropyHub: An Open-Source Toolkit for Entropic Time Series Analysis,
    2021 www.EntropyHub.xyz

([-0.0, 0.6931449751172578, 1.7916762374702377],...)

Hi @viraltux :slightly_smiling_face:

Good question :+1:t3:
That statement just means that if you use EntropyHub in your research, that you include that citation in any scientific outputs (journal/conference papers, presentations, etc.)



Please provide underscore_case versions of functions to reduce unnecessary confusion for Julia users.

It is always welcome to bring new packages and expand Julia’s application environment!


Sounds like metaprogramming. Does anyone know how to write such a macro or similar to generate underscore_case from CamelCase to exported functions?

If so that macro should exist for Camel case, No case and Snake case. :blush:

Whooooooooooops sorry about that! Someone was pretty stupid, check out how I defined the “convenience function” permentropy:

function permentropy(x, m = 3; τ = 1, base = Base.MathConstants.e)
    Entropies.genentropy(x, SymbolicPermutation(; τ = 1, m = 3))


Fixed now, thanks! Thankfully this has no impact on performance (assuming we measure with order 3). So the Entropies.jl version is a lot faster. We had an original version based on combinatorics that was really slow, then @kahaaga re-wrote the algorithm with different approach based on symbolizing. I was kinda hoping you guys would had something even faster so we could steal it from you :stuck_out_tongue: But now please do feel free to steal our implementation, as the performance difference will only become bigger with higher m and larger timeseries, especially due to the difference in allocations.

What is a permutation entropy of order 1? At least according to the original definition of Bandt and Pompe, there is no such thing as permutation entropy of order 1.

We also have amplitude-aware and weighted versions for the entropy but not the rest, good to know!

I am being annoyingly persistent, but I want to challenge that. Who does this help? A python user that already uses the Python version doesn’t really have a reason to use the Julia version, and so doesn’t really benefit from using the same names. Similarly a Julia user would want to use the Julia version, and once again has no benefit from having the same names.

Absolutely! I hope you don’t take my remarks the wrong way, I’m playing devil’s advocate here to (hopefully) help you make it even better.