I created Memoization.jl to address some of the limitations of Memoize.jl. It’s API compatible with Memoize.jl (just
@memoize your function definitions):
julia> using Memoization
julia> @memoize f(x) = (println("Computed $x"); x)
As of this writing, Memoization.jl has the following advantages:
It allows more function definition forms, including with keyword arguments and type parameters.
It does not issue any warnings, and allows you to memoize different methods of the same function across different modules.
Closures work, meaning different instances of closures with different closed-over variables can be separately and simultaneously memoized.
You can empty memoization caches.
Lookup is a bit faster
Feel free to take it for a spin and let me know if you run into any issues!
eh… so you guys can’t agree on how to combine the two into one package?
No, I just made no attempt to combine the packages or submit a PR into that one since its just a 100% rewrite.
I think that remarks like this are unwarranted. It is perfectly fine to develop a new package for something that already exists, especially if it is a complete rewrite.
The package ecosystem benefited a lot from people exploring new ways of doing something. Eventually parallel approaches may get merged, or develop a common API, but it is also OK to just have slightly parallel or overlapping functionality.
Marious311, kindly inform your audience on the information page of your package if your Memoization.jl is thread safe for use in multi-threading julia-1.3
Good question, the answer is it is not threadsafe with either the default
IdDict nor with
Dict, I can update the docs.
If we had a
ThreadSafeDict data type (to my knowledge this doesn’t currently exist in Julia, although ThreadSafeDataStructures.jl was on the right path), then
@memoize ThreadSafeDict f(x) = ... should already be threadsafe for top-level functions with no changes to this package (and it could made threadsafe for closures too with some minor tweaks).
Have you thought about adding an option to save the content to disk instead to some dictionary?
The annoying warnings with precompilation are gone! Thanks!
It probably wouldn’t be crazy hard to build some sort of
MmappedDict which could be pluggable into this package.
Just to say that there are some more memoization packages listed here: [ANN] Caching.jl, yet another memoizer with more features, and Caching.jl mentions disk support.
Yea I think that’d be the right approach. It almost already works with the output of e.g.
jldopen which returns a Dict-like thing. I will play with it.
Thanks for pointing that out, I wasn’t aware of either Caching.jl or Anamnesis.jl linked there, both of which have some complimentary features like disk caching which are really nice. One thing which appears to be unique to this package is that it doesn’t change your function’s name or return instead a non-function object, which is what lets this package memoize on a per-method basis.
Great package, thanks for sharing!
Quick update as I just tagged a new version (0.1.6) that addresses one limitation this package had. Previously, if you wanted to e.g. use a
ThreadSafeDict as a cache, you had to load the package
Memoization otherwise you got a world-age error. This is now no longer required, you can load packages in any order you like.
Since I wrote this package, Memoize.jl has improved so I think the main remaining differences / advantages of this package are that:
- It can memoize closures and callable objects.
- The memoization cache for a given function is all stored in a single place, and can be emptied with
Memoization.empty_cache!(func) regardless of which module the definition and the call are in.
- All the caches for every function/closure/callable are stored in one place,
Memoization.caches, and there is also
Memoization.empty_all_caches!() to clear everything.
If none of these are an issue for you, the two packages continue to be basically drop-in replacements for eachother. Happy memoizing!