"Homomorphic machine learning" in Haskell; goal to be more generic and faster than Julia

The author explains a bit about why it’s not being developed anymore.

https://news.ycombinator.com/item?id=14409595

" Hi everyone, author of HLearn here :slight_smile:

This is a bit awkward for me as I’ve paused the development of HLearn and emphatically do not recommend anyone use it. The main problem is that Haskell (which I otherwise love) has poor support for numerical computing."

later, on why the haskell type system doesn’t do what he wants:

“For example, I want the compiler to automatically rewrite my code to be much more efficient and numerically stable”

and

“t’s common in machine learning to define a parameter space \Theta that is a subset of Euclidean space with a number of constraints. For a simple example, Theta could be an elipse embedded in R^2. In existing Haskell, it is easy to make R^2 correspond to a a type, and then do automatic differentiation (i.e. backpropagation) over the space to learn the model. If, however, I want to learn over \Theta instead, then I need to completely rewrite all my code. In my ideal language, it would be easy to define complex types like \Theta that are subtypes of \R^2, and have all my existing code automatically work on this constrained parameter space.”

Sounds like Julia hits most of these notes, no?

10 Likes