Introducing NNUE to the Julia community

Background:
Local search has been a backbone of many optimization problems. When it comes to heuristic guide, however, machine learning is difficult to integrate into the field due to the time it takes to evaluate the neural network, the latency of CPU-GPU transmission, as well as the batch size required to efficiently perform inference. Contrast this with local search algorithms which typically search and evaluate simple heuristics on the CPU with a batch size of one.
NNUE comes to the rescue:
Due to the niche nature of the chess/shogi/etc programming community, advancements in the field has been relatively obscure. However, I would like to bring to light one advancement of interest, the NNUE.
The principle of NNUE is the sparse first layer with incremental update. This makes it possibly suitable for local search optimizations/etc. This could bring machine learning to algorithms previously untouched by it.

Julia support for it:
Julia has demonstrated its capability in building a small, fast CPU neural network in the SimpleChains package. However, direct support for NNUE remains to be (or not be) implemented.

Unfortunately, I do not have a use case for it. A chess engine in Julia, for example, is unlikely to win a chess engine competition. I do not have sufficient background in things which could benefit from it either. However, if you do, you may give it a try.

PS: do I have the expertise to suggest such a thing?:
I am not better than you in your field of expertise and the only thing I have is the awareness that this exists. I am not in a position to implement it because I only know the theoretical possibility that it might work. If anyone wants to try, do so with your own assessment.

Are you interested? What do you think?

1 Like

@fredrikekre

NNUE stands for Efficiently Updatable Neural Network (spelled backward due to a Japanese wordplay), which is a neural network, which is therefore, a machine learning algorithm. Iā€™m moving this category back to machine learning.