Random math function

Is there a Pkg that allows generating random mathematical functions from operators and constants or Symbols?

I don’t know of such (in Julia), shouldn’t be terribly hard to make. Can I ask what it’s for? Because I recall something similar. While it seems useless to just generate a random function, generating and having some goal seems, and is worthwhile.

I’ve personally been on the lookout for the best (single) activation function for neural networks, and looking up the work on randomly generated ones, I found some new papers.

ReLU isn’t always best, and it also depends on the dataset (and I see the initialization method too). Usually ReLU or ReLU related aren’t even in the top 3 best:

Neural Architecture Search (NAS) algorithms aim to take the human out of the loop by automatically finding a good set of hyper-parameters for the problem at hand. These algorithms have mostly focused on hyper-parameters such as the architectural configurations of the hidden layers and the connectivity of the hidden neurons, but there has been relatively little work on automating the search for completely new activation functions, which are one of the most crucial hyperparameters to choose. […]
The work in the literature has mostly focused on designing new activation functions by hand, or choosing from a set of predefined functions while this work presents an evolutionary algorithm to automate the search for completely new activation functions. We compare these new evolved activation functions to other existing and commonly used activation functions.
[…]
Our experiments consist of 3 classification problems on image-based datasets, 4 classification problems on non-image-based datasets, and 3 regression problems

I thought sin(x) and cos(x) had been ruled out as good activation functions (for “periodic” functions (or here modified) I recall sin²(x) + x however used and better, to avoid problem with sine), and depending on the dataset, good functions can range for that and complex like that (see also see Fig 5 (b) Plot of the best 3 activation functions found on the Magic Telescope dataset) down to f(x) = x - abs(x) [which is ReLU rotated] or f(x)

I WAS actually very surprised to find now (claimed):

[The training-time is 2-3 times slower than ReLU (but inference matters more), and I’m not sure this is best-in-class, only trying to approximate a few common ones with 5 parameters]

Universal activation function for machine learning
https://www.nature.com/articles/s41598-021-96723-8.pdf

Conclusion and future work
The UAF was developed as a generic activation function that can approximate many others such as the identity, ReLU, LeakyReLU, sigmoid, tanh, softplus, and Gaussian as well as to evolve to a unique shape. This versatility allows the UAF to achieve near optimal performance in classification, quantification, and reinforcement learn-
ing. As demonstrated, incorporating the UAF in a neural network leads to best or close-to-best performance, without the need to try many different activation functions in the design.

The choice of activation functions can have a significant effect on the
performance of a neural network. Although the researchers have been developing
novel activation functions, Rectified Linear Unit (ReLU) remains the most common
one in practice. This paper shows that evolutionary algorithms can discover new
activation functions for side-channel analysis (SCA) that outperform ReLU. Using
Genetic Programming (GP), candidate activation functions are defined and explored
(neuroevolution). As far as we know, this is the first attempt to develop custom
activation functions for SCA.
[…]
Modern digital systems are commonly equipped with cryptographic primitives, acting as the
foundation of security, trust, and privacy protocols. While such primitives are proven to be
mathematically secure, poor implementation choices can make them vulnerable to attackers.
Such vulnerabilities are commonly known as leakage [ MOP06 ]. Side-channel leakage
exploits various sources of information leakage in the device where some common examples
of leakage are timing [ Koc96], power [KJJ99b], and electromagnetic (EM) emanation [ QS01 ]
and the attacker is a passive one. The researchers proposed several side-channel analysis
(SCA) approaches to exploit those leakages in the last few decades.

1 Like

Depending on your specific needs this may be helpful: [ANN] SymbolicRegression.jl - distributed symbolic regression

3 Likes

Can you show me an example?

A mathematical expression is a tree. I suppose I have to generate a random tree given the number of nodes and then I have to fill it with variables and operators.
Do you know a Pkg to generate random trees in julia?

In SymbolicRegression.jl, you can do this:

using SymbolicRegression

# Define the available operators:
options = Options(;
    binary_operators=(+, -, *, /, ^),
    unary_operators=(cos, sin, exp),
)

First, let’s construct a tree manually:

x1, x2, x3 = Node("x1"), Node("x2"), Node("x3")
tree = x1 - x2 * 3.2

println(tree)
# (x1 - (x2 * 3.2))

Now, let’s generate it randomly. Let’s say we want 5 nodes total. Say that we have 3 feature nodes to use (and any number of constant nodes). We also specify the type of constants in a given tree:

num_nodes = 5
num_features = 3
type = Float64
tree = gen_random_tree_fixed_size(num_nodes, options, num_features, type)
println(tree)
# exp(cos(x3) + 0.1950986787131656)

We can also evaluate the tree over an array:

# Evaluate the tree:
X = randn(Float64, 3, 1000);
out, did_complete = eval_tree_array(tree, X, options)

And do other things like counting nodes, constants, etc.:

count_nodes(tree)
# ^ 5
2 Likes