Thank You very much for the Linux/Unix Pipe syntax which is a very helpful foundation and a natural fit for distributed processing, but hope we can elevate the language syntax to a slightly higher abstraction layer using mathematical notation, while simultaneously keeping the Automatic vectorization gears hidden/encapsulated to achieve this >> Automatic vectorization - Wikipedia
So also request support syntax for Function Composition (computer science) described here >>
" The ability to easily compose functions encourages factoring (breaking apart) functions for maintainability and code reuse. More generally, big systems might be built by composing whole programs ."
Generally allowing recomposing / resequencing function calls via REComposible Functional Programming is desirable. For example I am presently rewriting alot of my basically Julia procedural / .jl script code into functions and function calls so that I can easily resequence the order of operations so that when f(g(h(x) ) is less effective for machine learning cross validation etc. than h(g(f(x))) reordering the operations is very easy.
Generally I see computer language syntax/mathematics notation as critical to facilitate expressivity – a major benefit of writing Julia code , and actually at the heart of this thread discussion subject , namely >> “Custom XGBoost Loss function w/ Zygote. Julia Computing blog post” << because Cross Validation of Zygote Loss functions will involve mathematics notation such as in terms of the derivatives of f’ (x) and g’ (x) and the product of functions as follows: f(g(x))' = f'(g(x)) * g'(x)
A quick example:
Present Pipe syntax:
#Get models.
sk= AutoMLPipeline.SKLearners.learner_dict |> keys |> collect;
Proposed
Function composition /
Functional Programming
pseudo-code
syntax:
#Get models.
sk= collect(keys(AutoMLPipeline.SKLearners.learner_dict))
BTW it may be that the Functional Programming
pseudo-code
syntax
above already (almost?) works but even so
I
believe its
still
important to get in the habit of
writing
Functional Composition example code to support Calculus multiple derivative notations e.g.
f
(
g
(
x
))' =
f'
(
g
(
x
)) *
g'
(
x
)
so we can most easily write the XGBoost custom loss functions here
like other boosting methods do,
where they
generalize them by allowing
optimization of an arbitrary differentiable loss function
as
Per
`HTH
Ps>
I
believe we'll have to get the **math notation compact and expressive as** **possible** to hide/encapsulate complexity, because
Next Up
is
`` "How to define a
Fitness Function in a Genetic Algorithm ?
"
`
One mathematical description here >>
and here >>
How to define a Fitness Function in a Genetic Algorithm? | by Vijini Mallawaarachchi | Towards Data Science