How to properly define a function with changeable parameters within a struct?

Is there a way to define a function within a struct such that the function depends on some of the other fields in the struct? For example, I currently have

mutable struct Net
a::Float64
b::Float64
f::Function
end

function Net()
a = rand()
b = rand() f(x, y) = ax + by
Net(a, b, f)
end

Then if I create an instance of Net by u = Net() I can call the internal function u.f(1.0, 2.0). With this approach if i update one of the fields, like u.a = 0 , the function u.f uses the old parameter value for a . Is there a way to do this so that updating a or b would also update f?

Maybe the functor approach would be useful here

mutable struct Net
    a::Float64
    b::Float64
end

(n::Net)(x, y) = n.a * x + n.b * y

u = Net(rand(), rand())
u(1.0, 2.0)
2 Likes

this sounds like something you want to do with an OO language but not with Julia. There shouldn’t be “internal” function which “belongs” to certain struct.

I personally think it’s an anti-pattern to have a function as one of the field of a struct while some “self-modification” happen at the same time.


I think if you look at Flux.jl, a model is like your Net:

model = Chain(Dens(5,5), softmax)

the model has parameters, params(model), but when updating parameters, you don’t do model.train!(x, y), instead, the pattern is:
Flux.train!(opt, params(model), loss...)

Funny enough that is precisely what I am doing. I actually use the functor approach to define a neural network. I am looking at physics informed PDE and it is convenient to have derivatives of the NN w.r.t input arguments explicitly defined as functions. I was just getting tired of writing them out so I was hoping to make a NeuralNet struct that would hold everything for me.

1 Like

Thank you, this definitely works but I have used this approach to define the feed-forward pass of my neural net and I am not sure I can use it to define the NN derivatives as well.