An elegant way to construct recursive objective?

I need to fit a model on a vector f of length n, where f[i] is recursively calculated from f[i-1]. There is another vector d of length n related to f in the following way:
f[i] = f0 + d[i] d[1] = d0 + k * log(1 + d0 / k) d[i] = d[i-1] + k * log(1 + d[i-1] / k)
Here is the JuMP code I have so far for this model fitting.
jmp_model = Model(;kwargs_Model...) @variable(jmp_model, f0) @variable(jmp_model, d0 >= 0) @variable(jmp_model, k)
Is there an elegant way to construct the objective for this type of model? Thanks!

Hello.
You could define constraints to set the relationships. For example

m = Model()

@variable(m, f0)
@variable(m, d0 >= 0)
@variable(m, k)

@constraint(m, frel[i in I], f[i] == f0 + d[i])
@constraint(m, dinit, d[1] == d0 + k*log(1 + d0 / k))
@constraint(m, drel[i in 2:I[end]], d[i] == d[i-1] + k * log(1 + d[i-1] / k))

However, the standard way to fit a model would be to minimize the sum of squares (for example) of your prediction compared to your observations. So f should also be a variable, and you should add an objective like

@objective(m, Min, sum( (f[i] - fobs[i])^2 for i in I))

I hope it helps. regards

Thank you very much for the help!

You said “f should also be a variable”. I guess d should also be a variable? And I should define them like this?
@variable(m, f[I]) @variable(m, d[I])
where I is an vector containing all the indices for fobs

Thanks again!

In addition, should I use @objective or @NLobjective? I’m not sure because k * log(1 + d[i-1] / k is non-linear, though sum of squares can be considered linear. Thanks!

Please don’t worry about my follow-up questions any more.

I tried this:
@constraint(m, dinit, d[1] == d0 + k*log(1 + d0 / k)) @constraint(m, dinit, d[1] == d0 + k*log(1 + d0 / k)) @objective(...)
and got an error asking me to use @NLconstraint and @NLobjective.

I changed to this:
@NLconstraint(m, dinit, d[1] == d0 + k*log(1 + d0 / k)) @NLconstraint(m, dinit, d[1] == d0 + k*log(1 + d0 / k)) @NLobjective(...)
and it returned successfully.

In addition, I used:
@variable(m, f[I]) @variable(m, d[I])
where I is an vector containing all the indices for fobs

And I think I got the model fitting to work. Thanks again!

I’m glad it worked. Personally, I would prefer that @NLconstraint and @NLobjective wouldn’t exist. It might be confusing and no other AML requires you to explicit that. I imagine there is a good reason why both are required.

regards!