I’m trying to build a GARCH(1, 1) model, and this requires recursively constructing the variance:

Since the variances are unobserved (only the epsilons are), I have to pull \sigma_1^2 out of thin air to initialize the model. Thus, I can’t do:

```
@NLexpression(
model,
variances[t=2:T], α + β * series[t - 1]^2 + γ * variances[t - 1]
)
```

…because the variable `variances`

doesn’t exist here. Doing:

```
@NLexpressions(
model,
begin
variances[1], initial_variance
variances[t=2:T], α + β * series[t - 1]^2 + γ * variances[t - 1]
end
)
```

…doesn’t work because I can’t redefine `variances`

in the second line. The error message suggests using the anonymous construction syntax, but I specifically *need* to reference `variances`

by name.

This doesn’t work either, because `variances`

is undefined on the RHS:

```
@NLexpression(
model,
variances[t=2:5], α + β * series[t-1] + γ * (t - 1 < 1 ? initial_variance : variances[t-1])
)
```

Currently I’m creating a lot of expressions in a loop:

```
variances = Vector{JuMP.NonlinearExpression}(undef, T)
for t ∈ eachindex(series)
variances[t] = if t ≤ 1
@NLexpression(model, initial_variance)
else
@NLexpression(model, α + β * series[t - 1]^2 + γ * variances[t - 1])
end
end
```

`variances`

is used only once to construct the log-likelihood, but inspecting `model.nlp_data.nlexpr`

reveals that these intermediate expressions I created in the loop are stored in the model (and there are thousands of them), even though I don’t need them anymore.

Is it possible to create expressions *without* storing them in the model? Are there better ways of computing such recursive expressions like the one for \sigma_t^2?