I’ve tried to implement a simple linear regression with Normal obs. model, so effectively a combination of examples: Linear Regression and Univariate Normal Mixture

When I define both RVs for location and scale of the observed variable, I get a RuleMethodError (related probably to `initmessages`

I pass in the inference() function? See full example at the bottom).

However, the rule seems to exist in Github rules.jl but with a `q_tau`

keyword not `m_tau`

that my implementation looks for (unclear why).

```
RuleMethodError: no method matching rule for the given arguments
Possible fix, define:
@rule NormalMeanPrecision(:μ, Marginalisation) (q_out::PointMass, m_τ::GammaShapeRate, ) = begin
return ...
end
Stacktrace:
[1] rule(fform::Type, on::Type, vconstraint::Marginalisation, mnames::Type, messages::Tuple{Message{GammaShapeRate{Float64}}}, qnames::Type, marginals::Tuple{Marginal{PointMass{Float64}}}, meta::Nothing, __node::FactorNode{Type{NormalMeanPrecision}, Tuple{NodeInterface, NodeInterface, NodeInterface}, Tuple{Tuple{Int64}, Tuple{Int64, Int64}}, ReactiveMP.FactorNodeLocalMarginals{Tuple{ReactiveMP.FactorNodeLocalMarginal, ReactiveMP.FactorNodeLocalMarginal}}, Nothing, ReactiveMP.FactorNodePipeline{DefaultFunctionalDependencies, EmptyPipelineStage}})
@ ReactiveMP ~/.julia/packages/ReactiveMP/yX58s/src/rule.jl:744
[2] materialize!(mapping::ReactiveMP.MessageMapping{NormalMeanPrecision, DataType, Marginalisation, DataType, DataType, Nothing, FactorNode{Type{NormalMeanPrecision}, Tuple{NodeInterface, NodeInterface, NodeInterface}, Tuple{Tuple{Int64}, Tuple{Int64, Int64}}, ReactiveMP.FactorNodeLocalMarginals{Tuple{ReactiveMP.FactorNodeLocalMarginal, ReactiveMP.FactorNodeLocalMarginal}}, Nothing, ReactiveMP.FactorNodePipeline{DefaultFunctionalDependencies, EmptyPipelineStage}}}, dependencies::Tuple{Tuple{Message{GammaShapeRate{Float64}}}, Tuple{Marginal{PointMass{Float64}}}})
@ ReactiveMP ~/.julia/packages/ReactiveMP/yX58s/src/message.jl:243
[3] materialize!(vmessage::VariationalMessage{Tuple{ReactiveMP.MessageObservable{Message}}, Tuple{ProxyObservable{Marginal, Rocket.RecentSubjectInstance{Message{PointMass{Float64}}, Subject{Message{PointMass{Float64}}, AsapScheduler, AsapScheduler}}, Rocket.MapProxy{Message{PointMass{Float64}}, typeof(as_marginal)}}}, ReactiveMP.MessageMapping{NormalMeanPrecision, DataType, Marginalisation, DataType, DataType, Nothing, FactorNode{Type{NormalMeanPrecision}, Tuple{NodeInterface, NodeInterface, NodeInterface}, Tuple{Tuple{Int64}, Tuple{Int64, Int64}}, ReactiveMP.FactorNodeLocalMarginals{Tuple{ReactiveMP.FactorNodeLocalMarginal, ReactiveMP.FactorNodeLocalMarginal}}, Nothing, ReactiveMP.FactorNodePipeline{DefaultFunctionalDependencies, EmptyPipelineStage}}}})
@ ReactiveMP ~/.julia/packages/ReactiveMP/yX58s/src/message.jl:162
[4] as_message(vmessage::VariationalMessage{Tuple{ReactiveMP.MessageObservable{Message}}, Tuple{ProxyObservable{Marginal, Rocket.RecentSubjectInstance{Message{PointMass{Float64}}, Subject{Message{PointMass{Float64}}, AsapScheduler, AsapScheduler}}, Rocket.MapProxy{Message{PointMass{Float64}}, typeof(as_marginal)}}}, ReactiveMP.MessageMapping{NormalMeanPrecision, DataType, Marginalisation, DataType, DataType, Nothing, FactorNode{Type{NormalMeanPrecision}, Tuple{NodeInterface, NodeInterface, NodeInterface}, Tuple{Tuple{Int64}, Tuple{Int64, Int64}}, ReactiveMP.FactorNodeLocalMarginals{Tuple{ReactiveMP.FactorNodeLocalMarginal, ReactiveMP.FactorNodeLocalMarginal}}, Nothing, ReactiveMP.FactorNodePipeline{DefaultFunctionalDependencies, EmptyPipelineStage}}}})
@ ReactiveMP ~/.julia/packages/ReactiveMP/yX58s/src/message.jl:170
```

I’ve tried to provide explicit factorization (mean field as global parameter and via `where`

command), but it didn’t help.

When I remove `sigma`

and `b`

messages from the initmessages of the inference function, I get the following error:

```
Variables [ a, b, sigma ] have not been updated after a single inference iteration.
Therefore, make sure to initialize all required marginals and messages. See `initmarginals` and `initmessages` keyword arguments for the `inference` function.
Stacktrace:
[1] error(s::String)
@ Base ./error.jl:33
[2] inference(; model::ReactiveMP.ModelGenerator{linreg, Tuple{Int64}, Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}, Nothing, Nothing, Nothing}, data::NamedTuple{(:y, :time_index), Tuple{Vector{Float64}, Vector{Float64}}}, initmarginals::Nothing, initmessages::NamedTuple{(:a,), Tuple{NormalMeanVariance{Float64}}}, constraints::Nothing, meta::Nothing, options::NamedTuple{(), Tuple{}}, returnvars::NamedTuple{(:a, :b, :sigma), Tuple{KeepLast, KeepLast, KeepLast}}, iterations::Int64, free_energy::Bool, showprogress::Bool, callbacks::Nothing, warn::Bool)
@ ReactiveMP ~/.julia/packages/ReactiveMP/yX58s/src/inference.jl:361
[3] top-level scope
@ In[226]:1
[4] eval
@ ./boot.jl:373 [inlined]
[5] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
@ Base ./loading.jl:1196
```

I feel like I’m missing something obvious because I haven’t found anything in the introductory paper or the documentation that would suggest this case is not possible.

(I don’t think I need the initmarginals as well as initmessages in this case, but I wasn’t sure what else to try)

Thank you for your help!

MRE:

```
# generate data
a=0.2
b=5
time_index=collect(1.:20.)
noise_scale=0.5
y=time_index .* a .+ b + noise_scale*randn(size(time_index,1))
# define model
@model function linreg(n)
a ~ NormalMeanVariance(0.0, 10.0)
b ~ NormalMeanVariance(0.0, 10.0)
sigma ~ GammaShapeRate(1.0, 1.0)
time_index = datavar(Float64, n)
y = datavar(Float64, n)
for i in 1:n
y[i] ~ NormalMeanPrecision(a * time_index[i] + b, sigma)
end
return a, b, time_index, y
end
# run Var. inference
results = inference(
model = Model(linreg, length(time_index)),
data = (y = y, time_index = time_index),
initmessages = (
a = vague(NormalMeanVariance),
b = vague(NormalMeanVariance),
sigma = vague(GammaShapeRate)
),
initmarginals=(a = vague(NormalMeanVariance),
b = vague(NormalMeanVariance),
sigma = vague(GammaShapeRate)
),
returnvars = (a = KeepLast(), b = KeepLast(),sigma=KeepLast()),
iterations = 20,
);
```

System:

```
> Pkg.status()
Status `~/reactive-mp/Project.toml`
[6e4b80f9] BenchmarkTools v1.3.1
[31c24e10] Distributions v0.25.59
[b3f8163a] GraphPPL v2.0.1
[91a5bcdd] Plots v1.29.0
[a194aa59] ReactiveMP v2.0.3
[df971d30] Rocket v1.3.22
[5a560754] Splines2 v0.2.1
[860ef19b] StableRNGs v1.0.0
[9a3f8284] Random
> versioninfo()
Julia Version 1.7.2
Commit bf53498635 (2022-02-06 15:21 UTC)
Platform Info:
OS: macOS (x86_64-apple-darwin19.5.0)
CPU: Apple M1 Pro
WORD_SIZE: 64
LIBM: libopenlibm
LLVM: libLLVM-12.0.1 (ORCJIT, westmere)
Environment:
JULIA_NUM_THREADS = 4
```