Firstly, I apologise for asking what looks like such a silly question. Any help will be very welcome. I have a function which uses a double loop to compute the entries of a matrix
L. I know the matrix values from a paper I’m reading but I can’t seem to match them. As a first step to figuring out the problem I hand computed, then wrote a function to calculate the first entry and got exactly what I wanted.
Somehow, in moving to the function that calculates the whole matrix I’ve made a mistake. Unfortunately I don’t see where it is. The value I’m expecting is 0.002073 but what I calculate in
find_L is 0.002512. The relative difference is big and throws off the rest of my code. The function is
using Parameters using QuantEcon using DelimitedFiles Model = @with_kw (ef = readdlm("eff.txt"), s_prob = readdlm("prob.txt"), init_dist = readdlm("init_dist.txt"), mc = tauchen(9, 0.96, sqrt(0.045), 0, 2), P = mc.p, states = mc.state_values, work = 40, T = 60, ) m = Model(); function find_L(m) @unpack P, ef, states, init_dist, work, hbar, s_prob = m ϕ = zeros(work, length(states)) ϕ[1,:] = init_dist for t in 2:work ϕ[t,:] = transpose(P)*ϕ[t-1,:]*s_prob[t] end L = zeros(work, length(states)) for t in 1:work for i in 1:length(states) L[t,i] = exp(states[i])*ef[t]*ϕ[t,i] end end return L end
As a minimum working example to calculate the first entry
using QuantEcon mc = tauchen(9, 0.96, sqrt(0.045), 0, 2) P = mc.p states = mc.state_values ef = 0.599232388 s_prob = 1 states = -1.515228816828316 init_dist = 0.0157471 function first(ef, s_prob, states, P, init_dist) ϕ = init_dist x = exp(states)*ef*ϕ return x end first(ef, s_prob, states, P, init_dist) = 0.002073673367670175
which is exactly correct. It would be helpful for me to make the function for the full matrix operational but unfortunately it includes data drawn from .txt files and I wasn’t sure how to include those in my question. I should also mention that I know the \phi matrix is correct (in any case the first element of the matrix
L doesn’t depend on anything I’ve calculated).