Comparing GMM results

I am working on implementing a linear GMM estimator in Julia. As a first step, I compare 2 ways of getting parameter estimates: numerically minimizing the loss function

function LossFn(θ, y, X, Z)
    W = diagm(0 => ones(size(Z, 2)))
    g = [(y[i] - X[i, :]'*θ)*Z[i] for i=1:size(Z, 2)]
    loss = g'*W*g
    return loss
end
gmm   = optimize(θ1 -> LossFn(θ1, y, x, z), ones(6))

and using analytical solution:

inv(X'*Z*W*Z'*X)*X'*Z*W*Z'*y

I was surprised to find that the results are different.

Different how? Like obviously or slightly different? If they compare as equivalent with isapprox then they are sufficiently the same for this purpose.

1 Like

Obiviously different, making me wonder where my code is wrong lol

I think the error is size(Z, 2).

Yea is Z just a vector here or is it a matrix? Because you call size(Z,2) which indicates that it does have a second dimension but you are indexing it with only Z[i] which suggests it does not.

1 Like

Z is a N-by-L matrix of instrument, in which each row is an observation and each column is an exogenous variable.

function LossFn(θ, y, X, Z)
    W = diagm(0 => ones(size(Z, 2)))
    g = [(y[i] - X[i, :]'*θ)*Z[i, :] for i=1:size(Z, 1)]
    gbar = mean(g, dims = 1)[1]
    loss = gbar'*W*gbar
    return loss
end

I think this should get it right. Here g[i] is the value of moment funtion for observation i:
g_i = (y_i - x_i'\theta)z_i
and gbar gives sample moment
\bar{g} = \frac{1}{N}\sum_{i}^N g_i.

Likely that you want .* when constructing g

y is N-by-1 and X is N-by-K. The model is overidentified so we have K<L. Therefore

(y[i] - X[i, :]'*θ)*Z[i, :]

is a scaler muptiplying a 1-by-L vector.

I think this should be right, but the results are still quite different. Don’t know where I made another stupid mistake again.

Still haven’t figured out why the previous function was wrong, but found that the following matrix-based function works correctly:

function LossFn(θ, y, X, Z)
    W = diagm(0 => ones(size(Z, 2)))
    g = z'*(y - x*θ)
    loss = g'*W*g
    return loss
end