My code is working but I am frustrated by the speed. I am trying to find the “right value of delta” by minimizing the squared distance between observed binary choices and model-predicted choice probabilities. My code for model-predicted probability:

```
function choice_prob_given_θ2(δ, θ2, y, c, d, p0)
βY = θ2[1:5]
γ0 = θ2[6]
γ1 = θ2[7:11]
τ = θ2[12]
κ = θ2[13]
s(ϵ1) = d + 1/sqrt(τ) * ϵ1
p(ϵ1) = p0/(p0 + (1 - p0) * exp((0.5 - s(ϵ1))*τ))
Eu(ϵ1, ϵ2) = δ + y'*βY - (γ0 + y'*γ1)*(1 - c)*p(ϵ1) + ϵ2 - (log(κ) - 1)
b(ϵ1, ϵ2) = isequal(sign(Eu(ϵ1, ϵ2)), 1.0)
integrand(ϵ1, ϵ2) = b(ϵ1, ϵ2)*(1/(2*π)*exp(-(ϵ1^2 + ϵ2^2)/2))
(expectation, err) = hcubature(ϵ -> integrand(ϵ[1], ϵ[2]), [-3,-3], [3,3], rtol=0.01)
return expectation
end
```

and sum of squared distance:

```
function nls_obj(δ, θ2, y, c, d, p0, B)
prob_given_θ2 = [choice_prob_given_θ2(δ, θ2, y[i,:], c[i], d[i], p0[i]) for i = 1:size(y,1)]
obj = (B - prob_given_θ2)'*(B - prob_given_θ2)
return obj
end
```

Each call of nls_obj really takes a while, especially when delta gets close to the “right value”. Is there a way to further speed it up? Do you have any suggested way to run the minimization?