Julia vs R vs Python: simple MLE problem

Is there an example of someone defining their own number type in R? I don’t know of an example other than RMPFR which is quite complicated, but don’t know of things like bindings for Arb in R, or how it can be used in the standard library functions.

I looked at Rmpfr briefly, and didn’t see a way to invert matrices.
They also need to implement their own uniroot, integrate, etc…

I know last summer a professor in my university was interested in tropical geometry, where + is min and * is +. I don’t know what the applications are. I think the goal was to sample from a space of phylogenetic trees. Getting creative like that would be a lot easier in Julia.

By the way, did you use (L)BFGS in Python? There seems to be an inverse Hessian approximation in the output as well as a gradient.

I honestly don’t know what that means and/or why it might be important. I just know I can specify bounds :sweat_smile:

Ah, okay. It’s the optimization algorithm.

This is really cool; I’ve been looking for a good way to make an fminunc/con mirror in Julia so this is great to see! I may be in touch with a PR on your GitHub repository at some point in the future.

Thanks, PRs are welcome. Note that the fminunc/con bits are just frontends to Optim and NLOpt, which do the real work. They are intended to ease the transition for students who are used to Matlab.

Sounds good. I’ve been in search of a good interface for Julia to do common econometric MLE problems (stuff that’s not in GLM, e.g. multinomial logit, etc.). I’ve tried implementing in JuMP but the Hessian computation was overly burdensome (in terms of memory usage—see this discourse). So it seems like using Optim instead of JuMP may be an improvement.

This should be pretty easy, in principle. You just need to set the likelihood function following this example:

 model = β -> poisson(β, y, x)
    mleresults(model, β, "simple ML example");

where the likelihood returns the log likelihood for each observation, in an array. Example likelihood functions are in https://github.com/mcreel/Econometrics.jl/tree/master/src/ML/Likelihoods.

mleresults() actually uses LBFGS from Optim for the maximization. Currently, there’s no overt support for constrained ML, but that would be pretty easy to add.

1 Like

Edited

Using JuMP shouldn’t be conflated with using methods that require second-order derivatives. JuMP is perfectly happy to skip the Hessian computation if the solver doesn’t ask for it. In this case you won’t see the corresponding memory usage. You could use JuMP to compute first-order derivatives for the methods in Optim that need them, but JuMP isn’t really targeting the unconstrained case so this link hasn’t been well developed, and as far as I know the AD solutions in Optim are good enough already.

1 Like

Thanks, Miles! I was unaware that Optim didn’t have an AD option; that’s really good to know and makes me want to go back to using JuMP.

I was under the impression that NLobjective (which is the workhorse for MLE models) requires the Hessian for optimization. Is that correct?

And, if not, is it possible to compute the Hessian at the optimum post-estimation? I really only need it at the end for doing statistical inference.

This is false, and I’m not sure that’s what Miles is implying either.

1 Like

My post was hastily written, I’ve edited it.

1 Like

An example of computing the Hessian post optimization, using the Calculus.jl package, is here: https://github.com/mcreel/Econometrics.jl/blob/master/src/ML/mle.jl, line 21.

1 Like

Thanks for the clarification, @miles.lubin and @pkofod!

You have Optim loaded already, so you could also create an

td = TwiceDifferentiable(obj, θ; autodiff = :forward)
value_gradient!(td, θ)
hessian!(td, θ)
L, dL, d2L = value(td), gradient(td), hessian(td)

though if this is horrible depends on the user I guess :slight_smile: It’s sort of just based around the way Optim does this stuff internally.

Happy to report that the compilation latency with Optim in Julia 1.1 has gone from 7.5 seconds to sub 1s for me now.

So I’ve updated the post

https://www.codementor.io/zhuojiadai/julia-vs-r-vs-python-simple-optimization-gnqi4njro

Now Julia is the clear winner for me when doing simple optimisation problems.

11 Likes

Wow! Any idea what reduced the compilation latency of Optim? Some of my own private packages tend to suffer from long compilation times and I’ve wondered if there are standard strategies to improve compilation time.

Something’s happened to Optim compile times right?

A lot of work has gone into improving compilation and startup times: Issues · JuliaLang/julia · GitHub.

6 Likes