An Idea for Optimization as if Solving ODE

I should say that turning a problem into a differential equation, and then looking at the property of the paths that the differential equation takes to discover facts about a mathematical object is a super effective strategy for sampling from many Bayesian posterior distributions. This is known as Hamiltonian Monte Carlo and for many problems it’s the best thing going. So you might want to look at that field.

A key difference there is that you actually do care about the function at all the points along the path… as opposed to optimization where ideally you’d just consult an oracle and it’d tell you that at location x the minimum occurs.

1 Like