I am currently stuck trying to utilize the Optim package in Julia in an attempt to minimize a cost function. The cost function is the cost function for an L2 regularised logistic regression. It is constructed as follows;
using Optim
function regularised_cost(X, y, θ, λ)
m = length(y)
# Sigmoid predictions
h = sigmoid(X * θ)
# left side of the cost function
positive_class_cost = ((-y)' * log.(h))
# right side of the cost function
negative_class_cost = ((1 .- y)' * log.(1 .- h))
# lambda effect
lambda_regularization = (λ/(2*m) * sum(θ[2 : end] .^ 2))
# Current batch cost
𝐉 = (1/m) * (positive_class_cost - negative_class_cost) + lambda_regularization
# Gradients for all the theta members with regularization except the constant
∇𝐉 = (1/m) * (X') * (h-y) + ((1/m) * (λ * θ))
∇𝐉[1] = (1/m) * (X[:, 1])' * (h-y) # Exclude the constant
return (𝐉, ∇𝐉)
end
I would like to use LBFGS algorithm as a solver to find the best weights that minimize this function based on my training examples and labels which are defined as:
opt_train = [ones(size(X_train_scaled, 1)) X_train_scaled] # added intercept
initial_theta = zeros(size(opt_train, 2))
Having read the documentation, here’s my current implementation which is currently not working:
res = optimize(b -> regularised_cost(opt_train, y_train, initial_theta, 0.01),
method=LBFGS(),
Optim.Options(show_trace=true, iterations = 1000))
How do I pass my training examples and labels along with the gradients so that the solver (LBFGS) can find me the best values for theta?