I have a script that has two functions with the following relationship:
function optimize(w::Array{Float64,2}, b::Float64, X::Array{Float64,2}, Y::Array{Int64,2};
num_iterations::Int64=2000, learning_rate::Float64=0.5, print_cost::Bool=false)…
end
and
GRADED FUNCTION: model
function model(X_train::Array{Float64,2}, Y_train::Array{Int64,2}, X_test::Array{Float64,2}, Y_test::Array{Int64,2};
num_iterations::Int64 = 2000, learning_rate::Float64 = 0.5, print_cost::Bool = false)
“”"
Builds the logistic regression model by calling the function you’ve implemented previouslyArguments: X_train -- training set represented by a numpy array of shape (num_px * num_px * 3, m_train) Y_train -- training labels represented by a numpy array (vector) of shape (1, m_train) X_test -- test set represented by a numpy array of shape (num_px * num_px * 3, m_test) Y_test -- test labels represented by a numpy array (vector) of shape (1, m_test) num_iterations -- hyperparameter representing the number of iterations to optimize the parameters learning_rate -- hyperparameter representing the learning rate used in the update rule of optimize() print_cost -- Set to true to print the cost every 100 iterations Returns: d -- dictionary containing information about the model. """ ### START CODE HERE ### # initialize parameters with zeros (≈ 1 line of code) w,b = initialize_with_zeros(size(X_train)[1]) # Gradient descent (≈ 1 line of code) println(typeof(w)) println(typeof(b)) println(typeof(X_train)) println(typeof(Y_train)) println(typeof(num_iterations)) println(typeof(learning_rate)) println(typeof(print_cost)) parameters, grads, costs = optimize(w, b, X_train, Y_train, num_iterations, learning_rate, print_cost)
…
end
I then call
> d = model(train_set_x, train_set_y, test_set_x, test_set_y, num_iterations=2000, learning_rate=0.005, print_cost=true)
to get the intermediate output and crash
> Array{Float64,2}
> Float64
> Array{Float64,2}
> Array{Int64,2}
> Int64
> Float64
> Bool
> MethodError: no method matching optimize(::Array{Float64,2}, ::Float64, ::Array{Float64,2}, ::Array{Int64,2}, ::Int64, ::Float64, ::Bool)
> Closest candidates are:
> optimize(::Array{Float64,2}, ::Float64, ::Array{Float64,2}, ::Array{Int64,2}; num_iterations, learning_rate, print_cost) at In[24]:5
>
> Stacktrace:
> [1] #model#8(::Int64, ::Float64, ::Bool, ::Function, ::Array{Float64,2}, ::Array{Int64,2}, ::Array{Float64,2}, ::Array{Int64,2}) at .\In[28]:34
> [2] (::getfield(Main, Symbol("#kw##model")))(::NamedTuple{(:num_iterations, :learning_rate, :print_cost),Tuple{Int64,Float64,Bool}}, ::typeof(model), ::Array{Float64,2}, ::Array{Int64,2}, ::Array{Float64,2}, ::Array{Int64,2}) at .\none:0
> [3] top-level scope at In[29]:1
Can someone help me understand why the types match and the function template is not recognized?