I’m too sad… I tried to apply your suggestions to one of my functions and it doesn’t work :
X=rand(50,6)
h(beta0,beta1,beta2,beta3,beta4,beta5,c1,c2,c3,c4,c5)=-sum(X*[beta0,c1*beta1,c2*beta2,c3*beta3,c4*beta4,c5*beta5])+sum(log.(1 .+exp.(X*[beta0,c1*beta1,c2*beta2,c3*beta3,c4*beta4,c5*beta5])))+(sum([c1,c2,c3,c4,c5])+1)/2*log(2*pi)+1/2*sum([beta0,c1*beta1,c2*beta2,c3*beta3,c4*beta4,c5*beta5].^2)
To test the function, I tried:
optimize((beta0,beta1,beta2,beta3,beta4,beta5) -> h(beta0,beta1,beta2,beta3,beta4,beta5,1.0,1.0,1.0,1.0,1.0), ones(6), ConjugateGradient())
but I get the following error:
ERROR: MethodError: no method matching (::getfield(Main, Symbol("##57#58")))(::Array{Float64,1})
Closest candidates are:
#57(::Any, ::Any, ::Any, ::Any, ::Any, ::Any) at none:1
Stacktrace:
[1] #finite_difference_gradient!#22(::Float64, ::Float64, ::Bool, ::Function, ::Array{Float64,1}, ::getfield(Main, Symbol("##57#58")), ::Array{Float64,1}, ::DiffEqDiffTools.GradientCache{Nothing,Nothing,Nothing,Val{:central},Float64,Val{true}}) at C:\Users\User\.julia\packages\DiffEqDiffTools\uD0fb\src\gradients.jl:321
[2] finite_difference_gradient! at C:\Users\User\.julia\packages\DiffEqDiffTools\uD0fb\src\gradients.jl:273 [inlined]
[3] (::getfield(NLSolversBase, Symbol("#g!#15")){getfield(Main, Symbol("##57#58")),DiffEqDiffTools.GradientCache{Nothing,Nothing,Nothing,Val{:central},Float64,Val{true}}})(::Array{Float64,1}, ::Array{Float64,1}) at C:\Users\User\.julia\packages\NLSolversBase\NsXIC\src\objective_types\oncedifferentiable.jl:56
[4] (::getfield(NLSolversBase, Symbol("#fg!#16")){getfield(Main, Symbol("##57#58"))})(::Array{Float64,1}, ::Array{Float64,1}) at C:\Users\User\.julia\packages\NLSolversBase\NsXIC\src\objective_types\oncedifferentiable.jl:60
[5] value_gradient!!(::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Array{Float64,1}) at C:\Users\User\.julia\packages\NLSolversBase\NsXIC\src\interface.jl:82
[6] initial_state(::ConjugateGradient{Float64,Nothing,getfield(Optim, Symbol("##30#32")),LineSearches.InitialHagerZhang{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}}}, ::Optim.Options{Float64,Nothing}, ::OnceDifferentiable{Float64,Array{Float64,1},Array{Float64,1}}, ::Array{Float64,1}) at C:\Users\User\.julia\packages\Optim\L5T76\src\multivariate\solvers\first_order\cg.jl:113
[7] optimize at C:\Users\User\.julia\packages\Optim\L5T76\src\multivariate\optimize\optimize.jl:33 [inlined]
[8] #optimize#93(::Bool, ::Symbol, ::Function, ::Function, ::Array{Float64,1}, ::ConjugateGradient{Float64,Nothing,getfield(Optim, Symbol("##30#32")),LineSearches.InitialHagerZhang{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}}}, ::Optim.Options{Float64,Nothing}) at C:\Users\User\.julia\packages\Optim\L5T76\src\multivariate\optimize\interface.jl:116
[9] optimize(::Function, ::Array{Float64,1}, ::ConjugateGradient{Float64,Nothing,getfield(Optim, Symbol("##30#32")),LineSearches.InitialHagerZhang{Float64},LineSearches.HagerZhang{Float64,Base.RefValue{Bool}}}, ::Optim.Options{Float64,Nothing}) at C:\Users\User\.julia\packages\Optim\L5T76\src\multivariate\optimize\interface.jl:115 (repeats 2 times)
[10] top-level scope at none:0
I think it pehaps comes from an Integer of Float problem, but I don’t see where… When I write
h(2,3,2,2,2,3,1,1,1,1,1)
for example, it works and returns a value.
Otherwise, How should I configure possible_configurations
in the for
loop for Julia to take (c1,c2,c3,c4,c5)
as a vector of values at each iteration? With zip
?