Gradient of a normal (NOT vector) function that has vector arguments

Hi guys,

I am trying to implement a code in Julia (it’s my first in Julia) and I am having problems in computing the gradient of a function with vector arguments.

I would like to compute the derivative in relation with only one of those vector arguments of my function.

If you need further details of my code (or pseudocode) please let me know and I will be happy to provide them for you.

I have to let you know also that I have already tried with ForwardDiff but without success.

I thank you in advance for your time and support.



Please refer to this topic and post an MWE - I don’t think your question can be answered at present without additional information.

1 Like

Just use ForwardDiff.gradient:

julia> using ForwardDiff

julia> f(x) = sum(x)
f (generic function with 1 method)

julia> ForwardDiff.gradient(f, [1., 2.])
2-element Array{Float64,1}:

Edit: hadn’t noticed

You can use a closure:

julia> using ForwardDiff

julia> f(x, y) = sum(x .* y)
f (generic function with 1 method)

julia> ForwardDiff.gradient(x -> f(x, [3, 4]), [1., 2.])
2-element Array{Float64,1}:

@nilshg Hi nilshg, and thank you for your advice. I was trying to create a minimal working code that represent generally what I would like to realize with my code, but in the mean time “tkoolen” has answered and he is in line with what I was experimenting with. Hence I will carry on where he has left.

@tkoolen Hi tkoolen, and thank you very much for your answer. This part of your answer:

ForwardDiff.gradient(x -> f(x, [3, 4]), [1, 2])

ia what I was very much interested, but when I tried to implement it I got the same error as the one when I implemented it myself as follows:

julia> using ForwardDiff
julia> f(x, y) = sum(x .* y)
julia> f2(x) = f(x, [3, 4])
julia> ForwardDiff.gradient(f2, [1, 2])

It seems to me that is an error that has to do with value of ‘x’ which comes out to be:

::Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")

The “PreImageProblems” is a module of my project.



It will be a lot easier to help you, if you posted the exact error message. Everything works fine on my machine, so I would first try ]up ForwardDiff to make sure, you are running the latest version. Also make sure you are running a fresh Julia session. If that doesn’t work, the output of versioninfo() and ]st ForwardDiff would be helpful.

1 Like

I agree with simeonschaub. This code runs fine in a new Julia session. The fact that the tag is getfield(PreImageProblems, Symbol("##4#5") means that the code you posted cannot be what produced the error, regardless of ForwardDiff version.

@simeonschaub and @tkoolen Hi guys and thank you very much for your help. Maybe I wasn’t clear enough and I am sorry for that. As you guys affirm the code that tkoolen provided and the code I provided work in a new Julia session. But I am saying that when I try to implement it on my code (when I try to incorporate it in my code) it generates the following error:

ERROR: LoadError: MethodError: no method matching (::PolynomialKernel{Int64,Int64})(::Array{Float64,1}, ::Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2},1})
Closest candidates are:
  PolynomialKernel(::Array{T₃<:Real,1}, ::Array{T₃<:Real,1}) where T₃<:Real at /home/x_user/.julia/dev/PreImageProblems/src/PreImageProblems.jl:28
  PolynomialKernel(::Union{AbstractArray{T₃<:Real,1}, AbstractArray{T₃<:Real,2}}, ::Union{AbstractArray{T₃<:Real,1}, AbstractArray{T₃<:Real,2}}) where T₃<:Real at /home/x_user/.julia/dev/PreImageProblems/src/PreImageProblems.jl:35
 [1] cost_function(::KernelPCA{Float64}, ::Type{Denoising}, ::Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2},1}) at /home/x_user/.julia/dev/PreImageProblems/src/pre_image_problem.jl:33
 [2] (::getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)})(::Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2},1}) at /home/x_user/.julia/dev/PreImageProblems/src/pre_image_problem.jl:87
 [3] vector_mode_gradient(::getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)}, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2},1}}) at /home/x_user/.julia/packages/ForwardDiff/N0wMF/src/apiutils.jl:37
 [4] gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2},1}}, ::Val{true}) at /home/x_user/.julia/packages/ForwardDiff/N0wMF/src/gradient.jl:17
 [5] gradient(::Function, ::Array{Float64,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2,Array{ForwardDiff.Dual{ForwardDiff.Tag{getfield(PreImageProblems, Symbol("##4#5")){KernelPCA{Float64},DataType,typeof(PreImageProblems.cost_function)},Float64},Float64,2},1}}) at /home/x_user/.julia/packages/ForwardDiff/N0wMF/src/gradient.jl:15 (repeats 2 times)
 [6] #update_step#3(::Float64, ::Float64, ::Function, ::KernelPCA{Float64}, ::Type{WithRespectToData}, ::Type{Denoising}, ::Array{Float64,1}, ::Int64, ::typeof(PreImageProblems.cost_function)) at /home/x_user/.julia/dev/PreImageProblems/src/pre_image_problem.jl:87
 [7] update_step at /home/x_user/.julia/dev/PreImageProblems/src/pre_image_problem.jl:71 [inlined]
 [8] pre_image_non_neg_constraint(::Type{KernelPCA}, ::PolynomialKernel{Int64,Int64}, ::Type{Denoising}, ::Type{WithRespectToData}, ::Array{Float64,2}, ::Int64) at /home/x_user/.julia/dev/PreImageProblems/src/pre_image_problem.jl:50
 [9] top-level scope at none:0
in expression starting at /home/x_user/.julia/dev/PreImageProblems/src/polyDataDenoisExp.jl:76

I believe that the version of Julia and other packages is updated, but anyway please find following the result of version checking:

julia> versioninfo()
Julia Version 1.1.0
Commit 80516ca202 (2019-01-21 21:24 UTC)
Platform Info:
  OS: Linux (x86_64-pc-linux-gnu)
  CPU: Intel(R) Core(TM) i7-8700K CPU @ 3.70GHz
  LIBM: libopenlibm
  LLVM: libLLVM-6.0.1 (ORCJIT, skylake)
  JULIA_EDITOR = atom  -a

If you guys have the nerve to investigate further this story I will be happy to provide you details in order to proceed further with a minimal working code.

Again I thank you very much for your support.



It looks like you designed the PolynomialKernel constructor such that the element type of the first and second argument must be the same, namely T₃. But when you use ForwardDiff, the constructor is called with an array with element type ForwardDiff.Dual as the second argument. One way to fix this is to allow heterogeneous element types in the PolynomialKernel constructor (and possibly the type itself). Another is to promote the first argument to an array of the same element type as the second argument. Be sure to read; basically your code is currently not sufficiently generic.


@tkoolen Hi tkoolen and thank you very much for your help. I wasn’t aware of:

struct Partials{N,V} <: AbstractVector{V}


struct Dual{T,V<:Real,N} <: Real

in ForwardDiff.

It seems I have to generalize the PolynomialKernel constructor in order to accept as arguments not only T₃<: Real but also Dual.

I will let you if I am successful in implementing it.




Hi tkoolen,

sorry for this delay in contacting you back, but just to confirm that your suggestion worked fine.

Once more thank you very much.



1 Like