How to do partial derivatives?

I’m totally new in Julia. Now, I’m working on derivative. I found that ‘2nd derivatives’ (ex. function derivative_sec_fd(f,x)), but still couldn’t find how to do partial derivative…
Could you tell me which command should I use?

Do you try to do numeric, automatic or symbolic differentiation?

1 Like

Related (maybe xref?): https://stackoverflow.com/questions/54277219/partial-derivatives-in-julia

1 Like

You could use automatic differentiation to calculate partial derivatives of a function

julia> using ForwardDiff

julia> f(x) = 2*x[2]^2+x[1]^2 # some function
f (generic function with 2 methods)

julia> g = x -> ForwardDiff.gradient(f, x); # g is now a function representing the gradient of f

julia> g([1,2]) # evaluate the partial derivatives (gradient) at some point x
2-element Array{Int64,1}:
 2
 8
1 Like

Thank you for all of replies. I’ll try to do it !!

Is there anything wrong with doing it this way?

julia> using ForwardDiff

julia> f(x, y) = 2y^2 + x^2
f (generic function with 1 method)

julia> ∂f_∂x(x, y) = ForwardDiff.derivative(x -> f(x, y), x)
∂f_∂x (generic function with 1 method)

julia> ∂f_∂y(x, y) = ForwardDiff.derivative(y -> f(x, y), y)
∂f_∂y (generic function with 1 method)

julia> ∂f_∂x(1, 2)
2

julia> ∂f_∂y(1, 2)
8

Just thought I’d ask to check if it’s unhealthy to define partial derivatives like so.

3 Likes

I think it’s fine. You’re formally introducing y via closure, which is usually pretty efficient in Julia.

Disclaimer: I do this myself for some of my codes.

3 Likes

No particularly, but if you need both partial derivatives anyway, just define a function on a vector and use ForwardDiff.gradient.

1 Like

This is true, but given f(x, y),

g(x) = f(x[1], x[2])
∇g(x) = ForwardDiff.gradient(g, x)
∂f_∂x(x, y) = ∇g([x, y])[1]
∂f_∂y(x, y) = ∇g([x, y])[2]

is noticeably longer and somewhat less elegant than

∂f_∂x(x, y) = ForwardDiff.derivative(x -> f(x, y), x)
∂f_∂y(x, y) = ForwardDiff.derivative(y -> f(x, y), y)

in my opinion.

I’ve needed second derivatives too, so the latter method is much less verbose (and accomplishes the same thing, AFAIU).

3 Likes

Some one in slack (sorry, I forgot whom) gave me this snippet

using ForwardDiff: derivative

partiali(f, x, i) = ForwardDiff.partials(f([Dual{}(x[j], 1.0*(i==j)) for j in eachindex(x)]))[]


struct ∂{k, F} <: Function
    f::F
    ∂{k}(f::F) where {k, F} = new{k, F}(f)
end
function (pd::∂{k})(args...) where {k}
    N = length(args)
    derivative(argk -> f(ntuple(i -> i == k ? argk : args[i], Val(N))...), args[k])
end
f(x, y, z, w) = x + 2y + 3z + 4w 
∂{3}(f)(1,1,1,1)
5 Likes

Addendum via @cscherrer

julia> using StructArrays

julia> using ForwardDiff

julia> function partiali(n,i)
           ith = zeros(n)
           ith[i] += 1
           function (f,x)
               sa = StructArray{ForwardDiff.Dual{}}((x, ith))
               return f(sa)
           end
       end
partiali (generic function with 2 methods)

julia> f(x) = sum(x -> x^2, x)
f (generic function with 1 method)

julia> x = randn(1000);

julia> ∂₅₀ = partiali(1000,50);

julia> x[50]
2.1129129387789667

julia> @btime $f($x)
  82.987 ns (0 allocations: 0 bytes)
1000.4848272576928

julia> @btime $∂₅₀($f,$x)
  167.038 ns (0 allocations: 0 bytes)
Dual{Nothing}(1000.4848272576928,4.225825877557933)
2 Likes

Thanks @mschauer :slight_smile:

This is the approach I’d go with if you just need one partial at a time, say for coordinate descent or Gibbs sampling. A big advantage of doing it this way is that by using a StructArray you can use the original data with no new allocations.