Is it possible to multiply with a derivative operator?

Sorry for opening this thread again, I’ve just been thinkning so much about this lately, because it has been so exciting for me. Therefore I made a little script to test some different things out:

# https://discourse.julialang.org/t/is-it-possible-to-multiply-with-a-derivative-operator/28914/10
# julia> DirectSum.printindices(stdout,DirectSum.indices(UInt(2^62-1)))
#v₁₂₃₄₅₆₇₈₉₀abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ
using Leibniz, DirectSum, SymEngine, LinearAlgebra
import Leibniz: Monomial
printindices(V,D) = (io=IOBuffer();DirectSum.printindices(io,V,D,true);String(take!(io)))
Base.:*(d::Monomial{V,G,D,0} where G,r::Basic) where {V,D} = diff(r,symbols(Symbol(printindices(V,D))))
Base.:*(d::Monomial{V,1,D,1},r::Basic) where {V,D} = diff(r,symbols(Symbol((printindices(V,D)))))
# x and y, becomes reexpressed as v1 and v2 symbolically - this doesn't really
# matter, since in the end the expression gives a number. Shape function can
# look a bit annoying though.
# ℕ(symbols(:x),symbols(:y)), to make it easy to see, so now it is fine
x,y = symbols(:v1),symbols(:v2)
a=symbols(:a); b=symbols(:b);

∂ₓ = ∂(ℝ,1)
∂ᵧ = ∂(ℝ,2) #\_gamma

#x = v1
#y = v2
expr = 2*x^2*y*a*b

# 1st derivatives work

println("Derivative with respect to x:  ",∂ₓ*(expr))

# Double derivatives also work, but multiplication order matters

println("This x-x derivative does not work:  ",(∂ₓ*∂ₓ)*(expr))
println("This x-x derivative works: ",∂ₓ*(∂ₓ*expr))

# Same is true for mixed derivatives

println("This x-y derivative does not work:  ",(∂ₓ*∂ᵧ)*(expr))
println("This x-y derivative works: ",∂ₓ*(∂ᵧ*expr))

# This is fine with me, because this ensures that "sloppy code" is avoided, and
# that the order of differentiation is always visible

# Imagine having a vector as:

vec = [2*x^3;4*b*y;3*a*b]

nabla = [∂ₓ;∂ᵧ;0]

# Taking the dot product these two approaches, give the same result

approach1 = nabla .* vec #Couldn't get dot product function to work..
approach2 = vec   .* nabla

println()
println(transpose(approach1))
println(transpose(approach2))

# But in theory approach 2, should be [2^x^3*∂ₓ,4*b*y*∂ᵧ,0]

The first parts of the script is pretty basic, but what I wanted to talk about is the last part showing approach 1 and 2. This is in theory wrong, since an operator should not be applied on the expr to the left, but only to the right, so in approach 2, the result should have been [2^x^3*∂ₓ,4*b*y*∂ᵧ,0] - I know that matrix multiplication in Julia is able to handle such cases, so I just wondered, whether you would think that there is an easy way to get this behaviour in Julia? This could be quite neat in writing for an example the friction term in the Navier-Stokes equation. So for an example one would have:

2^x^3*∂ₓ*(another expression) = 2^x^3*(derivative of another expression)

I don’t want you to deviate from you developing Grassmann.jl, so if this is very difficult to even implement then don’t bother please - and thanks regarding the Leibniz package - it is a nice feature being able to write something very “math like” in Julia, even if it costs a little in performance - makes it very easy for engineering processes to verify an algorithm based on some physical laws, so I really do hope you keep Leibniz in a way, in which this feature is not lost.

Kind regards

1 Like

In fact, I already have a solution work around for this; however, it is not yet fully implemented.

The solution is to use the differential forms and tensor fields… this is what I am working on implementing with the Grassmann package, for which the Leibniz is a mere prototype. Further development to fix this issue will be happening with Grassmann but it is not ready yet, as I have many other things in life.

It is not too difficult to implement and I already know how to do it, it’s simply a matter of taking the time to do it when I also have lots of other features and things in life to worry about too. The Grassmann package is going to have this built in properly. Not sure yet what I will do with maintaining the Leibniz package in the future… remember I am doing this work for free, please consider donating at liberapay.com/chakravala

Very exciting to hear, thanks for elaborating a bit. And please do not feel pressured by me :slight_smile: I only comment/chat about this topic since I find it really exciting - you of course have other stuff to do, and I can in no way expect you to work for free, please don’t think that is the goal of all this. I only started getting interested in this a few days ago, since I discovered I was able to use greek letters in Julia and from there I have just been moving forward trying to learn more, step by step.

I am looking forward to test your package in the future - I hope you are able to get funding / become part of a Julia dev, so that you can keep doing all of these neat things.

Kind regards, have a good weekend

1 Like

https://github.com/chakravala/Grassmann.jl/issues/33

Alright, so the functionality is designed around the concept of differential operators and tensor fields.

julia> using Reduce, Grassmann # can also use SymPy or SymEngine, or others if setup

julia> @mixedbasis tangent(ℝ^2,3,2); # 2D space, 3rd order, 2 variables

Here we load the 2D geometric algebra having 3rd order differentiability with respect to 2 variables.

This gives us some new Basis elements, e.g. ∂1 (differential operator) and ϵ2 (tensor field).

julia> ∂1 * (:(x1^2+x2^2)*ϵ2)
(2x1)∂₁ϵ²

julia> ∂1 * ans
2∂₁∂₁ϵ²

julia> ∂1 * ans
0vv

Roughly speaking, multiplying a differential operator and a tensor field results in the application of that particular differential operator on the tensor field with respect to its indices.

This feature is very new and hasn’t been completely tested yet. It only correctly applies first order differential operators at the moment, so it will take a few more commits before it’s fully correct.

Very exciting! Will see if I can make some tests of it, in one or two weeks time - then I will let you hear back.

Kind regards