Defining own math rules

Hey guys, so I wondered if anyone could try explaining this to me. I am trying to do some (for me) more advanced stuff than I am used to, and I want to define my own math rules in Julia by some kind of operator. Imagine that I have the case that I want:

using SymEngine

x,a,∂ = symbols(:x),symbols(:a),symbols(:∂) #\partial

# I want to have that:

∂*(a*x) = derivative of a*x

# And that if I change the order:

(a*x)*∂ = (a*x) * (derivative not yet taken) 

Such that depending on the order of multiplication I get two different results - a simple example would be to compare it with matrices, where a different ordering can give different results as well. Could anyone guide me to some ressources to learn how to do something like this or provide a simple example defining some random math rule? I’ve been trying to read the mathematical operations part of the wiki, but I can’t really see how it is implemented in Julia.

I have knowledge of how to make the differential operator, my question is basically just about how to make a rule which acknowledges the order of multiplication.

Kind regards

1 Like

Have you had a look at This seems to be exactly what you’re after.

Thanks will take a look at it, and then be back.

Kind regards

As I said in my other post, I have an implementation that will address this issue. However, I recommend against making the multiplication of a partial differential operator non-commutative, as this thinking comes from a fundamental misunderstanding (due to not knowing the difference between a differential operator and a differential form). The solution I will be implementing in Grassmann.jl is to use a basis for differential forms. Basically, you want to treat operator symbolic expressions separately from the dependent variables of a multivariable tensor field… the way to deal with this is to have a basis for differential operators and a basis for differential forms separately, which is exactly what my package is going to provide. Thus, multiplying differential operators with each other is different from multiplying differential operators and differential forms and commutativity doesn’t matter. As I mentioned before, this is not fully implemented yet. The solution you propose involves making the differential operator non-commutative, which is unnecessary once you understand the difference between differential operators and differential forms.

Anyway, you could easily define non-commutative operations in Julia without using any package, just use the Julia language it is already possible with the built-in language. My point is that this is the wrong idea for getting the behavior you want… what you want is differential operators and differential forms…

Not sure when I will implement this feature, but I’d like to be able to demonstrate it soon.

However, I have found that to get the behavior you want using my original Leibniz based script,

All you need to do is additionally define the operation

Base.:*(r::Basic,d::Monomial{V,G,D,O}) where {V,G,D,O} = Monomial{V,G,D,O,Basic}(r)

then you will have the non-commutative property you desired… however, as I mentioned in my previous post this is not how things will work when I design the packages because I use differential forms… also I’m not sure if all of the other dispatch will work correctly with this, as I haven’t spent time testing it further. The implementation I build on further will be based on differential forms as I said in my previous post.

Ah okay, just googled a bit and I can understand that a “differential operator” would be for an example nabla, but I am having a bit of trouble understanding what a differential form ie. why there is this distinction between the two concepts.

I’ve tried to add your line to the previous code (in the link you referenced), but I don’t feel like I am getting the right result. So now the basic example as such is working:

using Leibniz, DirectSum, SymEngine
import Leibniz: Monomial
printindices(V,D) = (io=IOBuffer();DirectSum.printindices(io,V,D,true);String(take!(io)))
Base.:*(d::Monomial{V,G,D,0} where G,r::Basic) where {V,D} = diff(r,symbols(Symbol(printindices(V,D))))
Base.:*(d::Monomial{V,1,D,1},r::Basic) where {V,D} = diff(r,symbols(Symbol((printindices(V,D)))))
Base.:*(r::Basic,d::Monomial{V,G,D,O}) where {V,G,D,O} = Monomial{V,G,D,O,Basic}(r)
# x and y, becomes reexpressed as v1 and v2 symbolically - this doesn't really
# matter, since in the end the expression gives a number. Shape function can
# look a bit annoying though.
# ℕ(symbols(:x),symbols(:y)), to make it easy to see, so now it is fine
x,y = symbols(:v1),symbols(:v2)

∂ₓ = ∂(ℝ,1)
∂ᵧ = ∂(ℝ,2) #\_gamma
vec = [2*x;4*y;0]
nabla = [∂ₓ;∂ᵧ;0]

# This gives the derivative of vec = [2,4,0] as expected

approach1 = nabla .* vec

# This gives the "function" in vec * the operator as expected, [2*v1∂₁;4*v2∂₂;0]:

approach2 = vec   .* nabla

# But now trying to use this as such:

approach3 = (vec  .* nabla).*vec

# I expect basically approach2* d(approach1)/dR ie. =  [2*v1∂₁;4*v2∂₂;0] .*[2*x;4*y;0] = [4*v1;8*v2;0]
# Instead I get:

approach3 = (vec  .* nabla).*vec #[2;4;0]

Maybe I am defining something a bit wrong or my understanding is still off - I am having a lot of fun trying to understand it, thanks for taking time to explain.

Kind regards

Yes, I see that you are having fun with that. This prototype is only a crude approximation of what I am trying to do with the Grassmann package. With the next update, the inner products you are trying to compute should be possible using the Grassmann instead of the Leibniz, then I would like to work on answering your question (I no longer want to support Leibniz but instead Grassmann will be the main software). It would be great if you could test and play around with this functionality after I finalize the differential operator implementation of Grassmann, I will try to have this feature available within the next few days or week… because I wanted to try this out also. It will take some testing to find all bugs.

Awesome - feel free to message me when you need someone to test, would love to contribute as well as I can, which is probably by testing :slight_smile:

Kind regards