julia: 0^0 = 1
Python: 0^0 = 0
Who is right? A plot shows where Python falls down:
Y = [x^x for x in 1:-.0001:.0001] X = [x for x in 1:-.0001:.0001] using Plots plot(X,Y,title="Death curve of Python") println(minimum(Y)) println(Y[end])
x^x does indeed get smaller as x gets smaller, as Python thinks, but then starts to get larger as x gets even smaller, at around x^x = .699
You can see this with Calculus, but after computing reaches the smallest Planck limit ,and it’s obvious this is a digital universe, Calculus will only be taught in history classes, alongside other myths like ancient religions. Newton pulled the wool over everyone’s eyes in order to get where he had to. But millions of math freshmen have suffered from the fantasy of infinitesimals. We will all soon be free of them.