Backprop error: Complex numbers in neuron weights?


#1

Any thoughts on how I may have introduced complex numbers into the algorithm by accident?
Ah it seems to have disappearred now =/ but it was an error like

minimum(matrixX) ^ 0.2
Domain Error: Complex result requires a complex argument
[1] in math.jl line 300
[2] in math.jl line 699


#2

error-capture
I thought it might be a complex number hiding in the dataset, but it doesn’t seem to be that either


#3

there’s confusing parsing problem going on in your example. -2.68e15^0.2 is parsed as -(2.68e15^0.2) which exists. Try (-2.68e15)^0.2 and it’ll error. (-2)^0.2 is a much easier example. Fractional exponentiation is defined by the complex log function. The easiest example to think of here is that (-1)^0.5 = sqrt(-1) = im. So in this domain it cannot be guaranteed to be a real number. It gets even worse because exponentials/logs actually don’t always have unique solutions, so you have to take principle branches and stuff like that. Here’s a blurb:

But long story short, negative numbers to exponents <1 are generally defined as complex valued (via applying the complex exponential defined via the principle branch of the complex log), which gives this trouble.