# Custom Flux loss function culiteral_pow() error

Hi, I’m trying to build a MLP for a regression problem. A sample of my data looks like this:
x= \left[ {\begin{array}{c} \hat{m_1}\\ \vdots \\ \hat{m_k}\\ \hat{n_1}\\ \vdots \\ \hat{n_k}\\ P \\ Q \\ \end{array} } \right] , where P, Q similarly goes from 1 to k.
A sample of my target looks like this:
y= \left[ {\begin{array}{c} m_1 \\ \vdots \\ m_k \\ n_1 \\ \vdots \\ n_k \\ \end{array} } \right] .

For this particular problem, \hat{m_i} and m_i are very close before any training, but not \hat{n_i} and n_i. There is another model where I’m passing in the final test set prediction, and model(\hat{n_i}) needs to be as close to n_i as possible, and I don’t care so much for \hat{m_i} and m_i (obviously, I want them to be close, but it’s not as important). So, I tried modifying the plain MSE loss into this:

loss = Flux.mse(model(x)[1:k, :], y[1:k, :]) + λ*Flux.mse(model(x)[k+1:end, :], y[k+1:end, :])


, where \lambda is greater than 1 and subject to tuning.

However, I’m getting the following error:

ERROR: MethodError: no method matching culiteral_pow(::typeof(^), ::ForwardDiff.Dual{Nothing,Float32,1}, ::Val{2})
Closest candidates are:
culiteral_pow(::typeof(^), ::Union{Float32, Float64}, ::Val{2}) at C:\Users\me\.julia\packages\CuArrays\eFBar\src\broadcast.jl:46
culiteral_pow(::typeof(^), ::Union{Float32, Float64}, ::Val{p}) where p at C:\Users\me\.julia\packages\CuArrays\eFBar\src\broadcast.jl:48
culiteral_pow(::typeof(^), ::Union{Float32, Float64}, ::Val{0}) at C:\Users\me\.julia\packages\CuArrays\eFBar\src\broadcast.jl:44
...
Stacktrace:
[12] #547 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\lib\array.jl:488 [inlined]
[13] macro expansion at .\sysimg.jl:300 [inlined]
[14] ntuple at .\sysimg.jl:296 [inlined]
[17] back(::Tracker.Tracked{CuArray{Float32,2}}, ::CuArray{Float64,2}, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:58
[18] #13 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38 [inlined]
[19] foreach at .\abstractarray.jl:1867 [inlined]
[20] back_(::Tracker.Call{getfield(Tracker, Symbol("##482#483")){TrackedArray{…,CuArray{Float32,2}}},Tuple{Tracker.Tracked{CuArray{Float32,2}}}}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:8
[21] back(::Tracker.Tracked{Float32}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:58
[22] #13 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38 [inlined]
[23] foreach at .\abstractarray.jl:1867 [inlined]
[24] back_(::Tracker.Call{getfield(Tracker, Symbol("##278#281")){Rational{Int64}},Tuple{Tracker.Tracked{Float32},Nothing}}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38
[25] back(::Tracker.Tracked{Float32}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:58
[26] #13 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38 [inlined]
[27] foreach at .\abstractarray.jl:1867 [inlined]
[28] back_(::Tracker.Call{getfield(Tracker, Symbol("##279#282")){Float64},Tuple{Nothing,Tracker.Tracked{Float32}}}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38
[29] back(::Tracker.Tracked{Float64}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:58
[30] #13 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38 [inlined]
[31] foreach at .\abstractarray.jl:1867 [inlined]
[32] back_(::Tracker.Call{getfield(Tracker, Symbol("##253#256")),Tuple{Tracker.Tracked{Float64},Tracker.Tracked{Float32}}}, ::Float64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:38
[33] back(::Tracker.Tracked{Float64}, ::Int64, ::Bool) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:58
[34] #back!#15 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:77 [inlined]
[35] #back! at .\none:0 [inlined]
[36] #back!#32 at C:\Users\me\.julia\packages\Tracker\RRYy6\src\lib\real.jl:16 [inlined]
[37] back!(::Tracker.TrackedReal{Float64}) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\lib\real.jl:14
[38] gradient_(::getfield(Flux.Optimise, Symbol("##14#20")){getfield(Main, Symbol("#loss#23")){Float64},Tuple{CuArray{Float32,2},CuArray{Float32,2}}}, ::Tracker.Params) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\backjl:4
[39] #gradient#24(::Bool, ::Function, ::Function, ::Tracker.Params) at C:\Users\me\.julia\packages\Tracker\RRYy6\src\back.jl:164
[41] macro expansion at C:\Users\me\.julia\packages\Flux\qXNjB\src\optimise\train.jl:71 [inlined]
[42] macro expansion at C:\Users\me\.julia\packages\Juno\Drrdg\src\progress.jl:119 [inlined]
[43] #train!#12(::getfield(Flux.Optimise, Symbol("##16#22")), ::Function, ::Function, ::Tracker.Params, ::Array{Tuple{CuArray{Float32,2},CuArray{Float32,2}},1}, ::ADAM) at C:\Users\me\.julia\packages\Flux\qXNjB\src\optimise\train.jl:69
[44] train!(::Function, ::Tracker.Params, ::Array{Tuple{CuArray{Float32,2},CuArray{Float32,2}},1}, ::ADAM) at C:\Users\me\.julia\packages\Flux\qXNjB\src\optimise\train.jl:67
[45] train_net(::String, ::Array{Float32,2}, ::Array{Float32,2}, ::Float64, ::Int64, ::Int64, ::Float64, ::Int64, ::Int64, ::Bool) at F:\work\Merge\train_nc.jl:127
[46] train_net(::String, ::Array{Float32,2}, ::Array{Float32,2}, ::Float64, ::Int64, ::Int64) at F:\work\Merge\train_nc.jl:52
[47] main(::Array{String,1}) at .\util.jl:156
[48] top-level scope at none:0


I’d appreciate any suggestions on how to solve this, and/or a better custom loss function.

1 Like

Found a corresponding issue on GitHub:

Looks like there is a fix but it hasn’t been merged yet.

1 Like

I actually saw these earlier too, but thanks!

Do you know if my new loss definition is correct though? I’m not sure if I can use regular array indexing on that.

1 Like