Hi, I was trying to build this loss function from this paper:

I wanted to see if it outperforms squared L2 norm in another regression problem I’m working on. Here is my attempt:

```
function berhu(x, y)
x = model(x)
loss = Tracker.collect(zeros(Float32, size(x)))
bound = 0.2*maximum(abs.(x-y))
inbound = abs.(x-y) .<= bound
loss[inbound] .= norm.((x-y)[inbound], 1)
loss[.!inbound] .= (((x-y)[.!inbound]).^2 .+ bound^2)./(2*bound)
return loss
end
```

It works as intended when I comment out the `x = model(x)`

line, with dummy variables. The problem when I run this is with `loss[inbound]`

, which was brought up in issue #93 on Flux repo.

Stack trace of error:

```
ERROR: Can't differentiate `setindex!`
Stacktrace:
[1] #setindex!#369(::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}, ::Function, ::TrackedArray{…,SubArray{Float32,1,Array{Float32,1},Tuple{Array{Int64,1}},false}}, ::Float64, ::Int64) at C:\Users\me\.julia\packages\Tracker\JhqMQ\src\lib\array.jl:65
[2] setindex!(::TrackedArray{…,SubArray{Float32,1,Array{Float32,1},Tuple{Array{Int64,1}},false}}, ::Float64, ::Int64) at C:\Users\me\.julia\packages\Tracker\JhqMQ\src\lib\array.jl:65
[3] macro expansion at .\broadcast.jl:843 [inlined]
[4] macro expansion at .\simdloop.jl:73 [inlined]
[5] copyto! at .\broadcast.jl:842 [inlined]
[6] copyto! at .\broadcast.jl:797 [inlined]
[7] materialize!(::TrackedArray{…,SubArray{Float32,1,Array{Float32,1},Tuple{Array{Int64,1}},false}}, ::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1},Nothing,typeof(norm),Tuple{Array{Float64,1},Int64}}) at .\broadcast.jl:756
[8] top-level scope at none:0
```

How can I implement this loss function correctly?