I’m using Convex.jl for convex optimisation.
When constructing a function that I need to optimise, I used to extend methods of Convex to make it compatible with other functions, for example,
# extend Convex method
function Convex.pos(x::Array)
return max.(x, 0.0)
end
Then, pos(x)
can also be used as relu
in Flux.
I wonder if it’s possible to extend “other functions” for this purpose and apply to element-wise operations. For example,
function Flux.relu(x::Convex.Variable)
# blahblah...
end
and
relu.(x) # x: array
My simple test seems not work:
julia> function Flux.relu(x::Convex.Variable)
return pos(x)
end
julia> x = Variable(3)
Variable
size: (3, 1)
sign: real
vexity: affine
id: 451…306
julia> Flux.relu(x)
max (convex; positive)
├─ 3-element real variable (id: 451…306)
└─ 0
julia> Flux.relu.(x)
ERROR: MethodError: no method matching zero(::Convex.IndexAtom)
Closest candidates are:
zero(::Type{Pkg.Resolve.VersionWeight}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/versionweights.jl:15
zero(::Type{Pkg.Resolve.FieldValue}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.5/Pkg/src/Resolve/fieldvalues.jl:38
zero(::Type{ModelingToolkit.TermCombination}) at /home/jinrae/.julia/packages/ModelingToolkit/1qEYb/src/linearity.jl:67
...
Stacktrace:
[1] relu(::Convex.IndexAtom) at /home/jinrae/.julia/packages/NNlib/2Wxlq/src/activation.jl:63
[2] _broadcast_getindex_evalf at ./broadcast.jl:648 [inlined]
[3] _broadcast_getindex at ./broadcast.jl:621 [inlined]
[4] getindex at ./broadcast.jl:575 [inlined]
[5] copy at ./broadcast.jl:876 [inlined]
[6] materialize(::Base.Broadcast.Broadcasted{Base.Broadcast.DefaultArrayStyle{1},Nothing,typeof(relu),Tuple{Array{Any,1}}}) at ./broadcast.jl:837
[7] top-level scope at REPL[12]:1