Hello all, I have a use case where I need to use some custom layers in a Flux.Chain
that depend on the initial vector passed to the chain (which changes). I can construct the chain easily as per below (with some dummy layers just for illustrative purposes), but can’t Flux.destructure
on the chain:
import Flux
n = 10
L1 = Flux.Dense(n, n, tanh)
L2 = Flux.Dense(n, 1, tanh)
cust_chain(v) = Flux.Chain(L1, x -> v .* x, L2)
### test the code
test_in = rand(n,)
cust_chain(test_in)(test_in)
### so far so good
Flux.destructure(cust_chain)
ERROR: ArgumentError: reducing over an empty collection is not allowed │ _ _ _(_)_ | Documentation: https:/
Stacktrace: │ (_) | (_) (_) |
[1] _empty_reduce_error() │ _ _ _| |_ __ _ | Type "?" for help, "]?
@ Base ./reduce.jl:301 │ | | | | | | |/ _` | |
[2] mapreduce_empty(f::Function, op::Function, T::Type) │ | | |_| | | | (_| | | Version 1.7.1 (2021-12
@ Base ./reduce.jl:344 │ _/ |\__'_|_|_|\__'_| | Official https://julia
[3] reduce_empty(op::Base.MappingRF{typeof(eltype), typeof(promote_type)}, #unused#::Type{AbstractVector}) │|__/ |
@ Base ./reduce.jl:331 │
[4] reduce_empty_iter │julia> int(5.1)
@ ./reduce.jl:357 [inlined] │ERROR: UndefVarError: int not defined
[5] mapreduce_empty_iter(f::Function, op::Function, itr::Vector{AbstractVector}, ItrEltype::Base.HasEltype) │Stacktrace:
@ Base ./reduce.jl:353 │ [1] top-level scope
[6] _mapreduce(f::typeof(eltype), op::typeof(promote_type), #unused#::IndexLinear, A::Vector{AbstractVector}) │ @ REPL[1]:1
@ Base ./reduce.jl:402 │
[7] _mapreduce_dim │julia> integer(5.1)
@ ./reducedim.jl:330 [inlined] │ERROR: UndefVarError: integer not defined
[8] #mapreduce#725 │Stacktrace:
@ ./reducedim.jl:322 [inlined] │ [1] top-level scope
[9] mapreduce │ @ REPL[2]:1
@ ./reducedim.jl:322 [inlined] │
[10] reduce(#unused#::typeof(vcat), A::Vector{AbstractVector}) │julia> Int(5.1)
@ Base ./abstractarray.jl:1621 │ERROR: InexactError: Int64(5.1)
[11] _flatten │Stacktrace:
@ ~/.julia/packages/Optimisers/cLLV1/src/destructure.jl:67 [inlined] │ [1] Int64(x::Float64)
[12] destructure(x::Function) │ @ Base ./float.jl:812
@ Optimisers ~/.julia/packages/Optimisers/cLLV1/src/destructure.jl:22 │ [2] top-level scope
[13] top-level scope │ @ REPL[3]:1
@ REPL[68]:1 │
[14] top-level scope │julia> Int64(5.1)
@ ~/.julia/packages/CUDA/Uurn4/src/initialization.jl:52
Is there any way I can either bypass this and code it differently or destructure the chain in some other way? I want to use the chain and optimize its parameters via DiffEqFlux.sciml_train
.