Axes not working on reshaped array?

Hi,
I’m confused about this behavior of axes:

using Distributions
using FillArrays
using LinearAlgebra
Nz = 5
My = 10
a = reshape(MvNormal(Fill(0.5, My*Nz), 0.01 * I), (My, Nz))
axes(a)
axes(a,1)

Calling axes on a works fine, but raises a MethodError when specifying the dimensions:

julia> axes(a)
(Base.OneTo(10), Base.OneTo(5))

julia> axes(a,1)
ERROR: MethodError: no method matching axes(::MatrixReshaped{Continuous, MvNormal{Float64, PDMats.ScalMat{Float64}, Fill{Float64, 1, Tuple{…}}}}, ::Int64)

Closest candidates are:
  axes(::Core.SimpleVector, ::Integer)
   @ Base essentials.jl:784
  axes(::SparseArrays.ReadOnly, ::Any...)
   @ SparseArrays ~/.julia/juliaup/julia-1.10.4+0.x64.linux.gnu/share/julia/stdlib/v1.10/SparseArrays/src/readonly.jl:20
  axes(::Base.Broadcast.Broadcasted{<:Any, <:Tuple{Vararg{T, N}} where T}, ::Integer) where N
   @ Base broadcast.jl:239
  ...

Stacktrace:
 [1] top-level scope
   @ REPL[4]:1
Some type information was truncated. Use `show(err)` to see complete types.

The problem is that the type of a is a MatrixReshaped, which despite the name is not an ordinary reshaped array, it is defined by Distributions.jl to be a subtype of Distribution, and in particular it is not a subtype of AbstractArray.

axes(a) works because the default axes(a) method in Base works for any type of argument that defines size(a), whereas the default axes(a,d) method requires a::AbstractArray (maybe because it does the special AbstractArray thing where d can be >= ndims(a) in which case it allows indices of 1).

Possible solutions:

  1. add a method Base.axes(a::MatrixReshaped, d::Integer) = axes(a)[d] to Distributions.jl — update, filed add missing axes(::ReshapedDistribution, k) method by stevengj · Pull Request #1892 · JuliaStats/Distributions.jl · GitHub
  2. add a fallback method axes(A, d::Integer) = axes(A)[d] to Base

Option (1) will be quicker to get in the short term, and is a good thing anyway. Basically, if Distributions.jl wants to define its own “array-ish” type which is not an AbstractArray, they are responsible for defining the applicable array methods.

PS. The fact that the type is printed as MatrixReshaped at all is somewhat deceptive — this is a deprecated alias for ReshapedDistribution. Possibly Julia’s type printing chould be improved so that it doesn’t show deprecated names. Update looks like this works if Distributions.jl uses Base.@deprecate_binding, so I’ve filed a PR: mark the MatrixReshaped type as deprecated by stevengj · Pull Request #1893 · JuliaStats/Distributions.jl · GitHub

1 Like

Thank you for clarifying. My confusion arises from trying to optimize a Turing model using this tip.

In my model I need to calculate a response function like

response(θ, sa) = sum(θ .* sa)

where sa is a Vector.

When I define my model as

@model function forward_model(signal,  sa_matrix)

    θ_guess = Array{Float64}(undef, My*Nz)
    for ix ∈ eachindex(θ_guess)
        θ_guess[ix] ~ Uniform(0.0, 1.0)
    end

    # Set variance prior
    σ² ~ truncated(Normal(0.0, 0.01); lower=0)

    μ = [response(θ_guess, s) for s ∈ eachslice(sa, dims=1)]
    signal ~ MvNormal(μ, σ² * I)
end

it works fine. The indexing errors come up when I try to use a Multivariate distribution:

θ_guess = MvNormal(Fill(0.5, My*Nz), 0.01 * I)

Does it work if you define the missing method

Base.axes(d::ReshapedDistribution, k::Integer) = Base.oneto(d.dims[k])

?

It should work if you use

θ_guess ~ MvNormal(Fill(0.5, My*Nz), 0.01 * I)

Otherwise Turing does not sample θ_guess but only sets it to the distribution - which then leads to the errors.

1 Like

@stevengj Yes, that works. Thanks!

@devmotion You’re right, it works. I probably used an older definition of the function in my repl session loaded as I was rewriting my code.

A minor comment: this can be written as
response(θ, sa) = dot(θ, sa)
Which avoids the intermediate allocation.

1 Like