Stripping type parameters


#1

Is there a way to strip type parameters from a type. I have in mind something like this:

f(x::T{U}) where T where U = T

#2

Usually you don’t need to. Do you have an example to illustrate a possible need for it?


#3

Here’s a hack.

julia> f(x) = eval(x.name.name)
julia> f(Complex{Int64})
Complex

I have no idea why this works, but it does.


#4

My use case is a bit complicated. I’m trying to build some code to generically find the maximum likelihood estimate for a distribution on a sample. The optimizer needs derivatives so I’m trying to use ReverseDiff. ReverseDiff doesn’t work on distributions which have been instantiated with specific types. For example, the following demonstrates simplified code that doesn’t work and the subsequent code (using the stripped Normal distribution) which does:

using ReverseDiff
using Distributions

n = Normal()
sample = rand(n,1000);
f(x) = loglikelihood( typeof(n)(x...), sample )
ReverseDiff.gradient( f, [params(n)...] ) # doesn't work; returns zeros

g(x) = loglikelihood( Normal(x...), sample )
ReverseDiff.gradient( g, [params(n)...] ) # works

#5

Try

f(x) = loglikelihood( typeof(n).instance(x...), sample )

#6

That didn’t work for me. I’m on v0.6:

julia> h(x) = loglikelihood( typeof(n).instance(x...), sample )
h (generic function with 1 method)

julia> h([params(n)...])
ERROR: UndefRefError: access to undefined reference
Stacktrace:
 [1] h(::Array{Float64,1}) at ./none:1

julia> 

#7

Sorry about that. typeof(Normal()) is `Distributions.Normal{Float64}’ which is not a function.


#8

I finally got this:

using ReverseDiff
using Distributions

n = Normal()
sample = rand(n,1000);
f(x) = loglikelihood( (typeof(n).name.wrapper)(x...), sample )

julia> ReverseDiff.gradient( f, [params(n)...] )
2-element Array{Float64,1}:
  2.29938
 35.7245 

Is that what you expected?