Stripping type parameters

Is there a way to strip type parameters from a type. I have in mind something like this:

f(x::T{U}) where T where U = T

Usually you don’t need to. Do you have an example to illustrate a possible need for it?

Here’s a hack.

julia> f(x) = eval(x.name.name)
julia> f(Complex{Int64})
Complex

I have no idea why this works, but it does.

My use case is a bit complicated. I’m trying to build some code to generically find the maximum likelihood estimate for a distribution on a sample. The optimizer needs derivatives so I’m trying to use ReverseDiff. ReverseDiff doesn’t work on distributions which have been instantiated with specific types. For example, the following demonstrates simplified code that doesn’t work and the subsequent code (using the stripped Normal distribution) which does:

using ReverseDiff
using Distributions

n = Normal()
sample = rand(n,1000);
f(x) = loglikelihood( typeof(n)(x...), sample )
ReverseDiff.gradient( f, [params(n)...] ) # doesn't work; returns zeros

g(x) = loglikelihood( Normal(x...), sample )
ReverseDiff.gradient( g, [params(n)...] ) # works

Try

f(x) = loglikelihood( typeof(n).instance(x...), sample )

That didn’t work for me. I’m on v0.6:

julia> h(x) = loglikelihood( typeof(n).instance(x...), sample )
h (generic function with 1 method)

julia> h([params(n)...])
ERROR: UndefRefError: access to undefined reference
Stacktrace:
 [1] h(::Array{Float64,1}) at ./none:1

julia> 

Sorry about that. typeof(Normal()) is `Distributions.Normal{Float64}’ which is not a function.

I finally got this:

using ReverseDiff
using Distributions

n = Normal()
sample = rand(n,1000);
f(x) = loglikelihood( (typeof(n).name.wrapper)(x...), sample )

julia> ReverseDiff.gradient( f, [params(n)...] )
2-element Array{Float64,1}:
  2.29938
 35.7245 

Is that what you expected?