Symbolic random variables and distributions

Hi, I’m not sure how to phrase this question, so please bear with me:

I’ve written a function f which takes an input x and returns a vector [p_1(x),...,p_K(x)], which correspond to the first K orthogonal polynomials corresponding to some distribution \mu. I can check that my implementation is right by sampling many X_i \sim \mu and checking that the sample average and covariance matrix are roughly the zero vector and the identity matrix, respectively.

I was wondering if there is a more systematic way of doing this, where I’m willing to restrict myself to distributions in Distributions.jl. My first thought was the following, which I’ll explain informally: I figured you could define a symbolic variable X and pass this to f(X) to get a vector of symbolic polynomials [p_1(x),...,p_K(x)], for which you’d then have to compute the expected value of each polynomial term and simplify; you could get the variance-covariance matrix in a similar way, as this should be the matrix of second moments.

I’m curious to hear your thoughts!