Hello, good afternoon.

There are a r2 utility function for glm GeneralizedLinearModel?

r2 only works with LinearModel.

Also, the coefficients from sklearn had some value differences from glm. Someone knows if sklear do some optimizations behind the scenes?

I don’t think there is an unambiguous definition of R² for a generalized linear model. The definition for a linear model depends strongly on the properties of a linear model. So the fact that there isn’t an extractor for R² from a GLM is probably a good thing.

The implementation of glm produces the maximum likelihood estimates of the coefficients. I don’t know what the properties of the estimates returned from scikit-learn are but my guess is that they have some sort of regularization applied by default.

3 Likes

OP perhaps you are confusing `lm`

(vanilla OLS) with `glm`

(an MLE estimator where errors are assumed normal)

Im following a book in python and programming the exercises in julia. Im got differences between sklearn and glm. The logistic curve is well fitted using sklearn. GLM is a bit more imprecise.

Hi. No. Im using glm in a logistic regression model. Its possible to acquire r2 from a logistic regression. I already calculated it with logfit and log data, but will be nice to have an utility for that.

It looks like you have to choose a particular form of the pseudo-R-squared. Try `r2(model, :McFadden)`

. See `?r2`

for more details.

1 Like

It works correctly. Thank you. Nice to see there are a symbol to make it works with glm models.

Thats it.