I think it’s important to understand what causes what. The issue discussed here does not originate in MLJ but is a Julia issue which is known, I’m just surprised not more people have encountered it, it’s bound to be fixed some day and in the mean time the work around should help.
Then there’s another question which is whether what you get via MLJ and via scilitlearn are different, this is an entirely different question, there’s basically no reason they should be different as it calls the exact same code. The only source of possible difference are the default hyperparameters which you do not set here. I aimed to have the same defaults as sklearn for everything but may have made a mistake in the hundreds of hps I came across.
Here you can help. Basically run your script feeding the same standardised data to both MLJ & scikitleaen. If there’s a difference it’s because of hyperparameters, not because MLJ does something weird.
Finally you should also be aware that some of these scikitlearn models may not be appropriate for how you use them, eg ARD, see my previous message. Basically, again, it would be useful to compare the exact input / output with MLJ and scikitlearn and flag the differences in an issue on MLJModels; it will narrow down what model may have improperly set HP if any.
Hope this makes sense