Hi, I am wondering if there is any package which has a 100% Julia “learning to rank” feature besides XGBoost.jl? (my dataset would be too wide if I did that).
I have checked MLRanking.jl but this is deprecated by 6 years.
LightGBM.jl hasn’t imported this feature yet.
CatBoost.jl just uses PyCall to run Conda to do it.
JLBoost.jl is deprecated (2020) and doesn’t work with DataFrames 1.0
Beware that “learning to rank” can have another meaning: in recent papers, it has been used to describe models which output permutations. I guess you mean ranking features according to their importance?
Yes, something like lightgbm.LGBMRanker() but for Julia.
The most complete list of Julia ML models that I know is hosted by MLJ: List of Supported Models · MLJ
Perhaps one of them has the functionality you’re looking for?
Thanks, I’ll go through it - though just by using the search/find feature I don’t see anything “rank” related.
You won’t necessarily find rank in the name, for instance random forests are known to provide feature rankings in a natural way
I am not sure @Billpete002 means feature ranking.
I I get the example on StackOverflow correctly, lightgbm.LGBMRanker is about Learning to Rank.
I took “features” as the results - but yes I mean LTR as per your wiki link.
Alright, this changes things! I don’t know if it is exactly what you want but my current research project allows you to take a non-differentiable operator, such as
ranking(x) = invperm(sortperm(x))
and insert it into a differentiable Flux model. See https://github.com/axelparmentier/InferOpt.jl for more details