We are pleased to announce that we are publishing a fork of LightGBM.jl, a package originally authored by allardvm and modified for compatibility with Julia > 1 by wakakausa.
As of writing, the package is registered in Julia General registry, with version 0.2.1
] add LightGBM
The package is a C-FFI wrapper to Microsoft’s LightGBM software. This software is a machine learning library, specifically implementing variations of gradient boosted trees (and the tree implementations themselves also have variations).
This work is important to our group because we use LightGBM extensively for internal projects. As such, we intend to ensure this package is actively developed, kept up to date and maintained.
The improvements made to the package are as follows:
- Improved test coverage of the module; specifically a much broader range of the FFI entry points are tested
- Added a build step which downloads prebuilt LightGBM binary from Microsoft’s GitHub release page.
- Added GitHub actions setup to allow to verify that changes coming in work correctly for OSX, Ubuntu and Windows, and for all Julia versions >= 1.0
- Implemented an interface to the MLJ software ecosystem (which is also tested with help of Alan Turing Institute team). This allows LightGBM to be easily chained as part of larger machine learning pipelines
- Removed distinction between binary/multiclass classifiers
There are some missing features, such as that some users might wish to supply their own prebuilt binaries (for GPU acceleration for example). There is an issues page to track these and other items. There is not a specific roadmap for any of these features but they will be worked on as capacity allows.
We welcome and encourage contributions from the wider community, whether this is a bug report, a discussion on a feature or feature request, a pull request implementing a feature or fixing a bug, or simply a heads up that you’ve used or are using the package.
We are excited to become active contributors to the wider open source ecosystem and thank you for taking the time to read this. We would also like to thank the Alan Turing Institute team for their support with this project, in particular assistance with the MLJ integration.