Crates.io | lightgbm3 |
lib.rs | lightgbm3 |
version | 1.0.2 |
source | src |
created_at | 2023-06-24 08:45:19.235553 |
updated_at | 2023-07-19 05:39:47.370909 |
description | Rust bindings for LightGBM library |
homepage | |
repository | https://github.com/Mottl/lightgbm3-rs |
max_upload_size | |
id | 898855 |
size | 93,949 |
lightgbm3
is based on lightgbm
(which is unsupported by now), but it is not back-compatible with it.
cargo add lightgbm3
Since lightgbm3
compiles LightGBM
from source, you also need to install development libraries:
apt install -y cmake clang libclang-dev libc++-dev gcc-multilib
brew install cmake
brew install libomp # only required if you compile with "openmp" feature
LIBCLANG_PATH
environment variable (i.e. C:\Program Files\LLVM\bin
)Please see below for details.
use lightgbm3::{Dataset, Booster};
use serde_json::json;
let features = vec![vec![1.0, 0.1, 0.2],
vec![0.7, 0.4, 0.5],
vec![0.9, 0.8, 0.5],
vec![0.2, 0.2, 0.8],
vec![0.1, 0.7, 1.0]];
let labels = vec![0.0, 0.0, 0.0, 1.0, 1.0];
let dataset = Dataset::from_vec_of_vec(features, labels, true).unwrap();
let params = json!{
{
"num_iterations": 10,
"objective": "binary",
"metric": "auc",
}
};
let bst = Booster::train(dataset, ¶ms).unwrap();
bst.save_file("path/to/model.lgb").unwrap();
use lightgbm3::{Dataset, Booster};
let bst = Booster::from_file("path/to/model.lgb").unwrap();
let features = vec![1.0, 2.0, -5.0];
let n_features = features.len();
let y_pred = bst.predict_with_params(&features, n_features as i32, true, "num_threads=1").unwrap()[0];
Look in the ./examples/
folder for more details:
lightgbm3
supports the following features:
polars
for polars supportopenmp
for MPI supportgpu
for GPU supportcuda
for experimental CUDA supportcargo bench
Add --features=openmp
, --features=gpu
and --features=cuda
appropriately.
git clone --recursive https://github.com/Mottl/lightgbm3-rs.git
Great respect to vaaaaanquish for the LightGBM Rust package, which unfortunately no longer supported.
Much reference was made to implementation and documentation. Thanks.