Crates.io | xgboost |
lib.rs | xgboost |
version | 0.1.4 |
source | src |
created_at | 2018-10-01 14:40:16.525725 |
updated_at | 2019-03-05 14:41:09.792364 |
description | Machine learning using XGBoost |
homepage | https://github.com/davechallis/rust-xgboost |
repository | https://github.com/davechallis/rust-xgboost |
max_upload_size | |
id | 87421 |
size | 114,103 |
Rust bindings for the XGBoost gradient boosting library.
Basic usage example:
extern crate xgboost;
use xgboost::{parameters, dmatrix::DMatrix, booster::Booster};
fn main() {
// training matrix with 5 training examples and 3 features
let x_train = &[1.0, 1.0, 1.0,
1.0, 1.0, 0.0,
1.0, 1.0, 1.0,
0.0, 0.0, 0.0,
1.0, 1.0, 1.0];
let num_rows = 5;
let y_train = &[1.0, 1.0, 1.0, 0.0, 1.0];
// convert training data into XGBoost's matrix format
let mut dtrain = DMatrix::from_dense(x_train, num_rows).unwrap();
// set ground truth labels for the training matrix
dtrain.set_labels(y_train).unwrap();
// test matrix with 1 row
let x_test = &[0.7, 0.9, 0.6];
let num_rows = 1;
let y_test = &[1.0];
let mut dtest = DMatrix::from_dense(x_test, num_rows).unwrap();
dtest.set_labels(y_test).unwrap();
// build overall training parameters
let params = parameters::ParametersBuilder::default().build().unwrap();
// specify datasets to evaluate against during training
let evaluation_sets = &[(&dtrain, "train"), (&dtest, "test")];
// train model, and print evaluation data
let bst = Booster::train(¶ms, &dtrain, 3, evaluation_sets).unwrap();
println!("{:?}", bst.predict(&dtest).unwrap());
}
See the examples directory for more detailed examples of different features.
Currently in a very early stage of development, so the API is changing as usability issues occur, or new features are supported.
Builds against XGBoost 0.81.
Tested:
Unsupported: