Crates.io | egobox-moe |
lib.rs | egobox-moe |
version | 0.24.0 |
source | src |
created_at | 2022-04-12 13:41:51.133461 |
updated_at | 2024-11-12 16:06:07.31017 |
description | A library for mixture of expert gaussian processes |
homepage | |
repository | https://github.com/relf/egobox |
max_upload_size | |
id | 566438 |
size | 197,809 |
egobox-moe
provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
egobox-moe
is a library crate in the top-level package egobox.
egobox-moe
currently implements mixture of gaussian processes provided by egobox-gp
:
linfa-clustering/gmm
)There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0