Crates.io | egobox-moe |
lib.rs | egobox-moe |
version | |
source | src |
created_at | 2022-04-12 13:41:51.133461 |
updated_at | 2025-02-14 14:46:37.360817 |
description | A library for mixture of expert gaussian processes |
homepage | https://github.com/relf/egobox |
repository | https://github.com/relf/egobox/crates/moe |
max_upload_size | |
id | 566438 |
Cargo.toml error: | TOML parse error at line 18, column 1 | 18 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
egobox-moe
provides a Rust implementation of mixture of experts algorithm.
It is a Rust port of mixture of expert of the SMT Python library.
egobox-moe
is a library crate in the top-level package egobox.
egobox-moe
currently implements mixture of gaussian processes provided by egobox-gp
:
linfa-clustering/gmm
)There is some usage examples in the examples/ directory. To run, use:
$ cargo run --release --example clustering
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0