egobox-moe

Crates.ioegobox-moe
lib.rsegobox-moe
version0.32.0
created_at2022-04-12 13:41:51.133461+00
updated_at2025-08-22 13:49:13.625711+00
descriptionA library for mixture of expert gaussian processes
homepagehttps://github.com/relf/egobox
repositoryhttps://github.com/relf/egobox/crates/moe
max_upload_size
id566438
size209,467
Rémi Lafage (relf)

documentation

README

Mixture of experts

crates.io docs

egobox-moe provides a Rust implementation of mixture of experts algorithm. It is a Rust port of mixture of expert of the SMT Python library.

The big picture

egobox-moe is a library crate in the top-level package egobox.

Current state

egobox-moe currently implements mixture of gaussian processes provided by egobox-gp:

  • Clustering (linfa-clustering/gmm)
  • Hard recombination / Smooth recombination
  • Gaussian processe model choice: specify regression and correlation allowed models

Examples

There is some usage examples in the examples/ directory. To run, use:

$ cargo run --release --example clustering

License

Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0

Commit count: 498

cargo fmt