| Crates.io | rhmm |
| lib.rs | rhmm |
| version | 0.0.2 |
| created_at | 2026-01-19 12:05:02.301736+00 |
| updated_at | 2026-01-22 09:37:14.039664+00 |
| description | A rust implementation of hidden markov models. |
| homepage | |
| repository | https://github.com/unico-serein/rhmm |
| max_upload_size | |
| id | 2054438 |
| size | 140,896 |
A Rust library for Hidden Markov Models (HMM), inspired by Python's hmmlearn. This library provides efficient implementations of various HMM models and algorithms using ndarray for numerical computations.
Multiple HMM Model Types
Standard HMM Algorithms
Efficient Implementation
ndarray for fast numerical operationsAdd this to your Cargo.toml:
[dependencies]
rhmm = "0.0.1"
Or install from source:
git clone https://github.com/yourusername/rhmm.git
cd rhmm
cargo build --release
ndarray - N-dimensional arraysndarray-linalg - Linear algebra operationsrand - Random number generationrand_distr - Probability distributionsthiserror - Error handlingserde - Serialization supportuse ndarray::array;
use rhmm::models::GaussianHMM;
use rhmm::base::HiddenMarkovModel;
fn main() {
// Create training data
let observations = array![
[0.5, 1.0],
[0.6, 1.1],
[5.0, 6.0],
[5.1, 6.2],
];
// Create and train model with 2 hidden states
let mut model = GaussianHMM::new(2);
model.fit(&observations, None).unwrap();
// Predict hidden states
let states = model.predict(&observations).unwrap();
println!("Predicted states: {:?}", states);
// Calculate log-likelihood
let log_prob = model.score(&observations).unwrap();
println!("Log probability: {:.4}", log_prob);
// Generate synthetic data
let (sampled_obs, sampled_states) = model.sample(10).unwrap();
println!("Generated {} samples", sampled_obs.nrows());
}
use ndarray::array;
use rhmm::models::BetaHMM;
use rhmm::base::HiddenMarkovModel;
fn main() {
// Conversion rates (values between 0 and 1)
let observations = array![
[0.12, 0.15], // Low conversion
[0.10, 0.13], // Low conversion
[0.75, 0.82], // High conversion
[0.78, 0.85], // High conversion
];
// Create and train model
let mut model = BetaHMM::new(2);
model.fit(&observations, None).unwrap();
// Predict states
let states = model.predict(&observations).unwrap();
println!("States: {:?}", states);
// Get learned parameters
if let (Some(alphas), Some(betas)) = (model.alphas(), model.betas()) {
println!("Alpha parameters: {:?}", alphas);
println!("Beta parameters: {:?}", betas);
}
}
HiddenMarkovModelAll HMM models implement this trait:
pub trait HiddenMarkovModel {
/// Get the number of hidden states
fn n_states(&self) -> usize;
/// Get the number of features/dimensions
fn n_features(&self) -> usize;
/// Fit the model to observed data
fn fit(&mut self, observations: &Array2<f64>, lengths: Option<&[usize]>) -> Result<()>;
/// Predict the most likely state sequence (Viterbi)
fn predict(&self, observations: &Array2<f64>) -> Result<Array1<usize>>;
/// Compute the log probability of observations
fn score(&self, observations: &Array2<f64>) -> Result<f64>;
/// Sample from the model
fn sample(&self, n_samples: usize) -> Result<(Array2<f64>, Array1<usize>)>;
/// Decode the most likely state sequence
fn decode(&self, observations: &Array2<f64>) -> Result<(f64, Array1<usize>)>;
}
let model = GaussianHMM::new(n_states);
let model = GaussianHMM::with_covariance_type(n_states, CovarianceType::Diagonal);
let model = BetaHMM::new(n_states);
let model = MultinomialHMM::new(n_states, n_features);
let model = GaussianMixtureHMM::new(n_states, n_mix);
Run the included examples:
# Beta HMM example (conversion rate analysis)
cargo run --example beta_hmm_example
# Polars integration example
cargo run --example polars_example
Run the test suite:
# Run all tests
cargo test
# Run with output
cargo test -- --nocapture
# Run specific test
cargo test integration_tests
The library is optimized for performance:
ndarray for efficient numerical operationsTrain on multiple sequences of different lengths:
let observations = array![/* concatenated sequences */];
let lengths = vec![10, 15, 20]; // Length of each sequence
model.fit(&observations, Some(&lengths)).unwrap();
let mut model = GaussianHMM::new(3);
// Set custom initial parameters before fitting
// model.set_start_prob(...);
// model.set_transition_matrix(...);
model.fit(&observations, None).unwrap();
use rhmm::base::CovarianceType;
// Diagonal covariance (default)
let model = GaussianHMM::with_covariance_type(3, CovarianceType::Diagonal);
// Spherical covariance (single variance)
let model = GaussianHMM::with_covariance_type(3, CovarianceType::Spherical);
// Full covariance matrix
let model = GaussianHMM::with_covariance_type(3, CovarianceType::Full);
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
git checkout -b feature/AmazingFeature)git commit -m 'Add some AmazingFeature')git push origin feature/AmazingFeature)This project is licensed under the MIT License - see the LICENSE file for details.
Star β this repository if you find it helpful!