| Crates.io | sklears-covariance |
| lib.rs | sklears-covariance |
| version | 0.1.0-beta.1 |
| created_at | 2025-10-13 14:53:11.162256+00 |
| updated_at | 2026-01-01 21:36:26.967599+00 |
| description | Covariance estimation algorithms |
| homepage | https://github.com/cool-japan/sklears |
| repository | https://github.com/cool-japan/sklears |
| max_upload_size | |
| id | 1880627 |
| size | 1,819,699 |
Latest release:
0.1.0-beta.1(January 1, 2026). See the workspace release notes for highlights and upgrade guidance.
sklears-covariance is the most comprehensive covariance estimation library available in any programming language, featuring 90+ algorithms ranging from classic statistical methods to cutting-edge quantum-inspired and privacy-preserving techniques. Built in Rust for maximum performance, it provides 5-20x speed improvements over scikit-learn while maintaining API compatibility.
use sklears_covariance::{EmpiricalCovariance, LedoitWolf};
use sklears_core::traits::Fit;
use scirs2_core::ndarray::Array2;
// Generate or load your data
let X = Array2::from_shape_vec((100, 10), (0..1000).map(|x| x as f64).collect())?;
// Empirical covariance estimation
let empirical = EmpiricalCovariance::new();
let fitted = empirical.fit(&X.view(), &())?;
let cov = fitted.get_covariance();
// Shrinkage estimation with Ledoit-Wolf
let lw = LedoitWolf::new();
let fitted_lw = lw.fit(&X.view(), &())?;
let shrunk_cov = fitted_lw.get_covariance();
let shrinkage = fitted_lw.get_shrinkage();
use sklears_covariance::{CovarianceDataFrame, DataFrameEstimator, LedoitWolf};
// Create DataFrame with metadata
let df = CovarianceDataFrame::new(
data,
vec!["feature1".to_string(), "feature2".to_string()],
None
)?;
// Fit estimator with DataFrame
let estimator = LedoitWolf::new();
let result = estimator.fit_dataframe(&df)?;
// Rich result with feature names and metadata
println!("Covariance shape: {:?}", result.covariance.shape());
println!("Feature names: {:?}", result.feature_names);
println!("Estimator: {}", result.estimator_info.name);
use sklears_covariance::{
CovarianceHyperparameterTuner, ParameterSpec, ScoringMethod,
SearchStrategy, TuningConfig
};
// Configure hyperparameter tuning
let config = TuningConfig {
n_cv_folds: 5,
scoring: ScoringMethod::LogLikelihood,
search_strategy: SearchStrategy::BayesianOptimization { n_initial: 10, n_iter: 50 },
..Default::default()
};
// Define parameter space
let params = vec![
ParameterSpec::Continuous {
name: "alpha".to_string(),
low: 0.01,
high: 1.0,
},
];
// Run tuning
let tuner = CovarianceHyperparameterTuner::new(config);
let result = tuner.tune(&X, params)?;
println!("Best parameters: {:?}", result.best_params);
println!("Best score: {}", result.best_score);
use sklears_covariance::{AutoCovarianceSelector, model_selection_presets};
// Use preset selector for high-dimensional data
let selector = model_selection_presets::high_dimensional_selector();
// Or create custom selector
let selector = AutoCovarianceSelector::builder()
.add_estimator("EmpiricalCovariance", |data| { /* factory fn */ })
.add_estimator("LedoitWolf", |data| { /* factory fn */ })
.add_estimator("GraphicalLasso", |data| { /* factory fn */ })
.build();
// Select best model
let result = selector.select(&X)?;
println!("Selected: {}", result.best_estimator);
println!("Reason: {}", result.selection_reason);
Check out the comprehensive examples in the examples/ directory:
advanced_covariance_analysis.rs: Complete pipeline with matrix analysis, benchmarking, and cross-validationpolars_dataframe_demo.rs: DataFrame integration with financial data analysiscovariance_hyperparameter_tuning_demo.rs: Advanced hyperparameter optimization strategiesautomatic_model_selection_demo.rs: Intelligent model selection with data characterizationcomprehensive_cookbook.rs: 6 complete recipes from quick start to production deploymentEmpiricalCovariance, ShrunkCovariance, LedoitWolf, OAS, Rao-Blackwell Ledoit-Wolf, Chen-Stein, Nonlinear Shrinkage, Rotation-Equivariant Shrinkage
MinCovDet, FastMCD, EllipticEnvelope, Huber, Ridge, Lasso, Elastic Net, Adaptive Lasso, Group Lasso
GraphicalLasso, CLIME, Neighborhood Selection, SPACE, TIGER, BigQUIC, Robust PCA, Low-Rank + Sparse
PCA (6 variants), ICA (4 algorithms), NMF (5 algorithms), Sparse Factor Models, Factor Analysis
EM for Missing Data, IPF, Alternating Projections (5 variants), Frank-Wolfe (6 variants), Coordinate Descent
Bayesian (5 methods), Time-Varying (5 methods), Non-parametric (8 methods)
Differential Privacy, Information Theory, Meta-Learning, Quantum-inspired, Federated Learning, Adversarial Robustness
Contributions are welcome! See the main sklears repository for contribution guidelines.
Licensed under either of Apache License, Version 2.0 or MIT license at your option.