| Crates.io | amari-optimization |
| lib.rs | amari-optimization |
| version | 0.17.0 |
| created_at | 2025-10-22 01:07:17.567253+00 |
| updated_at | 2026-01-11 22:47:20.063605+00 |
| description | Geometric optimization algorithms leveraging Amari's Tropical-Dual-Clifford fusion |
| homepage | https://github.com/justinelliottcobb/Amari |
| repository | https://github.com/justinelliottcobb/Amari |
| max_upload_size | |
| id | 1894738 |
| size | 288,721 |
Advanced optimization algorithms for geometric computing with information geometry, tropical algebra, and multi-objective optimization.
Amari Optimization provides state-of-the-art optimization algorithms designed for mathematical and scientific computing. The library emphasizes type safety, performance, and mathematical rigor with compile-time verification through phantom types.
Add to your Cargo.toml:
[dependencies]
amari-optimization = "0.9.7-1"
use amari_optimization::prelude::*;
// Natural gradient optimization
let config = NaturalGradientConfig::default();
let optimizer = NaturalGradientOptimizer::new(config);
// Multi-objective optimization
let nsga2_config = MultiObjectiveConfig::default();
let nsga2 = NsgaII::new(nsga2_config);
// Tropical optimization
let tropical_optimizer = TropicalOptimizer::with_default_config();
use amari_optimization::prelude::*;
struct GaussianMLE {
data: Vec<f64>,
}
impl ObjectiveWithFisher<f64> for GaussianMLE {
fn evaluate(&self, params: &[f64]) -> f64 {
// Negative log-likelihood
let mu = params[0];
let log_sigma = params[1];
// ... implementation
}
fn gradient(&self, params: &[f64]) -> Vec<f64> {
// Gradient computation
// ... implementation
}
fn fisher_information(&self, params: &[f64]) -> Vec<Vec<f64>> {
// Fisher information matrix
// ... implementation
}
}
fn main() -> Result<(), Box<dyn std::error::Error>> {
let data = vec![1.0, 2.0, 3.0, 2.5, 1.8]; // Your data
let objective = GaussianMLE { data };
let config = NaturalGradientConfig::default();
let optimizer = NaturalGradientOptimizer::new(config);
// Create type-safe problem specification
use amari_optimization::phantom::{Statistical, NonConvex, SingleObjective, Unconstrained};
let problem: OptimizationProblem<2, Unconstrained, SingleObjective, NonConvex, Statistical> =
OptimizationProblem::new();
let initial_params = vec![0.0, 0.0]; // mu, log_sigma
let result = optimizer.optimize_statistical(&problem, &objective, initial_params)?;
println!("Converged: {}", result.converged);
println!("Final parameters: {:?}", result.parameters);
Ok(())
}
use amari_optimization::prelude::*;
struct EngineeringDesign;
impl MultiObjectiveFunction<f64> for EngineeringDesign {
fn num_objectives(&self) -> usize { 3 }
fn num_variables(&self) -> usize { 3 }
fn evaluate(&self, variables: &[f64]) -> Vec<f64> {
let thickness = variables[0];
let width = variables[1];
let height = variables[2];
let volume = thickness * width * height;
let weight = volume * 2.7; // kg
let cost = weight * 3.5; // $
let strength = -(thickness.sqrt() * width * height).powf(0.3); // Negative for minimization
vec![weight, cost, strength]
}
fn variable_bounds(&self) -> Vec<(f64, f64)> {
vec![(0.001, 0.1), (0.1, 2.0), (0.1, 2.0)]
}
}
fn main() -> Result<(), Box<dyn std::error::Error>> {
let problem_fn = EngineeringDesign;
let config = MultiObjectiveConfig::default();
let nsga2 = NsgaII::new(config);
use amari_optimization::phantom::{Euclidean, NonConvex, Unconstrained};
let problem: OptimizationProblem<3, Unconstrained, MultiObjective, NonConvex, Euclidean> =
OptimizationProblem::new();
let result = nsga2.optimize(&problem, &problem_fn)?;
println!("Pareto front size: {}", result.pareto_front.solutions.len());
if let Some(hv) = result.pareto_front.hypervolume {
println!("Hypervolume: {:.6}", hv);
}
Ok(())
}
use amari_optimization::prelude::*;
use amari_tropical::{TropicalMatrix, TropicalNumber};
fn main() -> Result<(), Box<dyn std::error::Error>> {
// Task scheduling with dependencies
let dependency_data = vec![
vec![0.0, 2.0, f64::NEG_INFINITY, 1.0],
vec![f64::NEG_INFINITY, 0.0, 1.0, 2.0],
vec![3.0, f64::NEG_INFINITY, 0.0, 1.0],
vec![f64::NEG_INFINITY, f64::NEG_INFINITY, f64::NEG_INFINITY, 0.0],
];
let dependency_matrix = TropicalMatrix::from_log_probs(&dependency_data);
let optimizer = TropicalOptimizer::with_default_config();
// Solve for critical cycle time
let eigen_result = optimizer.solve_tropical_eigenvalue(&dependency_matrix)?;
if let Some(critical_time) = eigen_result.eigenvalue {
println!("Critical cycle time: {:.2} hours", critical_time);
}
Ok(())
}
The examples/ directory contains comprehensive demonstrations:
natural_gradient_example.rs: Complete Gaussian parameter estimation with Fisher informationtropical_optimization_example.rs: Task scheduling, resource allocation, and shortest path problemsmulti_objective_example.rs: Engineering design trade-offs and environmental optimizationRun examples with:
cargo run --example natural_gradient_example
cargo run --example tropical_optimization_example
cargo run --example multi_objective_example
ObjectiveWithFisher<T>For natural gradient optimization on statistical manifolds:
trait ObjectiveWithFisher<T: Float> {
fn evaluate(&self, params: &[T]) -> T;
fn gradient(&self, params: &[T]) -> Vec<T>;
fn fisher_information(&self, params: &[T]) -> Vec<Vec<T>>;
}
MultiObjectiveFunction<T>For multi-objective Pareto optimization:
trait MultiObjectiveFunction<T: Float> {
fn num_objectives(&self) -> usize;
fn num_variables(&self) -> usize;
fn evaluate(&self, variables: &[T]) -> Vec<T>;
fn variable_bounds(&self) -> Vec<(T, T)>;
fn evaluate_constraints(&self, variables: &[T]) -> Vec<T> { vec![] }
fn ideal_point(&self) -> Option<Vec<T>> { None }
}
ConstrainedObjective<T>For constrained optimization problems:
trait ConstrainedObjective<T: Float> {
fn evaluate(&self, x: &[T]) -> T;
fn gradient(&self, x: &[T]) -> Vec<T>;
fn evaluate_constraints(&self, x: &[T]) -> Vec<T>;
fn constraint_gradients(&self, x: &[T]) -> Vec<Vec<T>>;
fn variable_bounds(&self) -> Vec<(T, T)>;
}
NaturalGradientOptimizer<T>Information geometric optimization:
impl<T: Float> NaturalGradientOptimizer<T> {
pub fn new(config: NaturalGradientConfig<T>) -> Self;
pub fn optimize_statistical<const DIM: usize, O: ObjectiveWithFisher<T>>(
&self,
problem: &OptimizationProblem<DIM, Unconstrained, SingleObjective, NonConvex, Statistical>,
objective: &O,
initial_params: Vec<T>,
) -> Result<NaturalGradientResult<T>, OptimizationError>;
}
NsgaII<T>Multi-objective evolutionary optimization:
impl<T: Float> NsgaII<T> {
pub fn new(config: MultiObjectiveConfig<T>) -> Self;
pub fn optimize<const DIM: usize, O: MultiObjectiveFunction<T>>(
&self,
problem: &OptimizationProblem<DIM, Unconstrained, MultiObjective, NonConvex, Euclidean>,
objective: &O,
) -> Result<MultiObjectiveResult<T>, OptimizationError>;
}
TropicalOptimizerMax-plus algebra optimization:
impl TropicalOptimizer {
pub fn with_default_config() -> Self;
pub fn solve_tropical_eigenvalue(&self, matrix: &TropicalMatrix) -> Result<TropicalEigenResult, OptimizationError>;
pub fn solve_tropical_linear_program(&self, objective: &[TropicalNumber], constraints: &TropicalMatrix, rhs: &[TropicalNumber]) -> Result<TropicalLPResult, OptimizationError>;
pub fn solve_shortest_path(&self, matrix: &TropicalMatrix, start: usize, end: usize) -> Result<ShortestPathResult, OptimizationError>;
}
The library uses phantom types for compile-time verification:
// Constraint states
struct Constrained;
struct Unconstrained;
// Objective states
struct SingleObjective;
struct MultiObjective;
// Convexity states
struct Convex;
struct NonConvex;
// Manifold types
struct Euclidean;
struct Statistical;
// Problem specification
struct OptimizationProblem<const DIM: usize, C, O, V, M> {
// Zero-sized phantom fields
}
pub struct NaturalGradientConfig<T: Float> {
pub learning_rate: T,
pub max_iterations: usize,
pub gradient_tolerance: T,
pub parameter_tolerance: T,
pub fisher_regularization: T,
pub use_line_search: bool,
pub line_search_beta: T,
pub line_search_alpha: T,
}
pub struct MultiObjectiveConfig<T: Float> {
pub population_size: usize,
pub max_generations: usize,
pub crossover_probability: T,
pub mutation_probability: T,
pub mutation_strength: T,
pub elite_ratio: T,
pub convergence_tolerance: T,
pub reference_point: Option<Vec<T>>,
pub preserve_diversity: bool,
}
pub struct ConstrainedConfig<T: Float> {
pub method: ConstrainedMethod,
pub penalty_parameter: T,
pub penalty_increase_factor: T,
pub barrier_parameter: T,
pub max_outer_iterations: usize,
pub max_inner_iterations: usize,
pub tolerance: T,
pub gradient_tolerance: T,
}
The library is optimized for performance with:
Run benchmarks with:
cargo bench
Enable optional features in Cargo.toml:
[dependencies]
amari-optimization = { version = "0.9.7", features = ["gpu", "serde", "parallel"] }
Available features:
gpu: GPU acceleration supportserde: Serialization support for configurations and resultsparallel: Parallel processing for multi-objective algorithmslinalg: Enhanced linear algebra operationsNatural gradient optimization uses the Fisher information metric to define gradients on statistical manifolds, providing parameter-invariant optimization that converges faster than standard gradient descent for statistical problems.
Tropical optimization uses the max-plus semiring where addition becomes max and multiplication becomes addition. This algebra is naturally suited for scheduling, timing analysis, and discrete optimization problems.
NSGA-II uses evolutionary principles with Pareto dominance ranking and crowding distance to maintain diversity while converging to the Pareto front of optimal trade-off solutions.
The library implements classical constrained optimization methods:
The library includes comprehensive testing:
# Unit tests
cargo test
# Integration tests
cargo test --test integration_tests
# Property-based tests
cargo test --test property_tests
# Documentation tests
cargo test --doc
Contributions are welcome! Please ensure:
cargo testcargo fmtcargo clippycargo benchLicensed under either of
at your option.
For more information, visit the Amari project documentation.