Crates.io | hdbscan |
lib.rs | hdbscan |
version | 0.9.0 |
source | src |
created_at | 2024-02-14 12:24:32.99931 |
updated_at | 2024-11-13 18:22:05.987761 |
description | HDBSCAN clustering in pure Rust. A huge improvement on DBSCAN, capable of identifying clusters of varying densities. |
homepage | |
repository | https://github.com/tom-whitehead/hdbscan |
max_upload_size | |
id | 1139825 |
size | 87,089 |
Hierarchical Density-Based Spatial Clustering of Applications with Noise ("HDBSCAN")
HDBSCAN clustering algorithm in pure Rust. Generic over floating point numeric types.
HDBSCAN is a powerful clustering algorithm that can be used to effectively find clusters in real world data. The main benefits of HDBSCAN are that:
This implementation owes a debt to the Python scikit-learn implementation of this algorithm, without which this algorithm would not have been possible. The "How HDBSCAN works" article below is invaluable in understanding this algorithm better.
Several variations of HDBSCAN are possible. Notably, a nearest neighbours algorithm is used to calculate the distance of a point to its Kth neighbour. This is a crucial input to calculate the density of points in the vector space. Currently, this implementation only supports the K-d Tree nearest neighbours algorithm to do this. While K-d Tree is the best candidate for most uses cases, in the future I hope to support other nearest neighbour algorithms to make this implementation more flexible (as per the scikit-learn Python implementation).
Further, this implementation uses Prim's algorithm to find the minimum spanning tree of the points. Prim's algorithm will perform the best for dense vectors and therefore most uses cases. However, Kruskal's algorithm is another possibility for this, that would perform better on sparse vectors.
use std::collections::HashSet;
use hdbscan::Hdbscan;
let data: Vec<Vec<f32>> = vec![
vec![1.5, 2.2],
vec![1.0, 1.1],
vec![1.2, 1.4],
vec![0.8, 1.0],
vec![1.1, 1.0],
vec![3.7, 4.0],
vec![3.9, 3.9],
vec![3.6, 4.1],
vec![3.8, 3.9],
vec![4.0, 4.1],
vec![10.0, 10.0],
];
let clusterer = Hdbscan::default_hyper_params(&data);
let labels = clusterer.cluster().unwrap();
//First five points form one cluster
assert_eq!(1, labels[..5].iter().collect::<HashSet<_>>().len());
// Next five points are a second cluster
assert_eq!(1, labels[5..10].iter().collect::<HashSet<_>>().len());
// The final point is noise
assert_eq!(-1, labels[10]);
use std::collections::HashSet;
use hdbscan::{DistanceMetric, Hdbscan, HdbscanHyperParams, NnAlgorithm};
let data: Vec<Vec<f32>> = vec![
vec![1.3, 1.1],
vec![1.3, 1.2],
vec![1.2, 1.2],
vec![1.0, 1.1],
vec![0.9, 1.0],
vec![0.9, 1.0],
vec![3.7, 4.0],
];
let hyper_params = HdbscanHyperParams::builder()
.min_cluster_size(3)
.min_samples(2)
.dist_metric(DistanceMetric::Manhattan)
.nn_algorithm(NnAlgorithm::BruteForce)
.build();
let clusterer = Hdbscan::new(&data, hyper_params);
let labels = clusterer.cluster().unwrap();
// First three points form one cluster
assert_eq!(1, labels[..3].iter().collect::<HashSet<_>>().len());
// Next three points are a second cluster
assert_eq!(1, labels[3..6].iter().collect::<HashSet<_>>().len());
// The final point is noise
assert_eq!(-1, labels[6]);
use hdbscan::{Center, Hdbscan};
let data: Vec<Vec<f32>> = vec![
vec![1.5, 2.2],
vec![1.0, 1.1],
vec![1.2, 1.4],
vec![0.8, 1.0],
vec![1.1, 1.0],
vec![3.7, 4.0],
vec![3.9, 3.9],
vec![3.6, 4.1],
vec![3.8, 3.9],
vec![4.0, 4.1],
vec![10.0, 10.0],
];
let clusterer = Hdbscan::default(&data);
let labels = clusterer.cluster().unwrap();
let centroids = clusterer.calc_centers(Center::Centroid, &labels).unwrap();
assert_eq!(2, centroids.len());
assert!(centroids.contains(&vec![3.8, 4.0]) && centroids.contains(&vec![1.12, 1.34]));
Campello, R.J.G.B.; Moulavi, D.; Sander, J. Density-based clustering based on hierarchical density estimates.
How HDSCAN Works. Leland McInnes, John Healy, Steve Astels.
Dual-licensed to be compatible with the Rust project.
Licensed under the Apache License, Version 2.0 http://www.apache.org/licenses/LICENSE-2.0 or the MIT license http://opensource.org/licenses/MIT, at your option. This file may not be copied, modified, or distributed except according to those terms.