| Crates.io | catsu |
| lib.rs | catsu |
| version | 0.1.7 |
| created_at | 2026-01-07 08:39:14.962454+00 |
| updated_at | 2026-01-16 00:24:40.724305+00 |
| description | High-performance embeddings client for multiple providers |
| homepage | |
| repository | https://github.com/chonkie-inc/catsu |
| max_upload_size | |
| id | 2027809 |
| size | 225,038 |

A unified, batteries-included client for embedding APIs that actually works.
The world of embedding API clients is broken. (details)
Catsu fixes this. It's a high-performance, unified client built specifically for embeddings with:
🎯 A clean, consistent API across all providers 🔄 Built-in retry logic with exponential backoff 💰 Automatic usage and cost tracking 📚 Rich model metadata and capability discovery ⚡ Rust core with Python bindings for maximum performance
Add to your Cargo.toml:
[dependencies]
catsu = "0.1"
tokio = { version = "1", features = ["full"] }
use catsu::Client;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client (reads API keys from environment)
let client = Client::new()?;
// Generate embeddings
let response = client.embed(
"openai:text-embedding-3-small",
vec!["Hello, world!".to_string(), "How are you?".to_string()],
).await?;
println!("Dimensions: {}", response.dimensions);
println!("Tokens used: {}", response.usage.tokens);
println!("Embedding: {:?}", &response.embeddings[0][..5]);
Ok(())
}
use catsu::{Client, InputType};
let response = client.embed_with_options(
"openai:text-embedding-3-small",
vec!["Search query".to_string()],
Some(InputType::Query), // input type hint
Some(256), // output dimensions
).await?;
// List all available models
let models = client.list_models(None);
// Filter by provider
let openai_models = client.list_models(Some("openai"));
for model in openai_models {
println!("{}: {} dims", model.name, model.dimensions);
}
Looking for Python? See the Python documentation.
pip install catsu
from catsu import Client
client = Client()
response = client.embed("openai:text-embedding-3-small", ["Hello, world!"])
print(f"Dimensions: {response.dimensions}")
Can't find your favorite model or provider? Open an issue and we'll add it!
For guidelines on contributing, see CONTRIBUTING.md.
If you found this helpful, consider giving it a ⭐!
made with ❤️ by chonkie, inc.