| Crates.io | openrouter-rs |
| lib.rs | openrouter-rs |
| version | 0.4.5 |
| created_at | 2025-03-23 13:36:49.64741+00 |
| updated_at | 2025-07-29 04:13:01.951839+00 |
| description | A type-safe OpenRouter Rust SDK |
| homepage | https://github.com/realmorrisliu/openrouter-rs |
| repository | https://github.com/realmorrisliu/openrouter-rs |
| max_upload_size | |
| id | 1602676 |
| size | 232,635 |
A type-safe, async Rust SDK for the OpenRouter API - Access 200+ AI models with ease
tokio for high-performance concurrent operationsfuturesAdd to your Cargo.toml:
[dependencies]
openrouter-rs = "0.4.5"
tokio = { version = "1", features = ["full"] }
use openrouter_rs::{OpenRouterClient, api::chat::*, types::Role};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Create client
let client = OpenRouterClient::builder()
.api_key("your_api_key")
.build()?;
// Send chat completion
let request = ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![
Message::new(Role::User, "Explain Rust ownership in simple terms")
])
.build()?;
let response = client.send_chat_completion(&request).await?;
println!("{}", response.choices[0].content().unwrap_or(""));
Ok(())
}
Leverage chain-of-thought processing with reasoning tokens:
use openrouter_rs::types::Effort;
let request = ChatCompletionRequest::builder()
.model("deepseek/deepseek-r1")
.messages(vec![Message::new(Role::User, "What's bigger: 9.9 or 9.11?")])
.reasoning_effort(Effort::High) // Enable deep reasoning
.reasoning_max_tokens(2000) // Control reasoning depth
.build()?;
let response = client.send_chat_completion(&request).await?;
// Access both reasoning and final answer
println!("π§ Reasoning: {}", response.choices[0].reasoning().unwrap_or(""));
println!("π‘ Answer: {}", response.choices[0].content().unwrap_or(""));
Process responses as they arrive:
use futures_util::StreamExt;
let stream = client.stream_chat_completion(&request).await?;
stream
.filter_map(|event| async { event.ok() })
.for_each(|chunk| async move {
if let Some(content) = chunk.choices[0].content() {
print!("{}", content); // Print as it arrives
}
})
.await;
Use curated model collections:
use openrouter_rs::config::OpenRouterConfig;
let config = OpenRouterConfig::default();
// Three built-in presets:
// β’ programming: Code generation and development
// β’ reasoning: Advanced problem-solving models
// β’ free: Free-tier models for experimentation
println!("Available models: {:?}", config.get_resolved_models());
use openrouter_rs::error::OpenRouterError;
match client.send_chat_completion(&request).await {
Ok(response) => println!("Success!"),
Err(OpenRouterError::ModerationError { reasons, .. }) => {
eprintln!("Content flagged: {:?}", reasons);
}
Err(OpenRouterError::ApiError { code, message }) => {
eprintln!("API error {}: {}", code, message);
}
Err(e) => eprintln!("Other error: {}", e),
}
| Feature | Status | Module |
|---|---|---|
| Chat Completions | β | api::chat |
| Text Completions | β | api::completion |
| Reasoning Tokens | β | api::chat |
| Streaming Responses | β | api::chat |
| Model Information | β | api::models |
| API Key Management | β | api::api_keys |
| Credit Management | β | api::credits |
| Authentication | β | api::auth |
use openrouter_rs::types::ModelCategory;
let models = client
.list_models_by_category(ModelCategory::Programming)
.await?;
println!("Found {} programming models", models.len());
let client = OpenRouterClient::builder()
.api_key("your_key")
.http_referer("https://yourapp.com")
.x_title("My AI App")
.base_url("https://openrouter.ai/api/v1") // Custom endpoint
.build()?;
let stream = client.stream_chat_completion(
&ChatCompletionRequest::builder()
.model("anthropic/claude-sonnet-4")
.messages(vec![Message::new(Role::User, "Solve this step by step: 2x + 5 = 13")])
.reasoning_effort(Effort::High)
.build()?
).await?;
let mut reasoning_buffer = String::new();
let mut content_buffer = String::new();
stream.filter_map(|event| async { event.ok() })
.for_each(|chunk| async {
if let Some(reasoning) = chunk.choices[0].reasoning() {
reasoning_buffer.push_str(reasoning);
print!("π§ "); // Show reasoning progress
}
if let Some(content) = chunk.choices[0].content() {
content_buffer.push_str(content);
print!("π¬"); // Show content progress
}
}).await;
println!("\nπ§ Reasoning: {}", reasoning_buffer);
println!("π‘ Answer: {}", content_buffer);
# Set your API key
export OPENROUTER_API_KEY="your_key_here"
# Basic chat completion
cargo run --example send_chat_completion
# Reasoning tokens demo
cargo run --example chat_with_reasoning
# Streaming responses
cargo run --example stream_chat_completion
# Run with reasoning
cargo run --example stream_chat_with_reasoning
Please open an issue with:
rustc --version)We love hearing your ideas! Start a discussion to:
Contributions are welcome! Please see our contributing guidelines:
If this SDK helps your project, consider:
This project is licensed under the MIT License - see the LICENSE file for details.
This is a third-party SDK not officially affiliated with OpenRouter. Use at your own discretion.
programming/reasoning/free categoriesMade with β€οΈ for the Rust community
β Star us on GitHub | π¦ Find us on Crates.io | π Read the Docs