| Crates.io | mixtape-anthropic-sdk |
| lib.rs | mixtape-anthropic-sdk |
| version | 0.2.1 |
| created_at | 2026-01-05 04:01:35.09775+00 |
| updated_at | 2026-01-06 06:52:35.419873+00 |
| description | Minimal Anthropic API client for the mixtape agent framework |
| homepage | |
| repository | https://github.com/adlio/mixtape |
| max_upload_size | |
| id | 2023061 |
| size | 243,950 |
A minimal Anthropic API client. Supports messages, streaming, tools, batching, and token counting.
Most mixtape users should use the main mixtape-core crate with the anthropic feature instead. This crate is the low-level client that powers it.
use mixtape_anthropic_sdk::{Anthropic, MessageCreateParams};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Anthropic::from_env()?; // Uses ANTHROPIC_API_KEY
let params = MessageCreateParams::builder("claude-sonnet-4-20250514", 1024)
.user("Hello, Claude!")
.build();
let response = client.messages().create(params).await?;
println!("{:?}", response);
Ok(())
}
For long responses, streaming provides better feedback:
let stream = client.messages().stream(params).await?;
let text = stream.collect_text().await?;
// Or get the full message with stop reason and usage
let stream = client.messages().stream(params).await?;
let message = stream.collect_message().await?;
use mixtape_anthropic_sdk::{Tool, ToolInputSchema, ToolChoice, ContentBlock};
let tool = Tool {
name: "get_weather".to_string(),
description: Some("Get weather for a location".to_string()),
input_schema: ToolInputSchema::new(),
cache_control: None,
tool_type: None,
};
let params = MessageCreateParams::builder("claude-sonnet-4-20250514", 1024)
.user("What's the weather in Tokyo?")
.tools(vec![tool])
.tool_choice(ToolChoice::auto())
.build();
let response = client.messages().create(params).await?;
for block in &response.content {
if let ContentBlock::ToolUse { name, input, .. } = block {
println!("{}: {}", name, input);
}
}
For complex reasoning:
let params = MessageCreateParams::builder("claude-sonnet-4-20250514", 16000)
.user("Solve this problem...")
.thinking(4096) // Budget for thinking
.build();
Access rate limit headers for debugging:
let response = client.messages().create_with_metadata(params).await?;
if let Some(rate_limit) = response.rate_limit() {
println!("Remaining: {:?}", rate_limit.requests_remaining);
}
if let Some(request_id) = response.request_id() {
println!("Request ID: {}", request_id);
}
use mixtape_anthropic_sdk::RetryConfig;
use std::time::Duration;
let client = Anthropic::builder()
.api_key("your-api-key")
.max_retries(5)
.build()?;
// Full control
let client = Anthropic::builder()
.api_key("your-api-key")
.retry_config(RetryConfig {
max_retries: 3,
base_delay: Duration::from_millis(500),
max_delay: Duration::from_secs(10),
jitter: 0.25,
})
.build()?;
For high-volume workloads:
use mixtape_anthropic_sdk::{BatchCreateParams, BatchRequest};
let requests = vec![
BatchRequest::new("req-1", params1),
BatchRequest::new("req-2", params2),
];
let batch = client.batches().create(BatchCreateParams::new(requests)).await?;
println!("Batch ID: {}", batch.id);
| Feature | Description |
|---|---|
schemars |
Enable JsonSchema derives for tool inputs |