| Crates.io | anthropic-tools |
| lib.rs | anthropic-tools |
| version | 1.0.1 |
| created_at | 2026-01-25 04:32:34.203763+00 |
| updated_at | 2026-01-25 04:44:17.871136+00 |
| description | A Rust library for interacting with the Anthropic API |
| homepage | |
| repository | https://github.com/akitenkrad/rs-anthropic-tools |
| max_upload_size | |
| id | 2068141 |
| size | 1,953,708 |
A Rust library for interacting with the Anthropic API.
Model enum with all Claude models.env fileAdd to your Cargo.toml:
[dependencies]
anthropic-tools = "1.0"
Set your API key via environment variable or .env file:
# Environment variable
export ANTHROPIC_API_KEY="sk-ant-..."
# Or create .env file in project root
echo 'ANTHROPIC_API_KEY=sk-ant-...' > .env
use anthropic_tools::prelude::*;
#[tokio::main]
async fn main() -> Result<()> {
let mut client = Messages::new();
client
.model(Model::Sonnet4) // Type-safe model selection
.max_tokens(1024)
.system("You are a helpful assistant.")
.user("Hello, how are you?");
let response = client.post().await?;
println!("{}", response.get_text());
Ok(())
}
use anthropic_tools::prelude::*;
#[tokio::main]
async fn main() -> Result<()> {
let mut client = Messages::new();
client
.model(Model::Sonnet4)
.max_tokens(16000)
.thinking(10000) // Enable extended thinking with 10k token budget
.user("Solve this complex problem step by step...");
let response = client.post().await?;
// Access thinking content if available
if response.has_thinking() {
println!("Thinking: {}", response.get_thinking().unwrap_or_default());
}
println!("Response: {}", response.get_text());
Ok(())
}
use anthropic_tools::prelude::*;
#[tokio::main]
async fn main() -> Result<()> {
// Define a tool
let mut tool = Tool::new("get_weather");
tool.description("Get the current weather for a location")
.add_string_property("location", Some("City name"), true);
// Create client with tool
let mut client = Messages::new();
client
.model(Model::Sonnet4)
.max_tokens(1024)
.tools(vec![tool.to_value()])
.user("What's the weather in Tokyo?");
let response = client.post().await?;
// Check if tool was used
if response.has_tool_use() {
for tool_use in response.get_tool_uses() {
if let ContentBlock::ToolUse { name, input, .. } = tool_use {
println!("Tool: {}, Input: {}", name, input);
}
}
}
Ok(())
}
use anthropic_tools::prelude::*;
#[tokio::main]
async fn main() -> Result<()> {
let mut client = Messages::new();
client
.model(Model::Sonnet4)
.max_tokens(1024)
.user_with_image_url(
"Describe this image",
"https://example.com/image.png",
);
let response = client.post().await?;
println!("{}", response.get_text());
Ok(())
}
use anthropic_tools::prelude::Model;
// Claude 4.5 Family
Model::Opus45 // claude-opus-4-5-20251101
// Claude 4 Family
Model::Opus4 // claude-opus-4-20250514
Model::Sonnet4 // claude-sonnet-4-20250514 (default)
// Claude 3 Family
Model::Opus3 // claude-3-opus-20240229
Model::Sonnet3 // claude-3-sonnet-20240229
Model::Haiku3 // claude-3-haiku-20240307
// Custom/Future models
Model::Other("custom-model".to_string())
| Variable | Description |
|---|---|
ANTHROPIC_API_KEY |
Your Anthropic API key (required) |
Supports loading from .env file automatically.
anthropic-tools
├── common/
│ ├── errors.rs - Error types (AnthropicToolError)
│ ├── tool.rs - Tool definitions (Tool, JsonSchema)
│ └── usage.rs - Token usage tracking
└── messages/
├── request/
│ ├── mod.rs - Messages client
│ ├── body.rs - Request body, ThinkingConfig
│ ├── content.rs - Content blocks (text, image, tool_use, etc.)
│ ├── message.rs - Message and SystemPrompt types
│ ├── model.rs - Model enum
│ ├── role.rs - Role enum (User, Assistant)
│ └── mcp.rs - MCP server configuration
├── response.rs - API response types
└── streaming.rs - SSE streaming types
MIT