| Crates.io | adk-model |
| lib.rs | adk-model |
| version | 0.2.1 |
| created_at | 2025-11-30 13:48:27.933526+00 |
| updated_at | 2026-01-22 03:36:26.981852+00 |
| description | LLM model integrations for Rust Agent Development Kit (ADK-Rust) (Gemini, OpenAI, Claude, DeepSeek, etc.) |
| homepage | |
| repository | https://github.com/zavora-ai/adk-rust |
| max_upload_size | |
| id | 1958240 |
| size | 249,004 |
LLM model integrations for Rust Agent Development Kit (ADK-Rust) with Gemini, OpenAI, Anthropic, and DeepSeek.
adk-model provides LLM integrations for the Rust Agent Development Kit (ADK-Rust). Supports all major providers:
The crate implements the Llm trait from adk-core, allowing models to be used interchangeably.
[dependencies]
adk-model = "0.2.0"
Or use the meta-crate:
[dependencies]
adk-rust = { version = "0.2.1", features = ["models"] }
use adk_model::GeminiModel;
use adk_agent::LlmAgentBuilder;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("GOOGLE_API_KEY")?;
let model = GeminiModel::new(&api_key, "gemini-2.5-flash")?;
let agent = LlmAgentBuilder::new("assistant")
.model(Arc::new(model))
.build()?;
Ok(())
}
use adk_model::openai::{OpenAIClient, OpenAIConfig};
use adk_agent::LlmAgentBuilder;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("OPENAI_API_KEY")?;
let model = OpenAIClient::new(OpenAIConfig::new(api_key, "gpt-4o"))?;
let agent = LlmAgentBuilder::new("assistant")
.model(Arc::new(model))
.build()?;
Ok(())
}
use adk_model::anthropic::{AnthropicClient, AnthropicConfig};
use adk_agent::LlmAgentBuilder;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("ANTHROPIC_API_KEY")?;
let model = AnthropicClient::new(AnthropicConfig::new(api_key, "claude-sonnet-4-20250514"))?;
let agent = LlmAgentBuilder::new("assistant")
.model(Arc::new(model))
.build()?;
Ok(())
}
use adk_model::deepseek::{DeepSeekClient, DeepSeekConfig};
use adk_agent::LlmAgentBuilder;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("DEEPSEEK_API_KEY")?;
// Standard chat model
let model = DeepSeekClient::chat(api_key)?;
// Or use the reasoner model with chain-of-thought
// let model = DeepSeekClient::reasoner(api_key)?;
let agent = LlmAgentBuilder::new("assistant")
.model(Arc::new(model))
.build()?;
Ok(())
}
use adk_model::groq::{GroqClient, GroqConfig};
use adk_agent::LlmAgentBuilder;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("GROQ_API_KEY")?;
let model = GroqClient::new(GroqConfig::llama70b(api_key))?;
let agent = LlmAgentBuilder::new("assistant")
.model(Arc::new(model))
.build()?;
Ok(())
}
use adk_model::ollama::{OllamaModel, OllamaConfig};
use adk_agent::LlmAgentBuilder;
use std::sync::Arc;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
// Requires: ollama serve && ollama pull llama3.2
let model = OllamaModel::new(OllamaConfig::new("llama3.2"))?;
let agent = LlmAgentBuilder::new("assistant")
.model(Arc::new(model))
.build()?;
Ok(())
}
| Model | Description |
|---|---|
gemini-3-pro-preview |
Most intelligent multimodal model with agentic capabilities |
gemini-2.5-pro |
Advanced reasoning model |
gemini-2.5-flash |
Latest fast model (recommended) |
gemini-2.5-flash-lite |
Lightweight, cost-effective |
gemini-2.0-flash |
Fast and efficient |
See Gemini models documentation for the full list.
| Model | Description |
|---|---|
gpt-5.2 |
Latest GPT-5 with enhanced reasoning |
gpt-5.1 |
GPT-5 with improved tool use |
gpt-5 |
GPT-5 base model |
gpt-4o |
Most capable GPT-4 model |
gpt-4o-mini |
Fast, cost-effective GPT-4 |
See OpenAI models documentation for the full list.
| Model | Description |
|---|---|
claude-opus-4-20250514 |
Claude Opus 4.5 - Most capable |
claude-sonnet-4-20250514 |
Claude Sonnet 4.5 - Balanced |
claude-3-5-sonnet-20241022 |
Claude 3.5 Sonnet |
claude-3-opus-20240229 |
Claude 3 Opus |
See Anthropic models documentation for the full list.
| Model | Description |
|---|---|
deepseek-chat |
General-purpose chat model |
deepseek-reasoner |
Reasoning model with chain-of-thought |
Features:
<thinking> tagsSee DeepSeek API documentation for the full list.
| Model | Description |
|---|---|
llama-3.3-70b-versatile |
LLaMA 3.3 70B - Most capable |
llama-3.1-8b-instant |
LLaMA 3.1 8B - Ultra fast |
mixtral-8x7b-32768 |
Mixtral 8x7B - 32K context |
gemma2-9b-it |
Gemma 2 9B |
Features:
See Groq documentation for the full list.
| Model | Description |
|---|---|
llama3.2 |
LLaMA 3.2 - Fast and capable |
mistral |
Mistral 7B |
qwen2.5:7b |
Qwen 2.5 with excellent tool support (recommended) |
gemma2 |
Gemma 2 |
Features:
See Ollama library for all available models.
# Google Gemini
GOOGLE_API_KEY=your-google-api-key
# OpenAI
OPENAI_API_KEY=your-openai-api-key
# Anthropic
ANTHROPIC_API_KEY=your-anthropic-api-key
# DeepSeek
DEEPSEEK_API_KEY=your-deepseek-api-key
# Groq
GROQ_API_KEY=your-groq-api-key
# Ollama (no key needed, just start the server)
# ollama serve
Enable specific providers with feature flags:
[dependencies]
# All providers (default)
adk-model = { version = "0.2.1", features = ["all-providers"] }
# Individual providers
adk-model = { version = "0.2.1", features = ["gemini"] }
adk-model = { version = "0.2.1", features = ["openai"] }
adk-model = { version = "0.2.1", features = ["anthropic"] }
adk-model = { version = "0.2.1", features = ["deepseek"] }
adk-model = { version = "0.2.1", features = ["groq"] }
adk-model = { version = "0.2.1", features = ["ollama"] }
Llm traitApache-2.0
This crate is part of the ADK-Rust framework for building AI agents in Rust.