| Crates.io | genai-rs-macros |
| lib.rs | genai-rs-macros |
| version | 0.7.2 |
| created_at | 2026-01-09 00:27:59.808863+00 |
| updated_at | 2026-01-17 19:36:15.366258+00 |
| description | Procedural macros for genai-rs: automatic function declaration generation for Gemini API tool calling |
| homepage | |
| repository | https://github.com/evansenter/genai-rs |
| max_upload_size | |
| id | 2031338 |
| size | 38,004 |
A Rust client library for Google's Generative AI (Gemini) API using the Interactions API.
use genai_rs::Client;
#[tokio::main]
async fn main() -> Result<(), genai_rs::GenaiError> {
let client = Client::new(std::env::var("GEMINI_API_KEY").unwrap());
let response = client
.interaction()
.with_model("gemini-3-flash-preview")
.with_text("Explain Rust's ownership model in one sentence.")
.create()
.await?;
println!("{}", response.as_text().unwrap_or_default());
Ok(())
}
| Feature | Description |
|---|---|
| Streaming | Real-time token streaming with resume capability |
| Stateful Conversations | Multi-turn context via previous_interaction_id |
| Function Calling | Auto-discovery with #[tool] macro or manual control |
| Structured Output | JSON schema enforcement with with_response_format() |
| Thinking Mode | Access model reasoning with configurable depth |
| Tool | Method | Use Case |
|---|---|---|
| Google Search | with_google_search() |
Real-time web grounding |
| Code Execution | with_code_execution() |
Python sandbox |
| URL Context | with_url_context() |
Web page analysis |
| Input | Output |
|---|---|
| Images, Audio, Video, PDFs | Text, Images, Audio (TTS) |
[dependencies]
genai-rs = "0.7"
tokio = { version = "1.0", features = ["full"] }
# Optional
genai-rs-macros = "0.7" # For #[tool] macro
futures-util = "0.3" # For streaming
Requirements: Rust 1.88+ (edition 2024), Gemini API key
Runnable examples covering all features:
export GEMINI_API_KEY=your-key
cargo run --example simple_interaction
Quick Reference:
| I want to... | Example |
|---|---|
| Make my first API call | simple_interaction |
| Stream responses | streaming |
| Use function calling | auto_function_calling |
| Multi-turn conversations | stateful_interaction |
| Generate images | image_generation |
| Text to speech | text_to_speech |
| Get structured JSON | structured_output |
| Implement retry logic | retry_with_backoff |
See Examples Index for the complete categorized list.
use futures_util::StreamExt;
use genai_rs::StreamChunk;
let mut stream = client.interaction()
.with_text("Write a haiku about Rust.")
.create_stream();
while let Some(Ok(event)) = stream.next().await {
if let StreamChunk::Delta(delta) = &event.chunk {
if let Some(text) = delta.as_text() {
print!("{}", text);
}
}
}
#[tool]use genai_rs_macros::tool;
#[tool(location(description = "City name, e.g. Tokyo"))]
fn get_weather(location: String) -> String {
format!(r#"{{"temp": 72, "conditions": "sunny"}}"#)
}
let result = client.interaction()
.with_text("What's the weather in Tokyo?")
.add_function(GetWeatherCallable.declaration())
.create_with_auto_functions()
.await?;
// First turn (enable storage for multi-turn)
let r1 = client.interaction()
.with_system_instruction("You are a helpful assistant.")
.with_text("My name is Alice.")
.with_store_enabled()
.create().await?;
// Continue conversation (r1.id is Option<String>)
let r2 = client.interaction()
.with_previous_interaction(r1.id.as_ref().expect("stored interactions have IDs"))
.with_text("What's my name?") // Remembers: Alice
.create().await?;
use genai_rs::ThinkingLevel;
let response = client.interaction()
.with_thinking_level(ThinkingLevel::High)
.with_text("What's 15% of 847?")
.create().await?;
// Check if model used reasoning (thoughts contain cryptographic signatures)
if response.has_thoughts() {
println!("Model used {} thought blocks", response.thought_signatures().count());
}
use genai_rs::InteractionRequest;
// Build request without executing (Clone + Serialize)
let request: InteractionRequest = client.interaction()
.with_model("gemini-3-flash-preview")
.with_text("Hello!")
.build()?;
// Execute separately - enables retry loops
let response = client.execute(request.clone()).await?;
// On error, check if retryable: error.is_retryable()
See retry_with_backoff for a complete retry example using the backon crate.
| Guide | Description |
|---|---|
| Examples Index | All examples, categorized |
| Function Calling | #[tool] macro, ToolService, manual execution |
| Multi-Turn Patterns | Stateful/stateless, inheritance rules |
| Streaming API | Stream types, resume, auto-functions |
| Multimodal | Images, audio, video, PDFs |
| Output Modalities | Image generation, text-to-speech |
| Thinking Mode | Reasoning depth, thought signatures |
| Built-in Tools | Google Search, code execution, URL context |
| Configuration | Client options, generation config |
| Conversation Patterns | Multi-turn, context management |
| Document | Description |
|---|---|
| Error Handling | Error types, recovery patterns |
| Reliability Patterns | Retries, timeouts, resilience |
| Logging Strategy | Log levels, LOUD_WIRE debugging |
| Testing Guide | Test strategies, assertions |
| Agents & Background | Long-running tasks, polling |
| API Reference | Generated API documentation |
| Resource | Description |
|---|---|
| Interactions API Reference | Official API specification |
| Interactions API Guide | Usage patterns |
| Function Calling Guide | Google's function calling docs |
# Wire-level request/response logging
LOUD_WIRE=1 cargo run --example simple_interaction
# Library debug logs
RUST_LOG=genai_rs=debug cargo run --example simple_interaction
See Logging Strategy for details.
This library follows the Evergreen philosophy: unknown API types deserialize into Unknown variants instead of failing. Always include wildcard arms:
match content {
Content::Text { text } => println!("{}", text.unwrap_or_default()),
_ => {} // Handles future variants gracefully
}
make test # Unit tests (uses cargo-nextest)
make test-all # Full integration suite (requires GEMINI_API_KEY)
genai-rs/ # Main crate: Client, InteractionBuilder, types
genai-rs-macros/ # Procedural macro for #[tool]
docs/ # Comprehensive guides
examples/ # Runnable examples
Contributions welcome! Please read:
Common issues and solutions are documented in TROUBLESHOOTING.md.
Quick fixes:
GEMINI_API_KEY is setgemini-3-flash-previewcreate_with_auto_functions()