genai-rs-macros

Crates.iogenai-rs-macros
lib.rsgenai-rs-macros
version0.7.2
created_at2026-01-09 00:27:59.808863+00
updated_at2026-01-17 19:36:15.366258+00
descriptionProcedural macros for genai-rs: automatic function declaration generation for Gemini API tool calling
homepage
repositoryhttps://github.com/evansenter/genai-rs
max_upload_size
id2031338
size38,004
Evan Senter (evansenter)

documentation

https://docs.rs/genai-rs-macros

README

genai-rs

Crates.io Documentation CI MSRV License: MIT

A Rust client library for Google's Generative AI (Gemini) API using the Interactions API.

Quick Start

use genai_rs::Client;

#[tokio::main]
async fn main() -> Result<(), genai_rs::GenaiError> {
    let client = Client::new(std::env::var("GEMINI_API_KEY").unwrap());

    let response = client
        .interaction()
        .with_model("gemini-3-flash-preview")
        .with_text("Explain Rust's ownership model in one sentence.")
        .create()
        .await?;

    println!("{}", response.as_text().unwrap_or_default());
    Ok(())
}

Features

Core Capabilities

Feature Description
Streaming Real-time token streaming with resume capability
Stateful Conversations Multi-turn context via previous_interaction_id
Function Calling Auto-discovery with #[tool] macro or manual control
Structured Output JSON schema enforcement with with_response_format()
Thinking Mode Access model reasoning with configurable depth

Built-in Tools

Tool Method Use Case
Google Search with_google_search() Real-time web grounding
Code Execution with_code_execution() Python sandbox
URL Context with_url_context() Web page analysis

Multimodal I/O

Input Output
Images, Audio, Video, PDFs Text, Images, Audio (TTS)

Installation

[dependencies]
genai-rs = "0.7"
tokio = { version = "1.0", features = ["full"] }

# Optional
genai-rs-macros = "0.7"  # For #[tool] macro
futures-util = "0.3"     # For streaming

Requirements: Rust 1.88+ (edition 2024), Gemini API key

Examples

Runnable examples covering all features:

export GEMINI_API_KEY=your-key
cargo run --example simple_interaction

Quick Reference:

I want to... Example
Make my first API call simple_interaction
Stream responses streaming
Use function calling auto_function_calling
Multi-turn conversations stateful_interaction
Generate images image_generation
Text to speech text_to_speech
Get structured JSON structured_output
Implement retry logic retry_with_backoff

See Examples Index for the complete categorized list.

Usage Highlights

Streaming

use futures_util::StreamExt;
use genai_rs::StreamChunk;

let mut stream = client.interaction()
    .with_text("Write a haiku about Rust.")
    .create_stream();

while let Some(Ok(event)) = stream.next().await {
    if let StreamChunk::Delta(delta) = &event.chunk {
        if let Some(text) = delta.as_text() {
            print!("{}", text);
        }
    }
}

Function Calling with #[tool]

use genai_rs_macros::tool;

#[tool(location(description = "City name, e.g. Tokyo"))]
fn get_weather(location: String) -> String {
    format!(r#"{{"temp": 72, "conditions": "sunny"}}"#)
}

let result = client.interaction()
    .with_text("What's the weather in Tokyo?")
    .add_function(GetWeatherCallable.declaration())
    .create_with_auto_functions()
    .await?;

Stateful Conversations

// First turn (enable storage for multi-turn)
let r1 = client.interaction()
    .with_system_instruction("You are a helpful assistant.")
    .with_text("My name is Alice.")
    .with_store_enabled()
    .create().await?;

// Continue conversation (r1.id is Option<String>)
let r2 = client.interaction()
    .with_previous_interaction(r1.id.as_ref().expect("stored interactions have IDs"))
    .with_text("What's my name?")  // Remembers: Alice
    .create().await?;

Thinking Mode

use genai_rs::ThinkingLevel;

let response = client.interaction()
    .with_thinking_level(ThinkingLevel::High)
    .with_text("What's 15% of 847?")
    .create().await?;

// Check if model used reasoning (thoughts contain cryptographic signatures)
if response.has_thoughts() {
    println!("Model used {} thought blocks", response.thought_signatures().count());
}

Build & Execute (for Retries)

use genai_rs::InteractionRequest;

// Build request without executing (Clone + Serialize)
let request: InteractionRequest = client.interaction()
    .with_model("gemini-3-flash-preview")
    .with_text("Hello!")
    .build()?;

// Execute separately - enables retry loops
let response = client.execute(request.clone()).await?;

// On error, check if retryable: error.is_retryable()

See retry_with_backoff for a complete retry example using the backon crate.

Documentation

Guides

Guide Description
Examples Index All examples, categorized
Function Calling #[tool] macro, ToolService, manual execution
Multi-Turn Patterns Stateful/stateless, inheritance rules
Streaming API Stream types, resume, auto-functions
Multimodal Images, audio, video, PDFs
Output Modalities Image generation, text-to-speech
Thinking Mode Reasoning depth, thought signatures
Built-in Tools Google Search, code execution, URL context
Configuration Client options, generation config
Conversation Patterns Multi-turn, context management

Reference

Document Description
Error Handling Error types, recovery patterns
Reliability Patterns Retries, timeouts, resilience
Logging Strategy Log levels, LOUD_WIRE debugging
Testing Guide Test strategies, assertions
Agents & Background Long-running tasks, polling
API Reference Generated API documentation

External Resources

Resource Description
Interactions API Reference Official API specification
Interactions API Guide Usage patterns
Function Calling Guide Google's function calling docs

Debugging

# Wire-level request/response logging
LOUD_WIRE=1 cargo run --example simple_interaction

# Library debug logs
RUST_LOG=genai_rs=debug cargo run --example simple_interaction

See Logging Strategy for details.

Forward Compatibility

This library follows the Evergreen philosophy: unknown API types deserialize into Unknown variants instead of failing. Always include wildcard arms:

match content {
    Content::Text { text } => println!("{}", text.unwrap_or_default()),
    _ => {}  // Handles future variants gracefully
}

Testing

make test      # Unit tests (uses cargo-nextest)
make test-all  # Full integration suite (requires GEMINI_API_KEY)

Project Structure

genai-rs/           # Main crate: Client, InteractionBuilder, types
genai-rs-macros/    # Procedural macro for #[tool]
docs/               # Comprehensive guides
examples/           # Runnable examples

Contributing

Contributions welcome! Please read:

Troubleshooting

Common issues and solutions are documented in TROUBLESHOOTING.md.

Quick fixes:

  • "API key not valid" - Check GEMINI_API_KEY is set
  • "Model not found" - Use gemini-3-flash-preview
  • Functions not executing - Use create_with_auto_functions()

License

MIT

Commit count: 295

cargo fmt