ai_client

Crates.ioai_client
lib.rsai_client
version
sourcesrc
created_at2025-05-06 12:23:18.261238+00
updated_at2025-05-06 12:24:54.447733+00
descriptionA Rust crate for interacting with AI language model APIs
homepage
repository
max_upload_size
id1662308
Cargo.toml error:TOML parse error at line 17, column 1 | 17 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include`
size0
(mevratFinebee)

documentation

README

ai_client

A Rust crate for interacting with AI language model APIs, supporting multiple providers (Grok, Anthropic, OpenAI) through a unified ChatCompletionClient trait.

Features

  • Unified interface for chat completions across different LLM providers
  • Caching of responses using an LRU cache
  • Exponential backoff for retrying failed requests
  • Metrics tracking for requests, successes, errors, and cache hits
  • Environment-based configuration
  • Robust error handling

Installation

Add the following to your Cargo.toml:

[dependencies]
ai_client = { path = "/path/to/ai_client" }
tokio = { version = "1.0", features = ["full"] }

Or, if published to crates.io:

[dependencies]
ai_client = "0.1.0"
tokio = { version = "1.0", features = ["full"] }

Usage

use ai_client::clients::{ChatCompletionClient, GrokClient};
use ai_client::entities::Message;
use tokio;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = GrokClient::new()?;
    let messages = vec![
        Message {
            role: "system".to_string(),
            content: "You are a helpful assistant.".to_string(),
        },
        Message {
            role: "user".to_string(),
            content: "What is 101*3?".to_string(),
        },
    ];
    let response = client.send_chat_completion(messages, "low").await?;
    println!("Response: {:?}", response.choices[0].message.content);
    Ok(())
}

Environment Variables

  • GROK_API_KEY: API key for Grok (required)
  • GROK_API_ENDPOINT: API endpoint (default: https://api.x.ai/v1/chat/completions)
  • GROK_MODEL: Model name (default: grok-3-mini-fast-latest)
  • GROK_CACHE_SIZE: Cache size for responses (default: 100)

Building and Testing

cargo build
cargo test

License

Licensed under either MIT or Apache-2.0 at your option.

Commit count: 0

cargo fmt