| Crates.io | ai_client |
| lib.rs | ai_client |
| version | 0.1.1 |
| created_at | 2025-05-06 12:23:18.261238+00 |
| updated_at | 2025-05-06 12:24:54.447733+00 |
| description | A Rust crate for interacting with AI language model APIs |
| homepage | |
| repository | |
| max_upload_size | |
| id | 1662308 |
| size | 65,764 |
A Rust crate for interacting with AI language model APIs, supporting multiple providers (Grok, Anthropic, OpenAI) through a unified ChatCompletionClient trait.
Add the following to your Cargo.toml:
[dependencies]
ai_client = { path = "/path/to/ai_client" }
tokio = { version = "1.0", features = ["full"] }
Or, if published to crates.io:
[dependencies]
ai_client = "0.1.0"
tokio = { version = "1.0", features = ["full"] }
use ai_client::clients::{ChatCompletionClient, GrokClient};
use ai_client::entities::Message;
use tokio;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = GrokClient::new()?;
let messages = vec![
Message {
role: "system".to_string(),
content: "You are a helpful assistant.".to_string(),
},
Message {
role: "user".to_string(),
content: "What is 101*3?".to_string(),
},
];
let response = client.send_chat_completion(messages, "low").await?;
println!("Response: {:?}", response.choices[0].message.content);
Ok(())
}
GROK_API_KEY: API key for Grok (required)GROK_API_ENDPOINT: API endpoint (default: https://api.x.ai/v1/chat/completions)GROK_MODEL: Model name (default: grok-3-mini-fast-latest)GROK_CACHE_SIZE: Cache size for responses (default: 100)cargo build
cargo test
Licensed under either MIT or Apache-2.0 at your option.