| Crates.io | simple-groq-rs |
| lib.rs | simple-groq-rs |
| version | 0.1.3 |
| created_at | 2025-12-30 15:13:39.201272+00 |
| updated_at | 2025-12-30 15:53:06.90437+00 |
| description | A simple, async Rust client for the Groq API (OpenAPI-compatible) |
| homepage | |
| repository | https://github.com/olatunbosunoyeleke94/simple-groq-rs |
| max_upload_size | |
| id | 2012807 |
| size | 50,457 |
A minimal, ergonomic, async Rust client for the Groq inference API.
Groq provides lightning-fast inference for open-source models (Llama 3.1, Mixtral, Gemma, etc.) and is fully compatible with the OpenAI API format. This crate gives you a clean, lightweight way to call Groq from Rust with almost zero overhead.
reqwest)Add this to your Cargo.toml:
[dependencies]
simple-groq-rs = "0.1.0"
export GROQ_API_KEY="gsk_your_key_here"
use simple_groq_rs::{GroqClient, Message};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = GroqClient::from_env()?; // Loads GROQ_API_KEY automatically
let messages = vec![
Message::system("You are a helpful and concise assistant."),
Message::user("Explain Rust's borrow checker in one short paragraph."),
];
let response = client
.chat_completion("llama-3.1-70b-versatile", messages, None, None)
.await?;
println!("Groq response:\n{}", response);
Ok(())
}
Running Examples:
This crate includes ready-to-run examples:
# Set your key (once per terminal session)
export GROQ_API_KEY="gsk_your_key_here"
# Simple chat example
cargo run --example simple
# List all models available to your account
cargo run --example list_models