openai-ergonomic

Crates.ioopenai-ergonomic
lib.rsopenai-ergonomic
version0.1.0
created_at2025-09-21 19:48:50.331723+00
updated_at2025-09-21 19:48:50.331723+00
descriptionErgonomic Rust wrapper for OpenAI API
homepagehttps://github.com/genai-rs/openai-ergonomic
repositoryhttps://github.com/genai-rs/openai-ergonomic
max_upload_size
id1849074
size189,857
Tim Van Wassenhove (timvw)

documentation

https://docs.rs/openai-ergonomic

README

openai-ergonomic

Crates.io Documentation CI MSRV License

Ergonomic Rust wrapper for the OpenAI API, providing type-safe builder patterns and async/await support.

Features

  • ๐Ÿ›ก๏ธ Type-safe - Full type safety with builder patterns using bon
  • โšก Async/await - Built on tokio and reqwest for modern async Rust
  • ๐Ÿ”„ Streaming - First-class support for streaming responses
  • ๐Ÿ“ Comprehensive - Covers all OpenAI API endpoints
  • ๐Ÿงช Well-tested - Extensive test coverage with mock support
  • ๐Ÿ“š Well-documented - Rich documentation with examples

Status

๐Ÿšง Under Construction - This crate is currently being developed and is not yet ready for production use.

Quick Start

Add openai-ergonomic to your Cargo.toml:

[dependencies]
openai-ergonomic = "0.1"
tokio = { version = "1.0", features = ["full"] }

Basic Usage

use openai_ergonomic::{Client, Config};

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = Client::from_env()?
        .api_key("your-api-key-here")
        .build();

    let response = client
        .chat_completions()
        .model("gpt-4")
        .message("user", "Hello, world!")
        .send()
        .await?;

    println!("{}", response.choices[0].message.content);
    Ok(())
}

Streaming Example

use openai_ergonomic::{Client, Config};
use futures::StreamExt;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    let client = Client::from_env()?
        .api_key("your-api-key-here")
        .build();

    let mut stream = client
        .chat_completions()
        .model("gpt-4")
        .message("user", "Tell me a story")
        .stream()
        .await?;

    while let Some(chunk) = stream.next().await {
        let chunk = chunk?;
        if let Some(content) = chunk.choices[0].delta.content {
            print!("{}", content);
        }
    }
    Ok(())
}

Documentation

Examples

The examples/ directory contains comprehensive examples for all major OpenAI API features:

  • Basic Usage: Simple chat completions and responses
  • Streaming: Real-time response streaming
  • Function Calling: Tool integration and function calling
  • Vision: Image understanding and analysis
  • Audio: Speech-to-text and text-to-speech
  • Assistants: Assistant API with file handling
  • Embeddings: Vector embeddings generation
  • Images: Image generation and editing

Run an example:

cargo run --example quickstart

Contributing

We welcome contributions! Please see our Contributing Guide for details.

License

Licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in the work by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.

Commit count: 99

cargo fmt