| Crates.io | api_openai |
| lib.rs | api_openai |
| version | 0.3.0 |
| created_at | 2025-04-09 12:42:55.017213+00 |
| updated_at | 2025-11-30 06:39:08.4837+00 |
| description | OpenAI's API for accessing large language models (LLMs). |
| homepage | https://github.com/Wandalen/api_llm/tree/master/api/openai |
| repository | https://github.com/Wandalen/api_llm/tree/master/api/openai |
| max_upload_size | |
| id | 1626673 |
| size | 2,560,018 |
api_openaiComprehensive, type-safe Rust client for OpenAI's API with enterprise reliability features.
This API crate is designed as a stateless HTTP client with zero persistence requirements. It provides:
This ensures lightweight, containerized deployments and eliminates operational complexity.
Expose all server-side functionality transparently while maintaining zero client-side intelligence or automatic behaviors.
Key principles:
tokio for high-performance async operationsuse api_openai::exposed::{Client, Secret, components::responses::*, environment::*};
#[tokio::main]
async fn main() -> Result< (), Box< dyn std::error::Error > >
{
// Initialize client with official OpenAI endpoints
let secret = Secret::load_from_env("OPENAI_API_KEY")?;
let env = OpenaiEnvironmentImpl::build(
secret,
None,
None,
OpenAIRecommended::base_url().to_string(),
OpenAIRecommended::realtime_base_url().to_string(),
)?;
let client = Client::build(env)?;
// Create a response
let request = CreateResponseRequest::former()
.model("gpt-5.1-chat-latest".to_string())
.input(ResponseInput::String("Hello, world!".to_string()))
.form();
let response = client.responses().create(request).await?;
println!("Response: {}", response.id);
Ok(())
}
use api_openai::exposed::{Client, Secret, environment::OpenaiEnvironmentImpl};
#[tokio::main]
async fn main() -> Result< (), Box< dyn std::error::Error > >
{
let secret = Secret::load_from_env("OPENAI_API_KEY")?;
// Azure OpenAI Service
let env = OpenaiEnvironmentImpl::build(
secret,
None,
None,
"https://your-resource.openai.azure.com/".to_string(),
"https://your-resource.openai.azure.com/realtime/".to_string(),
)?;
// Or OpenAI-compatible API (LocalAI, Ollama, etc.)
// let env = OpenaiEnvironmentImpl::build(
// secret, None, None,
// "http://localhost:8080/v1/".to_string(),
// "http://localhost:8080/realtime/".to_string(),
// )?;
let client = Client::build(env)?;
// Use client normally - all APIs work with custom base URLs
// ...
Ok(())
}
See the examples/ directory for comprehensive examples of all API endpoints:
responses_create.rs - Basic response creationresponses_create_stream.rs - Streaming responsesresponses_create_with_tools.rs - Function callingresponses_create_image_input.rs - Multimodal inputresponses_get.rs - Retrieve responsesresponses_update.rs - Update responsesresponses_delete.rs - Delete responsesresponses_cancel.rs - Cancel responsesrealtime_response_create.rs - Real-time responsesrealtime_input_audio_buffer_append.rs - Audio streamingrealtime_session_update.rs - Session managementRun any example with:
cargo run --example responses_create
The crate includes comprehensive tests for all API endpoints with 100% pass rate:
# Run all tests (683 tests)
cargo nextest run --all-features
# Run with strict warnings
RUSTFLAGS="-D warnings" cargo nextest run --all-features
# Run clippy
cargo clippy --all-targets --all-features -- -D warnings
# Full verification (ctest3)
RUSTFLAGS="-D warnings" cargo nextest run --all-features && \
RUSTDOCFLAGS="-D warnings" cargo test --doc --all-features && \
cargo clippy --all-targets --all-features -- -D warnings
Test Statistics:
Note: Integration tests require a valid OpenAI API key. Tests fail loudly if credentials are unavailable (no silent fallbacks).
The crate supports multiple authentication methods via comprehensive fallback chain:
# Create workspace secrets file
mkdir -p ../../secret
echo 'export OPENAI_API_KEY="your-api-key-here"' > ../../secret/-secrets.sh
chmod 600 ../../secret/-secrets.sh
export OPENAI_API_KEY="your-api-key-here"
use api_openai::Secret;
let secret = Secret::load_with_fallbacks("OPENAI_API_KEY")?; // Tries all methods
// Or explicitly:
let secret = Secret::load_from_env("OPENAI_API_KEY")?;
let secret = Secret::new("sk-...".to_string())?; // With validation
Fallback Chain Order:
../../secret/-secrets.sh)OPENAI_API_KEY)secrets.sh, .env)Security Features:
sk-)secrecy crateThe library provides comprehensive error handling:
use api_openai::exposed::OpenAIError;
match client.responses().create(request).await
{
Ok(response) => println!("Success: {}", response.id),
Err(OpenAIError::Api(api_error)) =>
{
eprintln!("API Error: {}", api_error.message);
},
Err(OpenAIError::Reqwest(http_error)) =>
{
eprintln!("HTTP Error: {}", http_error);
},
Err(e) => eprintln!("Other Error: {:?}", e),
}
The crate follows a layered architecture using the mod_interface pattern:
Client)Responses, Chat)See the License file for details.