| Crates.io | async-openai-compat |
| lib.rs | async-openai-compat |
| version | 0.29.6 |
| created_at | 2025-02-06 13:45:16.60353+00 |
| updated_at | 2025-09-25 08:05:57.330014+00 |
| description | Rust library for OpenAI |
| homepage | https://github.com/64bit/async-openai |
| repository | https://github.com/64bit/async-openai |
| max_upload_size | |
| id | 1545555 |
| size | 600,334 |
Note: This is a fork of async-openai focused on making the client compatible with other LLM providers. This functionality cannot be merged upstream due to strict usage of OpenAI's APIs specifications - see this PR discussion for more details.
Async Rust library for OpenAI
async-openai is an unofficial Rust library for OpenAI.
The library reads API key from the environment variable OPENAI_API_KEY.
# On macOS/Linux
export OPENAI_API_KEY='sk-...'
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'
async-openai.Only types for Realtime API are implemented, and can be enabled with feature flag realtime.
These types were written before OpenAI released official specs.
use async_openai::{
types::{CreateImageRequestArgs, ImageSize, ImageResponseFormat},
Client,
};
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
// create client, reads OPENAI_API_KEY environment variable for API key.
let client = Client::new();
let request = CreateImageRequestArgs::default()
.prompt("cats on sofa and carpet in living room")
.n(2)
.response_format(ImageResponseFormat::Url)
.size(ImageSize::S256x256)
.user("async-openai")
.build()?;
let response = client.images().create(request).await?;
// Download and save images to ./data directory.
// Each url is downloaded and saved in dedicated Tokio task.
// Directory is created if it doesn't exist.
let paths = response.save("./data").await?;
paths
.iter()
.for_each(|path| println!("Image file path: {}", path.display()));
Ok(())
}
Enable methods whose input and outputs are generics with byot feature. It creates a new method with same name and _byot suffix.
For example, to use serde_json::Value as request and response type:
let response: Value = client
.chat()
.create_byot(json!({
"messages": [
{
"role": "developer",
"content": "You are a helpful assistant"
},
{
"role": "user",
"content": "What do you think about life?"
}
],
"model": "gpt-4o",
"store": false
}))
.await?;
This can be useful in many scenarios:
serde.Visit examples/bring-your-own-type directory to learn more.
For any struct that implements Config trait, you can wrap it in a smart pointer and cast the pointer to dyn Config
trait object, then your client can accept any wrapped configuration type.
For example,
use async_openai::{Client, config::Config, config::OpenAIConfig};
let openai_config = OpenAIConfig::default();
// You can use `std::sync::Arc` to wrap the config as well
let config = Box::new(openai_config) as Box<dyn Config>;
let client: Client<Box<dyn Config> > = Client::with_config(config);
Thank you for taking the time to contribute and improve the project. I'd be happy to have you!
All forms of contributions, such as new features requests, bug fixes, issues, documentation, testing, comments, examples etc. are welcome.
A good starting point would be to look at existing open issues.
To maintain quality of the project, a minimum of the following is a must for code contribution:
This project adheres to Rust Code of Conduct
This project is licensed under MIT license.