Crates.io | async-openai |
lib.rs | async-openai |
version | 0.21.0 |
source | src |
created_at | 2022-12-02 08:26:48.919974 |
updated_at | 2024-05-07 05:43:43.624677 |
description | Rust library for OpenAI |
homepage | https://github.com/64bit/async-openai |
repository | https://github.com/64bit/async-openai |
max_upload_size | |
id | 728237 |
size | 277,495 |
Async Rust library for OpenAI
async-openai
is an unofficial Rust library for OpenAI.
experiments
branch)Note on Azure OpenAI Service (AOS): async-openai
primarily implements OpenAI spec, and doesn't try to maintain parity with spec of AOS.
The library reads API key from the environment variable OPENAI_API_KEY
.
# On macOS/Linux
export OPENAI_API_KEY='sk-...'
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'
async-openai
.use async_openai::{
types::{CreateImageRequestArgs, ImageSize, ResponseFormat},
Client,
};
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
// create client, reads OPENAI_API_KEY environment variable for API key.
let client = Client::new();
let request = CreateImageRequestArgs::default()
.prompt("cats on sofa and carpet in living room")
.n(2)
.response_format(ResponseFormat::Url)
.size(ImageSize::S256x256)
.user("async-openai")
.build()?;
let response = client.images().create(request).await?;
// Download and save images to ./data directory.
// Each url is downloaded and saved in dedicated Tokio task.
// Directory is created if it doesn't exist.
let paths = response.save("./data").await?;
paths
.iter()
.for_each(|path| println!("Image file path: {}", path.display()));
Ok(())
}
Thank you for taking the time to contribute and improve the project. I'd be happy to have you!
All forms of contributions, such as new features requests, bug fixes, issues, documentation, testing, comments, examples etc. are welcome.
A good starting point would be to look at existing open issues.
To maintain quality of the project, a minimum of the following is a must for code contribution:
This project adheres to Rust Code of Conduct
openai-func-enums provides procedural macros that make it easier to use this library with OpenAI API's tool calling feature. It also provides derive macros you can add to existing clap application subcommands for natural language use of command line tools. It also supports openai's parallel tool calls and allows you to choose between running multiple tool calls concurrently or own their own OS threads.
This project is licensed under MIT license.