flynn-openai
Async Rust library for OpenAI
## Overview
`flynn-openai` is an unofficial Rust library for OpenAI.
- It's based on [OpenAI OpenAPI spec](https://github.com/openai/openai-openapi)
- Current features:
- [x] Assistants (v2)
- [x] Audio
- [x] Batch
- [x] Chat
- [x] Completions (Legacy)
- [x] Embeddings
- [x] Files
- [x] Fine-Tuning
- [x] Images
- [x] Models
- [x] Moderations
- [ ] Organizations | Administration
- [x] Realtime API types (Beta)
- [ ] Uploads
- SSE streaming on available APIs
- Requests (except SSE streaming) including form submissions are retried with exponential backoff when [rate limited](https://platform.openai.com/docs/guides/rate-limits).
- Ergonomic builder pattern for all request objects.
- Microsoft Azure OpenAI Service (only for APIs matching OpenAI spec)
## Usage
The library reads [API key](https://platform.openai.com/account/api-keys) from the environment variable `OPENAI_API_KEY`.
```bash
# On macOS/Linux
export OPENAI_API_KEY='sk-...'
```
```powershell
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'
```
- Visit [examples](https://github.com/64bit/flynn-openai/tree/main/examples) directory on how to use `flynn-openai`.
- Visit [docs.rs/flynn-openai](https://docs.rs/flynn-openai) for docs.
## Realtime API
Only types for Realtime API are implemented, and can be enabled with feature flag `realtime`
These types may change when OpenAI releases official specs for them.
## Image Generation Example
```rust
use flynn_openai::{
types::{CreateImageRequestArgs, ImageSize, ImageResponseFormat},
Client,
};
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box> {
// create client, reads OPENAI_API_KEY environment variable for API key.
let client = Client::new();
let request = CreateImageRequestArgs::default()
.prompt("cats on sofa and carpet in living room")
.n(2)
.response_format(ImageResponseFormat::Url)
.size(ImageSize::S256x256)
.user("flynn-openai")
.build()?;
let response = client.images().create(request).await?;
// Download and save images to ./data directory.
// Each url is downloaded and saved in dedicated Tokio task.
// Directory is created if it doesn't exist.
let paths = response.save("./data").await?;
paths
.iter()
.for_each(|path| println!("Image file path: {}", path.display()));
Ok(())
}
```
Scaled up for README, actual size 256x256
## Contributing
Thank you for taking the time to contribute and improve the project. I'd be happy to have you!
All forms of contributions, such as new features requests, bug fixes, issues, documentation, testing, comments, [examples](../examples) etc. are welcome.
A good starting point would be to look at existing [open issues](https://github.com/64bit/flynn-openai/issues).
To maintain quality of the project, a minimum of the following is a must for code contribution:
- **Names & Documentation**: All struct names, field names and doc comments are from OpenAPI spec. Nested objects in spec without names leaves room for making appropriate name.
- **Tested**: For changes supporting test(s) and/or example is required. Existing examples, doc tests, unit tests, and integration tests should be made to work with the changes if applicable.
- **Scope**: Keep scope limited to APIs available in official documents such as [API Reference](https://platform.openai.com/docs/api-reference) or [OpenAPI spec](https://github.com/openai/openai-openapi/). Other LLMs or AI Providers offer OpenAI-compatible APIs, yet they may not always have full parity. In such cases, the OpenAI spec takes precedence.
- **Consistency**: Keep code style consistent across all the "APIs" that library exposes; it creates a great developer experience.
This project adheres to [Rust Code of Conduct](https://www.rust-lang.org/policies/code-of-conduct)
## Complimentary Crates
- [openai-func-enums](https://github.com/frankfralick/openai-func-enums) provides procedural macros that make it easier to use this library with OpenAI API's tool calling feature. It also provides derive macros you can add to existing [clap](https://github.com/clap-rs/clap) application subcommands for natural language use of command line tools. It also supports openai's [parallel tool calls](https://platform.openai.com/docs/guides/function-calling/parallel-function-calling) and allows you to choose between running multiple tool calls concurrently or own their own OS threads.
- [flynn-openai-wasm](https://github.com/ifsheldon/flynn-openai-wasm) provides WASM support.
## License
This project is licensed under [MIT license](https://github.com/64bit/flynn-openai/blob/main/LICENSE).