Crates.io | async-openai-wasm |
lib.rs | async-openai-wasm |
version | 0.26.0 |
source | src |
created_at | 2024-04-16 11:49:29.224579 |
updated_at | 2024-11-23 04:20:39.09254 |
description | Rust library for OpenAI on WASM |
homepage | https://github.com/ifsheldon/async-openai-wasm |
repository | https://github.com/ifsheldon/async-openai-wasm |
max_upload_size | |
id | 1210217 |
size | 415,339 |
Async Rust library for OpenAI on WASM
async-openai-wasm
is a FORK of async-openai
that supports WASM targets by targeting wasm32-unknown-unknown
.
That means >99% of the codebase should be attributed to the original project. The synchronization with the original
project is and will be done manually when async-openai
releases a new version. Versions are kept in sync
with async-openai
releases, which means when async-openai
releases x.y.z
, async-openai-wasm
also releases
a x.y.z
version.
async-openai-wasm
is an unofficial Rust library for OpenAI.
Note on Azure OpenAI Service (AOS): async-openai-wasm
primarily implements OpenAI spec, and doesn't try to
maintain parity with spec of AOS. Just like async-openai
.
async-openai
+ * WASM support
+ * WASM examples
+ * Realtime API: Does not bundle with a specific WS implementation. Need to convert a client event into a WS message by yourself, which is just simple `your_ws_impl::Message::Text(some_client_event.into_text())`
- * Tokio
- * Non-wasm examples: please refer to the original project [async-openai](https://github.com/64bit/async-openai/).
- * Builtin backoff retries: due to [this issue](https://github.com/ihrwein/backoff/issues/61).
- * Recommend: use `backon` with `gloo-timers-sleep` feature instead.
- * File saving: `wasm32-unknown-unknown` on browsers doesn't have access to filesystem.
The library reads API key from the environment
variable OPENAI_API_KEY
.
# On macOS/Linux
export OPENAI_API_KEY='sk-...'
# On Windows Powershell
$Env:OPENAI_API_KEY='sk-...'
async-openai
,
and WASM examples in async-openai-wasm
.Only types for Realtime API are implemented, and can be enabled with feature flag realtime
These types may change if/when OpenAI releases official specs for them.
Again, the types do not bundle with a specific WS implementation. Need to convert a client event into a WS message by yourself, which is just simple your_ws_impl::Message::Text(some_client_event.into_text())
.
use async_openai_wasm::{
types::{CreateImageRequestArgs, ImageSize, ImageResponseFormat},
Client,
};
use std::error::Error;
#[tokio::main]
async fn main() -> Result<(), Box<dyn Error>> {
// create client, reads OPENAI_API_KEY environment variable for API key.
let client = Client::new();
let request = CreateImageRequestArgs::default()
.prompt("cats on sofa and carpet in living room")
.n(2)
.response_format(ImageResponseFormat::Url)
.size(ImageSize::S256x256)
.user("async-openai-wasm")
.build()?;
let response = client.images().create(request).await?;
// Download and save images to ./data directory.
// Each url is downloaded and saved in dedicated Tokio task.
// Directory is created if it doesn't exist.
let paths = response.save("./data").await?;
paths
.iter()
.for_each(|path| println!("Image file path: {}", path.display()));
Ok(())
}
This repo will only accept issues and PRs related to WASM support. For other issues and PRs, please visit the original project async-openai.
This project adheres to Rust Code of Conduct
async-openai-wasm
Because I wanted to develop and release a crate that depends on the wasm feature in experiments
branch
of async-openai, but the pace of stabilizing the wasm feature is different
from what I expected.
The additional modifications are licensed under MIT license. The original project is also licensed under MIT license.