Crates.io | openai-rst |
lib.rs | openai-rst |
version | 0.2.0 |
source | src |
created_at | 2024-05-01 23:02:52.311543 |
updated_at | 2024-07-16 23:44:24.069364 |
description | OpenAI API Rust client |
homepage | https://github.com/ra0x3/openai-rst |
repository | https://github.com/ra0x3/openai-rst |
max_upload_size | |
id | 1227307 |
size | 192,831 |
⚠️ Forked from openai-rs-api.
The OpenAI API client Rust library provides convenient access to the OpenAI API from Rust applications.
Check out the docs.rs.
Cargo.toml
[dependencies]
openai-rst = "0.1.0"
The library needs to be configured with your account's secret key, which is available on the website. We recommend setting it as an environment variable. Here's an example of initializing the library with the API key loaded from an environment variable and creating a completion:
$ export OPENAI_API_KEY=sk-xxxxxxx
let client = Client::from_env().unwrap();
// Single request
let req = ChatCompletionRequest::new(
Model::GPT4(GPT4::GPT4),
ChatCompletionMessage {
role: MessageRole::User,
content: Content::Text(String::from("What is bitcoin?")),
name: None,
},
);
// Multiple requests
let req = ChatCompletionRequest::new_multi(
Model::GPT4(GPT4::GPT4),
vec![ChatCompletionMessage {
role: MessageRole::User,
content: Content::Text(String::from("What is bitcoin?")),
name: None,
}],
);
let result = client.chat_completion(req)?;
println!("Content: {:?}", result.get_choice());
$ export OPENAI_API_BASE=https://api.openai.com/v1
use openai_rst::{chat_completion::ChatCompletionRequest, client::Client, models::{Model, GPT4}};
fn main() -> Result<(), Box<dyn std::error::Error>> {
let client = Client::from_env().unwrap();
let req = "What is bitcoin?".into();
let result = client.chat_completion(req).await?;
println!("Content: {:?}", result.get_choice());
println!("Response Headers: {:?}", result.headers);
Ok(())
}
More Examples: examples
Check out the full API documentation for examples of all the available functions.
This project is licensed under MIT license.