etke_openai_api_rust

Crates.ioetke_openai_api_rust
lib.rsetke_openai_api_rust
version0.1.9
sourcesrc
created_at2024-09-12 18:47:31.815683
updated_at2024-09-12 18:47:31.815683
description(Temporary etke.cc fork) A very simple Rust library for OpenAI API, free from complex async operations and redundant dependencies.
homepage
repositoryhttps://github.com/etkecc/openai_api_rust
max_upload_size
id1373032
size574,729
Slavi Pantaleev (spantaleev)

documentation

README

OpenAI API for Rust

GitHub Workflow Status Crates.io Crates.io GitHub

NOTE: this is a temporary fork of openai_api_rust (https://github.com/openai-rs/openai-api) which improves the functionality of the image-generation API.

A community-maintained library provides a simple and convenient way to interact with the OpenAI API. No complex async and redundant dependencies.

API

check official API reference

API Support
Models ✔️
Completions ✔️
Chat ✔️
Images ✔️
Embeddings ✔️
Audio ✔️
Files
Fine-tunes
Moderations
Engines

Usage

Add the following to your Cargo.toml file:

openai_api_rust = "0.1.9"

Export your API key into the environment variables

export OPENAI_API_KEY=<your_api_key>

Then use the crate in your Rust code:

use openai_api_rust::*;
use openai_api_rust::chat::*;
use openai_api_rust::completions::*;

fn main() {
    // Load API key from environment OPENAI_API_KEY.
    // You can also hadcode through `Auth::new(<your_api_key>)`, but it is not recommended.
    let auth = Auth::from_env().unwrap();
    let openai = OpenAI::new(auth, "https://api.openai.com/v1/");
    let body = ChatBody {
        model: "gpt-3.5-turbo".to_string(),
        max_tokens: Some(7),
        temperature: Some(0_f32),
        top_p: Some(0_f32),
        n: Some(2),
        stream: Some(false),
        stop: None,
        presence_penalty: None,
        frequency_penalty: None,
        logit_bias: None,
        user: None,
        messages: vec![Message { role: Role::User, content: "Hello!".to_string() }],
    };
    let rs = openai.chat_completion_create(&body);
    let choice = rs.unwrap().choices;
    let message = &choice[0].message.as_ref().unwrap();
    assert!(message.content.contains("Hello"));
}

Use proxy

Load proxy from env

let openai = OpenAI::new(auth, "https://api.openai.com/v1/")
        .use_env_proxy();

Set the proxy manually

let openai = OpenAI::new(auth, "https://api.openai.com/v1/")
        .set_proxy("http://127.0.0.1:1080");

License

This library is distributed under the terms of the MIT license. See LICENSE for details.

Commit count: 0

cargo fmt