openai_responses

Crates.ioopenai_responses
lib.rsopenai_responses
version
sourcesrc
created_at2025-03-12 07:51:48.779583+00
updated_at2025-03-14 21:50:42.095188+00
descriptionRust SDK for the OpenAI Responses API
homepagehttps://github.com/m1guelpf/openai-responses-rs
repositoryhttps://github.com/m1guelpf/openai-responses-rs
max_upload_size
id1589457
Cargo.toml error:TOML parse error at line 18, column 1 | 18 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include`
size0
Miguel Piedrafita (m1guelpf)

documentation

README

OpenAI Responses SDK

crates.io download count badge docs.rs

An unofficial Rust SDK for the OpenAI Responses API.

Usage

To get started, create a new Client and call the create method with a Request object. The Request object contains the parameters for the API call, such as the model, instructions, and input. The create method returns a Response object, which contains the output of the API call.

use openai_responses::{Client, Request, types::{Input, Model}};

let response = Client::from_env()?.create(Request {
    model: Model::GPT4o,
    input: Input::Text("Are semicolons optional in JavaScript?".to_string()),
    instructions: Some("You are a coding assistant that talks like a pirate".to_string()),
    ..Default::default()
}).await?;

println!("{}", response.output_text());

To stream the response as it is generated, use the stream method:

use openai_responses::{Client, Request};

// You can also build the `Request` struct with a fluent interface
let mut stream = Client::from_env()?.stream(
    Request::builder()
        .model("gpt-4o")
        .input("Are semicolons optional in JavaScript?")
        .instructions("You are a coding assistant that talks like a pirate")
        .build()
);

while let Some(event) = stream.next().await {
    dbg!(event?);
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

Commit count: 0

cargo fmt