| Crates.io | openai_responses |
| lib.rs | openai_responses |
| version | 0.1.6 |
| created_at | 2025-03-12 07:51:48.779583+00 |
| updated_at | 2025-05-02 21:32:25.431713+00 |
| description | Rust SDK for the OpenAI Responses API |
| homepage | https://github.com/m1guelpf/openai-responses-rs |
| repository | https://github.com/m1guelpf/openai-responses-rs |
| max_upload_size | |
| id | 1589457 |
| size | 127,710 |
An unofficial Rust SDK for the OpenAI Responses API.
To get started, create a new Client and call the create method with a Request object. The Request object contains the parameters for the API call, such as the model, instructions, and input. The create method returns a Response object, which contains the output of the API call.
use openai_responses::{Client, Request, types::{Input, Model}};
let response = Client::from_env()?.create(Request {
model: Model::GPT4o,
input: Input::Text("Are semicolons optional in JavaScript?".to_string()),
instructions: Some("You are a coding assistant that talks like a pirate".to_string()),
..Default::default()
}).await?;
println!("{}", response.output_text());
To stream the response as it is generated, use the stream method:
use openai_responses::{Client, Request};
// You can also build the `Request` struct with a fluent interface
let mut stream = Client::from_env()?.stream(
Request::builder()
.model("gpt-4o")
.input("Are semicolons optional in JavaScript?")
.instructions("You are a coding assistant that talks like a pirate")
.build()
);
while let Some(event) = stream.next().await {
dbg!(event?);
}
This project is licensed under the MIT License - see the LICENSE file for details.