| Crates.io | rsoai |
| lib.rs | rsoai |
| version | 0.1.0 |
| created_at | 2025-05-10 08:09:55.655645+00 |
| updated_at | 2025-05-10 08:09:55.655645+00 |
| description | Rust client for the OpenAI API with multimodal and schema-based output |
| homepage | https://github.com/San-Juna-H/rsoai |
| repository | https://github.com/San-Juna-H/rsoai |
| max_upload_size | |
| id | 1668188 |
| size | 61,833 |
rsoai is a Rust client library for the OpenAI API, offering simple, structured, and multimodal (text + image) input support with robust error handling and JSON schema-based output deserialization.
Add rsoai to your Cargo.toml:
cargo add rsoai --path path/to/rsoai
cargo add dotenv
Set your OpenAI API key in a .env file:
OPENAI_API_KEY=your_openai_api_key_here
use rsoai::openai::{OpenAIChatClient, ChatClient};
fn main() {
let mut client = OpenAIChatClient::new("gpt-4o");
client.push_user_message("What is the capital of Japan?");
match client.run_text() {
Ok(reply) => println!("Assistant: {}", reply),
Err(e) => eprintln!("Error: {}", e),
}
}
let mut client = OpenAIChatClient::new("gpt-4o");
client.put_text("user", "What do you see in this image?");
client.put_image("user", "https://example.com/image.jpg");
let result = client.run_text();
First, define a macro in your macro crate:
use proc_macro::TokenStream;
use quote::quote;
use syn::{parse_macro_input, ItemStruct};
/// Attribute macro to add common derives and serde settings.
#[proc_macro_attribute]
pub fn schema_struct(_attr: TokenStream, item: TokenStream) -> TokenStream {
let input = parse_macro_input!(item as ItemStruct);
let ident = &input.ident;
let generics = &input.generics;
let fields = &input.fields;
let vis = &input.vis;
let attrs = &input.attrs;
let expanded = quote! {
#[derive(Debug, serde::Serialize, serde::Deserialize, schemars::JsonSchema)]
#[serde(deny_unknown_fields)]
#(#attrs)*
#vis struct #ident #generics #fields
};
TokenStream::from(expanded)
}
Use it to deserialize OpenAI output into typed Rust structs:
#[schema_struct]
struct Ingredient {
name: String,
quantity: String,
}
#[schema_struct]
struct Recipe {
title: String,
ingredients: Vec<Ingredient>,
}
Then run the structured query:
client.set_structured_mode::<Recipe>("Recipe");
client.push_user_message("Give me a French toast recipe.");
let recipe: Recipe = client.run_structured().unwrap();
println!("{:#?}", recipe);
Switch between modes during the session:
client.set_plain_text(); // Free-form text output
client.set_structured_mode::<T>("SchemaName"); // Structured JSON output
Track all interactions:
for msg in client.history() {
println!("{}: {:?}", msg.role, msg.content);
}
All errors are typed via ChatError:
enum ChatError {
MissingApiKey,
RequestFailed,
ApiError,
ReadBodyFailed,
JsonParseFailed,
NoOutputText,
}
Use standard ? or match to handle errors cleanly.
rsoai/
├── src/
│ ├── ioschema.rs # Schema, types, error handling
│ └── openai.rs # Client logic
project/
└── src/main.rs # Examples
.env file with OPENAI_API_KEYMIT License © 2025
See LICENSE for details.