| Crates.io | rsai |
| lib.rs | rsai |
| version | 0.3.0 |
| created_at | 2025-10-31 12:54:04.384588+00 |
| updated_at | 2025-12-10 15:18:12.610165+00 |
| description | Predictable development for unpredictable Gen-AI models. Let the compiler handle the chaos. |
| homepage | https://rsai.dev |
| repository | https://github.com/caluckenbach/rsai |
| max_upload_size | |
| id | 1909995 |
| size | 290,176 |
⚠️ WARNING: This is a pre-release version with an unstable API. Breaking changes may occur between versions. Use with caution and pin to specific versions in production applications.
This library offers an opinionated feature set, rather than trying to be a general-purpose LLM client.
| Provider | API Type | Notes |
|---|---|---|
| OpenAI | Responses API | Uses the /responses endpoint for structured interactions. |
| OpenRouter | Responses API | Uses the /responses endpoint, supporting a wide range of models. |
use rsai::{llm, Message, ChatRole, ApiKey, Provider, completion_schema};
#[completion_schema]
struct Analysis {
sentiment: String,
confidence: f32,
}
let analysis = llm::with(Provider::OpenAI)
.api_key(ApiKey::Default)?
.model("gpt-4o-mini")
.messages(vec![Message {
role: ChatRole::User,
content: "Analyze: 'This library is amazing!'".to_string(),
}])
.complete::<Analysis>()
.await?;
The #[completion_schema] macro automatically adds the necessary derives (Deserialize, JsonSchema) and attributes for structured output. It supports:
#[completion_schema]
enum TaskStatus {
NotStarted,
InProgress { percentage: u8 },
Completed { date: String },
Blocked { reason: String },
}
let status = llm::with(Provider::OpenAI)
.api_key(ApiKey::Default)?
.model("gpt-4o-mini")
.messages(vec![/* ... */])
.complete::<TaskStatus>()
.await?;
Note: The library automatically handles provider-specific requirements (e.g., wrapping non-object types for OpenAI).
For plain text, use TextResponse.
use rsai::{llm, TextResponse, /* ... */};
let response = llm::with(Provider::OpenAI)
// ... configuration ...
.complete::<TextResponse>()
.await?;
println!("{}", response.text);
See examples/ for more runnable examples.
This project is licensed under the MIT License - see the LICENSE file for details.