| Crates.io | unia |
| lib.rs | unia |
| version | 0.1.0 |
| created_at | 2026-01-06 00:28:21.865632+00 |
| updated_at | 2026-01-06 00:28:21.865632+00 |
| description | A pragmatic, provider-agnostic Rust LLM client. |
| homepage | |
| repository | https://github.com/geodic/unia |
| max_upload_size | |
| id | 2024861 |
| size | 318,996 |
⚠️ Warning: Heavy Development
This library is currently under active and heavy development. APIs are subject to change, and future updates may introduce breaking changes. Use with caution in production environments.
unia is a pragmatic, provider-agnostic Rust library designed to unify interactions with various Large Language Model (LLM) providers. It abstracts away the differences between APIs (OpenAI, Anthropic, Gemini, etc.) into a single, consistent interface, while providing powerful features like automatic tool execution (Agents) and Model Context Protocol (MCP) integration.
Write your code once and switch providers with a single line of configuration. unia normalizes:
Message, Part, and Response structs.The Agent struct wraps any Client to provide an autonomous loop:
Built-in support for the Model Context Protocol:
Here is an exhaustive list of all the providers we currently support (more to come):
Add this to your Cargo.toml:
[dependencies]
unia = "0.1.0"
tokio = { version = "1.0", features = ["full"] }
use unia::client::Client;
use unia::model::{Message, Part};
use unia::providers::{OpenAI, Provider};
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
let api_key = std::env::var("OPENAI_API_KEY")?;
// Create client using the factory
let client = OpenAI::create(api_key, "gpt-5".to_string());
// Create a message
let messages = vec![
Message::User(vec![
Part::Text {
content: "Hello!".to_string(),
finished: true,
}
])
];
// Send request
let response = client.request(messages, vec![]).await?;
// Print response content
if let Some(msg) = response.data.first() {
if let Some(content) = msg.content() {
println!("Response: {}", content);
}
}
Ok(())
}
See examples/ for more detailed example usages.
This project is licensed under the MIT License - see the LICENSE file for details.