| Crates.io | onellm |
| lib.rs | onellm |
| version | 1.0.3 |
| created_at | 2025-07-10 06:16:05.095511+00 |
| updated_at | 2025-08-13 10:01:45.892779+00 |
| description | Official rust crate to communicate with the OneLLM API in rust |
| homepage | |
| repository | https://github.com/OneLLM-dev/onellm-crate.git |
| max_upload_size | |
| id | 1745940 |
| size | 52,500 |
This is a Rust client for interacting with the OneLLM API.
Add this to your Cargo.toml:
onellm = "1.0.0"
use onellm::input::Message;
mod input;
mod output;
#[tokio::main]
async fn main() {
let output = input::APIInput::new(
"https://api.deepseek.com/chat/completions".to_string(),
input::Model::DeepSeekV3,
vec![Message {
role: "user".to_string(),
content: "hi there!".to_string(),
}],
200,
)
.send(String::from("YOUR API KEY HERE"))
.await
.expect("Error obtaining result");
println!("Output: {output:#?}");
}