| Crates.io | openllm |
| lib.rs | openllm |
| version | 0.5.2 |
| created_at | 2025-02-16 09:19:17.994164+00 |
| updated_at | 2025-05-06 09:25:48.340961+00 |
| description | A simple SDK for OpenAI compatible API. |
| homepage | https://github.com/thlstsul/llm-sdk |
| repository | https://github.com/thlstsul/llm-sdk |
| max_upload_size | |
| id | 1557500 |
| size | 243,601 |
SDK for OpenAI compatible APIs.
Add openllm by using cargo add openllm.
As assistant API is still in Beta and is super slow, so we don't have plan to support it (and relevant file APIs) for now.
Here are some examples of how to use the SDK:
let sdk = LlmSdk::new_with_base_url("your-api-key", "https://api.deepseek.com");
// chat completion
let messages = vec![
ChatCompletionMessage::new_system("I can answer any question you ask me.", ""),
ChatCompletionMessage::new_user("What is human life expectancy in the world?", "user1"),
];
let req = ChatCompletionRequest::new(openllm::ChatCompleteModel::DeepSeekChat, messages);
let res = sdk.chat_completion(req).await?;
// stream
sdk.chat_stream(req, |msg| {
println!("Received message: {:?}", msg.choices);
})
.await?;
For more usage, please check the test cases.