Crates.io | rustylms |
lib.rs | rustylms |
version | 0.1.0 |
source | src |
created_at | 2024-06-22 12:29:38.472981 |
updated_at | 2024-06-22 12:29:38.472981 |
description | A library used to communicate with lm-studio servers |
homepage | |
repository | https://github.com/BaxoPlenty/rustylms |
max_upload_size | |
id | 1280427 |
size | 22,629 |
ℹ️ If you are looking for an ollama api wrapper, consider looking at ollama-rs
⚠️ This project is still not finished! Bugs may occur
This library provides support for LM Studio Servers. All features are made according to the official documentation.
use rustylms::lmsserver::LMSServer;
#[tokio::main]
async fn main() {
let server = LMSServer::new("http://localhost:1234");
let models = server.get_models().await.expect("Unable to retrieve models");
println!("{:#?}", models);
}
use rustylms::{chat::Chat, lmsserver::LMSServer};
#[tokio::main]
async fn main() {
let server = LMSServer::new("http://localhost:1234");
let chat = Chat::new("model-name")
.system_prompt(
"You are a helpful assistant that gives information to any programming-related topic.",
)
.user_prompt("what is rust?");
let completion = chat
.get_completions(&server)
.await
.expect("could not get completions");
let message = completion.get_message().unwrap();
println!("The assistant answered: {}", message.content);
}