rustylms

Crates.iorustylms
lib.rsrustylms
version0.1.0
sourcesrc
created_at2024-06-22 12:29:38.472981
updated_at2024-06-22 12:29:38.472981
descriptionA library used to communicate with lm-studio servers
homepage
repositoryhttps://github.com/BaxoPlenty/rustylms
max_upload_size
id1280427
size22,629
(BaxoPlenty)

documentation

README

rustylms - A LM-Studio API wrapper written in Rust

ℹ️ If you are looking for an ollama api wrapper, consider looking at ollama-rs

⚠️ This project is still not finished! Bugs may occur

This library provides support for LM Studio Servers. All features are made according to the official documentation.

Feature List

  • Generating completions using chats
  • Retrieving all models from the server

To-Do List

  • Generating completions
    • Supporting streams as responses
  • Creating embeddings

Examples

Retrieve models

use rustylms::lmsserver::LMSServer;

#[tokio::main]
async fn main() {
    let server = LMSServer::new("http://localhost:1234");
    let models = server.get_models().await.expect("Unable to retrieve models");

    println!("{:#?}", models);
}

Generating a chat completion

use rustylms::{chat::Chat, lmsserver::LMSServer};

#[tokio::main]
async fn main() {
    let server = LMSServer::new("http://localhost:1234");
    let chat = Chat::new("model-name")
        .system_prompt(
            "You are a helpful assistant that gives information to any programming-related topic.",
        )
        .user_prompt("what is rust?");

    let completion = chat
        .get_completions(&server)
        .await
        .expect("could not get completions");
    let message = completion.get_message().unwrap();

    println!("The assistant answered: {}", message.content);
}
Commit count: 13

cargo fmt