lm-studio-api-extended

Crates.iolm-studio-api-extended
lib.rslm-studio-api-extended
version0.1.3
created_at2025-07-27 20:07:38.748718+00
updated_at2025-09-10 20:38:17.784448+00
descriptionUnofficial Rust client for LM Studio with text embedding support.
homepage
repositoryhttps://github.com/pmrch/rs-lm-studio-api
max_upload_size
id1770373
size83,177
(pmrch)

documentation

README

githubcrates-iodocs-rs

🧠 LM Studio API (Extended)

This crate extends rs-lm-studio-api by [Bulat Sh.], adding embedding support. Use this to interact with locally hosted LM Studio models — send prompts, receive completions (streaming or not), and now... generate embeddings as well!


Features

  • ✅ Chat completions (fully async, with or without streaming)
  • ✅ Embedding support (POST /v1/embeddings, sync-style usage)
  • ✅ Minimal setup — just point to your LM Studio server
  • ✅ Customizable prompts, model selection, and context memory

Quick Start

Add it to your Cargo.toml:

lm-studio-api-extended = "0.1.2"

Examples

Check out the examples/ directory for working demos. You can run them like this:

cargo run --example embedding_test
cargo run --example chat_completion
cargo run --example chat_completion_streaming

Licensing:

This project is licensed under the MIT License, inheriting from the original work by Bulat Sh. See LICENSE for details.

Feedback:

You can contact the original creator via GitHub or send a message to their Telegram @fuderis.
This library is constantly evolving, and they welcome your suggestions and feedback.

Versions:

See CHANGELOG.md for version history.

Commit count: 18

cargo fmt