| Crates.io | lm-studio-api-extended |
| lib.rs | lm-studio-api-extended |
| version | 0.1.3 |
| created_at | 2025-07-27 20:07:38.748718+00 |
| updated_at | 2025-09-10 20:38:17.784448+00 |
| description | Unofficial Rust client for LM Studio with text embedding support. |
| homepage | |
| repository | https://github.com/pmrch/rs-lm-studio-api |
| max_upload_size | |
| id | 1770373 |
| size | 83,177 |
This crate extends rs-lm-studio-api
by [Bulat Sh.], adding embedding support.
Use this to interact with locally hosted LM Studio models — send prompts, receive
completions (streaming or not), and now... generate embeddings as well!
Add it to your Cargo.toml:
lm-studio-api-extended = "0.1.2"
Check out the examples/ directory for working demos. You can run them like this:
cargo run --example embedding_test
cargo run --example chat_completion
cargo run --example chat_completion_streaming
This project is licensed under the MIT License, inheriting from the original work by Bulat Sh. See LICENSE for details.
You can contact the original creator via GitHub or send a message to their Telegram @fuderis.
This library is constantly evolving, and they welcome your suggestions and feedback.
See CHANGELOG.md for version history.