| Crates.io | lm-studio-api |
| lib.rs | lm-studio-api |
| version | 0.1.5 |
| created_at | 2025-05-26 22:48:45.081921+00 |
| updated_at | 2025-11-13 11:53:47.79034+00 |
| description | This API is designed for interacting with LM Studio |
| homepage | |
| repository | https://github.com/fuderis/rs-lm-studio-api |
| max_upload_size | |
| id | 1690346 |
| size | 103,134 |
Is a high-performance and user-friendly library for interacting with locally running Llama-based language models via LM Studio. It allows you to send requests to models, receive responses both in full and in streaming mode, and manage model parameters.
Support for both regular and streaming response modes.
Context management and system prompt customization.
Flexible configuration of request and model parameters.
Supports structured response schemes in JSON format.
use lm_studio_api::prelude::*;
/// The system prompt
struct SystemPrompt;
impl SystemInfo for SystemPrompt {
fn new() -> Box<Self> {
Box::new(Self {})
}
fn update(&mut self) -> String {
format!(r##"
You're Jarvis — is a personal assistant created by the best programmer 'Fuderis'.
Response briefly and clearly.
Response language: English.
Actual system Info:
* datetime: 1969-10-29 22:30:00.
* location: Russian Federation, Moscow.
"##)
}
}
#[tokio::main]
async fn main() -> Result<()> {
// init chat:
let mut chat = Chat::new(
Model::Gemma3_4b, // AI model
Context::new(SystemPrompt::new(), 8192), // system prompt + max tokens
9090, // server port
);
// generating request:
let request = Messages {
messages: vec![
Message {
role: Role::User,
content: vec![
Content::Text { text: "What is shown in the picture?".into() },
Content::Image { image_url: Image::from_file("rust-logo.png").unwrap() }
]
}
],
context: true,
stream: true,
/* format: Some(Format::json(
"commands",
vec![
Schema::object(
"datetime",
"returns actual datetime",
macron::hash_map! {
"time": Schema::string("only time", Some("time")),
"date": Schema::string("only date", Some("date")),
}
),
Schema::object(
"location",
"returns user geolocation",
macron::hash_map! {
"location": Schema::string("user geolocation", None),
}
),
],
false
)), */
..Default::default()
};
// sending request:
let _ = chat.send(request.into()).await?;
// reading pre-results:
while let Some(result) = chat.next().await {
match result {
Ok(r) => if let Some(text) = r.text() { eprint!("{text}"); }else{ },
Err(e) => eprintln!("Error: {e}"),
}
}
Ok(())
}
Distributed under the MIT license.
You can find me here, also see my channel. I welcome your suggestions and feedback!
Copyright (c) 2025 Bulat Sh. (fuderis)