Crates.io | foundry-local |
lib.rs | foundry-local |
version | 0.1.0 |
created_at | 2025-05-27 16:24:47.516894+00 |
updated_at | 2025-05-30 17:01:22.315564+00 |
description | SDK for Microsoft Foundry Local service |
homepage | |
repository | https://github.com/microsoft/Foundry-Local |
max_upload_size | |
id | 1691317 |
size | 97,728 |
A Rust SDK for interacting with the Microsoft Foundry Local service. This SDK allows you to manage and use AI models locally on your device. See Foundry Local for more infromation.
use foundry_local::FoundryLocalManager;
use anyhow::Result;
#[tokio::main]
async fn main() -> Result<()> {
// Create a FoundryLocalManager instance with the option to automatically download and start the service and a model
let manager = FoundryLocalManager::builder()
.alias_or_model_id("phi-3.5-mini")
.bootstrap(true)
.build()
.await?;
// Use the OpenAI compatible API to interact with the model
let client = reqwest::Client::new();
let response = client
.post(format!("{}/chat/completions", manager.endpoint()?))
.json(&serde_json::json!({
"model": model_info.id,
"messages": [{"role": "user", "content": prompt}],
}))
.send()
.await?;
let result = response.json::<serde_json::Value>().await?;
println!("{}", result["choices"][0]["message"]["content"]);
Ok(())
}
Add the following to your Cargo.toml
:
[dependencies]
foundry-local = "0.1.0"
winget install Microsoft.FoundryLocal
Licensed under the MIT License.