| Crates.io | mini-prompt |
| lib.rs | mini-prompt |
| version | 0.0.1 |
| created_at | 2025-06-10 20:44:51.687572+00 |
| updated_at | 2025-06-10 20:44:51.687572+00 |
| description | Lightweight abstractions for using LLMs via a providers API. |
| homepage | https://github.com/twitchyliquid64/mini-prompt |
| repository | https://github.com/twitchyliquid64/mini-prompt |
| max_upload_size | |
| id | 1707740 |
| size | 112,822 |
Lightweight abstractions for using LLMs via a providers API.
Simple calls:
let mut backend = callers::Openrouter::<models::Gemma27B3>::default();
let resp =
backend.simple_call("How much wood could a wood-chuck chop").await;
If you are looking for more control over the input, you can use call instead of simple_call.
With tools:
let backend = callers::Anthropic::<models::ClaudeHaiku35>::default();
let mut session = ToolsSession::new(
backend,
vec![
(
ToolInfo::new("flubb", "Performs the flubb action.", None).into(),
Box::new(move |_args| {
r#"{"status": "success", "message": "flubb completed successfully"}"#
.to_string()
}),
),
],
);
let resp =
session.simple_call("Go ahead and flubb for me").await;
Structured output:
let mut backend = callers::Openrouter::<models::Gemma27B3>::default();
let resp =
backend.simple_call("Whats 2+2? output the final answer as JSON within triple backticks (A markdown code block with json as the language).").await;
let json = markdown_codeblock(&resp.unwrap(), &MarkdownOptions::json()).unwrap();
let p: serde_json::Value = serde_json_lenient::from_str(&json).expect("json decode");
License: MIT OR Apache-2.0