mini-prompt

Crates.iomini-prompt
lib.rsmini-prompt
version0.0.1
created_at2025-06-10 20:44:51.687572+00
updated_at2025-06-10 20:44:51.687572+00
descriptionLightweight abstractions for using LLMs via a providers API.
homepagehttps://github.com/twitchyliquid64/mini-prompt
repositoryhttps://github.com/twitchyliquid64/mini-prompt
max_upload_size
id1707740
size112,822
Tom (twitchyliquid64)

documentation

https://docs.rs/mini_prompt

README

mini-prompt

Lightweight abstractions for using LLMs via a providers API.

Simple calls:

let mut backend = callers::Openrouter::<models::Gemma27B3>::default();
let resp =
    backend.simple_call("How much wood could a wood-chuck chop").await;

If you are looking for more control over the input, you can use call instead of simple_call.

With tools:

let backend = callers::Anthropic::<models::ClaudeHaiku35>::default();
let mut session = ToolsSession::new(
            backend,
            vec![
                (
                    ToolInfo::new("flubb", "Performs the flubb action.", None).into(),
                    Box::new(move |_args| {
                        r#"{"status": "success", "message": "flubb completed successfully"}"#
                            .to_string()
                    }),
                ),
            ],
        );

let resp =
    session.simple_call("Go ahead and flubb for me").await;

Structured output:

let mut backend = callers::Openrouter::<models::Gemma27B3>::default();
let resp =
    backend.simple_call("Whats 2+2? output the final answer as JSON within triple backticks (A markdown code block with json as the language).").await;

let json = markdown_codeblock(&resp.unwrap(), &MarkdownOptions::json()).unwrap();
let p: serde_json::Value = serde_json_lenient::from_str(&json).expect("json decode");

License: MIT OR Apache-2.0

Commit count: 25

cargo fmt