ask_llm

Crates.ioask_llm
lib.rsask_llm
version0.1.4
created_at2025-03-14 19:14:01.741681+00
updated_at2025-06-29 23:20:52.702105+00
descriptionmake a request to whatever llm is the best these days, without hardcoding model/provider
homepagehttps://github.com/valeratrades/ask_llm
repositoryhttps://github.com/valeratrades/ask_llm
max_upload_size
id1592623
size105,399
Valera (valeratrades)

documentation

https://github.com/valeratrades/ask_llm/tree/master/README.md

README

ask_llm

Minimum Supported Rust Version crates.io docs.rs Lines Of Code
ci errors ci warnings

Layer for llm requests, generic over models and providers

Usage

Lib

Provides 2 simple primitives:

oneshot and conversation functions, which follow standard logic for llm interactions, that most providers share.

Then the model is automatically chosen based on whether we care about cost/speed/quality. Currently this is expressed by choosing Model::{Fast/Medium/Slow}, from which we pick a model as hardcoded in current implementation.

When used as a lib, import with

ask_llm = { version = "*", default-features = false }

as clap would be brought otherwise, as it is necessary for cli part to function.

Cli

Wraps the lib with clap. Uses oneshot by default, if needing conversation - read/write it from/to json files.

Semver

Note that due to specifics of implementation, minor version bumps can change effective behavior by changing what model processes the request. Only boundary API changes will be marked with major versions.


This repository follows my best practices and Tiger Style (except "proper capitalization for acronyms": (VsrState, not VSRState) and formatting).

License

Licensed under Blue Oak 1.0.0
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in this crate by you, as defined in the Apache-2.0 license, shall be licensed as above, without any additional terms or conditions.
Commit count: 14

cargo fmt