Crates.io | llmterm |
lib.rs | llmterm |
version | |
source | src |
created_at | 2025-03-10 03:02:07.430483+00 |
updated_at | 2025-03-10 03:40:31.362434+00 |
description | Your friendly local LLM terminal companion |
homepage | |
repository | https://github.com/timschmidt/llmterm |
max_upload_size | |
id | 1586098 |
Cargo.toml error: | TOML parse error at line 18, column 1 | 18 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
The kalosm crate is used to interface with the LLM. Cuda support is enabled by default. To choose other inference methods, edit the Cargo.toml file and rebuild.
cargo build --release
cargo run --release
cargo install llmterm
cargo run --release -- --model llama_3_1_8b_chat
Use Ctrl-C, or type exit or quit.