Crates.io | llmvm-chat |
lib.rs | llmvm-chat |
version | 0.1.1 |
source | src |
created_at | 2023-08-09 20:16:05.526623 |
updated_at | 2024-01-23 07:45:58.014274 |
description | An llmvm frontend that acts as a CLI chat interface. |
homepage | |
repository | https://github.com/djandries/llmvm |
max_upload_size | |
id | 940334 |
size | 52,517 |
An llmvm frontend that acts as a CLI chat interface.
Install this application using cargo
.
cargo install llmvm-chat
The llmvm core must be installed. If you have not done so, the core may be installed via
cargo install llmvm-core
A backend must be installed and configured. The llmvm-outsource is recommended for OpenAI requests.
Currently, the default model preset is gpt-3.5-chat
which uses this backend.
Run llmvm-chat
to use the interface. Press CTRL-C when you are finished with your chat. A chat thread will be persisted, and the thread ID will be outputted.
Use the -h
to see all options.
Use the -l
to load the last chat thread.
Run the chat executable to generate a configuration file at:
~/.config/llmvm/chat.toml
.~/Library/Application Support/com.djandries.llmvm/chat.toml
AppData\Roaming\djandries\llmvm\config\chat.toml
Key | Required? | Description |
---|---|---|
stdio_core |
No | Stdio client configuration for communicated with llmvm core. See llmvm-protocol for details. |
http_core |
No | HTTP client configuration for communicating with llmvm core. See llmvm-protocol for details. |