llmvm-chat

Crates.iollmvm-chat
lib.rsllmvm-chat
version0.1.1
sourcesrc
created_at2023-08-09 20:16:05.526623
updated_at2024-01-23 07:45:58.014274
descriptionAn llmvm frontend that acts as a CLI chat interface.
homepage
repositoryhttps://github.com/djandries/llmvm
max_upload_size
id940334
size52,517
Darnell Andries (DJAndries)

documentation

README

llmvm-chat

Crates.io GitHub

An llmvm frontend that acts as a CLI chat interface.

Demo

asciicast

Installation

Install this application using cargo.

cargo install llmvm-chat

The llmvm core must be installed. If you have not done so, the core may be installed via

cargo install llmvm-core

A backend must be installed and configured. The llmvm-outsource is recommended for OpenAI requests. Currently, the default model preset is gpt-3.5-chat which uses this backend.

Usage

Run llmvm-chat to use the interface. Press CTRL-C when you are finished with your chat. A chat thread will be persisted, and the thread ID will be outputted.

Use the -h to see all options.

Use the -l to load the last chat thread.

Configuration

Run the chat executable to generate a configuration file at:

  • Linux: ~/.config/llmvm/chat.toml.
  • macOS: ~/Library/Application Support/com.djandries.llmvm/chat.toml
  • Windows: AppData\Roaming\djandries\llmvm\config\chat.toml
Key Required? Description
stdio_core No Stdio client configuration for communicated with llmvm core. See llmvm-protocol for details.
http_core No HTTP client configuration for communicating with llmvm core. See llmvm-protocol for details.

License

Mozilla Public License, version 2.0

Commit count: 78

cargo fmt