Crates.io | llm-lsp |
lib.rs | llm-lsp |
version | 0.1.1 |
source | src |
created_at | 2024-04-19 19:01:32.93758 |
updated_at | 2024-11-10 19:37:46.695803 |
description | LSP server to communicate with LLMs |
homepage | |
repository | https://github.com/rosarp/llm-lsp |
max_upload_size | |
id | 1214016 |
size | 94,592 |
Language Server Protocol for Large Language Models
Editors will be able to use this as lsp for code completions.
[x] Code completion using codeium.ai
[x] Uses async
[x] Allows users to use same binary for different AI models with cli option
[x] Saves multiple configs to be used by different instances of llm-lsp
cargo build --release
llm-lsp -h
llm-lsp -V
Once llm-lsp is executed, it will guide through the process to generate API_KEY (in case of codeium) and save the relevant configurations in OS specific toml configs.
llm-lsp generate-config
llm-lsp server -p codeium
[language-server.llm-lsp]
command = "llm-lsp"
args = ["server", "-p", "codeium"]
[[language]]
name = "rust"
language-servers = [
"rust-analyzer",
"llm-lsp",
]
[ ] More Documentation
[ ] Add chat support on cli