| Crates.io | llmur |
| lib.rs | llmur |
| version | 0.0.4 |
| created_at | 2024-07-17 17:16:28.035584+00 |
| updated_at | 2026-01-03 23:08:19.399286+00 |
| description | Lightweight LLM Proxy - Binary application for running an LLM proxy server |
| homepage | https://github.com/llmur/llmur |
| repository | https://github.com/llmur/llmur |
| max_upload_size | |
| id | 1306357 |
| size | 505,395 |
LLMUR (Lightweight LLM Proxy) is a self-hostable proxy service that provides a unified interface for interacting with multiple Large Language Model (LLM) providers. It offers OpenAI-compatible API endpoints while adding powerful features like rate limiting, load balancing, and multi-tenant management.
LLMUR acts as a middleware layer between your applications and LLM providers, giving you:
/v1/chat/completions) regardless of the underlying providerClone the repository (if you haven't already):
git clone https://github.com/llmur/llmur.git
cd llmur
Create a configuration file (config.yaml):
application_secret: your-secret-here
log_level: info
host: 0.0.0.0
port: 8082
database_configuration:
engine: postgres
host: db
port: 5432
database: llmur
username: postgres
password: postgres
cache_configuration:
engine: redis
host: cache
port: 6379
username: default
password: redispassword
Start the services:
docker-compose up -d db cache
Build and run the proxy:
cargo build --release
./target/release/llmur --configuration config.yaml
For more information take a look at the docs