kubellm

Crates.iokubellm
lib.rskubellm
version0.0.1
created_at2025-02-05 14:48:57.076347+00
updated_at2025-02-05 14:48:57.076347+00
descriptionKubeLLM is a simple LLM proxy designed to run on Kubernetes or standalone
homepagehttps://github.com/bcvanmeurs/kubellm
repositoryhttps://github.com/bcvanmeurs/kubellm
max_upload_size
id1544066
size50,317
Bram van Meurs (bcvanmeurs)

documentation

README

KubeLLM

An opinionated LLM proxy written in Rust to be used with Kubernetes or standalone.

[!WARNING] Under development, not ready for production.

Design goals

  • An API that allows calling different LLM providers based on the OpenAI spec
  • Transparent configuration in code
  • Provides a way to generate virtual keys in Kubernetes
  • Spend and usage tracking
  • Logging into local database
  • Prometheus metrics

Personal goals

  • Learning Rust
Commit count: 2

cargo fmt