llm-observatory-collector

Crates.iollm-observatory-collector
lib.rsllm-observatory-collector
version0.1.1
created_at2025-11-06 01:19:21.59756+00
updated_at2025-11-06 01:19:21.59756+00
descriptionOpenTelemetry collector with LLM-specific processors for LLM Observatory
homepage
repositoryhttps://github.com/globalbusinessadvisors/llm-observatory
max_upload_size
id1918936
size133,935
GBA (globalbusinessadvisors)

documentation

https://docs.llm-observatory.io

README

llm-observatory-collector

OpenTelemetry collector with LLM-specific processors for LLM Observatory.

Overview

High-performance telemetry collector designed specifically for LLM applications:

  • OTLP Receiver: gRPC and HTTP endpoints for OpenTelemetry data
  • LLM-Aware Processing: Token counting, cost calculation, PII redaction
  • Intelligent Sampling: Head and tail sampling strategies
  • Context Propagation: Distributed tracing support
  • Metric Enrichment: Automatic LLM-specific metadata

Features

  • 100k+ spans/sec: High-throughput processing
  • PII Redaction: Optional privacy-preserving transformations
  • Cost Calculation: Real-time LLM cost tracking
  • Multiple Backends: TimescaleDB, Tempo, Loki support

Usage

# Run collector
llm-observatory-collector --config config.yaml

Configuration:

receivers:
  otlp:
    grpc:
      endpoint: 0.0.0.0:4317
    http:
      endpoint: 0.0.0.0:4318

processors:
  llm_enrichment:
    enable_cost: true
    enable_pii_redaction: false

exporters:
  timescaledb:
    connection_string: postgres://...

Documentation

See the collector documentation for detailed configuration.

License

Apache-2.0

Commit count: 0

cargo fmt