braintrust-sdk-rust

Crates.iobraintrust-sdk-rust
lib.rsbraintrust-sdk-rust
version0.1.0-alpha.2
created_at2025-12-30 16:07:21.801327+00
updated_at2026-01-20 00:26:52.796754+00
descriptionRust SDK for Braintrust logging and tracing
homepage
repositoryhttps://github.com/braintrustdata/braintrust-sdk-rust
max_upload_size
id2012899
size189,312
groups-sdk (github:braintrustdata:groups-sdk)

documentation

README

braintrust-sdk-rust

Rust SDK for Braintrust logging and tracing.

Early Development Notice: This SDK is in early development (alpha). Expect backwards-incompatible changes between releases until we reach 1.0.

Installation

Add this to your Cargo.toml:

[dependencies]
braintrust-sdk-rust = "0.1.0-alpha.1"

Usage

use braintrust_sdk_rust::{BraintrustClient, BraintrustClientConfig, SpanLog};
use serde_json::json;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // Create a client pointing to your Braintrust instance
    let client = BraintrustClient::new(
        BraintrustClientConfig::new("https://api.braintrust.dev")
    )?;

    // Create a span for logging
    let span = client
        .span_builder("your-api-token", "your-org-id")
        .project_name("my-project")
        .build();

    // Log input/output using span.log()
    span.log(SpanLog {
        input: Some(json!({"prompt": "Hello, world!"})),
        output: Some(json!({"response": "Hi there!"})),
        ..Default::default()
    }).await;

    // Flush the span data to Braintrust
    span.flush().await?;
    client.flush().await?;

    Ok(())
}

Features

  • Span-based logging: Create spans to track LLM calls and other operations
  • Usage metrics extraction: Built-in extractors for OpenAI and Anthropic usage metrics
  • Async-first: Built on Tokio for high-performance async operations
  • Background submission: Logs are submitted in the background to minimize latency

Extracting Usage Metrics

The SDK includes helpers for extracting usage metrics from provider responses:

use braintrust_sdk_rust::{extract_openai_usage, extract_anthropic_usage};
use serde_json::json;

// Extract from OpenAI response
let openai_response = json!({
    "usage": {
        "prompt_tokens": 100,
        "completion_tokens": 50,
        "total_tokens": 150
    }
});
let usage = extract_openai_usage(&openai_response);

// Extract from Anthropic response
let anthropic_response = json!({
    "usage": {
        "input_tokens": 100,
        "output_tokens": 50
    }
});
let usage = extract_anthropic_usage(&anthropic_response);

License

This project is licensed under the Apache License, Version 2.0.

Commit count: 0

cargo fmt