mecha10-video

Crates.iomecha10-video
lib.rsmecha10-video
version0.1.25
created_at2025-11-25 02:24:43.187781+00
updated_at2026-01-01 02:01:21.162233+00
descriptionWebRTC video streaming for Mecha10 - camera frame capture and broadcasting
homepage
repositoryhttps://github.com/mecha10/mecha10
max_upload_size
id1949092
size191,248
Peter C (PeterChauYEG)

documentation

README

mecha10-video

WebRTC-based low-latency video streaming for robotics applications.

Features

  • WebRTC Streaming: H.264 video encoding with OpenH264
  • Multi-client Support: Broadcast frames to multiple WebRTC connections
  • Low Latency: 30-50ms glass-to-glass latency
  • Zero-copy Frame Sharing: Efficient frame distribution using Arc
  • Optional Diagnostics: Enable with diagnostics feature flag

Architecture

┌─────────────┐
│   Source    │ (Godot, Camera, etc.)
│  (Frames)   │
└──────┬──────┘
       │ mpsc::channel
       ▼
┌─────────────────────┐
│  WebRTCServer       │
│  (Frame Broadcast)  │
└──────┬──────────────┘
       │ broadcast::channel
       ├─────────┬─────────┐
       ▼         ▼         ▼
  ┌────────┐ ┌────────┐ ┌────────┐
  │ Conn 1 │ │ Conn 2 │ │ Conn 3 │
  │ (H.264)│ │ (H.264)│ │ (H.264)│
  └────┬───┘ └────┬───┘ └────┬───┘
       │          │          │
       ▼          ▼          ▼
   Browser    Browser    Browser

Usage

Basic Example

use mecha10_video::{CameraFrame, ImageFormat, WebRTCServer};
use mecha10_video::signaling::start_signaling_server;
use tokio::sync::mpsc;
use std::sync::Arc;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // Create frame channel
    let (frame_tx, frame_rx) = mpsc::channel(10);

    // Create WebRTC server (without diagnostics)
    let webrtc_server = WebRTCServer::new(frame_rx).await?;

    // Start signaling server for SDP exchange
    let server = Arc::new(webrtc_server);
    tokio::spawn(async move {
        start_signaling_server(11010, server).await.unwrap();
    });

    // Send frames to WebRTC server
    let frame = CameraFrame {
        camera_id: "camera0".to_string(),
        width: 640,
        height: 480,
        timestamp: 0,
        image_bytes: Arc::new(vec![0u8; 640 * 480 * 3]),
        format: ImageFormat::Rgb,
    };
    frame_tx.send(frame).await?;

    Ok(())
}

With Diagnostics

Enable the diagnostics feature in your Cargo.toml:

[dependencies]
mecha10-video = { path = "../video", features = ["diagnostics"] }

Then create the server with diagnostics:

use mecha10_diagnostics::prelude::StreamingCollector;
use std::sync::Arc;

let diagnostics = Arc::new(StreamingCollector::new("my-node"));
let webrtc_server = WebRTCServer::new(frame_rx, diagnostics).await?;

Dual-Path Publishing (CameraPublisher)

For nodes that need to publish frames to multiple destinations (e.g., WebRTC for dashboards + Redis for telemetry), use CameraPublisher:

use mecha10_video::{CameraPublisher, ImageFormat, WebRTCServer};
use tokio::sync::mpsc;
use std::sync::Arc;

#[tokio::main]
async fn main() -> anyhow::Result<()> {
    // Create channels
    let (webrtc_tx, webrtc_rx) = mpsc::channel(1);
    let (redis_tx, redis_rx) = mpsc::channel(50);

    // Create WebRTC server
    let diagnostics = Arc::new(StreamingCollector::new("my-node"));
    let webrtc_server = WebRTCServer::new(webrtc_rx, diagnostics.clone()).await?;

    // Create camera publisher for dual-path streaming
    let camera_publisher = CameraPublisher::new(webrtc_tx, diagnostics)
        .with_secondary_publisher(redis_tx);

    // Publish frame (zero-copy Arc sharing between WebRTC and Redis!)
    let stats = camera_publisher.publish(
        "camera0".to_string(),
        vec![0u8; 640 * 480 * 3],
        ImageFormat::Rgb,
        640,
        480,
        12345,
    )?;

    println!("Published to {} destinations", stats.destinations);

    Ok(())
}

Key Features:

  • Zero-copy sharing: Wraps frame data in Arc once, shares between all destinations
  • Best-effort secondary: Redis/telemetry publishing is non-blocking (drops on full channel)
  • Automatic diagnostics: Tracks frame sent/dropped metrics (when diagnostics feature enabled)
  • Simple API: Just call publish() with raw frame data

Performance

  • Encoding: ~8-15ms per frame (160×120 @ 30 FPS)
  • Latency: 30-50ms glass-to-glass
  • Bitrate: 400 Kbps (configurable in source)
  • Codec: H.264 (OpenH264)
  • Resolution: Supports any resolution (tested with 160×120, 640×480)
  • FPS: Configurable (tested with 20-30 FPS)

API Documentation

Frame Types

CameraFrame

Represents a single camera frame.

pub struct CameraFrame {
    pub camera_id: String,
    pub width: u32,
    pub height: u32,
    pub timestamp: u64,
    pub image_bytes: Arc<Vec<u8>>,
    pub format: ImageFormat,
}

ImageFormat

Image data format.

pub enum ImageFormat {
    Jpeg,  // JPEG-compressed image
    Rgb,   // Raw RGB pixels (width × height × 3 bytes)
}

FrameBroadcaster

Efficiently broadcasts frames to multiple subscribers.

let broadcaster = FrameBroadcaster::new(1); // capacity = 1 for minimal latency
broadcaster.broadcast(frame)?;
let mut rx = broadcaster.subscribe();

WebRTC Server

WebRTCServer

Creates and manages WebRTC connections.

// Without diagnostics
let server = WebRTCServer::new(frame_rx).await?;

// With diagnostics (requires "diagnostics" feature)
let server = WebRTCServer::new(frame_rx, diagnostics).await?;

// Create connection for a client
let connection = server.create_connection().await?;

WebRTCConnection

Represents a single client connection.

// Create SDP offer
let offer_sdp = connection.create_offer().await?;

// Handle SDP answer from browser
connection.handle_answer(answer_sdp).await?;

// Add ICE candidate
connection.add_ice_candidate(candidate, sdp_mid, sdp_mline_index).await?;

// Start streaming (blocks until connection closes)
connection.run_streaming_loop().await?;

Signaling Server

start_signaling_server

Starts a WebSocket-based WebRTC signaling server.

use mecha10_video::start_signaling_server;

let webrtc_server = Arc::new(webrtc_server);
tokio::spawn(async move {
    start_signaling_server(11010, webrtc_server).await.unwrap();
});

Protocol:

  • Endpoint: ws://localhost:11010/webrtc
  • Single-connection mode (disconnects previous client when new one connects)
  • Server-initiated SDP offer flow

Integration

Simulation Bridge

The simulation-bridge node uses this package for camera streaming:

use mecha10_video::{CameraFrame, WebRTCServer, start_signaling_server};

// Create WebRTC server with diagnostics
let webrtc_server = WebRTCServer::new(frame_rx, diagnostics).await?;

// Start signaling server
tokio::spawn(async move {
    start_signaling_server(11010, Arc::new(webrtc_server)).await
});

Edge Robot Cameras

For physical cameras on edge robots:

  1. Capture frames from camera driver
  2. Convert to CameraFrame with RGB or JPEG format
  3. Send to WebRTC server via mpsc channel
  4. Dashboard connects via WebRTC signaling server

Source of Truth

This package was extracted from simulation-bridge/src/webrtc_server.rs and simulation-bridge/src/signaling_server.rs, which are the production-tested reference implementations.

See docs/VIDEO_EXTRACTION_PLAN.md for extraction history and rationale.

Dependencies

  • webrtc 0.14 - WebRTC implementation
  • openh264 0.9 - Fast H.264 encoder
  • axum 0.7 - WebSocket signaling server
  • tokio - Async runtime
  • image 0.25 - JPEG decoding (minimal features)
  • mecha10-diagnostics (optional) - Streaming metrics

License

MIT

Commit count: 0

cargo fmt