| Crates.io | mecha10-video |
| lib.rs | mecha10-video |
| version | 0.1.25 |
| created_at | 2025-11-25 02:24:43.187781+00 |
| updated_at | 2026-01-01 02:01:21.162233+00 |
| description | WebRTC video streaming for Mecha10 - camera frame capture and broadcasting |
| homepage | |
| repository | https://github.com/mecha10/mecha10 |
| max_upload_size | |
| id | 1949092 |
| size | 191,248 |
WebRTC-based low-latency video streaming for robotics applications.
diagnostics feature flag┌─────────────┐
│ Source │ (Godot, Camera, etc.)
│ (Frames) │
└──────┬──────┘
│ mpsc::channel
▼
┌─────────────────────┐
│ WebRTCServer │
│ (Frame Broadcast) │
└──────┬──────────────┘
│ broadcast::channel
├─────────┬─────────┐
▼ ▼ ▼
┌────────┐ ┌────────┐ ┌────────┐
│ Conn 1 │ │ Conn 2 │ │ Conn 3 │
│ (H.264)│ │ (H.264)│ │ (H.264)│
└────┬───┘ └────┬───┘ └────┬───┘
│ │ │
▼ ▼ ▼
Browser Browser Browser
use mecha10_video::{CameraFrame, ImageFormat, WebRTCServer};
use mecha10_video::signaling::start_signaling_server;
use tokio::sync::mpsc;
use std::sync::Arc;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Create frame channel
let (frame_tx, frame_rx) = mpsc::channel(10);
// Create WebRTC server (without diagnostics)
let webrtc_server = WebRTCServer::new(frame_rx).await?;
// Start signaling server for SDP exchange
let server = Arc::new(webrtc_server);
tokio::spawn(async move {
start_signaling_server(11010, server).await.unwrap();
});
// Send frames to WebRTC server
let frame = CameraFrame {
camera_id: "camera0".to_string(),
width: 640,
height: 480,
timestamp: 0,
image_bytes: Arc::new(vec![0u8; 640 * 480 * 3]),
format: ImageFormat::Rgb,
};
frame_tx.send(frame).await?;
Ok(())
}
Enable the diagnostics feature in your Cargo.toml:
[dependencies]
mecha10-video = { path = "../video", features = ["diagnostics"] }
Then create the server with diagnostics:
use mecha10_diagnostics::prelude::StreamingCollector;
use std::sync::Arc;
let diagnostics = Arc::new(StreamingCollector::new("my-node"));
let webrtc_server = WebRTCServer::new(frame_rx, diagnostics).await?;
For nodes that need to publish frames to multiple destinations (e.g., WebRTC for dashboards + Redis for telemetry), use CameraPublisher:
use mecha10_video::{CameraPublisher, ImageFormat, WebRTCServer};
use tokio::sync::mpsc;
use std::sync::Arc;
#[tokio::main]
async fn main() -> anyhow::Result<()> {
// Create channels
let (webrtc_tx, webrtc_rx) = mpsc::channel(1);
let (redis_tx, redis_rx) = mpsc::channel(50);
// Create WebRTC server
let diagnostics = Arc::new(StreamingCollector::new("my-node"));
let webrtc_server = WebRTCServer::new(webrtc_rx, diagnostics.clone()).await?;
// Create camera publisher for dual-path streaming
let camera_publisher = CameraPublisher::new(webrtc_tx, diagnostics)
.with_secondary_publisher(redis_tx);
// Publish frame (zero-copy Arc sharing between WebRTC and Redis!)
let stats = camera_publisher.publish(
"camera0".to_string(),
vec![0u8; 640 * 480 * 3],
ImageFormat::Rgb,
640,
480,
12345,
)?;
println!("Published to {} destinations", stats.destinations);
Ok(())
}
Key Features:
Arc once, shares between all destinationspublish() with raw frame dataCameraFrameRepresents a single camera frame.
pub struct CameraFrame {
pub camera_id: String,
pub width: u32,
pub height: u32,
pub timestamp: u64,
pub image_bytes: Arc<Vec<u8>>,
pub format: ImageFormat,
}
ImageFormatImage data format.
pub enum ImageFormat {
Jpeg, // JPEG-compressed image
Rgb, // Raw RGB pixels (width × height × 3 bytes)
}
FrameBroadcasterEfficiently broadcasts frames to multiple subscribers.
let broadcaster = FrameBroadcaster::new(1); // capacity = 1 for minimal latency
broadcaster.broadcast(frame)?;
let mut rx = broadcaster.subscribe();
WebRTCServerCreates and manages WebRTC connections.
// Without diagnostics
let server = WebRTCServer::new(frame_rx).await?;
// With diagnostics (requires "diagnostics" feature)
let server = WebRTCServer::new(frame_rx, diagnostics).await?;
// Create connection for a client
let connection = server.create_connection().await?;
WebRTCConnectionRepresents a single client connection.
// Create SDP offer
let offer_sdp = connection.create_offer().await?;
// Handle SDP answer from browser
connection.handle_answer(answer_sdp).await?;
// Add ICE candidate
connection.add_ice_candidate(candidate, sdp_mid, sdp_mline_index).await?;
// Start streaming (blocks until connection closes)
connection.run_streaming_loop().await?;
start_signaling_serverStarts a WebSocket-based WebRTC signaling server.
use mecha10_video::start_signaling_server;
let webrtc_server = Arc::new(webrtc_server);
tokio::spawn(async move {
start_signaling_server(11010, webrtc_server).await.unwrap();
});
Protocol:
ws://localhost:11010/webrtcThe simulation-bridge node uses this package for camera streaming:
use mecha10_video::{CameraFrame, WebRTCServer, start_signaling_server};
// Create WebRTC server with diagnostics
let webrtc_server = WebRTCServer::new(frame_rx, diagnostics).await?;
// Start signaling server
tokio::spawn(async move {
start_signaling_server(11010, Arc::new(webrtc_server)).await
});
For physical cameras on edge robots:
CameraFrame with RGB or JPEG formatThis package was extracted from simulation-bridge/src/webrtc_server.rs and simulation-bridge/src/signaling_server.rs, which are the production-tested reference implementations.
See docs/VIDEO_EXTRACTION_PLAN.md for extraction history and rationale.
webrtc 0.14 - WebRTC implementationopenh264 0.9 - Fast H.264 encoderaxum 0.7 - WebSocket signaling servertokio - Async runtimeimage 0.25 - JPEG decoding (minimal features)mecha10-diagnostics (optional) - Streaming metricsMIT