edgefirst-schemas

Crates.ioedgefirst-schemas
lib.rsedgefirst-schemas
version1.4.1
created_at2024-04-13 16:59:30.491275+00
updated_at2025-11-19 00:08:27.772142+00
descriptionMessage schemas for EdgeFirst Perception - ROS2 Common Interfaces, Foxglove, and custom types
homepagehttps://doc.edgefirst.ai
repositoryhttps://github.com/EdgeFirstAI/schemas
max_upload_size
id1207702
size340,264
Sébastien Taylor (sebastient)

documentation

https://docs.rs/edgefirst-schemas/

README

EdgeFirst Perception Schemas

License Rust Python

Core message schemas and language bindings for the EdgeFirst Perception middleware

EdgeFirst Perception Schemas provides the foundational message types used throughout the EdgeFirst Perception middleware stack. It delivers high-performance Rust and Python bindings for consuming and producing messages in EdgeFirst Perception applications. The library implements ROS2 Common Interfaces, Foxglove schemas, and custom EdgeFirst message types with CDR (Common Data Representation) serialization over Zenoh.

No ROS2 Required: EdgeFirst Perception applications work directly on Linux, Windows, and macOS without any ROS2 installation. For systems that do use ROS2, EdgeFirst Perception interoperates seamlessly through the Zenoh ROS2 DDS Bridge.

Features

  • 🔄 ROS2 Common Interfaces - Full compatibility with standard ROS2 message types (geometry_msgs, sensor_msgs, std_msgs, nav_msgs)
  • 📊 Foxglove Schema Support - Native visualization with Foxglove Studio
  • ⚡ Custom EdgeFirst Messages - Specialized types for edge AI (detection, tracking, DMA buffers, radar)
  • 🦀 High-Performance Rust Bindings - Zero-copy serialization with CDR encoding
  • 🐍 Python Bindings - Efficient point cloud decoding and message handling
  • 📡 Zenoh-Based Communication - Modern pub/sub over Zenoh middleware
  • 💻 Cross-Platform - Linux, Windows, and macOS support
  • 🚫 ROS2 Optional - No ROS2 installation required for EdgeFirst Perception applications

Quick Start

Installation

Rust (via crates.io):

cargo add edgefirst-schemas

Python (via pip, when published):

pip install edgefirst-schemas

For detailed installation instructions and troubleshooting, see the Developer Guide.

Consuming Messages (Primary Use Case)

Most applications consume messages from EdgeFirst Perception services. Here's how to decode sensor data:

Python Example - Consuming PointCloud2:

from edgefirst.schemas import PointCloud2, decode_pcd

# Receive a point cloud message from EdgeFirst Perception
# (via Zenoh subscriber - see samples for complete examples)
points = decode_pcd(point_cloud_msg)

# Access point data
for point in points:
    x, y, z = point.x, point.y, point.z
    # Process point data...

Rust Example - Consuming Detection Results:

use edgefirst_schemas::edgefirst_msgs::Detect;
use edgefirst_schemas::sensor_msgs::Image;

fn process_detections(detect_msg: Detect) {
    for bbox in detect_msg.boxes {
        println!("Class: {}, Confidence: {:.2}",
                 bbox.class_id, bbox.confidence);
        // Process detection...
    }
}

Complete working examples: See the EdgeFirst Samples repository for full subscriber implementations, Zenoh configuration, and integration patterns.

Producing Messages (Secondary Use Case)

Applications can also produce messages for custom perception pipelines:

Python Example - Creating Custom Messages:

from edgefirst.schemas.geometry_msgs import Pose, Point, Quaternion
from edgefirst.schemas.std_msgs import Header

# Create a pose message
pose = Pose(
    position=Point(x=1.0, y=2.0, z=0.5),
    orientation=Quaternion(x=0.0, y=0.0, z=0.0, w=1.0)
)

Rust Example - Building Detection Messages:

use edgefirst_schemas::edgefirst_msgs::{Detect, Box as BBox, Track};
use edgefirst_schemas::std_msgs::Header;

fn create_detection() -> Detect {
    Detect {
        header: Header::default(),
        boxes: vec![
            BBox {
                class_id: 1,
                confidence: 0.95,
                x: 100, y: 100, w: 50, h: 50,
                ..Default::default()
            }
        ],
        tracks: vec![],
        model_info: Default::default(),
    }
}

Learn more: The Developer Guide covers serialization, Zenoh publishing, and message lifecycle management.

Building from Source

Rust:

cargo build --release
cargo test

Python:

python -m pip install -e .

ROS2 Debian Package (for ROS2 integration only):

cd edgefirst_msgs
source /opt/ros/humble/setup.bash
fakeroot debian/rules build
dpkg -i ../ros-humble-edgefirst-msgs_*.deb

For complete build instructions, see CONTRIBUTING.md.

Message Schemas

EdgeFirst Perception Schemas combines three sources of message definitions:

1. ROS2 Common Interfaces

Standard ROS2 message types for broad interoperability:

  • std_msgs - Basic primitive types (Header, String, etc.)
  • geometry_msgs - Spatial messages (Pose, Transform, Twist, etc.)
  • sensor_msgs - Sensor data (PointCloud2, Image, CameraInfo, Imu, NavSatFix, etc.)
  • nav_msgs - Navigation (Odometry, Path)
  • builtin_interfaces - Time and Duration
  • rosgraph_msgs - Clock

Based on ROS2 Humble Hawksbill LTS release.

2. Foxglove Schemas

Visualization-focused message types from Foxglove Schemas:

  • Scene graph visualization - 3D rendering primitives
  • Annotation types - Bounding boxes, markers, text
  • Panel-specific messages - Optimized for Foxglove Studio

3. EdgeFirst Custom Messages

Specialized types for edge AI perception workflows:

  • Detect - Object detection results with bounding boxes and tracks
  • Box - 2D bounding box with confidence and class
  • Track - Object tracking information with unique IDs
  • DmaBuffer - Zero-copy DMA buffer sharing for hardware accelerators
  • RadarCube - Raw radar data cube for processing
  • RadarInfo - Radar sensor calibration and metadata
  • Model - Neural network model metadata
  • ModelInfo - Inference performance instrumentation

Full message definitions and field descriptions are in the API Reference.

Platform Support

EdgeFirst Perception Schemas works on:

  • Linux - Primary development and deployment platform
  • Windows - Full support for development and integration
  • macOS - Development and testing support

No ROS2 Required: Applications can consume and produce EdgeFirst Perception messages on any supported platform without installing ROS2. ROS2 is only needed if you want to bridge EdgeFirst Perception data into an existing ROS2 ecosystem.

See the EdgeFirst Samples repository for platform-specific examples and setup guides.

Hardware Platforms

EdgeFirst Perception is optimized for Au-Zone edge AI platforms:

These platforms provide hardware-accelerated inference and sensor integration. For custom hardware projects, contact Au-Zone for engineering services.

Use Cases

EdgeFirst Perception Schemas enables:

  • Consuming Sensor Data - Subscribe to camera, radar, lidar, IMU, GPS topics
  • Processing Detections - Receive object detection and tracking results
  • Custom Perception Services - Build new perception algorithms that integrate with EdgeFirst
  • Recording & Playback - Use with MCAP for data recording and analysis
  • Visualization - Connect Foxglove Studio for real-time monitoring
  • ROS2 Integration - Bridge to ROS2 systems when needed

Example applications: Explore the EdgeFirst Samples for complete implementations including camera subscribers, detection visualizers, sensor fusion examples, and custom service templates.

Documentation

Support

Community Resources

EdgeFirst Ecosystem

EdgeFirst Studio - Complete MLOps platform for edge AI:

  • Deploy models to devices running EdgeFirst Perception
  • Monitor inference performance in real-time
  • Manage fleets of edge devices
  • Record and replay sensor data with MCAP
  • Visualize messages with integrated Foxglove
  • Free tier available for development

EdgeFirst Hardware Platforms:

  • Maivin and Raivin edge AI computers
  • Custom carrier board design services
  • Rugged industrial enclosures for harsh environments

Professional Services

Au-Zone Technologies offers commercial support for production deployments:

  • Training & Workshops - Accelerate your team's development with EdgeFirst Perception
  • Custom Development - Tailored perception solutions and algorithm integration
  • Integration Services - Connect EdgeFirst Perception to your existing systems
  • Production Support - SLA-backed support for mission-critical applications

Contact: support@au-zone.com | Learn more: au-zone.com

Architecture

For detailed information about message serialization, CDR encoding, Zenoh communication patterns, and system architecture, see the ARCHITECTURE.md document.

Quick overview:

  • Messages are serialized using CDR (Common Data Representation)
  • Communication happens over Zenoh pub/sub middleware
  • ROS2 interoperability via Zenoh ROS2 DDS Bridge when needed
  • Zero-copy optimizations for embedded platforms using DMA buffers

Contributing

We welcome contributions! Please see our Contributing Guidelines for details on:

  • Development setup and build process
  • Code style and testing requirements
  • Pull request process
  • Issue reporting guidelines

All contributors must follow our Code of Conduct.

Security

For reporting security vulnerabilities, please see our Security Policy. Do not report security issues through public GitHub issues.

License

Licensed under the Apache License, Version 2.0. See LICENSE for details.

Copyright © 2025 Au-Zone Technologies. All Rights Reserved.

Third-Party Acknowledgments

This project incorporates schemas and code from:

See NOTICE file for complete third-party license information.

Related Projects


EdgeFirst Perception is a trademark of Au-Zone Technologies Inc.

Commit count: 95

cargo fmt