| Crates.io | loggix |
| lib.rs | loggix |
| version | 1.0.4 |
| created_at | 2024-12-06 21:29:27.38125+00 |
| updated_at | 2025-04-10 18:02:25.194618+00 |
| description | A powerful, structured logging library for Rust inspired by Logrus. Features thread-safe logging, structured fields, custom formatters, and beautiful terminal output. |
| homepage | |
| repository | https://github.com/cploutarchou/loggix |
| max_upload_size | |
| id | 1474754 |
| size | 98,729 |
A high-performance, async-first logging framework for Rust with Kafka integration.
log_async method for async contextsAdd this to your Cargo.toml:
[dependencies]
loggix = "1.0"
use loggix::{Logger, Level, Fields};
// Create a logger
let logger = Logger::new().build();
// Log a message
let mut fields = Fields::new();
fields.insert("user_id".to_string(), "123".into());
logger.log(Level::Info, "User logged in", fields).unwrap();
// Async logging
let mut fields = Fields::new();
fields.insert("order_id".to_string(), "456".into());
logger.log_async(Level::Info, "Order processed", fields).await.unwrap();
use loggix::{Logger, JSONFormatter, Level, Fields};
use serde_json::Value;
let logger = Logger::new()
.formatter(JSONFormatter::new())
.build();
let mut fields = Fields::new();
fields.insert("transaction_id".to_string(), Value::String("tx-123".to_string()));
fields.insert("amount".to_string(), Value::Number(100.into()));
logger.log(Level::Info, "Payment processed", fields).unwrap();
docker-compose up -d
use loggix::{Logger, KafkaHook, Level, Fields};
use serde_json::Value;
// Create a Kafka hook with message key support
let kafka_hook = KafkaHook::new("localhost:9092", "logs")
.unwrap()
.with_key_field("correlation_id".to_string());
// Create a logger with the Kafka hook
let logger = Logger::new()
.add_hook(kafka_hook)
.build();
// Log a message with a correlation ID for message routing
let mut fields = Fields::new();
fields.insert("correlation_id".to_string(), Value::String("abc-123".to_string()));
fields.insert("user_id".to_string(), Value::String("456".to_string()));
logger.log_async(Level::Info, "User action", fields).await.unwrap();
The Kafka hook supports setting a field as the message key:
// Set up hook with a key field
let kafka_hook = KafkaHook::new("localhost:9092", "logs")
.unwrap()
.with_key_field("tenant_id".to_string());
// Any log message with the tenant_id field will use it as the Kafka message key
let mut fields = Fields::new();
fields.insert("tenant_id".to_string(), Value::String("tenant-1".to_string()));
logger.log_async(Level::Info, "Tenant action", fields).await.unwrap();
This enables:
Both the logger and hooks support async operations:
// Async logging with hooks
logger.log_async(Level::Info, "Async message", fields).await?;
// Hooks automatically use async operations when available
impl Hook for MyHook {
fn fire_async<'a>(&'a self, entry: &'a Entry) -> Pin<Box<dyn Future<Output = Result<(), Error>> + Send + 'a>> {
// Async implementation
}
}
See the examples/ directory for more examples:
Run the benchmarks:
cargo bench
Example benchmark results:
Create a config.yaml file:
kafka:
bootstrap_servers: "localhost:9092"
group_id: "logger_group"
auto_offset_reset: "earliest"
socket_timeout_ms: 3000
session_timeout_ms: 6000
replication_factor: 1
partitions: 1
log_async in async contextsFor a detailed list of changes between versions, please see our CHANGELOG.
This project is licensed under the MIT License - see the LICENSE file for details.