Crates.io | hyperlane-log |
lib.rs | hyperlane-log |
version | 0.24.0 |
source | src |
created_at | 2024-12-28 13:49:39.566933 |
updated_at | 2025-01-09 00:49:44.50661 |
description | A Rust asynchronous logging library that runs on a dedicated thread to avoid blocking other threads. It supports multiple log levels (such as error, info, and debug), and allows custom log handling methods and configuration of log file paths. When a single log file reaches the specified size limit, a new log file will be automatically created. |
homepage | |
repository | https://github.com/ltpp-universe/hyperlane-log.git |
max_upload_size | |
id | 1497512 |
size | 19,876 |
A Rust asynchronous logging library that runs on a dedicated thread to avoid blocking other threads. It supports multiple log levels (such as error, info, and debug), and allows custom log handling methods and configuration of log file paths. When a single log file reaches the specified size limit, a new log file will be automatically created.
To use this crate, you can run cmd:
cargo add hyperlane-log
Three directories will be created under the user-specified directory: one for error logs, one for info logs, and one for debug logs. Each of these directories will contain a subdirectory named by the date, and the log files within these subdirectories will be named in the format
timestamp.index.log
.
use hyperlane_log::*;
let log: Log = Log::new("./logs", 1_024_000);
let log_thread: JoinHandle<()> = log_run(&log);
log.log_error("error data!", |error| {
let write_data: String = format!("User error func => {:?}\n", error);
write_data
});
log.log_info("info data!", |info| {
let write_data: String = format!("User info func => {:?}\n", info);
write_data
});
log.log_debug("debug data!", |debug| {
let write_data: String = format!("User debug func => {:#?}\n", debug);
write_data
});
let _ = log_thread.join();
This project is licensed under the MIT License. See the LICENSE file for details.
Contributions are welcome! Please open an issue or submit a pull request.
For any inquiries, please reach out to the author at ltpp-universe root@ltpp.vip.