Crates.io | db_logger |
lib.rs | db_logger |
version | 0.1.0 |
source | src |
created_at | 2022-04-12 16:29:33.125398 |
updated_at | 2022-04-12 16:29:33.125398 |
description | A database-backed logger for use with the log crate |
homepage | |
repository | https://github.com/jmmv/db_logger |
max_upload_size | |
id | 566534 |
size | 93,329 |
db_logger is a Rust crate providing an implementation of the log crate's logging facade to write structured log entries to a database. Just add a few lines of code to the beginning of your program and all logging will be saved for later analysis, which is especially suited to (distributed) services.
db_logger currently supports PostgreSQL and SQLite and is backed by the sqlx crate.
The latest version of db_logger is 0.1.0 and was released on 2022-04-12.
To use db_logger, you need to add a dependency to your project with the right set of features, create a database with the expected schema, and then initialize the logger during program initialization.
As a logging facade implementation, db_logger should only be depended upon from binary crates (never from libraries).
Add the following to your list of dependencies in Cargo.toml
:
[dependencies.db_logger]
version = "0.1"
default-features = false
features = ["postgres"]
Create a PostgreSQL database and initialize it with the
schemas/postgres.sql
schema.
Initialize the logger in your code with one of:
Explicit configuration:
use db_logger::postgres;
let conn = postgres::connect_lazy(postgres::ConnectionOptions {
host: "some host".to_owned(),
port: 5432,
database: "some database".to_owned(),
username: "some username".to_owned(),
password: "some password".to_owned(),
..Default::default()
});
let _handle = db_logger::init(conn).await;
Environment-based configuration:
use db_logger::postgres;
let conn = postgres::connect_lazy(
postgres::ConnectionOptions::from_env("LOGGER").unwrap());
let _handle = db_logger::init(conn).await;
This will cause your program to recognize variables of the form
LOGGER_HOST
, LOGGER_PORT
, LOGGER_DATABASE
, LOGGER_USERNAME
and
LOGGER_PASSWORD
to configure the PostgreSQL connection.
Make sure to keep _handle
alive for the duration of the program in an
async context, because the handle keeps the background logging task alive.
Add the following to your list of dependencies in Cargo.toml
:
[dependencies.db_logger]
version = "0.1"
default-features = false
features = ["sqlite"]
Create an SQLite database and initialize it with the
schemas/sqlite.sql
schema.
Initialize with:
use db_logger::sqlite;
let conn = sqlite::connect(sqlite::ConnectionOptions {
uri: "file:/path/to/database?mode=rw",
..Default::default()
}).await.unwrap();
let _handle = db_logger::init(conn).await;
Make sure to keep _handle
alive for the duration of the program in an
async context, because the handle keeps the background logging task alive.
db_logger recognizes the RUST_LOG
environment variable to configure the
maximum level of the log messages to record, the same way as the
env_logger crate does.
As indicated above, you should create the database and its schema by hand
before establishing a connection using the reference files provided in the
schemas
directory. This is a one-time operation.
You also have the option of invoking the Connection::create_schema()
method
to initialize the database schema. You probably don't want to do this in
production but this is useful if you are using ephemeral SQLite databases.
The code in this crate was extracted from the EndBASIC cloud service and then cleaned for separate publication. This code was written as a fun experiment as part of that service and it has not received a lot of real-world stress testing. As a result, expect this to have a bunch of limitations, including:
Because db_logger runs in-process, the logger must filter out any log messages that it can generate while writing to the database (or else it would enter an infinite logging loop). This includes log messages for its own dependencies, such as those emitted by sqlx or tokio, which means that if you are using this crate, you won't get their logs saved into the database.
Performance is not great. While the logger tries to avoid blocking caller code by buffering log messages, if you log a lot, your application will slow down. I'm positive some profiling could make this much better without a ton of effort, but I just haven't spent the time doing it.
No tracing support. This crate may be a dead end. Saving raw logs to a database is interesting but what would be more interesting is saving the traces generated when integrating with the tracing crate (so as to trace individual requests as they flow through an async server, which was the original desire).
stderr pollution. Right now, any errors encountered while persisting logs to the database and any log messages that are filtered out are dumped to stderr in an ad-hoc format. This behavior should be configurable but isn't.