Crates.io | bonsaidb-macros |
lib.rs | bonsaidb-macros |
version | 0.5.0 |
source | src |
created_at | 2022-01-19 17:54:47.649018 |
updated_at | 2023-10-05 17:57:31.54238 |
description | Macros for use in BonsaiDb |
homepage | https://bonsaidb.io/ |
repository | https://github.com/khonsulabs/bonsaidb |
max_upload_size | |
id | 516980 |
size | 67,234 |
BonsaiDb is a developer-friendly document database for Rust that grows with you. It offers many features out of the box that many developers need:
BonsaiDb is considered alpha software. It is under active development (). There may still be bugs that result in data loss. All users should regularly back up their data and test that restoring from backup works correctly.
Around May 2022, a bug and a mistake in benchmarking were discovered. The bug was promptly fixed, but the net result is that BonsaiDb's transactional write performance is significantly slower than other databases. Unless you're buliding a very write-heavy application, the performance will likely still be acceptable. Issue #251 on GitHub is where progress of the performance updates are being tracked. From a developer's perspective, migration is expected to be painless beyond the IO needed to copy the old database into the new format.
To get an idea of how it works, let's review the view-examples
example.
See the examples README for a list of all available examples.
The view-examples
example shows how to define a simple schema containing a single collection (Shape
), a view to query the Shape
s by their number_of_sides
(ShapesByNumberOfSides
), and demonstrates multiple ways to query that view.
First, here's how the schema is defined:
#[derive(Debug, Serialize, Deserialize, Collection)]
#[collection(name = "shapes", views = [ShapesByNumberOfSides])]
struct Shape {
pub sides: u32,
}
#[derive(Debug, Clone, View, ViewSchema)]
#[view(collection = Shape, key = u32, value = usize, name = "by-number-of-sides")]
struct ShapesByNumberOfSides;
impl CollectionMapReduce for ShapesByNumberOfSides {
fn map<'doc>(&self, document: CollectionDocument<Shape>) -> ViewMapResult<'doc, Self::View> {
document
.header
.emit_key_and_value(document.contents.sides, 1)
}
fn reduce(
&self,
mappings: &[ViewMappedValue<'_, Self>],
_rereduce: bool,
) -> ReduceResult<Self::View> {
Ok(mappings.iter().map(|m| m.value).sum())
}
}
After you have your collection(s) and view(s) defined, you can open up a database and insert documents:
let db = Database::open::<Shape>(StorageConfiguration::new("view-examples.bonsaidb"))?;
// Insert a new document into the Shape collection.
Shape { sides: 3 }.push_into(&db)?;
And query data using the Map-Reduce-powered view:
let triangles = ShapesByNumberOfSides::entries(&db).with_key(&3).query()?;
println!("Number of triangles: {}", triangles.len());
You can review the full example in the repository, or see all available examples in the examples README.
Our user's guide is early in development, but is available at: https://dev.bonsaidb.io/v0.5.0/guide/
While this project is alpha, we are actively adopting the current version of
Rust. The current minimum version is 1.70
.
No feature flags are enabled by default in the bonsaidb
crate. This is
because in most Rust executables, you will only need a subset of the
functionality. If you'd prefer to enable everything, you can use the full
feature:
[dependencies]
bonsaidb = { version = "*", features = "full" }
full
: Enables the features below and local-full
, server-full
, and client-full
.cli
: Enables the bonsaidb
executable.files
: Enables file storage support with bonsaidb-files
password-hashing
: Enables the ability to use password authentication using
Argon2 via AnyConnection
.token-authentication
: Enables the ability to authenticate using
authentication tokens, which are similar to API keys.All other feature flags, listed below, affect each crate individually, but can be safely combined.
[dependencies]
bonsaidb = { version = "*", features = "local-full" }
All Cargo features that affect local databases:
local-full
: Enables all the flags belowlocal
: Enables the local
module, which re-exports the crate
bonsaidb-local
.async
: Enables async support with Tokio.cli
: Enables the clap
structures for embedding database
management commands into your own command-line interface.compression
: Enables support for compressed storage using lz4.encryption
: Enables at-rest encryption.instrument
: Enables instrumenting with tracing
.password-hashing
: Enables the ability to use password authentication
using Argon2.token-authentication
: Enables the ability to authenticate using
authentication tokens, which are similar to API keys.[dependencies]
bonsaidb = { version = "*", features = "server-full" }
All Cargo features that affect networked servers:
server-full
: Enables all the flags below,server
: Enables the server
module, which re-exports the crate
bonsaidb-server
.acme
: Enables automtic certificate acquisition through ACME/LetsEncrypt.cli
: Enables the cli
module.compression
: Enables support for compressed storage using lz4.encryption
: Enables at-rest encryption.hyper
: Enables convenience functions for upgrading websockets using hyper
.instrument
: Enables instrumenting with tracing
.pem
: Enables the ability to install a certificate using the PEM format.websockets
: Enables WebSocket
support.password-hashing
: Enables the ability to use password authentication
using Argon2.token-authentication
: Enables the ability to authenticate using
authentication tokens, which are similar to API keys.[dependencies]
bonsaidb = { version = "*", features = "client-full" }
All Cargo features that affect networked clients:
client-full
: Enables all flags below.client
: Enables the client
module, which re-exports the crate
bonsaidb-client
.trusted-dns
: Enables using trust-dns for DNS resolution. If not
enabled, all DNS resolution is done with the OS's default name resolver.websockets
: Enables WebSocket
support for bonsaidb-client
.password-hashing
: Enables the ability to use password authentication
using Argon2.token-authentication
: Enables the ability to authenticate using
authentication tokens, which are similar to API keys.Unless there is a good reason not to, every feature in BonsaiDb should have
thorough unit tests. Many tests are implemented in bonsaidb_core::test_util
via a macro that allows the suite to run using various methods of accessing
BonsaiDb.
Some features aren't able to be tested using the Connection
,
StorageConnection
, KeyValue
, and PubSub
traits only. If that's the case,
you should add tests to whichever crates makes the most sense to test the code.
For example, if it's a feature that only can be used in bonsaidb-server
, the
test should be somewhere in the bonsaidb-server
crate.
Tests that require both a client and server can be added to the core-suite
test file in the bonsaidb
crate.
We use clippy
to give additional guidance on our code. Clippy should always return with no errors, regardless of feature flags being enabled:
cargo clippy --all-features
Our CI processes require that some commands succeed without warnings or errors. These checks can be performed manually by running:
cargo xtask test --fail-on-warnings
Or, if you would like to run all these checks before each commit, you can install the check as a pre-commit hook:
cargo xtask install-pre-commit-hook
We have a custom rustfmt configuration that enables several options only available in nightly builds:
cargo +nightly fmt
This project, like all projects from Khonsu Labs, is open-source. This repository is available under the MIT License or the Apache License 2.0.
To learn more about contributing, please see CONTRIBUTING.md.