Crates.io | diesel-streamer |
lib.rs | diesel-streamer |
version | 0.1.12 |
source | src |
created_at | 2023-07-20 23:40:18.325271 |
updated_at | 2023-09-17 16:13:50.242654 |
description | A tiny diesel add-on for streaming large tables |
homepage | |
repository | https://github.com/jurshsmith/diesel-streamer |
max_upload_size | |
id | 921878 |
size | 21,101 |
A tiny diesel add-on for streaming large tables. It currently allows streaming large serial tables using a cursor-based streaming strategy.
For a regular synchronous runtime:
[dependencies]
diesel-streamer = { version = "0.1.12", features = ["sync"]}
For tokio async runtime:
[dependencies]
diesel-streamer = { version = "0.1.12", features = ["async"]}
Stream SomeTable
that has a serial_field
:
use diesel_streamer::stream_serial_table;
fn main() {
use crate::schema::some_table::dsl::{some_table, serial_field};
let mut conn = pool.get().await.unwrap();
// with default chunk size of 500
stream_serial_table!(some_table, serial_field, conn, |streamed_table_data: Vec<SomeTable>| {
// do work here
dbg!(streamed_table_data);
});
// specify chunk size, 130
stream_serial_table!(some_table, serial_field, conn, 130, |streamed_table_data: Vec<SomeTable>| {
// do work here
dbg!(streamed_table_data);
});
// with cursor's beginning, 5.
stream_serial_table!(some_table, serial_field, conn, 130, 5, |streamed_table_data: Vec<SomeTable>| {
// do work here
dbg!(streamed_table_data);
});
// with cursor's end, 50,
stream_serial_table!(some_table, serial_field, conn, 130, 5, 50, |streamed_table_data: Vec<SomeTable>| {
// do work here
dbg!(streamed_table_data);
});
}
Defaults:
serial_field
in the tableserial_field
in the tableN/B: Generally, streaming should only be considered when there is a possibility of hitting OOM error when processing the table in question.
Spin up a test db using docker-compose up
or simply specify
a DB url in .env
as shown in .env.sample
.
Run cargo test
for tests.