| Crates.io | sqlx-batch |
| lib.rs | sqlx-batch |
| version | 0.1.1 |
| created_at | 2025-06-08 11:34:26.805976+00 |
| updated_at | 2025-06-09 14:19:31.197921+00 |
| description | Procedural macro for ergonomic insertion of vecs of structs into a PostgreSQL database. |
| homepage | |
| repository | https://github.com/chrismostert/sqlx-batch |
| max_upload_size | |
| id | 1704813 |
| size | 66,180 |
Procedural macro for ergonomic insertion of vecs of structs into a PostgreSQL database.
Bulk inserts in sqlx can be quite cumbersome to write manually. Consider the case where we have some vector of structs:
SomeStruct {
elem_1: i32,
elem_2: Option<String>,
elem_3: Option<bool>,
}
let to_insert = vec![
SomeStruct {
elem_1: 123,
elem_2: None,
elem_3: Some(true),
},
SomeStruct {
elem_1: 345,
elem_2: Some(String::from("Hello")),
elem_3: None,
},
];
If we want to insert this vec of structs into a Postgres database we would first have to collect the struct fields into three individual vectors and manually write the UNNEST query.
Instead, we can use sqlx-batch to write this for us. Consider the same struct. When annotated as follows:
use sqlx_batch::BatchInserter;
#[derive(BatchInserter)]
#[pgtable = "test_table"]
struct SomeStruct {
elem_1: i32,
elem_2: Option<String>,
#[colname = "renamed_elem_3"]
elem_3: Option<bool>,
}
It expands to the following struct definition with functions which help us do this transformation easily:
#[derive(Default, Debug, PartialEq)]
struct SomeStructInserter {
elem_1: Vec<i32>,
elem_2: Vec<Option<String>>,
elem_3: Vec<Option<bool>>,
}
impl SomeStructInserter {
fn new() -> Self {
Self::default()
}
fn from(items: Vec<SomeStruct>) -> Self {
items
.into_iter()
.fold(Self::default(), |mut inserter, item| {
inserter.add(item);
inserter
})
}
fn add(&mut self, item: SomeStruct) {
self.elem_1.push(item.elem_1);
self.elem_2.push(item.elem_2);
self.elem_3.push(item.elem_3)
}
fn build(self) -> sqlx::query::Query<'static, sqlx::Postgres, sqlx::postgres::PgArguments> {
sqlx::query!("INSERT INTO \"test_table\" (elem_1,elem_2,\"renamed_elem_3\") SELECT * FROM UNNEST ($1::integer[],$2::text[],$3::bool[])", &self.elem_1[..], &self.elem_2[..]as &[Option<String>], &self.elem_3[..]as &[Option<bool>])
}
}
Note that the name of the generated struct will always be {ANNOTATED_STRUCT}Inserter. So SomeStruct becomes SomeStructInserter.
We can then create the sqlx query by calling:
SomeStructInserter::from(elems).build()
You can also annotate certain fields with the #[key] attribute to specify that you want to do an upsert where these (one or more) fields specify the primary key or uniqueness constraint. The following:
#[derive(BatchInserter)]
#[pgtable = "keyed_test_table"]
struct SomeKeyedStruct {
elem_1: i32,
elem_2: Option<String>,
#[key]
#[colname = "renamed_elem_3"]
elem_3: bool,
}
Will generate the following SQL in the resulting sqlx query after calling build():
INSERT INTO "keyed_test_table" (elem_1,elem_2,"renamed_elem_3") SELECT * FROM UNNEST ($1::integer[],$2::text[],$3::bool[])
ON CONFLICT ("renamed_elem_3") DO UPDATE
SET elem_1=excluded.elem_1,elem_2=excluded.elem_2
The mapping of Rust types to Postgres types (needed for the types specified in the UNNEST clause) is far from exhaustive. Currently, the proc-macro will simply panic if it encounters a type which it does not know how to translate.