Crates.io | aws-dynamo-derive |
lib.rs | aws-dynamo-derive |
version | |
source | src |
created_at | 2024-09-04 10:07:14.318514 |
updated_at | 2025-02-10 06:03:09.643543 |
description | Helper crate for aws-sdk-dynamodb. |
homepage | |
repository | https://github.com/ffddw/aws-dynamo-derive |
max_upload_size | |
id | 1363133 |
Cargo.toml error: | TOML parse error at line 17, column 1 | 17 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
Helper crate for aws-sdk-dynamodb.
Generates conversion codes from Rust primitive types to AWS DynamoDB types. Works well with nested types!
use aws_dynamo_derive::{Item, Table};
#[derive(Table)]
struct Foo {
#[aws_dynamo(hash_key)]
pub name: String,
pub value: Value
}
#[derive(Item, Clone)]
struct Value {
pub numbers: Vec<u64>,
pub list_of_ss: Vec<Vec<String>>,
}
this generates
{
"Value": M(
{
"numbers": Ns(["1", "2", "3"]),
"list_of_ss": L([Ss(["one"]), Ss(["two"]), Ss(["three"])])
}
),
"name": S("foo_value")
}
By applying #[rename = "SOME_CASE"]
attribute to Table or Item, it is able to change case to all member fields.
use aws_dynamo_derive::Table;
#[derive(Table)]
#[aws_dynamo(rename = "PascalCase")]
struct FooTable {
#[aws_dynamo(hash_key)]
pub some_item: String
}
Member of FooTable some_item
becomes SomeItem
as the result of rename attribute. snake_case
is the default as
Struct fields decorated with #[aws_dynamo(hash_key)]
add KeyType::Hash
KeySchemas, and by data type of the fields, macro maps
those to AttributeDefinitions.
Available KeySchemas:
range_key
hash_key
AttributeDefinition mappings:
String
-> S
i8 | u8 | .. | u128
-> N
Blob
-> B
String
-> S
bool
-> BOOL
Blob
-> B
i8
| u8
| ..
| u128
-> N
Vec<String>
-> SS
T
: i8
| u8
| ..
| u128
, Vec<T>
-> NS
Vec<Blob>
-> Bs
Option<()>
-> NULL
T
is Vec<T>
but not SS
| NS
| Bs
-> L
HashMap<String, T>
-> M
, automatically converts inner values of HashMap
to AttributeValue
types.Item
and be converted into AttributeValue
.KeySchemas and AttributeDefinitions for LSIs are parsed and expanded to create_table()
if you use the following macros:
#[aws_dynamo(local_secondary_index(index_name = "lsi1", hash_key))]
#[aws_dynamo(local_secondary_index(index_name = "lsi1", range_key))]
If you specify LSIs with above macros, you must attach LocalSecondaryIndexBuilder
s to CreateTableFluentBuilder
so that the LSI is created upon table creation.
You can simply get Vec<KeySchemaElement>
using get_local_secondary_index_key_schemas()
and pass it to set_key_schema()
method of LocalSecondaryIndexBuilder
.
Take a look at test_local() to learn how to use LSIs.
KeySchemas and AttributeDefinitions for GSIs are parsed and expanded to create_table()
if you use the following macros:
#[aws_dynamo(global_secondary_index(index_name = "gsi1", hash_key))]
#[aws_dynamo(global_secondary_index(index_name = "gsi1", range_key))]
If you specify GSIs with above macros, you must attach GlobalSecondaryIndexBuilder
s to CreateTableFluentBuilder
so that the GSI is created upon table creation.
You can simply get Vec<KeySchemaElement>
using get_global_secondary_index_key_schemas()
and pass it to set_key_schema()
method of GlobalSecondaryIndexBuilder
.
Take a look at test_local() to learn how to use GSIs.
from_attribute_value
converts HashMap<String, AttributeValue>
to Rust types.
If any field type does not match the given AttributeValue
type, it returns Err(AttributeValue)
.
The macro tries to convert all possible types, which leads to extra allocation while iterating items of collection types like Vector
or HashMap
.
If the type is super complex and heavy, you might need to benchmark before using it.