Crates.io | terrars |
lib.rs | terrars |
version | 0.1.13 |
source | src |
created_at | 2022-12-03 14:07:00.680039 |
updated_at | 2024-07-21 19:51:42.347966 |
description | Terraform in Rust |
homepage | |
repository | https://github.com/andrewbaxter/terrars |
max_upload_size | |
id | 729124 |
size | 139,267 |
Terrars is a tool for building Terraform stacks in Rust. This is an alternative to the CDK.
See a working example in helloworld.
Current status: Usable, but may have some rough edges and missing features. I may continue to tweak things to improve ergonomics.
Why use this or the CDK instead of raw Terraform?
Why use this instead of the CDK?
cdk
requires terraform
, a cdk
CLI, Javascript tools, Javascript package directories, and depending on what language you use that language itself as well. CDK generation requires a json spec -> typescript -> generated javascript -> final language
translation process. terrars
only requires terraform
both during generation and runtime and goes directly from the JSON spec to Rust.Why not use this instead of the CDK?
Note: There's a full, working example in helloworld.
Add terrars
and pre-generated bindings such as terrars-andrewbaxter-stripe or else generate your own (see Generation below) to your project. Enable the features you want to use in the bindings.
Develop your code (ex: build.rs
)
Create a Stack
and set up providers:
let mut stack = &mut BuildStack{}.build();
BuildProviderStripe {
token: STRIPE_TOKEN,
}.build(stack);
The first provider instance for a provider type will be used by default for that provider's resources, so you don't need to bind it.
Then create resources:
let my_product = BuildProduct {
name: "My Product".into(),
}.build(stack);
let my_price = BuildPrice {
...
}.build(stack);
my_price.set_product(my_product.id());
...
Finally, write the stack out:
fs::write("mystack.tf.json", &stack.serialize("state.json")?)?;
Call terraform
as usual in the directory you generated mystack.tf.json
in
(Stack
also has methods run()
and get_output()
to call terraform
for you. You must have terraform
in your path.)
While there are premade crates for some providers, you can generate code for new providers locally using terrars-generate
.
Install the generate cli with cargo install terrars
Create a config file.
As an example, to use hashicorp/aws
, create a json file (ex: terrars_aws.json
) with the specification of what you want to generate:
{
"provider": "hashicorp/aws",
"version": "4.48.0",
"include": [
"cognito_user_pool",
"cognito_user_pool_client",
"cognito_user_pool_domain",
"cognito_user_pool_ui_customization",
"route53_zone",
"route53_record",
"aws_acm_certificate",
"aws_acm_certificate_validation"
],
"dest": "src/bin/mydeploy/tfschema/aws"
}
tfschema/aws
must be an otherwise unused directory - it will be wiped when you genenerate the code. If include
is missing or empty, this will generate everything (alternatively, you can use exclude
to blacklist resources/datasources). Resources and datasources don't include the provider prefix (aws_
in this example). Datasources start with data_
.
Make sure you have terraform
in your PATH
. Run cargo install terrars
, then terrars-generate terrars_aws.json
.
The first time you do this, create a src/bin/mydeploy/tfschema/mod.rs
file with this contents to root the generated provider:
pub mod aws;
There are Build*
structs containing required parameters and a build
method for most schema items (resources, stack, variables, outputs, etc). The build
method registers the item in the Stack
if applicable. Optional parameters can be set on the value returned from build
.
Background: In Terraform, all fields regardless of type can be assigned a string template expression for values computed during stack application. Since all strings can potentially be templates, non-template strings must be escaped to avoid accidental interpolation.
How terrars
handles it: When defining resources and calling methods, String
and &str
will be treated as non-template strings and appropriately escaped. To avoid the escaping, you can produce a PrimExpr
object via stack.str_expr
(to produce an expr that evaluates to a string) or stack.expr
for other expression types. To produce the expression body you can use format!()
as usual, but note - you must call .raw()
on any PrimExpr
s you use in the new expression to avoid double-antiescaping issues.
If Terraform gives you an error about something with the text _TERRARS_SENTINEL*
it means you probably missed a .raw()
call on that value (some expression was double-antiescaped).
As a rule of thumb
expression
to string
/field
is OK. The expression gets turned into a sentinel value and interpolated during writing the templatestring
/field
with no sentinel values (literals, etc) to expression
is OK.string
/field
containing sentinel values -> expression
is BAD. The sentinel replacement will happen twice and you'll have broken data. This can only happen if you convert an expression into a string and then back, so shouldn't happen often.Lists, sets, and record references have a .map
method which takes care of all the different "for" methods in Terraform. Specifically
.map
and define a resource: does resource-level for-each (per Terraform limitations, this cannot be done on lists derived from other resources so has very limited use, you should probably just use a for loop).map
and define a block element: does block-level for-each.map
and return an attribute reference: produces an attribute for
expression.map
always produces a list reference, but this can be assgned to set fields as well. .map_rec
is similar to .map
but results in a record.
There's two helper macros for generating vecs and maps of primitive values:
primvec![v, ...]
- creates a vec of primitive values, converting each value into a primitive if it is not. Use like primvec!["stringone", "stringtwo"]
(easier than vec!["stringone".into(), "stringtwo".into()]
).primmap!{"k" = v, ...}
- creates a map of strings to primitive values, converting each value into a primitive if it is not. Same as above, performs automatic conversion.Terraform provides a method to output provider schemas as json. This tool uses that schema to generate structures that would output matching json Terraform stack files.
Take as an example:
format!("{}{}", my_expr, verbatim_string))
This code would somehow need to escape the pattern and verbatim_string
, while leaving my_expr
unescaped, and the result would need to be treated as an "expression" to prevent escaping if it's used again in another format!
or something. This applies to not just format!
but serde serialization (json), other methods.
For now Terrars uses a simple (somewhat dirty) hack to avoid this. All expressions are put into a replacement table, and a sentinel string (ex: _TERRARS_SENTINEL_99_
) is used instead. During final stack json serialization, the strings are escaped and then the original expression text is substituted back in , replacing the sentinel text.
This way, all normal string formatting methods should retain the expected expressions.
Not all Terraform features have been implemented
The only one I'm aware of missing at the moment is resource Provisioning.
ignore_changes
takes strings rather than an enum
No variable or output static type checking
I'd like to add a derive macro for generating variables/outputs automatically from a structure at some point.
Non-local deployment methods
I think this is easy, but I haven't looked into it yet.
I originally called this terrarust
but then I realized it sounded like terrorist so I decided to play it safe and chopped out the u
t
which stands for unreal tournament.