spark-config

Crates.iospark-config
lib.rsspark-config
version
sourcesrc
created_at2025-05-06 20:35:08.841777+00
updated_at2025-05-07 23:45:26.812205+00
descriptionA lightweight crate to manage Spark configuration
homepage
repositoryhttps://github.com/polarityorg/spark-rs
max_upload_size
id1662925
Cargo.toml error:TOML parse error at line 18, column 1 | 18 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include`
size0
Flashnet Core (github:polarityorg:flashnet-core)

documentation

README

spark-config

A lightweight crate to manage Spark configuration in Rust applications.

Overview

The spark-config crate provides a structured way to load, validate, and access configuration settings for Spark applications. It handles Bitcoin network settings, operator configurations, service provider connections, and integrations with Mempool, Electrs, and LRC20 Node services.

Features

  • Load configuration from TOML files
  • Support for multiple Bitcoin networks (Mainnet, Testnet, Signet, Regtest)
  • Configuration for Spark operators and service providers
  • Integration with Mempool, Electrs, and LRC20 Node services
  • Automatic validation of configuration values
  • Flexible path resolution (absolute paths, HOME-relative paths)

Feature Flags

The crate provides the following optional features that can be enabled in your Cargo.toml:

  • with-serde: Enables serialization and deserialization support using serde. This is required for loading configuration from TOML files.
  • bitcoin-conversion: Enables conversion between BitcoinNetwork and bitcoin::Network types from the bitcoin crate.

Example usage in your Cargo.toml:

[dependencies]
spark-config = { version = "0.0.1", features = ["with-serde", "bitcoin-conversion"] }

Usage

Add spark-config to your Cargo.toml:

[dependencies]
# Option 1: Using local path
spark-config = { path = "crates/generic/spark-config" }

# Option 2: Using version and path
spark-config = { version = "0.0.1", path = "crates/generic/spark-config" }

# Option 3: From repository (when published)
spark-config = "0.0.1"

Loading Configuration

use spark_config::SparkConfig;
use anyhow::Result;

// Load config from a TOML file
let config = SparkConfig::<String>::from_toml_file("/path/to/config.toml")?;

// Or using a relative path (will be resolved relative to HOME)
let config = SparkConfig::<String>::from_toml_file("config/config.toml")?;

Note: Configuration loading returns a Result type using the anyhow crate for error handling. Errors can occur if:

  • The configuration file cannot be found or opened
  • The file cannot be read
  • The TOML syntax is invalid
  • Required fields are missing or have invalid values

You should handle these errors appropriately in your application.

Accessing Configuration Values

// Get Bitcoin network
let network = config.bitcoin_network();

// Get Mempool configuration
if let Some(mempool_url) = config.mempool_base_url() {
    println!("Mempool URL: {}", mempool_url);
}

// Get authentication details
if let Some((username, password)) = config.mempool_auth() {
    // Use credentials...
}

// Access service provider endpoint
let ssp_key = "lightspark";
if let Some(endpoint) = config.ssp_endpoint(&ssp_key.to_string()) {
    println!("SSP endpoint: {}", endpoint);
}

// Get coordinator operator
if let Some(coordinator) = config.coordinator_operator() {
    println!("Coordinator base URL: {}", coordinator.base_url());
}

Configuration File Format

The configuration is stored in TOML format. Here's an example structure:

# Bitcoin network (Mainnet, Testnet, Signet, Regtest)
bitcoin_network = "Regtest"

# Mempool configuration
[mempool]
base_url = "https://regtest-mempool.example.com"
username = "mempool-username"
password = "mempool-password"

# Electrs configuration
[electrs]
base_url = "https://regtest-mempool.example.com/api"
username = "electrs-username"
password = "electrs-password"

# LRC20 Node configuration
[lrc20_node]
base_url = "https://lrc20node.example.com/api"

# Service Provider configurations
[ssp_pool.lightspark]
base_url = "https://api.example.com"
schema_endpoint = "graphql"
identity_public_key = "02000000000000000000000000000000000000000000000000000000000000abcd"
running_authority = "Lightspark"

# Operator configurations
[[operators]]
id = 1
base_url = "https://0.spark.lightspark.com"
identity_public_key = "02000000000000000000000000000000000000000000000000000000000000abcd"
frost_identifier = "01000000000000000000000000000000000000000000000000000000000000abcd"
running_authority = "Lightspark"
is_coordinator = true

Environment Variables

When running tests, you can specify a custom configuration path using the SPARK_CONFIG_PATH environment variable:

SPARK_CONFIG_PATH=/path/to/test/config.toml cargo test

License

Licensed under either of:

Commit count: 0

cargo fmt