| Crates.io | spark-rust |
| lib.rs | spark-rust |
| version | 0.1.11 |
| created_at | 2025-03-14 18:28:15.067213+00 |
| updated_at | 2025-04-03 20:37:37.813581+00 |
| description | Rust Development Kit for Spark |
| homepage | https://docs.spark.info |
| repository | https://github.com/polarityorg/spark-rs |
| max_upload_size | |
| id | 1592593 |
| size | 741,192 |
This workspace is the semi-official Rust development environment for Spark. This crate forms a complete wallet with all necessary Spark utilities. The cryptographic primitives are provided by the spark-cryptography crate.
Spark Wallet SDK has 5 components:
config: This module contains the configuration for the Spark wallet, as found in the config directory.handlers: This module contains the user-facing APIs for the Spark wallet, as found in the handlers directory. Examples illustrating typical usage are provided below.internal_handlers: Contains the internal service handlers for coordinating signing processes and Spark RPC communications, as documented in the internal_handlers directory.rpc: Provides an RPC client for establishing secure connections to Spark nodes, handling TLS configurations, and creating service-specific clients.signer: Provides comprehensive key management, storage, and signing capabilities, fully conforming to the traits found in src/signer/traits. In addition, a convenient built-in signer (default_signer.rs) is included for quick and straightforward integration.Make sure that you have protos installed.
# Make sure you have protos installed
brew install protobuf
Also, make sure you have Rust version 1.75.0. Ideally, you should use the latest stable version.
# For version 1.75.0
rustup update
rustup install 1.75.0
# For the latest stable version
rustup update stable
use spark_rust::SparkSdk;
use spark_rust::SparkNetwork;
use spark_rust::signer::default_signer::DefaultSigner;
#[tokio::main]
async fn main() -> Result<SparkSdk, SparkSdkError> {
// Initialize the default signer. Alternatively, you can also create a custom signer as long as it implements all the necessary signing traits. In this case, it is your responsibility to make sure that the signer is safe to use and works as expected.
let mnemonic = "abandon ability able about above absent absorb abstract absurd \
abuse access accident";
let default_signer = DefaultSigner::from_mnemonic(mnemonic, SparkNetwork::Regtest).await?;
let sdk = SparkSdk::new(SparkNetwork::Regtest, default_signer).await?;
// Generate a deposit address. Note: This deposit address is one time use only!
let deposit_address = sdk.generate_deposit_address().await?;
println!("Deposit address: {}", deposit_address.deposit_address);
// You should send a deposit to this address on L1, and Spark will detect it. You can choose the amount of sats.
// This line sends 100,000 sats to the deposit address.
let txid = l1_wallet.send_to_address(deposit_address.deposit_address, 100_000).await?;
// For Regtest, sleep for 30 seconds
sleep(Duration::from_secs(30)).await;
// Claim the deposit
let deposits = sdk.claim_deposit(txid).await?;
let balance = sdk.get_bitcoin_balance();
assert_eq!(balance, 100_000);
// Also, query all the incoming transfers from other Spark users.
let pending = sdk.query_pending_transfers().await?;
println!("Querying all transfers...");
for transfer in all_transfers.transfers {
println!("Transfer: {:?} satoshis", transfer.total_value);
}
// So far, you have NOT claimed these transfers.
// You should claim them by calling `sdk.claim_transfers()`.
let claimed = sdk.claim_transfers().await?;
// And now, your Bitcoin balance should be updated.
let balance = sdk.get_bitcoin_balance();
}
The Spark network operates with fee structures that are different from traditional Bitcoin or Lightning wallets. All fees in Spark are service fees charged by the Spark Service Provider (SSP) for various operations they facilitate on behalf of users.
Each fee type has a corresponding estimation method that helps you determine the cost before performing the actual operation. The fee structure is designed to be transparent and predictable.
Note: You may want to revisit this section after reading the API docs and understanding how the SDK works by sequential API calls.
The Spark SDK includes locking mechanisms to coordinate wallet operations, but has important limitations when used in multi-threaded or concurrent environments.
Despite internal locking, concurrent operations that modify wallet state can lead to race conditions and unexpected behavior, particularly with leaf selection and swap operations.
The SDK uses internal locks to protect wallet state, but these locks have limitations in async environments. Complex operations involving network calls and leaf selection may not maintain exclusive access throughout their entire lifecycle.
When executing transfers, the SDK automatically performs leaf selection to find appropriate UTXOs for the transfer amount. If a leaf with the exact amount isn't available, the SDK requests a swap with the SSP to optimize denominations.
Example scenario:
// Assume the user has 2000 sats total in their wallet, divided into two leaves:
// - leaf1 = 1000 sats
// - leaf2 = 1000 sats
// Each leaf represents a UTXO that can be spent.
// UNSAFE: Concurrent transfers that will likely fail
let (first_transfer_result, second_transfer_result) = tokio::join!(
sender_sdk.transfer(800, &receiver_address),
sender_sdk.transfer(1200, &receiver_address)
);
Here's what happens in this concurrent scenario:
LeafSelectionInsufficientFunds errorThe key issue is that swap operations temporarily lock all available funds, not just the amount being transferred. This is necessary for the secure swap protocol with the SSP.
Solution: Sequential Operations
// SAFE: Sequential transfers will both succeed
let result1 = sender_sdk.transfer(1200, &receiver_address).await?;
// After the first transfer completes, the wallet has 800 sats remaining
let result2 = sender_sdk.transfer(800, &receiver_address).await?;
With sequential operations:
Even though there are sufficient total funds for both operations (1200 + 800 = 2000), running them concurrently leads to failures because the swap protocol temporarily locks more funds than are being transferred.
During swap operations or other leaf-modifying processes, your wallet's available balance may appear lower than the actual total balance. This occurs because leaves are locked during operations but may not yet be fully processed.
Balance reporting is always conservative - your balance may appear less than it actually is, but never more.
To avoid race conditions and ensure predictable behavior:
Serialize Leaf-Modifying Operations:
Safe Concurrent Operations: These operations are generally safe to perform concurrently:
Operations That Should Be Serialized: These operations should never run concurrently with other leaf-modifying operations:
Instead of concurrent transfers, use sequential processing:
// UNSAFE: Concurrent transfers that may both require swaps
// let (result1, result2) = tokio::join!(
// sender_sdk.transfer(1200, &receiver_address),
// sender_sdk.transfer(800, &receiver_address)
// );
// SAFE: Sequential transfers
let result1 = sender_sdk.transfer(1200, &receiver_address).await?;
let result2 = sender_sdk.transfer(800, &receiver_address).await?;
We plan to enhance thread safety in future versions of the SDK by implementing:
Until these improvements are in place, applications using the Spark SDK should take care to avoid concurrent leaf-modifying operations.
Below you will find the primary wallet API documentation. For developers interested in implementing custom signers, refer to the signer documentation at the end.
Use the new method to create a new instance of the Spark SDK. This is the main entry point for interacting with the Spark protocol.
network: SparkNetwork - The Spark network to connect to (e.g., Regtest or Mainnet)signer: S where S: SparkSigner - Implementation of the SparkSigner trait for secure key managementReturns a Result<SparkSdk, SparkSdkError>, which contains:
• The initialized SparkSdk instance if successful
• A SparkSdkError if initialization fails
Internally, this constructor:
SparkSdk::new(network, signer) with the desired network and signer// Create a signer using a mnemonic phrase
let mnemonic = "abandon ability able about above absent absorb abstract absurd abuse access accident";
let network = SparkNetwork::Regtest;
let signer = DefaultSigner::from_mnemonic(mnemonic, network.clone()).await?;
// Initialize the SDK with the signer and network
let sdk = SparkSdk::new(network, signer).await?;
// The SDK is now ready to use
println!("SDK initialized successfully");
Use the get_spark_address method to retrieve the Spark address of the wallet, which is derived from the wallet's identity public key.
None - This method doesn't require any parameters.
Returns a Result<PublicKey, SparkSdkError>, which contains:
• A PublicKey representing the wallet's identity public key
• This key serves as the wallet's unique identifier on the Spark network
The Spark address serves several purposes:
sdk.get_spark_address().// Get the wallet's Spark address
let spark_address = sdk.get_spark_address()?;
// This address is a compressed secp256k1 public key in SEC format (33 bytes)
println!("Your Spark address: {}", spark_address);
// You can share this address with others so they can send you funds
let serialized_address = spark_address.serialize();
assert_eq!(serialized_address.len(), 33); // 33-byte compressed format
Use the get_network method to retrieve the Bitcoin network that this wallet is connected to.
None - This method doesn't require any parameters.
Returns a SparkNetwork enum indicating whether this is a mainnet or regtest wallet.
The network determines which Spark operators the wallet communicates with and which Bitcoin network (mainnet or regtest) is used for transactions. It's set when creating the wallet and cannot be changed after initialization.
sdk.get_network().// Get the network this wallet is connected to
let network = sdk.get_network();
// You can use this to display appropriate information in your UI
match network {
SparkNetwork::Mainnet => println!("Connected to Spark Mainnet"),
SparkNetwork::Regtest => println!("Connected to Spark Regtest (testing network)"),
}
// Or use it for conditional logic
if network == SparkNetwork::Regtest {
println!("This is a test wallet - don't use real funds!");
}
Use the generate_deposit_address method to obtain a unique, one-time-use deposit address for Spark. This method returns a
GenerateDepositAddressSdkResponse, which explicitly contains:
• A deposit address of type bitcoin::Address
• A signing public key of type bitcoin::secp256k1::PublicKey
• A verifying public key of type bitcoin::secp256k1::PublicKey
Internally, Spark combines the user's signing public key with a Spark Operator public key to derive a taproot address.
sdk.generate_deposit_address().GenerateDepositAddressSdkResponse containing all three fields.// 1. Calling Spark to generate all three fields in GenerateDepositAddressSdkResponse.
let generate_deposit_response = sdk.generate_deposit_address().await?;
// 2. This deposit address (bitcoin::Address) is a one-time address you can use to send funds on L1.
let deposit_address = generate_deposit_response.deposit_address;
// 3. The signing public key (bitcoin::secp256k1::PublicKey) for the deposit address,
// generally managed internally by the SDK.
let signing_public_key = generate_deposit_response.signing_public_key;
// 4. The verifying public key (bitcoin::secp256k1::PublicKey),
// used to verify threshold signatures (not typically needed directly).
let verifying_public_key = generate_deposit_response.verifying_public_key;
Use the claim_deposit method to claim funds that have been deposited to a Spark deposit address.
txid: String - The transaction ID of the L1 transaction that sent funds to the Spark deposit addressReturns a Result<Vec<TreeNode>, SparkSdkError>, which explicitly contains:
• A vector of TreeNode objects representing the claimed deposits
• Each TreeNode are returned by Spark Operators and contains details about the deposit such as amount and status
• TreeNode is a tonic message type pre-compiled using Spark's official protobuf definitions
Internally, Spark processes the L1 transaction, verifies the deposit, and adds it to your wallet balance, making the funds available for use in the Spark network.
sdk.claim_deposit(txid) with the transaction ID of your deposit.TreeNode objects.// 1. Generate a deposit address first (as shown in the previous example)
let deposit_address = sdk.generate_deposit_address().await?;
// 2. Send bitcoin to this address on L1 (using an L1 wallet)
let txid = l1_wallet.send_to_address(deposit_address.deposit_address, 100_000).await?;
// 3. Wait for the transaction to be confirmed
// For Regtest, this will take around 30 seconds
sleep(Duration::from_secs(30)).await;
// 4. Claim the deposit using the transaction ID
let deposits = sdk.claim_deposit(txid).await?;
// 5. Verify the balance has been updated
let balance = sdk.get_bitcoin_balance();
assert_eq!(balance, 100_000);
Use the query_unused_deposit_addresses method to retrieve all unused deposit addresses that have been previously generated for your wallet. This helps you track deposit addresses that you've created but haven't received funds on yet.
None - This method doesn't require any parameters.
Returns a Result<Vec<DepositAddressQueryResult>, SparkSdkError>, which explicitly contains:
• A vector of DepositAddressQueryResult objects representing unused deposit addresses
• Each result contains the deposit address and associated metadata
• DepositAddressQueryResult is a tonic message type pre-compiled using Spark's official protobuf definitions
Internally, Spark queries the network for all deposit addresses associated with your identity public key that haven't been used for deposits yet.
sdk.query_unused_deposit_addresses().DepositAddressQueryResult objects representing all unused deposit addresses.// Query all unused deposit addresses associated with your wallet
let unused_addresses = sdk.query_unused_deposit_addresses().await?;
// Process each unused address
for address_result in unused_addresses {
println!("Unused address: {}", address_result.deposit_address);
// You might want to check if these addresses have received funds on L1
// or display them to users who are expected to make deposits
}
// You can also count how many unused addresses you have
println!("You have {} unused deposit addresses", unused_addresses.len());
This is an advanced method that allows you to finalize a deposit without using the claim_deposit method for custom use cases. Use the finalize_deposit method to finalize the claiming process for funds deposited to a Spark deposit address. Note: Users typically do not need to call this method directly, as claim_deposit automatically calls it internally. This method is provided for advanced use cases where you need to override the default claiming logic.
signing_pubkey: Vec<u8> - Binary representation of the signing public key used for the depositverifying_pubkey: Vec<u8> - Binary representation of the verifying public key used for the depositdeposit_tx: bitcoin::Transaction - The full Bitcoin transaction containing the depositvout: u32 - The output index in the transaction that contains the depositReturns a Result<TreeNode, SparkSdkError>, which explicitly contains:
• A TreeNode object representing the finalized deposit
• Contains details about the deposit such as amount and status
• TreeNode is a tonic message type pre-compiled using Spark's official protobuf definitions
Internally, Spark finalizes the deposit process by submitting the provided parameters to Spark Operators, who verify and process the deposit, making the funds available in your wallet.
sdk.finalize_deposit() with the required parameters.TreeNode object.// STANDARD APPROACH: In most cases, you would simply use claim_deposit:
// let deposits = sdk.claim_deposit(txid).await?;
// ADVANCED APPROACH: Only if you need to bypass claim_deposit for custom logic:
// 1. Get the Bitcoin transaction containing the deposit
let deposit_tx = bitcoin_client.get_transaction(txid).await?;
// 2. Identify which output contains the deposit (custom logic)
let vout = 0; // Example: using custom logic to determine output index
// 3. Get the signing and verifying public keys from your deposit tracking system
let signing_pubkey = your_custom_storage.get_signing_pubkey_for_deposit(txid).await?;
let verifying_pubkey = your_custom_storage.get_verifying_pubkey_for_deposit(txid).await?;
// 4. Call finalize_deposit directly (bypassing claim_deposit)
let deposit = sdk.finalize_deposit(
signing_pubkey,
verifying_pubkey,
deposit_tx,
vout
).await?;
// The funds are now available in your wallet
let balance = sdk.get_bitcoin_balance();
Use the query_pending_transfers method to retrieve all pending transfers where the current user is the receiver. A pending transfer represents funds that have been sent to the user but have not yet been claimed. The transfers remain in a pending state until the receiver claims them, at which point the funds become available in their wallet.
This function does not claim any pending transfers. To claim a transfer, you should call claim_transfers(). This will execute key tweaking, which is the core of Spark's security mechanism. Before the receiver tweaks the keys, the transfer is not final.
None - This method doesn't require any parameters.
Returns a Result<Vec<Transfer>, SparkSdkError>, which explicitly contains:
• A vector of Transfer objects representing pending transfers
• Each Transfer contains details about the pending transfer such as amount, sender, and status
• Transfer is a tonic message type pre-compiled using Spark's official protobuf definitions
Internally, Spark queries the network for all pending transfers associated with the user's identity public key.
sdk.query_pending_transfers().Transfer objects representing all pending transfers.// Query all pending transfers where the current user is the receiver
let pending = sdk.query_pending_transfers().await?;
// Process each pending transfer
for transfer in pending {
println!("Pending transfer: {:?} satoshis", transfer.total_value);
// You might want to automatically accept transfers or display them to the user
// For example:
// if should_auto_accept(&transfer) {
// sdk.accept_transfer(transfer.id).await?;
// }
}
Use the transfer method to send funds from your wallet to another Spark user. This initiates a transfer process where the funds are removed from your wallet and become available for the recipient to claim.
amount: u64 - The amount to transfer in satoshis. Must be greater than the dust limit and the wallet must have a leaf with exactly this amount.receiver_spark_address: &bitcoin::secp256k1::PublicKey - The Spark address identifying the receiver of the transfer. This should be the receiver's identity public key, not a regular Bitcoin public key.Returns a Result<String, SparkSdkError>, which explicitly contains:
• A String representing the transfer ID if successful
• This ID can be used to track the status of the transfer
Internally, Spark handles the process of transferring funds by selecting appropriate leaves (UTXOs), locking them, generating new signing keys, creating and signing the transfer transaction, and removing the used leaves from your wallet.
sdk.transfer(amount, &receiver_spark_address) with the amount and receiver's Spark address.// Define the amount to transfer (in satoshis)
let amount = 100_000;
// Get the recipient's Spark address (which is their public key)
// This can be shared between users in your application
let receiver_spark_address = PublicKey::from_str(
"02782d7ba8764306bd324e23082f785f7c880b7202cb10c85a2cb96496aedcaba7"
).unwrap();
// Send the transfer
let transfer_id_string = sdk.transfer(amount, &receiver_spark_address).await?;
// The transfer ID is a UUID string that can be parsed and stored
let transfer_id = Uuid::parse_str(&transfer_id_string).unwrap();
println!("Transfer successfully initiated with ID: {}", transfer_id);
// The recipient will need to call query_pending_transfers() and claim_transfer()
// to receive these funds
This is an advanced method intended for specialized use cases where you need precise control over which leaves (UTXOs) are used in a transfer. Most users should use the standard transfer(amount, receiver) method instead.
Use the transfer_leaf_ids method to transfer specific leaves from your wallet to another Spark user by directly providing the leaf IDs to be transferred.
leaf_ids: Vec<String> - Vector of leaf IDs to transfer. Each ID identifies a specific UTXO in your wallet.receiver_identity_pubkey: &PublicKey - The Spark address identifying the receiver of the transfer. This should be the receiver's identity public key.Returns a Result<String, SparkSdkError>, which explicitly contains:
• A String representing the transfer ID if successful
• This ID can be used to track the status of the transfer
Internally, this method follows a similar process to the standard transfer, but instead of selecting leaves based on an amount, it uses the exact leaves specified by their IDs.
sdk.transfer_leaf_ids(leaf_ids, &receiver_spark_address) with the leaf IDs and receiver's Spark address.// Get specific leaf IDs from your wallet that you want to transfer
// This requires knowledge of your wallet's internal leaf structure
let leaf_ids = vec!["leaf_id_1".to_string(), "leaf_id_2".to_string()];
// Get the recipient's Spark address
let receiver_spark_address = PublicKey::from_str(
"02782d7ba8764306bd324e23082f785f7c880b7202cb10c85a2cb96496aedcaba7"
).unwrap();
// Transfer the specified leaves
let transfer_id_string = sdk.transfer_leaf_ids(leaf_ids, &receiver_spark_address).await?;
// The transfer ID can be parsed and stored
let transfer_id = Uuid::parse_str(&transfer_id_string).unwrap();
println!("Leaf transfer initiated with ID: {}", transfer_id);
Use the claim_transfer method to claim a specific pending transfer that was sent to your wallet. This method processes a pending transfer and adds the funds to your wallet balance.
transfer: Transfer - The pending transfer to claim, must be in SenderKeyTweaked statusReturns a Result<(), SparkSdkError>, which indicates:
• Success (Ok) if the transfer was successfully claimed
• Error (Err) if there was an issue during the claim process
Internally, Spark performs several security-critical steps:
query_pending_transfers())sdk.claim_transfer(transfer) with the transfer object// First get pending transfers
let pending = sdk.query_pending_transfers().await?;
// Then claim each transfer individually
for transfer in pending {
sdk.claim_transfer(transfer).await?;
println!("Successfully claimed transfer: {}", transfer.id);
}
// Verify your updated balance
let balance = sdk.get_bitcoin_balance();
println!("Updated balance: {} satoshis", balance);
Use the claim_transfers method to claim all pending transfers sent to your wallet in a single operation. This convenience method automatically retrieves all pending transfers and claims them for you.
None - This method doesn't require any parameters.
Returns a Result<(), SparkSdkError>, which indicates:
• Success (Ok) if all transfers were successfully claimed
• Error (Err) if there was an issue during the claim process
Internally, this method:
query_pending_transfers() to get all pending transfersclaim_transfersdk.claim_transfers()// Claim all pending transfers in a single call
sdk.claim_transfers().await?;
println!("Successfully claimed all pending transfers");
// Verify your updated balance
let balance = sdk.get_bitcoin_balance();
println!("Updated balance: {} satoshis", balance);
// You can also check if there are any remaining pending transfers
// (there shouldn't be any if claim_transfers was successful)
let pending = sdk.query_pending_transfers().await?;
assert!(pending.is_empty(), "All transfers should have been claimed");
Use the get_all_transfers method to retrieve the history of all transfers (both sent and received) associated with your wallet. This method supports pagination to manage large transfer histories.
limit: Option<u32> - Optional maximum number of transfers to return (defaults to 20 if not specified)offset: Option<u32> - Optional number of transfers to skip (defaults to 0 if not specified)Returns a Result<QueryAllTransfersResponse, SparkSdkError>, which explicitly contains:
• A QueryAllTransfersResponse object containing the list of transfers
• This response includes both sent and received transfers
• Each transfer contains details such as amount, sender, receiver, status, and timestamp
• QueryAllTransfersResponse is a tonic message type pre-compiled using Spark's official protobuf definitions
Internally, Spark queries the network for all transfers associated with your identity public key and applies the pagination parameters.
sdk.get_all_transfers(limit, offset) with optional pagination parametersQueryAllTransfersResponse containing the requested transfers// Get the first 20 transfers (default pagination)
let first_page = sdk.get_all_transfers(None, None).await?;
println!("First page of transfers: {}", first_page.transfers.len());
// Display transfer details
for transfer in &first_page.transfers {
println!("Transfer ID: {}, Amount: {} sats, Status: {}",
transfer.id,
transfer.total_value,
transfer.status);
}
// Get the next 20 transfers (pagination)
let second_page = sdk.get_all_transfers(Some(20), Some(20)).await?;
println!("Second page of transfers: {}", second_page.transfers.len());
// You can implement pagination controls in your UI
let page_size = 10;
let page_number = 3; // 0-indexed
let transfers = sdk.get_all_transfers(
Some(page_size),
Some(page_size * page_number)
).await?;
Use the get_bitcoin_balance method to retrieve the current total balance of your wallet in satoshis.
None - This method doesn't require any parameters.
Returns a u64 value representing the total available balance in satoshis.
Internally, Spark calculates this by summing the value of all available leaves (UTXOs) in your wallet.
sdk.get_bitcoin_balance().// Get the current wallet balance
let balance = sdk.get_bitcoin_balance();
println!("Your current balance is {} satoshis", balance);
// You can also use this to check if you have enough funds for a transfer
let amount_to_send = 50_000;
if balance >= amount_to_send {
sdk.transfer(amount_to_send, &receiver_spark_address).await?;
} else {
println!("Insufficient funds: you need {} but only have {}",
amount_to_send, balance);
}
Use the sync_wallet method to perform a comprehensive synchronization of your wallet with the Spark network. This is a convenience method that executes multiple synchronization operations in a single call.
None - This method doesn't require any parameters.
Returns a Result<(), SparkSdkError>, which indicates:
• Success (Ok) if all synchronization operations completed successfully
• Error (Err) if there was an issue during any synchronization step
Internally, this method performs the following operations in sequence:
sdk.sync_wallet().// Perform a full wallet synchronization
sdk.sync_wallet().await?;
println!("Wallet successfully synchronized with the network");
// After syncing, you'll have the most up-to-date balance
let updated_balance = sdk.get_bitcoin_balance();
println!("Updated balance: {} satoshis", updated_balance);
// Your wallet will also have claimed all pending transfers
let pending = sdk.query_pending_transfers().await?;
assert!(pending.is_empty(), "All transfers should have been claimed during sync");
This is an advanced method that allows you to optimize your wallet's leaf structure by swapping your current leaves with the Spark Service Provider (SSP). This function is primarily used internally by the SDK when you need to transfer an amount that doesn't match any of your existing leaves.
For example, if you have a single leaf of 100,000 satoshis but need to send 80,000 satoshis, this function will swap with the SSP to get leaves totaling 100,000 satoshis but with denominations that include the 80,000 you need. The SSP typically provides leaves in power-of-2 denominations for optimal efficiency.
target_amount: u64 - The amount (in satoshis) you want to have in a specific leaf after the swapReturns a Result<String, SparkSdkError>, which explicitly contains:
• A String representing the ID of the newly created leaf with the target amount
• This leaf ID can be used for future transfers
Internally, this method:
sdk.request_leaves_swap(target_amount) with your desired amount// Let's say you have a single leaf of 100,000 satoshis but need to send 80,000
let target_amount = 80_000;
// Request a swap with the SSP to get optimized leaves
let new_leaf_id = sdk.request_leaves_swap(target_amount).await?;
println!("Created new leaf with ID: {}", new_leaf_id);
// Now you can transfer exactly 80,000 satoshis
let receiver_spark_address = PublicKey::from_str(
"02782d7ba8764306bd324e23082f785f7c880b7202cb10c85a2cb96496aedcaba7"
).unwrap();
sdk.transfer(target_amount, &receiver_spark_address).await?;
// Your wallet balance should still total 100,000 satoshis, but in optimized denominations
let balance = sdk.get_bitcoin_balance();
assert_eq!(balance, 100_000);
Use the pay_lightning_invoice method to pay a Lightning Network invoice using the Spark Service Provider (SSP) as an intermediary. Unlike traditional Lightning wallets, Spark doesn't directly connect to the Lightning Network. Instead, it uses a cooperative approach where:
invoice: &String - A BOLT11 Lightning invoice string that you want to payReturns a Result<String, SparkSdkError>, which explicitly contains:
• A String representing the payment ID if successful
• This ID can be used to track the payment status
Internally, this method:
sdk.pay_lightning_invoice(invoice) with the Lightning invoice string// Get a Lightning invoice from somewhere (e.g., a merchant)
let invoice = "lnbc1500n1p3zty3app5wkf0hagkc4egr8rl88msr4c5lp0ygt6gvzna5hdg4tpna65pzqdq0vehk7cnpwga5xzmnwvycqzpgxqyz5vqsp5v9ym7xsyf0qxqwzlmwjl3g0g9q2tg977h70hcheske9xlgfsggls9qyyssqtghx3qqpwm9zl4m398nm40wj8ryaz8v7v4rrdvczypdpy7qtc6rdrkklm9uxlkmtp3jf29yhqjw2vwmlp82y5ctft94k23cwgqd9llgy".to_string();
// Pay the invoice
let payment_id = sdk.pay_lightning_invoice(&invoice).await?;
println!("Lightning payment initiated with ID: {}", payment_id);
// Your leaves have been transferred to the SSP, and the SSP has made the Lightning payment
Use the create_lightning_invoice method to generate a Lightning Network invoice that others can pay to you. When someone pays this invoice via Lightning, the funds will be received by the SSP and then transferred to your Spark wallet.
amount_sats: u64 - The amount in satoshis that you want to receivememo: Option<String> - Optional description/memo for the invoiceexpiry_seconds: Option<i32> - Optional expiry time in seconds (defaults to 30 days if not specified)Returns a Result<Bolt11Invoice, SparkSdkError>, which explicitly contains:
• A Bolt11Invoice object representing the generated Lightning invoice
• This invoice can be shared with anyone who wants to pay you via Lightning
Internally, this method:
sdk.create_lightning_invoice(amount, memo, expiry) with your desired parameters// Create an invoice for 50,000 satoshis
let amount_sats = 50_000;
let memo = Some("Payment for services".to_string());
let expiry = Some(3600 * 24); // 24 hours
// Generate the Lightning invoice
let invoice = sdk.create_lightning_invoice(amount_sats, memo, expiry).await?;
// Get the invoice string to share with the payer
let invoice_string = invoice.to_string();
println!("Lightning Invoice: {}", invoice_string);
// When someone pays this invoice via Lightning, the funds will automatically
// appear in your Spark wallet (after being processed by the SSP)
Use the withdraw method to transfer funds from your Spark wallet back to the Bitcoin blockchain through a cooperative process with the Spark Service Provider (SSP). This process, also known as a "cooperative exit," allows you to convert your Spark funds into regular on-chain Bitcoin.
onchain_address: &Address - The Bitcoin address where the funds should be senttarget_amount_sats: Option<u64> - Optional amount in satoshis to withdraw. If not specified, attempts to withdraw all available funds in your walletReturns a Result<CoopExitResponse, SparkSdkError>, which explicitly contains:
• A CoopExitResponse object with:
request_id: A CoopExitRequestId identifying this withdrawal requestexit_txid: The transaction ID of the exit transaction on the Bitcoin blockchainInternally, this method:
sdk.withdraw(&onchain_address, target_amount_sats) with the Bitcoin address and optional amount// Create a Bitcoin address to receive the withdrawn funds
let bitcoin_address = Address::from_str("bc1qw508d6qejxtdg4y5r3zarvary0c5xw7kv8f3t4")?;
// Option 1: Withdraw all available funds
let withdrawal_response = sdk.withdraw(&bitcoin_address, None).await?;
println!("Withdrawal initiated with request ID: {:?}", withdrawal_response.request_id);
println!("Exit transaction ID: {}", withdrawal_response.exit_txid);
// Option 2: Withdraw a specific amount (e.g., 50,000 satoshis)
let specific_amount = 50_000;
let withdrawal_response = sdk.withdraw(&bitcoin_address, Some(specific_amount)).await?;
// You can check the status of the Bitcoin transaction using the exit_txid
// with any Bitcoin block explorer or your Bitcoin wallet
Withdrawals incur a service fee charged by the SSP for facilitating the on-chain transaction. You can estimate this fee before initiating a withdrawal:
// Get the leaf IDs you want to withdraw
let leaf_ids = sdk.leaf_manager
.get_available_bitcoin_leaves(None, SparkNodeStatus::Available)
.iter()
.map(|leaf| leaf.get_id().clone())
.collect();
// Get the Bitcoin address as a string
let bitcoin_address_string = bitcoin_address.to_string();
// Estimate the withdrawal fee
let fee_estimate = sdk.get_cooperative_exit_fee_estimate(leaf_ids, bitcoin_address_string).await?;
println!("Estimated withdrawal fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed based on the fee
if fee_estimate.fees < 5000 { // Example threshold
sdk.withdraw(&bitcoin_address, None).await?;
} else {
println!("Fee too high ({}), withdrawal aborted", fee_estimate.fees);
}
DEFAULT_WITHDRAWAL_AMOUNT (typically 10,000 satoshis)Use the get_lightning_send_fee_estimate method to estimate the fees associated with sending a Lightning payment through the Spark Service Provider (SSP).
invoice: String - The Lightning invoice you want to payReturns a Result<SparkFeeEstimate, SparkSdkError>, which explicitly contains:
• A SparkFeeEstimate object with the estimated fees in satoshis
This helps you understand the cost of making a Lightning payment before you commit to it. The fee is a service fee charged by the SSP for facilitating the Lightning payment.
sdk.get_lightning_send_fee_estimate(invoice) with the invoice// Get a Lightning invoice from somewhere
let invoice = "lnbc1500n1p3zty3app...".to_string();
// Get fee estimate before paying
let fee_estimate = sdk.get_lightning_send_fee_estimate(invoice.clone()).await?;
println!("Estimated fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed with the payment
if fee_estimate.fees < 100 {
// Fee is acceptable, proceed with payment
sdk.pay_lightning_invoice(&invoice).await?;
} else {
println!("Fee too high, payment aborted");
}
Use the get_lightning_receive_fee_estimate method to estimate the fees associated with receiving a Lightning payment through the Spark Service Provider (SSP).
amount: u64 - The amount in satoshis you want to receiveReturns a Result<SparkFeeEstimate, SparkSdkError>, which explicitly contains:
• A SparkFeeEstimate object with the estimated fees in satoshis
This helps you understand how much will be deducted from the payment amount as fees. The fee is a service fee charged by the SSP for facilitating the Lightning payment reception.
sdk.get_lightning_receive_fee_estimate(amount) with the desired amount// Amount you want to receive
let amount_sats = 50_000;
// Get fee estimate for receiving this amount
let fee_estimate = sdk.get_lightning_receive_fee_estimate(amount_sats).await?;
println!("Estimated receive fee: {} satoshis", fee_estimate.fees);
// Calculate the net amount you'll receive after fees
let net_amount = amount_sats - fee_estimate.fees;
println!("You'll receive {} satoshis after fees", net_amount);
// Create invoice if fees are acceptable
if fee_estimate.fees < amount_sats * 0.01 { // Less than 1% fee
sdk.create_lightning_invoice(amount_sats, None, None).await?;
}
Use the get_cooperative_exit_fee_estimate method to estimate the fees associated with withdrawing funds from Spark to an on-chain Bitcoin address through the Spark Service Provider (SSP).
leaf_ids: Vec<String> - The specific leaf IDs you want to withdrawon_chain_address: String - The Bitcoin address where you want to receive the fundsReturns a Result<SparkFeeEstimate, SparkSdkError>, which explicitly contains:
• A SparkFeeEstimate object with the estimated fees in satoshis
This helps you understand the cost of withdrawing your funds back to the Bitcoin blockchain before initiating the withdrawal. The fee is a service fee charged by the SSP for facilitating the on-chain exit.
sdk.get_cooperative_exit_fee_estimate(leaf_ids, on_chain_address) with the leaf IDs and address// Identify the leaves you want to withdraw
let leaf_ids = vec!["leaf_id_1".to_string(), "leaf_id_2".to_string()];
// Specify the Bitcoin address to receive funds
let onchain_address = "bc1q...".to_string();
// Get fee estimate before withdrawing
let fee_estimate = sdk.get_cooperative_exit_fee_estimate(leaf_ids.clone(), onchain_address.clone()).await?;
println!("Estimated withdrawal fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed with the withdrawal
if fee_estimate.fees < 1000 { // Example threshold
// Fee is acceptable, proceed with withdrawal
let bitcoin_address = Address::from_str(&onchain_address).unwrap();
sdk.withdraw(&bitcoin_address, None).await?;
} else {
println!("Fee too high, withdrawal aborted");
}
Use the get_leaves_swap_fee_estimate method to estimate the fees associated with optimizing your wallet's leaf structure by swapping your leaves with the Spark Service Provider (SSP).
total_amount_sats: u64 - The total amount in satoshis that will be involved in the swapReturns a Result<SparkFeeEstimate, SparkSdkError>, which explicitly contains:
• A SparkFeeEstimate object with the estimated fees in satoshis
This helps you understand the cost of optimizing your leaf structure before initiating the swap. The fee is a service fee charged by the SSP for facilitating the leaves swap operation.
sdk.get_leaves_swap_fee_estimate(total_amount_sats) with the total amount// Total amount to be swapped
let total_amount_sats = 100_000;
// Get fee estimate before swapping leaves
let fee_estimate = sdk.get_leaves_swap_fee_estimate(total_amount_sats).await?;
println!("Estimated swap fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed with the swap
if fee_estimate.fees < total_amount_sats * 0.005 { // Less than 0.5% fee
// Fee is acceptable, proceed with swap
let target_amount = 80_000; // The specific denomination you need
sdk.request_leaves_swap(target_amount).await?;
} else {
println!("Fee too high, swap aborted");
}
The signing system is a critical component of the Spark wallet, handling all cryptographic operations including key derivation, transaction signing, and threshold signatures via the FROST protocol. This documentation is intended for developers who need to implement custom signers or understand the internal signing architecture.
The signer in Spark follows a trait-based architecture, where various cryptographic capabilities are separated into distinct traits that together form a complete signing system:
SparkSigner
├── SparkSignerDerivationPath - Key derivation path handling
├── SparkSignerEcdsa - ECDSA signature operations
├── SparkSignerEcies - Encryption/decryption of secret keys
├── SparkSignerFrost - FROST nonce and commitment management
├── SparkSignerFrostSigning - FROST threshold signature operations
├── SparkSignerSecp256k1 - Secp256k1 keypair operations
└── SparkSignerShamir - Verifiable secret sharing operations
The SDK includes a DefaultSigner implementation that manages keys in memory. While this implementation works well for most use cases, you may implement your own signer for specialized needs such as remote signing or integration with custom key management systems.
The Spark security model requires that both the user and Spark Operators participate in signing Bitcoin transactions:
This ensures that neither the user nor the operators alone can spend funds, providing a secure multi-party computation model for Bitcoin transactions.
To create a custom signer, you must implement the SparkSigner trait and all its associated sub-traits. The implementation details will depend on your specific requirements, but there are some important considerations:
The derivation path scheme is critical for compatibility with other Spark wallets. The scheme follows:
m/8797555'/account'/key_type'/[leaf_index']
Where:
8797555' is the purpose value (derived from "spark")account' is the account index (hardened, starting from 0)key_type' is the key type:
0' for identity key1' for base signing key2' for temporary signing keyleaf_index' is a hash-derived index for leaf-specific keys (optional)All indices use hardened derivation for enhanced security.
The FROST implementation in Spark is customized to support Taproot tweaking. The process generally follows these steps:
SparkSignerFrost)SparkSignerFrostSigning)Your custom signer will need to properly implement these steps while maintaining the security properties of the FROST protocol.
When implementing a custom signer, carefully consider:
Handles the derivation of keys according to Spark's custom path scheme.
fn get_deposit_signing_key(&self, network: Network) -> Result<PublicKey, SparkSdkError>;
fn derive_spark_key(leaf_id: Option<String>, account: u32, seed_bytes: &[u8],
key_type: SparkKeyType, network: Network) -> Result<SecretKey, SparkSdkError>;
fn get_identity_derivation_path(account_index: u32) -> Result<SparkDerivationPath, SparkSdkError>;
Provides ECDSA signature capabilities for identity verification and other non-threshold operations.
fn sign_message_ecdsa_with_identity_key<T: AsRef<[u8]>>(&self, message: T,
apply_hashing: bool,
network: Network) -> Result<Signature, SparkSdkError>;
fn sign_message_ecdsa_with_key<T: AsRef<[u8]>>(&self, message: T,
public_key_for_signing_key: &PublicKey,
apply_hashing: bool) -> Result<Signature, SparkSdkError>;
Handles encryption and decryption of secret keys for secure exchange between parties.
fn encrypt_secret_key_with_ecies(&self, receiver_public_key: &PublicKey,
pubkey_for_sk_to_encrypt: &PublicKey) -> Result<Vec<u8>, SparkSdkError>;
fn decrypt_secret_key_with_ecies<T>(&self, ciphertext: T,
network: Network) -> Result<SecretKey, SparkSdkError>
where T: AsRef<[u8]>;
Manages FROST nonce pairs and commitments for threshold signing.
fn new_frost_signing_noncepair(&self) -> Result<SigningCommitments, SparkSdkError>;
fn sensitive_expose_nonces_from_commitments<T>(&self, signing_commitments: &T)
-> Result<SigningNonces, SparkSdkError>
where T: AsRef<[u8]>;
fn sensitive_create_if_not_found_expose_nonces_from_commitments(&self, signing_commitments: Option<&[u8]>)
-> Result<SigningNonces, SparkSdkError>;
Performs the actual FROST threshold signing operations, including signing and aggregation.
fn sign_frost(&self, signing_jobs: Vec<FrostSigningJob>) -> Result<SignFrostResponse, SparkSdkError>;
fn aggregate_frost(&self, request: AggregateFrostRequest) -> Result<AggregateFrostResponse, SparkSdkError>;
// Additional specialized signing methods...
Manages secp256k1 keypairs for various wallet operations.
fn get_identity_public_key(&self, account_index: u32, network: Network) -> Result<PublicKey, SparkSdkError>;
fn new_secp256k1_keypair(&self, leaf_id: String, key_type: SparkKeyType,
account_index: u32, network: Network) -> Result<PublicKey, SparkSdkError>;
fn insert_secp256k1_keypair_from_secret_key(&self, secret_key: &SecretKey) -> Result<PublicKey, SparkSdkError>;
// Additional keypair management methods...
Provides verifiable secret sharing operations for secure key distribution.
fn split_with_verifiable_secret_sharing(&self, message: Vec<u8>, threshold: usize,
num_shares: usize) -> Result<Vec<VerifiableSecretShare>, SparkSdkError>;
fn split_from_public_key_with_verifiable_secret_sharing(&self, public_key: &PublicKey,
threshold: usize,
num_shares: usize) -> Result<Vec<VerifiableSecretShare>, SparkSdkError>;
For most applications, the provided DefaultSigner implementation will be sufficient:
// Create a DefaultSigner from a mnemonic
let mnemonic = "abandon ability able about above absent absorb abstract absurd abuse access accident";
let network = SparkNetwork::Regtest;
let signer = DefaultSigner::from_mnemonic(mnemonic, network.clone()).await?;
// Initialize the SDK with the signer
let sdk = SparkSdk::new(network, signer).await?;
This is an early version of the Spark signing system. The architecture may undergo optimizations and refinements in future releases while maintaining backward compatibility where possible. The current implementation prioritizes security and correctness over performance optimization.
For most users, the provided DefaultSigner will be sufficient. Custom signer implementations should be undertaken only when specific requirements necessitate it, such as integration with remote signing services or hardware security modules.