| Crates.io | herosal-rhai-lib |
| lib.rs | herosal-rhai-lib |
| version | 0.1.1 |
| created_at | 2025-12-13 11:19:22.236054+00 |
| updated_at | 2025-12-14 13:33:26.654413+00 |
| description | Rhai scripting library with Redis-based task queue |
| homepage | |
| repository | https://github.com/threefoldtech/sal |
| max_upload_size | |
| id | 1982823 |
| size | 235,799 |
rhailib provides a robust infrastructure for executing Rhai scripts in a distributed manner, primarily designed to integrate with and extend the HeroModels ecosystem. It allows for dynamic scripting capabilities, offloading computation, and enabling flexible automation.
The rhailib system is composed of three main components working together, leveraging Redis for task queuing and state management:
Rhai Engine (src/engine):
This crate is the core of the scripting capability. It provides a Rhai engine pre-configured with various HeroModels modules (e.g., Calendar, Flow, Legal). Scripts executed within this engine can interact directly with HeroModels data and logic. The engine is utilized by the rhai_worker to process tasks.
Rhai Client (src/client):
This crate offers an interface for applications to submit Rhai scripts as tasks to the distributed execution system. Clients can send scripts to named Redis queues (referred to as "contexts"), optionally wait for results, and handle timeouts.
Rhai Worker (src/worker):
This executable component listens to one or more Redis queues ("contexts") for incoming tasks. When a task (a Rhai script) is received, the worker fetches its details, uses the rhai_engine to execute the script, and then updates the task's status and results back into Redis. Multiple worker instances can be deployed to scale script execution.
The typical workflow is as follows:
rhai_dispatcher submits a Rhai script to a specific Redis list (e.g., rhai:queue:my_context). Task details, including the script and status, are stored in a Redis hash.rhai_worker instance, configured to listen to rhai:queue:my_context, picks up the task ID from the queue using a blocking pop operation.rhai_engine. This engine provides the necessary HeroModels context for the script.completed, failed) and stores any return value or error message in the corresponding Redis hash.rhai_dispatcher can poll the Redis hash for the task's status and retrieve the results once available.This architecture allows for:
The core components are organized as separate crates within the src/ directory:
src/client/: Contains the rhai_dispatcher library.src/engine/: Contains the rhai_engine library.src/worker/: Contains the rhai_worker library and its executable.Each of these directories contains its own README.md file with more detailed information about its specific functionality, setup, and usage.
To work with this project:
client and worker components to communicate.src/client/, src/worker/, and src/engine/ for detailed instructions on building, configuring, and running each component.You can typically build all components using:
cargo build --workspace
Or build and run specific examples or binaries as detailed in their respective READMEs.
rhailib includes a powerful async architecture that enables Rhai scripts to perform HTTP API calls despite Rhai's synchronous nature. This allows scripts to integrate with external services like Stripe, payment processors, and other REST/GraphQL APIs.
// Configure API client
configure_stripe(STRIPE_API_KEY);
// Create a product with pricing
let product = new_product()
.name("Premium Software License")
.description("Professional software solution")
.metadata("category", "software");
let product_id = product.create();
// Create subscription pricing
let monthly_price = new_price()
.amount(2999) // $29.99 in cents
.currency("usd")
.product(product_id)
.recurring("month");
let price_id = monthly_price.create();
// Create a subscription
let subscription = new_subscription()
.customer("cus_customer_id")
.add_price(price_id)
.trial_days(14)
.create();
rhailib aims to provide a flexible and powerful way to extend applications with custom logic written in Rhai, executed in a controlled and scalable environment. This is particularly useful for tasks such as: