| Crates.io | graph-sp |
| lib.rs | graph-sp |
| version | 0.1.0 |
| created_at | 2026-01-17 21:16:24.812678+00 |
| updated_at | 2026-01-19 02:31:56.848362+00 |
| description | A pure Rust graph executor supporting implicit node connections, branching, and config sweeps |
| homepage | https://github.com/briday1/graph-sp |
| repository | https://github.com/briday1/graph-sp |
| max_upload_size | |
| id | 2051179 |
| size | 188,843 |
graph-sp is a pure Rust grid/node graph executor and optimizer. The project focuses on representing directed dataflow graphs, computing port mappings by graph inspection, and executing nodes efficiently in-process with parallel CPU execution.
.branch().variant() to create parameter sweeps.to_mermaid()Add to your Cargo.toml:
[dependencies]
graph-sp = "0.1.0"
The library can also be used from Python via PyO3 bindings:
pip install graph_sp
Or build from source:
pip install maturin
maturin build --release --features python
pip install target/wheels/graph_sp-*.whl
use graph_sp::Graph;
use std::collections::HashMap;
fn data_source(_: &HashMap<String, String>, _: &HashMap<String, String>) -> HashMap<String, String> {
let mut result = HashMap::new();
result.insert("value".to_string(), "42".to_string());
result
}
fn multiply(inputs: &HashMap<String, String>, _: &HashMap<String, String>) -> HashMap<String, String> {
let mut result = HashMap::new();
if let Some(val) = inputs.get("x").and_then(|s| s.parse::<i32>().ok()) {
result.insert("doubled".to_string(), (val * 2).to_string());
}
result
}
fn main() {
let mut graph = Graph::new();
// Add source node
graph.add(data_source, Some("DataSource"), None, Some(vec![("value", "data")]));
// Add processing node
graph.add(multiply, Some("Multiply"), Some(vec![("data", "x")]), Some(vec![("doubled", "result")]));
let dag = graph.build();
let context = dag.execute();
println!("Result: {}", context.get("result").unwrap());
}
import graph_sp
def data_source(inputs, variant_params):
return {"value": "42"}
def multiply(inputs, variant_params):
val = int(inputs.get("x", "0"))
return {"doubled": str(val * 2)}
# Create graph
graph = graph_sp.PyGraph()
# Add source node
graph.add(
function=data_source,
label="DataSource",
inputs=None,
outputs=[("value", "data")]
)
# Add processing node
graph.add(
function=multiply,
label="Multiply",
inputs=[("data", "x")],
outputs=[("doubled", "result")]
)
# Build and execute
dag = graph.build()
context = dag.execute()
print(f"Result: {context['result']}")
Mermaid visualization output:
graph TD
0["DataSource"]
1["Multiply"]
0 -->|data → x| 1
let mut graph = Graph::new();
// Source node
graph.add(source_fn, Some("Source"), None, Some(vec![("data", "data")]));
// Create parallel branches
graph.branch();
graph.add(stats_fn, Some("Statistics"), Some(vec![("data", "input")]), Some(vec![("mean", "stats")]));
graph.branch();
graph.add(model_fn, Some("MLModel"), Some(vec![("data", "input")]), Some(vec![("prediction", "model")]));
graph.branch();
graph.add(viz_fn, Some("Visualization"), Some(vec![("data", "input")]), Some(vec![("plot", "viz")]));
let dag = graph.build();
Mermaid visualization output:
graph TD
0["Source"]
1["Statistics"]
2["MLModel"]
3["Visualization"]
0 -->|data → input| 1
0 -->|data → input| 2
0 -->|data → input| 3
style 1 fill:#e1f5ff
style 2 fill:#e1f5ff
style 3 fill:#e1f5ff
DAG Statistics:
use graph_sp::{Graph, Linspace};
let mut graph = Graph::new();
// Source node
graph.add(source_fn, Some("DataSource"), None, Some(vec![("value", "data")]));
// Create variants for different learning rates
let learning_rates = vec![0.001, 0.01, 0.1, 1.0];
graph.variant("learning_rate", learning_rates);
graph.add(scale_fn, Some("ScaleLR"), Some(vec![("data", "input")]), Some(vec![("scaled", "output")]));
let dag = graph.build();
Mermaid visualization output:
graph TD
0["DataSource"]
1["ScaleLR (v0)"]
2["ScaleLR (v1)"]
3["ScaleLR (v2)"]
4["ScaleLR (v3)"]
0 -->|data → input| 1
0 -->|data → input| 2
0 -->|data → input| 3
0 -->|data → input| 4
style 1 fill:#e1f5ff
style 2 fill:#e1f5ff
style 3 fill:#e1f5ff
style 4 fill:#e1f5ff
style 1 fill:#ffe1e1
style 2 fill:#e1ffe1
style 3 fill:#ffe1ff
style 4 fill:#ffffe1
DAG Statistics:
Graph::new() - Create a new graphgraph.add(fn, name, inputs, outputs) - Add a node
fn: Node function with signature fn(&HashMap<String, String>, &HashMap<String, String>) -> HashMap<String, String>name: Optional node nameinputs: Optional vector of (broadcast_var, impl_var) tuples for input mappingsoutputs: Optional vector of (impl_var, broadcast_var) tuples for output mappingsgraph.branch() - Create a new parallel branchgraph.variant(param_name, values) - Create parameter sweep variantsgraph.build() - Build the DAGdag.execute() - Execute the graph and return execution contextdag.stats() - Get DAG statistics (nodes, depth, parallelism, branches, variants)dag.to_mermaid() - Generate Mermaid diagram representationThe Python bindings provide a similar API with proper GIL handling:
PyGraph() - Create a new graphgraph.add(function, label, inputs, outputs) - Add a node
function: Python callable with signature fn(inputs: dict, variant_params: dict) -> dictlabel: Optional node name (str)inputs: Optional list of (broadcast_var, impl_var) tuples or dictoutputs: Optional list of (impl_var, broadcast_var) tuples or dictgraph.branch(subgraph) - Create a new parallel branch with a subgraphgraph.build() - Build the DAG and return a PyDagdag.execute() - Execute the graph and return execution context (dict)dag.execute_parallel() - Execute with parallel execution where possible (dict)dag.to_mermaid() - Generate Mermaid diagram representation (str)The Python bindings are designed with proper GIL handling:
pyo3::prepare_freethreaded_python() (via auto-initialize) for multi-threaded safetyThis means that while Python functions execute sequentially (due to the GIL), the Rust graph traversal and coordination happens in parallel without GIL contention.
Prerequisites:
Build and run tests:
cargo build --release
cargo test
Run examples:
cargo run --example comprehensive_demo
cargo run --example parallel_execution_demo
cargo run --example variant_demo_full
Prerequisites:
Build Python bindings:
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install maturin
pip install maturin==1.2.0
# Build and install in development mode
maturin develop --release --features python
# Run Python example
python examples/python_demo.py
Build wheel for distribution:
maturin build --release --features python
# Wheel will be in target/wheels/
This repository is configured with GitHub Actions workflows to automatically publish to crates.io and PyPI when a release tag is pushed.
To enable automatic publishing, the repository owner must configure the following secrets in GitHub Settings → Secrets and variables → Actions:
CRATES_IO_TOKEN: Your crates.io API token (obtain from https://crates.io/me)PYPI_API_TOKEN: Your PyPI API token (obtain from https://pypi.org/manage/account/token/)The publish workflow (.github/workflows/publish.yml) will automatically run when:
v* is pushed (e.g., v0.1.0, v1.0.0)Creating a release:
# Ensure version numbers in Cargo.toml and pyproject.toml are correct
git tag -a v0.1.0 -m "Release v0.1.0"
git push origin v0.1.0
The workflow will:
PYPI_API_TOKEN is set) - prebuilt wheels mean end users do not need RustCRATES_IO_TOKEN is set)Important notes:
pip install graph_sp will not require Rust on the target machine because prebuilt platform-specific wheels are publishedmaturin build --release --features pythonIf you prefer to publish manually or need to publish from a local machine:
To crates.io:
cargo publish --token YOUR_CRATES_IO_TOKEN
To PyPI:
# Install maturin
pip install maturin==1.2.0
# Build and publish wheels
maturin publish --username __token__ --password YOUR_PYPI_API_TOKEN --features python