Crates.io | deep_causality_tensor |
lib.rs | deep_causality_tensor |
version | 0.1.4 |
created_at | 2025-09-19 03:20:28.42556+00 |
updated_at | 2025-09-25 09:08:11.044938+00 |
description | Tensor data structure for for deep_causality crate. |
homepage | |
repository | https://github.com/deepcausality/deep_causality.rs |
max_upload_size | |
id | 1845732 |
size | 132,497 |
The CausalTensor provides a flexible, multi-dimensional array (tensor) backed by a single, contiguous Vec<T>
. It is designed for efficient numerical computations, featuring a stride-based memory layout that supports broadcasting for
element-wise binary operations. It offers a comprehensive API for shape manipulation, element access, and common reduction operations like sum
and mean
, making it a versatile tool for causal modeling and other data-intensive
tasks.
CausalTensor
is straightforward to use. You create it from a flat vector of data and a vector defining its shape.
use deep_causality_tensor::CausalTensor;
fn main() {
// 1. Create a 2x3 tensor.
let data = vec![1, 2, 3, 4, 5, 6];
let shape = vec![2, 3];
let tensor = CausalTensor::new(data, shape).unwrap();
println!("Original Tensor: {}", tensor);
// 2. Get an element
let element = tensor.get(&[1, 2]).unwrap();
assert_eq!(*element, 6);
println!("Element at [1, 2]: {}", element);
// 3. Reshape the tensor
let reshaped = tensor.reshape(&[3, 2]).unwrap();
assert_eq!(reshaped.shape(), &[3, 2]);
println!("Reshaped to 3x2: {}", reshaped);
// 4. Perform tensor-scalar addition
let added = &tensor + 10;
assert_eq!(added.as_slice(), &[11, 12, 13, 14, 15, 16]);
println!("Tensor + 10: {}", added);
// 5. Perform tensor-tensor addition with broadcasting
let t1 = CausalTensor::new(vec![1, 2, 3, 4, 5, 6], vec![2, 3]).unwrap();
// A [1, 3] tensor...
let t2 = CausalTensor::new(vec![10, 20, 30], vec![1, 3]).unwrap();
// ...is broadcasted across the rows of the [2, 3] tensor.
let result = (&t1 + &t2).unwrap();
assert_eq!(result.as_slice(), &[11, 22, 33, 14, 25, 36]);
println!("Tensor-Tensor Add with Broadcast: {}", result);
// 6. Sum all elements in the tensor (full reduction)
let sum = tensor.sum_axes(&[]).unwrap();
assert_eq!(sum.as_slice(), &[21]);
println!("Sum of all elements: {}", sum);
}
The following benchmarks were run on a CausalTensor
of size 100x100 (10,000 f64
elements).
Operation | Time | Notes |
---|---|---|
tensor_get |
~1.64 ns | Accessing a single element. |
tensor_reshape |
~786 ns | Metadata only, but clones data in the test. |
tensor_scalar_add |
~2.69 µs | Element-wise addition with a scalar. |
tensor_tensor_add_broadcast |
~37.7 µs | Element-wise addition with broadcasting. |
tensor_sum_full_reduction |
~6.86 µs | Summing all 10,000 elements of the tensor. |
get
): Access is extremely fast, demonstrating the efficiency of the stride-based index calculation.reshape
): This operation is very fast as it only adjusts metadata (shape and strides) and clones the underlying data vector.binary_op
function provides efficient broadcasting for tensor-tensor operations, avoiding allocations in hot loops.The core of CausalTensor
is its stride-based memory layout. For a given shape (e.g., [d1, d2, d3]
), the strides represent the number of elements to skip in the flat data vector to move one step along a particular dimension. For a row-major layout, the strides would be [d2*d3, d3, 1]
. This allows the tensor to calculate the flat index for any multi-dimensional index [i, j, k]
with a simple dot product: i*strides[0] + j*strides[1] + k*strides[2]
.
Binary operations support broadcasting, which follows rules similar to those in libraries like NumPy. When operating on two tensors, CausalTensor
compares their shapes dimension by dimension (from right to left). Two dimensions are compatible if:
The smaller tensor's data is conceptually "stretched" or repeated along the dimensions where its size is 1 to match the larger tensor's shape, without actually copying the data. The optimized binary_op
implementation achieves this by manipulating how it calculates the flat index for each tensor inside the computation loop.
The CausalTensor
API is designed to be comprehensive and intuitive:
CausalTensor::new(data: Vec<T>, shape: Vec<usize>)
shape()
, num_dim()
, len()
, is_empty()
, as_slice()
get()
, get_mut()
reshape()
, ravel()
sum_axes()
, mean_axes()
, arg_sort()
+
, -
, *
, /
operators for both tensor-scalar and tensor-tensor operations.Benchmarks are for a 100x100 (10,000 element) tensor.
Operation | Time | Notes |
---|---|---|
tensor_get |
~1.64 ns | Accessing a single element. |
tensor_reshape |
~786 ns | Metadata only, but clones data in the test. |
tensor_scalar_add |
~2.69 µs | Element-wise addition with a scalar. |
tensor_tensor_add_broadcast |
~37.7 µs | Element-wise addition with broadcasting. |
tensor_sum_full_reduction |
~6.86 µs | Summing all 10,000 elements of the tensor. |
Contributions are welcomed especially related to documentation, example code, and fixes. If unsure where to start, just open an issue and ask.
Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in deep_causality by you, shall be licensed under the MIT licence, without any additional terms or conditions.
This project is licensed under the MIT license.
For details about security, please read the security policy.