| Crates.io | feagi |
| lib.rs | feagi |
| version | 0.0.1-beta.4 |
| created_at | 2025-12-23 23:54:57.473711+00 |
| updated_at | 2026-01-25 21:58:37.351369+00 |
| description | Framework for Evolutionary Artificial General Intelligence - Bio-Inspired Neural Computation |
| homepage | |
| repository | https://github.com/feagi/feagi-core |
| max_upload_size | |
| id | 2002552 |
| size | 1,944,089 |
Framework for Evolutionary Artificial General Intelligence - High-performance Rust libraries for bio-inspired neural computation.
FEAGI (Framework for Evolutionary Artificial General Intelligence) is a bio-inspired neural architecture that models brain structures and dynamics. FEAGI Core provides the foundational Rust libraries for building neural networks that learn and adapt like biological brains.
Unlike traditional neural networks, FEAGI:
Add to your Cargo.toml:
[dependencies]
feagi = "0.0.1-beta.1" # Umbrella crate (includes everything)
Or use individual building blocks:
[dependencies]
feagi-npu-burst-engine = "0.0.1-beta.1" # Just the NPU
feagi-npu-neural = "0.0.1-beta.1" # Just core types
Or umbrella with specific features:
[dependencies]
feagi = { version = "0.0.1-beta.1", features = ["gpu"] }
use feagi::prelude::*;
fn main() -> Result<(), Box<dyn std::error::Error>> {
// Initialize neural processing unit
let mut npu = RustNPU::new(100_000, 1_000_000, 20)?;
// Load brain configuration
npu.load_connectome("brain.json")?;
// Process neural burst cycle
npu.process_burst()?;
Ok(())
}
#![no_std]
use feagi_neural::NeuronDynamics;
use feagi_runtime_embedded::EmbeddedRuntime;
// Configure for resource-constrained systems
let runtime = EmbeddedRuntime::new(1000, 5000);
let mut dynamics = NeuronDynamics::new(&runtime);
import feagi_rust
# Create high-performance engine
engine = feagi_rust.SynapticPropagationEngine()
# Build synaptic connectivity
engine.build_index(source_neurons, target_neurons, weights, conductances, types, valid_mask)
# Process neural activity
result = engine.propagate(fired_neurons)
FEAGI Core is organized as a workspace of focused crates:
FEAGI Core delivers significant performance improvements over interpreted implementations:
#[repr(C)], AoS patterns)NeuronId, SynapseId)[features]
default = ["std", "full"]
std = [...] # Standard library support
no_std = [...] # Embedded/bare-metal
wasm = [...] # WebAssembly target
full = ["compute", "io"]
compute = [...] # Neural computation only
io = [...] # I/O and networking
[features]
gpu = [...] # Cross-platform GPU (WGPU)
cuda = [...] # NVIDIA CUDA acceleration
all-gpu = [...] # All GPU backends
# Clone repository
git clone https://github.com/feagi/feagi-core
cd feagi-core
# Build the crate
cargo build --release
# Run tests
cargo test --workspace
# Build with GPU support
cargo build --release --features gpu
# Generate documentation
cargo doc --open
We welcome contributions! Whether you're fixing bugs, adding features, improving documentation, or optimizing performance, your help is appreciated.
git checkout -b feature/amazing-feature)cargo test && cargo clippy)All contributions must:
cargo clippy with zero warningscargo test (all tests)# Check compilation
cargo check --workspace
# Run tests
cargo test --workspace
# Lint code
cargo clippy --workspace -- -D warnings
# Format code
cargo fmt --all
# Build release
cargo build --workspace --release
Generate local documentation:
cargo doc --open
# All tests
cargo test --workspace
# Specific crate
cargo test -p feagi-burst-engine
# With features
cargo test -p feagi-burst-engine --features gpu
# Benchmarks
cargo bench -p feagi-burst-engine
Version: 0.0.1
Status: Active development
Minimum Rust Version: 1.75+
FEAGI Core is under active development. The core APIs are stabilizing, but breaking changes may occur in minor releases.
Licensed under the Apache License, Version 2.0. See LICENSE for details.
Copyright 2025 Neuraville Inc.
If you use FEAGI in your research, please cite:
@article{nadji2020brain,
title={A brain-inspired framework for evolutionary artificial general intelligence},
author={Nadji-Tehrani, Mohammad and Eslami, Ali},
journal={IEEE transactions on neural networks and learning systems},
volume={31},
number={12},
pages={5257--5271},
year={2020},
publisher={IEEE}
}
Built with Rust for performance, safety, and portability.