Crates.io | aipack |
lib.rs | aipack |
version | 0.8.1 |
created_at | 2025-02-22 17:01:35.315765+00 |
updated_at | 2025-09-01 00:33:38.164725+00 |
description | Command Agent runner to accelerate production coding with genai. |
homepage | https://aipack.ai |
repository | https://github.com/aipack-ai/aipack |
max_upload_size | |
id | 1565573 |
size | 1,748,567 |
Open-source Agentic Runtime to run, build, and share AI Packs.
Simple & Powerful – 1 Agent = 1 multi-stage Markdown file with built-in concurrency, Map-Reduce, and all APIs on a single aip-doc page.
Light & Lean – No bloat, < 20MB, single executable, ZERO dependencies.
Efficient – Engine written in Rust with a lightweight and efficient embedded Lua script. All aip
functions are implemented in Rust.
Multi-AI – Supports all major AI providers and models at the native layer. For example, it can use zero thinking budget with Gemini models.
Local or Cloud – Runs locally, is completely IDE-agnostic, or runs in the cloud, on a server or serverless.
BIG UPDATE 0.8.x WITH NEW TUI
https://aipack.ai for more information and links, AIPACK News & Blog Posts
AIPACK Lab Repo for some cool examples.
Mac, Linux, Windows, ARM & x86 platforms are supported. See below how to install the binaries.
(More info at aipack.ai/doc/install)
# Mac ARM / Apple Silicon
curl -O https://repo.aipack.ai/aip-dist/stable/latest/aarch64-apple-darwin/aip.tar.gz && \
tar -xvf aip.tar.gz && \
./aip self setup
# Mac x86
curl -O https://repo.aipack.ai/aip-dist/stable/latest/x86_64-apple-darwin/aip.tar.gz && \
tar -xvf aip.tar.gz && \
./aip self setup
# Linux x86
curl -O https://repo.aipack.ai/aip-dist/stable/latest/x86_64-unknown-linux-gnu/aip.tar.gz && \
tar -xvf aip.tar.gz && \
./aip self setup
# Linux ARM
curl -O https://repo.aipack.ai/aip-dist/stable/latest/aarch64-unknown-linux-gnu/aip.tar.gz && \
tar -xvf aip.tar.gz && \
./aip self setup
# Windows x86
Invoke-WebRequest -Uri "https://repo.aipack.ai/aip-dist/stable/latest/x86_64-pc-windows-msvc/aip.tar.gz" -OutFile "aip.tar.gz"
tar -xvf aip.tar.gz
.\aip.exe self setup
# Windows ARM
Invoke-WebRequest -Uri "https://repo.aipack.ai/aip-dist/stable/latest/aarch64-pc-windows-msvc/aip.tar.gz" -OutFile "aip.tar.gz"
tar -xvf aip.tar.gz
.\aip.exe self setup
For now, installation requires building directly from source via Rust. Works on all major OSes.
Install Rust: https://www.rust-lang.org/tools/install
For now, install with cargo install aipack
# In the terminal, go to your project
cd /path/to/my/project/
# Initialize workspace .aipack/ and ~/.aipack-base
aip init
# Make sure to export the desired API key (no spaces around `=` Unix convention)
export OPENAI_API_KEY="sk...."
export ANTHROPIC_API_KEY="...."
export GEMINI_API_KEY="..."
# For more keys, see below
# Check the keys you set up
aip check-keys
# To proofread your README.md (namespace: demo, pack_name: proof)
aip run demo@proof -f ./README.md
# You can just use @pack_name if there is no other pack with this name
aip run @proof -f ./README.md
# To do some code crafting (will create `_craft-code.md`)
aip run demo@craft/code
# Or run your .aip file (you can omit the .aip extension)
aip run path/to/file.aip
# This is a good agent to run to ask questions about aipack
# It can even generate aipack code
aip run core@ask-aipack
# The prompt file will be at `.aipack/.prompt/core@ask-aipack/ask-prompt.md`
Stephane Philipakis, a key aipack collaborator.
David Horner for adding Windows support for Open Agent (with VSCode) (#30)
Diaa Kasem for --non-interactive
/--ni
(Now, in v0.7.x
-s
or --single-shot
) mode (#28)
pro@coder
pro@coder
with aip install pro@coder
, and thenaip run pro@coder
or aip run @coder
if you don't have any other @coder
pack in a different namespace.This is the agent I use every day for my production coding.
IMPORTANT 1: Make sure everything is committed before use (at least while you are learning about aipack).
IMPORTANT 2: Make sure to have your API_KEY set as an environment variable (on Mac, there is experimental keychain support).
OPENAI_API_KEY
ANTHROPIC_API_KEY
GEMINI_API_KEY
XAI_API_KEY
DEEPSEEK_API_KEY
GROQ_API_KEY
COHERE_API_KEY
Website: https://aipack.ai
Built on top of the Rust genai library, which supports many top AI providers and models (OpenAI, Anthropic, Gemini, DeepSeek, Groq, Ollama, xAI, and Cohere).
See full CHANGELOG
.aip
agent file is just a Markdown file with sections for each stage of the agent's processing.aip run demo@proof -f "./*.md"
main.aip
in theproof
demo
main.aip
~/.aipack-base/pack/installed/demo/proof/main.aip
-f "path/with/optional/**/glob.*" -f "README.md"
(the Lua code will receive a {path = .., name =..}
FileInfo-like structure as input)-i "some string" -i "another input"
(the Lua code will receive these strings as input)aip run some/path/to/agent
.aip
, it's treated as a direct file run..aip
extension, then:
.../agent.aip
will be executed if it exists..../agent/main.aip
will be executed if it exists..aip
files that can be placed anywhere on disk.
aipack run ./my-path/to/my-agent.aip ...
A single aipack file may comprise any of the following stages.
Stage | Language | Description |
---|---|---|
# Before All |
Lua | Reshape/generate inputs and add global data to the command scope (the "map" part of map/reduce). |
# Data |
Lua | Gather additional data per input and return it for the next stages. |
# System |
Handlebars | Customize the system prompt using data from # Before All and # Data . |
# Instruction |
Handlebars | Customize the instruction prompt using data from # Before All and # Data . |
# Assistant |
Handlebars | Optional for special customizations, such as the "Jedi Mind Trick." |
# Output |
Lua | Processes the ai_response from the LLM. If not defined, ai_response.content is output to the terminal. |
# After All |
Lua | Called with inputs and outputs for post-processing after all input processing is complete (the "reduce" part of map/reduce). |
# Before All
and # After All
act like the map and reduce steps, running before and after the main input processing loop, respectively.See the aipack documentation at core/doc/README.md (with the Lua modules doc).
You can also run the ask-aipack
agent.
# IMPORTANT: Make sure you have the `OPENAI_API_KEY` (or the key for your desired model) set in your environment
aip run core@ask-aipack
# The prompt file will be at `.aipack/.prompt/core@ask-aipack/ask-prompt.md`