Crates.io | devai |
lib.rs | devai |
version | 0.1.0 |
source | src |
created_at | 2024-09-16 03:54:09.71665 |
updated_at | 2024-09-27 02:58:26.084583 |
description | Command Agent runner to accelerate production coding. File based, fully customizable, NOT for building snake games. |
homepage | https://github.com/jeremychone/rust-devai |
repository | https://github.com/jeremychone/rust-devai |
max_upload_size | |
id | 1375976 |
size | 141,707 |
# install
cargo install devai
# Will fix all code comment in all matching file
devai run proof-comments -f "./src/m*.rs"
# How: It will run the installed Command Agent file ".devai/defaults/proof-comments.md" on all source files matching "./src/m*.rs"
# IMPORTANT: Make sure everything is committed before usage.
ONE Command Agent Markdown File that defines the full agent flow:
items
get expanded from the -f
file matches (more ways to generate items later).-> Data
scripting for getting full control over what data to put in the context.-> Instruction
templating (Handlebars) to have full control over the prompt layout.-> Output
scripting to get full control over how to manage the AI output.Data
, Instruction
, Output
(and more later) are all defined in a single file (see below), which is called the Command Agent File
Supports all models/providers supported by the genai crate (see below for more information).
You can customize the model and concurrency in .devai/config.toml
.
IMPORTANT: Make sure to run this command line when everything is committed, so that overwritten files can be reverted easily.
STILL IN HEAVY DEVELOPMENT... But it's starting to get pretty cool.
P.S. If possible, try to refrain from publishing devai-custom
type crates, as this might be more confusing than helpful. However, any other name is great.
devai uses the genai crate, and therefore the simplest way to provide the API keys for each provider is via environment variables in the terminal when running devai.
Here are the environment variable names used:
OPENAI_API_KEY
ANTHROPIC_API_KEY
MODEL_GEMINI
GEMINI_API_KEY
GROQ_API_KEY
COHERE_API_KEY
Usage: devai run proof-comments -f "./src/main.rs"
(or have any glob like -f "./src/**/*.rs"
)
.devai/defaults
folder with the "Command Agent Markdown" proof-comments.md
(see .devai/defaults/proof-comments.md`) and run it with genai as follows:
-f "./src/**/*.rs"
: The -f
command line argument takes a glob and will create an "item" for each file, which can then be accessed in the # Data
scripting section.# Data
, which contains a rhai
block that will get executed with the item
value (the file reference in our example above).
rhai
, there are some utility functions to list files, load file content, and such that can then be used in the instruction section.# Instruction
, which is a Handlebars template section, has access to item
as well as the output of the # Data
section, accessible as the data
variable.
# Output
, which now executes another rhai
block, using the item
, data
, and ai_output
, which is the string returned by the AI.
gpt-4o-mini
and look for the OPENAI_API_KEY
environment variable.OPENAI_API_KEY
, ANTHROPIC_API_KEY
, COHERE_API_KEY
, GEMINI_API_KEY
, GROQ_API_KEY
..devai/defaults/proof-comments.md
(see .devai/defaults/proof-comments.md`)
On devai run
or devai init
a .devai/config.toml
will be created with the following:
[genai]
# Required (any model rust genai crate support).
model = "gpt-4o-mini"
[runtime]
# Default to 1 if absent. Great way to increase speed when remote AI services.
items_concurrency = 1
# Items
section with yaml
or Rhai
.Rhai
modules/functions.# Before All
, # Before
, # After
, and # After All
(all Rhai
).--dry-req
will perform a dry run of the request by just saving the content of the request in a file.--dry-res
will perform a real AI request but just capture the AI response in a file (the request will be captured as well).--capture
will perform the normal run but capture the request and response in the request/response file.