Crates.io | cai |
lib.rs | cai |
version | |
source | src |
created_at | 2024-03-28 23:22:38.37237 |
updated_at | 2024-12-08 19:47:33.880391 |
description | The fastest CLI tool for prompting LLMs |
homepage | |
repository | https://github.com/ad-si/cai |
max_upload_size | |
id | 1189482 |
Cargo.toml error: | TOML parse error at line 17, column 1 | 17 | autolib = false | ^^^^^^^ unknown field `autolib`, expected one of `name`, `version`, `edition`, `authors`, `description`, `readme`, `license`, `repository`, `homepage`, `documentation`, `build`, `resolver`, `links`, `default-run`, `default_dash_run`, `rust-version`, `rust_dash_version`, `rust_version`, `license-file`, `license_dash_file`, `license_file`, `licenseFile`, `license_capital_file`, `forced-target`, `forced_dash_target`, `autobins`, `autotests`, `autoexamples`, `autobenches`, `publish`, `metadata`, `keywords`, `categories`, `exclude`, `include` |
size | 0 |
cai
- The fastest CLI tool for prompting LLMscargo install cai
Before using Cai, an API key must be set up.
Simply execute cai
in your terminal and follow the instructions.
Cai supports the following APIs:
Afterwards, you can use cai
to run prompts directly from the terminal:
cai List 10 fast CLI tools
Or a specific model, like Anthropic's Claude Opus:
cai op List 10 fast CLI tools
Full help output:
$ cai help
Cai 0.7.0
The fastest CLI tool for prompting LLMs
Usage: cai [OPTIONS] [PROMPT]... [COMMAND]
Commands:
groq Groq [aliases: gr]
ll - Llama 3 shortcut (🏆 Default)
mi - Mixtral shortcut
openai OpenAI [aliases: op]
gp - GPT-4o shortcut
gm - GPT-4o mini shortcut
anthropic Anthropic [aliases: an]
cl - Claude Opus
so - Claude Sonnet
ha - Claude Haiku
llamafile Llamafile server hosted at http://localhost:8080 [aliases: lf]
ollama Ollama server hosted at http://localhost:11434 [aliases: ol]
all Simultaneously send prompt to each provider's default model:
- Groq Llama 3.1
- Antropic Claude Sonnet 3.5
- OpenAI GPT-4o mini
- Ollama Llama 3
- Llamafile
changelog Generate a changelog starting from a given commit using OpenAI's GPT-4o
rename Analyze and rename a file with timestamp and description
ocr Extract text from an image
bash Use Bash development as the prompt context
c Use C development as the prompt context
cpp Use C++ development as the prompt context
cs Use C# development as the prompt context
elm Use Elm development as the prompt context
fish Use Fish development as the prompt context
fs Use F# development as the prompt context
gd Use Godot and GDScript development as the prompt context
gl Use Gleam development as the prompt context
go Use Go development as the prompt context
hs Use Haskell development as the prompt context
java Use Java development as the prompt context
js Use JavaScript development as the prompt context
kt Use Kotlin development as the prompt context
lua Use Lua development as the prompt context
oc Use OCaml development as the prompt context
php Use PHP development as the prompt context
po Use Postgres development as the prompt context
ps Use PureScript development as the prompt context
py Use Python development as the prompt context
rb Use Ruby development as the prompt context
rs Use Rust development as the prompt context
sql Use SQLite development as the prompt context
sw Use Swift development as the prompt context
ts Use TypeScript development as the prompt context
wl Use Wolfram Language and Mathematica development as the prompt context
zig Use Zig development as the prompt context
help Print this message or the help of the given subcommand(s)
Arguments:
[PROMPT]... The prompt to send to the AI model
Options:
-r, --raw Print raw response without any metadata
-j, --json Prompt LLM in JSON output mode
-h, --help Print help
Examples:
# Send a prompt to the default model
cai Which year did the Titanic sink
# Send a prompt to each provider's default model
cai all Which year did the Titanic sink
# Send a prompt to Anthropic's Claude Opus
cai anthropic claude-opus Which year did the Titanic sink
cai an claude-opus Which year did the Titanic sink
cai cl Which year did the Titanic sink
cai anthropic claude-3-opus-latest Which year did the Titanic sink
# Send a prompt to locally running Ollama server
cai ollama llama3 Which year did the Titanic sink
cai ol ll Which year did the Titanic sink
# Add data via stdin
cat main.rs | cai Explain this code