cogni

Crates.iocogni
lib.rscogni
version0.2.1
sourcesrc
created_at2023-06-26 16:20:14.410828
updated_at2023-12-01 00:40:05.049169
descriptionUnix native interface for LLMs
homepage
repositoryhttps://github.com/leoshimo/cogni
max_upload_size
id900451
size100,291
Leo Shimonaka (leoshimo)

documentation

README

cogni

Rust

Unix-minded interface for interacting with LLMs.

Focus

cogni brings language model scripting (prompting) into familiar Unix environment by focusing on:

  • Ergonomics and accessibility in Unix shell
  • Composability and interop with other programs - including cogni itself
  • Ease of language model programming in both ad-hoc and repeatable manner

For example, designing for IO redirection (stdin, stdout) allows cogni to work with files, editor buffers, clipboards, syslogs, sockets, and many external tools without bespoke integrations.

Features

  • Unix-minded Design (IO redirection, composability, interop)
  • Ad-hoc Language Model Scripting
  • Flexible input and output formats (Text, JSON, NDJSON)
  • Standalone binary - No Python required
  • 🚧 Repeatable Scripts via Templates
  • 🚧 Integration with external tools (Emacs, Raycast)

Non-Features

  • Interactive use - instead, invoke cogni from within interactive environments (REPLs, emacs, etc)

Installation

# Install from crates.io
$ cargo install cogni

# From source
$ cargo install --path .

Setup

cogni expects an OpenAI API Key to be supplied via --apikey option or more conveniently OPENAI_API_KEY environment variable:

# in shell configuration
export OPENAI_API_KEY=sk-DEADBEEF

Tour of cogni

🚧 WIP

Commit count: 61

cargo fmt